Onyx logo

Previous topic

onyx.textdata.onyxtext – A text format for line-oriented serialization of memory structures

Next topic

onyx.textdata.textfileprocess – >>> True

This Page

onyx.textdata.textdata – Support for writing and reading data using the Textdata format

A format and tool set for encoding line-based records into white-space-separated tokens and for decoding such records.

>>> True
True
class onyx.textdata.textdata.TextdataBase

Bases: object

Base class for Textdata writer and reader

Holds some basic accessors, constants, and functions; includes some invariant assertions

exception StopDataIteration

Bases: exceptions.StopIteration

args
message
exception TextdataBase.StopHeaderIteration

Bases: exceptions.StopIteration

args
message
TextdataBase.check_configuration()
TextdataBase.file_type

the name of the type of data in the stream or None, a read-only attribute

TextdataBase.file_version

the version of the type of data in the stream or None, a read-only attribute

TextdataBase.iter_data(rawtokens)

Generator: for each line, yields a tuple of the decoded tokens

TextdataBase.iter_header(rawtokens)
static TextdataBase.iter_rawtokens(stream)

Generator yields list of raw tokens on each line, skips blank lines

class onyx.textdata.textdata.TextdataReader(instream, file_type=None, file_version=None, comment_prefix='#', escape_char='%', headerless=False)

Bases: onyx.textdata.textdata.TextdataBase

Reads the Textdata for an object from a stream, returning the unencoded tokens

Gives access to the header information

exception StopDataIteration

Bases: exceptions.StopIteration

args
message
exception TextdataReader.StopHeaderIteration

Bases: exceptions.StopIteration

args
message
TextdataReader.check_configuration()
TextdataReader.file_type

the name of the type of data in the stream or None, a read-only attribute

TextdataReader.file_version

the version of the type of data in the stream or None, a read-only attribute

TextdataReader.iter_data(rawtokens)

Generator: for each line, yields a tuple of the decoded tokens

TextdataReader.iter_header(rawtokens)
static TextdataReader.iter_rawtokens(stream)

Generator yields list of raw tokens on each line, skips blank lines

class onyx.textdata.textdata.TextdataWriter(outstream, file_type=None, file_version=None, comment_prefix='#', escape_char='%', initial_newline=True, headerless=False)

Bases: onyx.textdata.textdata.TextdataBase

Writes the encoded Textdata tokens for an object to a stream

Does the checking necessary to ensure that the contents written to the stream are a valid instance of a Textdata object

exception StopDataIteration

Bases: exceptions.StopIteration

args
message
exception TextdataWriter.StopHeaderIteration

Bases: exceptions.StopIteration

args
message
TextdataWriter.check_configuration()
TextdataWriter.close()

Writes markers to note the end of this textdata object in the stream; futher attempts to write will raise an error; close() can be called multiple times

TextdataWriter.file_type

the name of the type of data in the stream or None, a read-only attribute

TextdataWriter.file_version

the version of the type of data in the stream or None, a read-only attribute

TextdataWriter.iter_data(rawtokens)

Generator: for each line, yields a tuple of the decoded tokens

TextdataWriter.iter_header(rawtokens)
static TextdataWriter.iter_rawtokens(stream)

Generator yields list of raw tokens on each line, skips blank lines

TextdataWriter.write_comment(comment='')

Write the comment string, followed by a newline

TextdataWriter.write_header()
TextdataWriter.write_newline()

Writes a newline

TextdataWriter.write_token(token)

Ensures that the header has been written; encodes the string token to the output stream with spacing

TextdataWriter.write_token_newline(token)

Does write_token(token) followed by write_newline()

TextdataWriter.write_tokens(iterable)

If necessary, ensures that the header has been written; encodes each string token from iterable and appends each encoded token to the output stream with spacing

TextdataWriter.write_tokens_newline(iterable)

Does write_tokens(iterable) followed by write_newline()