All Classes and Interfaces
Class
Description
An
InputFormat
for Avro data files, which
converts each datum to string form in the input key.Compares Avro string data (data with schema "string").
A collector for map and reduce output.
Converts a Java object into an Avro datum.
Constructs converters that turn objects (usually from the output of a MR job)
into Avro data that can be serialized.
Converts AvroWrappers into their wrapped Avro data.
Converts BooleanWritables into Booleans.
Converts BytesWritables into ByteBuffers.
Converts ByteWritables into GenericFixed of size 1.
Converts DoubleWritables into Doubles.
Converts FloatWritables into Floats.
Converts IntWritables into Ints.
Converts LongWritables into Longs.
Converts NullWritables into Nulls.
Converts Text into CharSequences.
Deserializes AvroWrapper objects within Hadoop.
An
InputFormat
for Avro data files.Setters to configure jobs for Avro data.
Utility methods for configuring jobs that work with Avro.
The wrapper of keys for jobs configured with
AvroJob
.The
RawComparator
used by jobs configured with
AvroJob
.The
RawComparator
used by jobs configured with AvroJob
.Deserializes AvroKey objects within Hadoop.
A MapReduce InputFormat that can handle Avro container files.
FileOutputFormat for writing Avro container files.
A factory for creating record writers.
Reads records from an input split representing a chunk of an Avro container
file.
Writes Avro records to an Avro container file output stream.
A helper object for working with the Avro generic records that are used to
store key/value pairs in an Avro container file.
A wrapper for iterators over GenericRecords that are known to be KeyValue
records.
A MapReduce InputFormat that reads from Avro container files of key/value
generic records.
FileOutputFormat for writing Avro container files of key/value pairs.
Reads Avro generic records from an Avro container file, where the records
contain two fields: 'key' and 'value'.
Writes key/value pairs to an Avro container file.
A mapper for Avro data.
This class supports Avro-MapReduce jobs that have multiple input paths with a
different
Schema
and AvroMapper
for each path.The AvroMultipleOutputs class simplifies writing Avro output data to multiple
outputs
The AvroMultipleOutputs class simplifies writing Avro output data to multiple
outputs
An
OutputFormat
for Avro data files.Abstract base class for output formats that write Avro container files.
An
RecordReader
for Avro data files.Abstract base class for
RecordReader
s that read Avro container
files.A reducer for Avro data.
A wrapper around a Hadoop
SequenceFile
that also
supports reading and writing Avro data.A reader for SequenceFiles that may contain Avro data.
A helper class to encapsulate the options that can be used to construct a
Reader.
A writer for an uncompressed SequenceFile that supports Avro data.
A helper class to encapsulate the options that can be used to construct a
Writer.
An input format for reading from AvroSequenceFiles (sequence files that
support Avro data).
A sequence file output format that knows how to write AvroKeys and AvroValues
in addition to Writables.
The
Serialization
used by jobs
configured with AvroJob
.The
Serialization
used by jobs configured with AvroJob
.Serializes AvroWrapper objects within Hadoop.
The equivalent of
TextOutputFormat
for
writing to Avro Data Files with a "bytes"
schema.An
InputFormat
for text files.The wrapper of values for jobs configured with
AvroJob
.Deserializes AvroValue objects within Hadoop.
The wrapper of data for jobs configured with
AvroJob
.A combine avro keyvalue file input format that can combine small avro files
into mappers.
Adapt an
FSDataInputStream
to SeekableInput
.Encapsulates the ability to specify and configure an avro compression codec
from a given hadoop codec defined with the configuration parameter:
mapred.output.compression.codec
Currently there are three codecs registered by default:
org.apache.hadoop.io.compress.DeflateCodec
will map to
deflate
org.apache.hadoop.io.compress.SnappyCodec
will map to
snappy
org.apache.hadoop.io.compress.BZip2Codec
will map to
bzip2
org.apache.hadoop.io.compress.GZipCodec
will map to
deflate
org.apache.hadoop.io.compress.ZStandardCodec
will map to
zstandard
Transmit inputs to a map or reduce task sub-process.
Transmit inputs to a map or reduce task sub-process.
Transmit outputs from a map or reduce task to parent.
Transmit outputs from a map or reduce task to parent.
A key/value pair.
An
InputFormat
for sequence files.A
FileReader
for sequence files.A
RecordReader
for sequence files.A SortedKeyValueFile is an indexed Avro container file of KeyValue records
sorted by key.
Reads a SortedKeyValueFile by loading the key index into memory.
A class to encapsulate the options of a Reader.
Writes a SortedKeyValueFile.
A class to encapsulate the various options of a SortedKeyValueFile.Writer.
Constructs and submits tether jobs.