Class AccumuloSampleDataForSplitPointsJobFactory
- java.lang.Object
-
- uk.gov.gchq.gaffer.accumulostore.operation.hdfs.handler.job.factory.AccumuloSampleDataForSplitPointsJobFactory
-
- All Implemented Interfaces:
JobFactory<SampleDataForSplitPoints>
,SampleDataForSplitPointsJobFactory
public class AccumuloSampleDataForSplitPointsJobFactory extends Object implements SampleDataForSplitPointsJobFactory
-
-
Field Summary
-
Fields inherited from interface uk.gov.gchq.gaffer.hdfs.operation.handler.job.factory.JobFactory
MAPPER_GENERATOR, SCHEMA, VALIDATE
-
Fields inherited from interface uk.gov.gchq.gaffer.hdfs.operation.handler.job.factory.SampleDataForSplitPointsJobFactory
PROPORTION_TO_SAMPLE
-
-
Constructor Summary
Constructors Constructor Description AccumuloSampleDataForSplitPointsJobFactory()
AccumuloSampleDataForSplitPointsJobFactory(org.apache.hadoop.conf.Configuration configuration)
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description org.apache.hadoop.mapred.JobConf
createJobConf(SampleDataForSplitPoints operation, String mapperGeneratorClassName, Store store)
Creates anJobConf
to be used for the add from hdfs.org.apache.hadoop.io.Writable
createKey()
byte[]
createSplit(org.apache.hadoop.io.Writable key, org.apache.hadoop.io.Writable value)
org.apache.hadoop.io.Writable
createValue()
int
getExpectedNumberOfSplits(Store store)
void
setupJob(org.apache.hadoop.mapreduce.Job job, SampleDataForSplitPoints operation, String mapperGeneratorClassName, Store store)
Sets up all parts of the Job to be used on the add from hdfs.-
Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface uk.gov.gchq.gaffer.hdfs.operation.handler.job.factory.JobFactory
createJobs
-
-
-
-
Method Detail
-
createSplit
public byte[] createSplit(org.apache.hadoop.io.Writable key, org.apache.hadoop.io.Writable value)
- Specified by:
createSplit
in interfaceSampleDataForSplitPointsJobFactory
-
createKey
public org.apache.hadoop.io.Writable createKey()
- Specified by:
createKey
in interfaceSampleDataForSplitPointsJobFactory
-
createValue
public org.apache.hadoop.io.Writable createValue()
- Specified by:
createValue
in interfaceSampleDataForSplitPointsJobFactory
-
getExpectedNumberOfSplits
public int getExpectedNumberOfSplits(Store store)
- Specified by:
getExpectedNumberOfSplits
in interfaceSampleDataForSplitPointsJobFactory
-
createJobConf
public org.apache.hadoop.mapred.JobConf createJobConf(SampleDataForSplitPoints operation, String mapperGeneratorClassName, Store store) throws IOException
Description copied from interface:JobFactory
Creates anJobConf
to be used for the add from hdfs.- Specified by:
createJobConf
in interfaceJobFactory<SampleDataForSplitPoints>
- Parameters:
operation
- The Operation.mapperGeneratorClassName
- Class name for the MapperGenerator class.store
- The store.- Returns:
- The JobConf.
- Throws:
IOException
- For IO issues.
-
setupJob
public void setupJob(org.apache.hadoop.mapreduce.Job job, SampleDataForSplitPoints operation, String mapperGeneratorClassName, Store store) throws IOException
Description copied from interface:JobFactory
Sets up all parts of the Job to be used on the add from hdfs.- Specified by:
setupJob
in interfaceJobFactory<SampleDataForSplitPoints>
- Parameters:
job
- TheJob
to be executed.operation
- The Operation.mapperGeneratorClassName
- Class Name for the MapperGenerator class.store
- The store.- Throws:
IOException
- For IO issues.
-
-