Class AccumuloSampleDataForSplitPointsJobFactory
- java.lang.Object
-
- uk.gov.gchq.gaffer.accumulostore.operation.hdfs.handler.job.factory.AccumuloSampleDataForSplitPointsJobFactory
-
- All Implemented Interfaces:
JobFactory<SampleDataForSplitPoints>,SampleDataForSplitPointsJobFactory
public class AccumuloSampleDataForSplitPointsJobFactory extends Object implements SampleDataForSplitPointsJobFactory
-
-
Field Summary
-
Fields inherited from interface uk.gov.gchq.gaffer.hdfs.operation.handler.job.factory.JobFactory
MAPPER_GENERATOR, SCHEMA, VALIDATE
-
Fields inherited from interface uk.gov.gchq.gaffer.hdfs.operation.handler.job.factory.SampleDataForSplitPointsJobFactory
PROPORTION_TO_SAMPLE
-
-
Constructor Summary
Constructors Constructor Description AccumuloSampleDataForSplitPointsJobFactory()AccumuloSampleDataForSplitPointsJobFactory(org.apache.hadoop.conf.Configuration configuration)
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description org.apache.hadoop.mapred.JobConfcreateJobConf(SampleDataForSplitPoints operation, String mapperGeneratorClassName, Store store)Creates anJobConfto be used for the add from hdfs.org.apache.hadoop.io.WritablecreateKey()byte[]createSplit(org.apache.hadoop.io.Writable key, org.apache.hadoop.io.Writable value)org.apache.hadoop.io.WritablecreateValue()intgetExpectedNumberOfSplits(Store store)voidsetupJob(org.apache.hadoop.mapreduce.Job job, SampleDataForSplitPoints operation, String mapperGeneratorClassName, Store store)Sets up all parts of the Job to be used on the add from hdfs.-
Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface uk.gov.gchq.gaffer.hdfs.operation.handler.job.factory.JobFactory
createJobs
-
-
-
-
Method Detail
-
createSplit
public byte[] createSplit(org.apache.hadoop.io.Writable key, org.apache.hadoop.io.Writable value)- Specified by:
createSplitin interfaceSampleDataForSplitPointsJobFactory
-
createKey
public org.apache.hadoop.io.Writable createKey()
- Specified by:
createKeyin interfaceSampleDataForSplitPointsJobFactory
-
createValue
public org.apache.hadoop.io.Writable createValue()
- Specified by:
createValuein interfaceSampleDataForSplitPointsJobFactory
-
getExpectedNumberOfSplits
public int getExpectedNumberOfSplits(Store store)
- Specified by:
getExpectedNumberOfSplitsin interfaceSampleDataForSplitPointsJobFactory
-
createJobConf
public org.apache.hadoop.mapred.JobConf createJobConf(SampleDataForSplitPoints operation, String mapperGeneratorClassName, Store store) throws IOException
Description copied from interface:JobFactoryCreates anJobConfto be used for the add from hdfs.- Specified by:
createJobConfin interfaceJobFactory<SampleDataForSplitPoints>- Parameters:
operation- The Operation.mapperGeneratorClassName- Class name for the MapperGenerator class.store- The store.- Returns:
- The JobConf.
- Throws:
IOException- For IO issues.
-
setupJob
public void setupJob(org.apache.hadoop.mapreduce.Job job, SampleDataForSplitPoints operation, String mapperGeneratorClassName, Store store) throws IOExceptionDescription copied from interface:JobFactorySets up all parts of the Job to be used on the add from hdfs.- Specified by:
setupJobin interfaceJobFactory<SampleDataForSplitPoints>- Parameters:
job- TheJobto be executed.operation- The Operation.mapperGeneratorClassName- Class Name for the MapperGenerator class.store- The store.- Throws:
IOException- For IO issues.
-
-