Class GetDataFrameOfElements

  • All Implemented Interfaces:
    Closeable, AutoCloseable, GraphFilters, OperationView, Output<org.apache.spark.sql.Dataset<org.apache.spark.sql.Row>>, Operation

    public class GetDataFrameOfElements
    extends Object
    implements Output<org.apache.spark.sql.Dataset<org.apache.spark.sql.Row>>, GraphFilters
    An Operation that returns an Apache Spark DataFrame (i.e. a Dataset of Rows) consisting of the Elements converted to Rows. The fields in the Row are ordered according to the ordering of the groups in the view, with Entitys first, followed by Edges.

    Implementations of this operation should automatically convert all properties that have natural equivalents as a Spark DataType to that DataType. An implementation may allow the user to specify a conversion function for properties that do not have natural equivalents. Thus not all properties from each Element will necessarily make it into the DataFrame.

    The schema of the Dataframe is formed of all properties from the first group, followed by all properties from the second group, with the exception of properties already found in the first group, etc.

    • Constructor Detail

      • GetDataFrameOfElements

        public GetDataFrameOfElements()
      • GetDataFrameOfElements

        public GetDataFrameOfElements​(List<Converter> converters)
    • Method Detail

      • setConverters

        public void setConverters​(List<Converter> converters)
      • getOptions

        public Map<String,​String> getOptions()
        Specified by:
        getOptions in interface Operation
        Returns:
        the operation options. This may contain store specific options such as authorisation strings or and other properties required for the operation to be executed. Note these options will probably not be interpreted in the same way by every store implementation.
      • setOptions

        public void setOptions​(Map<String,​String> options)
        Specified by:
        setOptions in interface Operation
        Parameters:
        options - the operation options. This may contain store specific options such as authorisation strings or and other properties required for the operation to be executed. Note these options will probably not be interpreted in the same way by every store implementation.
      • getOutputTypeReference

        public com.fasterxml.jackson.core.type.TypeReference<org.apache.spark.sql.Dataset<org.apache.spark.sql.Row>> getOutputTypeReference()
        Specified by:
        getOutputTypeReference in interface Output<org.apache.spark.sql.Dataset<org.apache.spark.sql.Row>>
      • shallowClone

        public GetDataFrameOfElements shallowClone()
        Description copied from interface: Operation
        Operation implementations should ensure a ShallowClone method is implemented. Performs a shallow clone. Creates a new instance and copies the fields across. It does not clone the fields. If the operation contains nested operations, these must also be cloned.
        Specified by:
        shallowClone in interface Operation
        Returns:
        shallow clone