Packages

class S3Connector extends DBConnector

Linear Supertypes
DBConnector, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. S3Connector
  2. DBConnector
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new S3Connector()

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @IntrinsicCandidate()
  6. def closeSpark(): Unit

    Closes spark connection

    Closes spark connection

    Definition Classes
    DBConnector
  7. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  8. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  9. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @IntrinsicCandidate()
  10. def get_metadata(config: Config): MetaData

    This method constructs the appropriate metadata based on the already stored in the database and the new presented in the config object

    This method constructs the appropriate metadata based on the already stored in the database and the new presented in the config object

    config

    contains the configuration passed during execution

    returns

    the metadata

    Definition Classes
    S3ConnectorDBConnector
  11. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @IntrinsicCandidate()
  12. def initialize_db(config: Config): Unit

    Create the appropriate tables, remove previous ones

    Create the appropriate tables, remove previous ones

    Definition Classes
    S3ConnectorDBConnector
  13. def initialize_spark(config: Config): Unit

    Spark initializes the connection to S3 utilizing the hadoop properties and the aws-bundle library

    Spark initializes the connection to S3 utilizing the hadoop properties and the aws-bundle library

    Definition Classes
    S3ConnectorDBConnector
  14. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  15. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  16. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @IntrinsicCandidate()
  17. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @IntrinsicCandidate()
  18. def read_last_checked_table(metaData: MetaData): RDD[LastChecked]

    Returns data from LastChecked Table Loads data from the LastChecked Table, which contains the information of the last timestamp per event type pair per trace.

    Returns data from LastChecked Table Loads data from the LastChecked Table, which contains the information of the last timestamp per event type pair per trace.

    metaData

    Object containing the metadata

    returns

    An RDD with the last timestamps per event type pair per trace

    Definition Classes
    S3ConnectorDBConnector
  19. def read_sequence_table(metaData: MetaData, detailed: Boolean = false): RDD[EventTrait]

    Read data as an rdd from the SeqTable

    Read data as an rdd from the SeqTable

    metaData

    Object containing the metadata

    returns

    Object containing the metadata

    Definition Classes
    S3ConnectorDBConnector
  20. def read_single_table(metaData: MetaData): RDD[Event]

    Loads the single inverted index from S3, stored in the SingleTable

    Loads the single inverted index from S3, stored in the SingleTable

    metaData

    Object containing the metadata

    returns

    In RDD the stored data

    Definition Classes
    S3ConnectorDBConnector
  21. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  22. def toString(): String
    Definition Classes
    AnyRef → Any
  23. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  25. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  26. def write_count_table(counts: RDD[Count], metaData: MetaData): Unit

    Writes count to countTable

    Writes count to countTable

    counts

    Calculated basic statistics per event type pair in order to be stored in the count table

    metaData

    Object containing the metadata

    Definition Classes
    S3ConnectorDBConnector
  27. def write_index_table(newPairs: RDD[PairFull], metaData: MetaData): Unit

    Write the combined pairs back to the S3, grouped by the interval and the first event

    Write the combined pairs back to the S3, grouped by the interval and the first event

    newPairs

    The newly generated pairs

    metaData

    Object containing the metadata

    Definition Classes
    S3ConnectorDBConnector
  28. def write_last_checked_table(lastChecked: RDD[LastChecked], metaData: MetaData): Unit

    Stores new records for last checked back in the database

    Stores new records for last checked back in the database

    lastChecked

    Records containing the timestamp of last completion for each event type pair for each trace combined with the records that were previously read from the LastChecked Table. that way douplicate read is avoided

    metaData

    Object containing the metadata

    Definition Classes
    S3ConnectorDBConnector
  29. def write_metadata(metaData: MetaData): Unit

    Persists metadata

    Persists metadata

    metaData

    Object containing the metadata

    Definition Classes
    S3ConnectorDBConnector
  30. def write_sequence_table(sequenceRDD: RDD[EventTrait], metaData: MetaData, detailed: Boolean = false): Unit

    This method writes traces to the auxiliary SeqTable.

    This method writes traces to the auxiliary SeqTable. Since RDD will be used as intermediate results it is already persisted and should not be modified. Additionally, updates metaData object

    sequenceRDD

    The RDD containing the traces with the new events

    metaData

    Object containing the metadata

    Definition Classes
    S3ConnectorDBConnector
  31. def write_single_table(sequenceRDD: RDD[EventTrait], metaData: MetaData): Unit

    This method writes traces to the auxiliary SingleTable.

    This method writes traces to the auxiliary SingleTable. The rdd that comes to this method is not persisted. Database should persist it before store it and unpersist it at the end. Additionally, updates metaData object.

    sequenceRDD

    Contains the newly indexed events in a form of single inverted index

    metaData

    Object containing the metadata

    Definition Classes
    S3ConnectorDBConnector

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated
    Deprecated

Inherited from DBConnector

Inherited from AnyRef

Inherited from Any

Ungrouped