Packages

trait DBConnector extends AnyRef

This class is the interface that each database should extend. There are 3 types of methods described here:

  • the write and reads methods - that should be implemented by each database, as they might have different indexing or structure
  • the combine methods - that describe how the data already stored in the database are combined with the newly calculated
  • the spark related methods - the initialization, building the appropriate tables and finally the termination.

The tables that are utilized are:

  • IndexTable: contains the inverted index that it is based on event type pairs
  • SequenceTable: contains the traces that are indexed
  • SingleTable: contains the single inverted index (Similar to Set-Containment) where the key is a single event tyep
  • LastChecked: contains the timestamp of the last completion of each event type per trace
  • CountTable: contains basic statistics, like max duration and number of completions, for each event type pair
  • Metadata: contains the information for each log database, like the compression algorithm, the number of traces and event type pairs etc.
Linear Supertypes
AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DBConnector
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Abstract Value Members

  1. abstract def get_metadata(config: Config): MetaData

    This method constructs the appropriate metadata based on the already stored in the database and the new presented in the config object

    This method constructs the appropriate metadata based on the already stored in the database and the new presented in the config object

    config

    contains the configuration passed during execution

    returns

    Obe

  2. abstract def initialize_db(config: Config): Unit

    Create the appropriate tables, remove previous ones

  3. abstract def initialize_spark(config: Config): Unit

    Depending on the different database, each connector has to initialize the spark context

  4. abstract def read_last_checked_table(metaData: MetaData): RDD[LastChecked]

    Returns data from LastChecked Table Loads data from the LastChecked Table, which contains the information of the last timestamp per event type pair per trace.

    Returns data from LastChecked Table Loads data from the LastChecked Table, which contains the information of the last timestamp per event type pair per trace.

    metaData

    Object containing the metadata

    returns

    An RDD with the last timestamps per event type pair per trace

  5. abstract def read_sequence_table(metaData: MetaData, detailed: Boolean = false): RDD[EventTrait]

    Read data as an rdd from the SeqTable

    Read data as an rdd from the SeqTable

    metaData

    Object containing the metadata

    returns

    Object containing the metadata

  6. abstract def read_single_table(metaData: MetaData): RDD[Event]

    Loads the single inverted index from Cassandra, stored in the SingleTable

    Loads the single inverted index from Cassandra, stored in the SingleTable

    metaData

    Object containing the metadata

    returns

    In RDD the stored data

  7. abstract def write_count_table(counts: RDD[Count], metaData: MetaData): Unit

    Write count to countTable

    Write count to countTable

    counts

    Calculated basic statistics per event type pair in order to be stored in the count table

    metaData

    Object containing the metadata

  8. abstract def write_index_table(newPairs: RDD[PairFull], metaData: MetaData): Unit

    Write the combined pairs back to the S3, grouped by the interval and the first event

    Write the combined pairs back to the S3, grouped by the interval and the first event

    newPairs

    The newly generated pairs

    metaData

    Object containing the metadata

  9. abstract def write_last_checked_table(lastChecked: RDD[LastChecked], metaData: MetaData): Unit

    Stores new records for last checked back in the database

    Stores new records for last checked back in the database

    lastChecked

    Records containing the timestamp of last completion for each event type pair for each trace

    metaData

    Object containing the metadata

  10. abstract def write_metadata(metaData: MetaData): Unit

    Persists metadata

    Persists metadata

    metaData

    Object containing the metadata

  11. abstract def write_sequence_table(sequenceRDD: RDD[EventTrait], metaData: MetaData, detailed: Boolean = false): Unit

    This method writes traces to the auxiliary SeqTable.

    This method writes traces to the auxiliary SeqTable. Since RDD will be used as intermediate results it is already persisted and should not be modified. Additionally, updates metaData object

    sequenceRDD

    The RDD containing the traces with the new events

    metaData

    Object containing the metadata

  12. abstract def write_single_table(sequenceRDD: RDD[EventTrait], metaData: MetaData): Unit

    This method writes traces to the auxiliary SingleTable.

    This method writes traces to the auxiliary SingleTable. The rdd that comes to this method is not persisted. Database should persist it before store it and unpersist it at the end. Additionally, updates metaData object.

    sequenceRDD

    Contains the newly indexed events in a form of single inverted index

    metaData

    Object containing the metadata

Concrete Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @IntrinsicCandidate()
  6. def closeSpark(): Unit

    Closes spark connection

  7. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  8. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  9. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @IntrinsicCandidate()
  10. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @IntrinsicCandidate()
  11. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  12. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @IntrinsicCandidate()
  14. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @IntrinsicCandidate()
  15. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  16. def toString(): String
    Definition Classes
    AnyRef → Any
  17. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  18. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  19. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated
    Deprecated

Inherited from AnyRef

Inherited from Any

Ungrouped