MemoryStore

MemoryStore manages blocks.

MemoryStore requires SparkConf, BlockInfoManager, SerializerManager, MemoryManager and BlockEvictionHandler.

Table 1. MemoryStore Internal Registries
Name Description

entries

Collection of …​FIXME

entries is Java’s LinkedHashMap with the initial capacity of 32, the load factor of 0.75 and access-order ordering mode (i.e. iteration is in the order in which its entries were last accessed, from least-recently accessed to most-recently).

NOTE: entries is Java’s java.util.LinkedHashMap.

Caution
FIXME Where are these dependencies used?
Caution
FIXME Where is the MemoryStore created? What params provided?
Note
MemoryStore is a private[spark] class.
Tip

Enable INFO or DEBUG logging level for org.apache.spark.storage.memory.MemoryStore logger to see what happens inside.

Add the following line to conf/log4j.properties:

log4j.logger.org.apache.spark.storage.memory.MemoryStore=DEBUG

Refer to Logging.

putIteratorAsBytes Method

Caution
FIXME

putIteratorAsValues Method

Caution
FIXME

Evicting Blocks to Free Space

Caution
FIXME

Removing Block

Caution
FIXME

Acquiring Storage Memory for Blocks — putBytes Method

putBytes[T](
  blockId: BlockId,
  size: Long,
  memoryMode: MemoryMode,
  _bytes: () => ChunkedByteBuffer): Boolean

putBytes requests storage memory for blockId from MemoryManager and registers the block in entries internal registry.

Internally, putBytes first makes sure that blockId block has not been registered already in entries internal registry.

If successful, putBytes "materializes" _bytes byte buffer and makes sure that the size is exactly size. It then registers a SerializedMemoryEntry (for the bytes and memoryMode) for blockId in the internal entries registry.

You should see the following INFO message in the logs:

INFO Block [blockId] stored as bytes in memory (estimated size [size], free [bytes])

putBytes returns true only after blockId was successfully registered in the internal entries registry.

Settings

Table 2. Spark Properties
Spark Property Default Value Description

spark.storage.unrollMemoryThreshold

1k

results matching ""

    No results matching ""