Skip to main content
This document describes the data import and export utilities provided by StableNet for managing blockchain data, snapshots, and database contents.
These utilities enable file-based chain synchronization, blockchain backups, historical data archiving, and database manipulation.
For live chain synchronization over the P2P network, see Chain Synchronization. For database architecture and storage mechanisms, see Database Layer and Merkle Patricia Tries and Ancient Store and Data Lifecycle.

Overview

StableNet provides a set of command-line utilities for importing and exporting various types of blockchain data.
These utilities operate in a file-based manner rather than via network connections and serve the following purposes:
  • Offline synchronization: Node bootstrapping using trusted chain dumps
  • Backup and recovery: Chain state backups for disaster recovery
  • Historical archiving: Preservation of historical block data using the Era1 format
  • Database migration: Movement of raw database data between nodes
  • Testing and development: Selective import/export of specific chain ranges
All utilities automatically detect gzip compression via the .gz extension and
support interrupt-based termination for long-running operations.

Chain Import and Export

ImportChain

ImportChain reads blocks from an RLP-encoded file and inserts them into the blockchain.
Processing is performed in batches to limit memory usage.
  • Batch size: 2500 blocks
  • The genesis block is automatically skipped
  • Existing blocks are not re-inserted
  • SIGINT / SIGTERM signals are detected for safe interruption
Missing block detection uses the following criteria:
  • At or below the current head: HasBlock
  • Above the current head: HasBlockAndState

ExportChain

ExportChain exports the entire blockchain into a single RLP file.
  • Overwrites the target file if it already exists
  • Applies gzip compression when using the .gz extension
  • Internally uses blockchain.Export

ExportAppendChain

ExportAppendChain appends a specified block range to an existing file.
  • Suitable for incremental backups or segmented exports
  • Writes blocks from first to last (inclusive)

Era1 History Format

Era1 is a standard historical data format defined by the Ethereum Portal Network specification.
A single Era1 file contains approximately 8192 blocks (one epoch) and includes the following data:
  • Block headers and bodies
  • Receipts
  • Total difficulty

ImportHistory

Restores the chain from genesis using an Era1 directory. Requirements:
  • The current block height must be 0
  • A checksums.txt file must be present
  • Era1 file naming conventions must be followed
Processing flow:
  1. List Era1 files in the directory
  2. Verify SHA256 checksums
  3. Process blocks per Era1 file
  4. Skip the genesis block
  5. Insert data in the order HeaderChain → ReceiptChain
  6. Periodically log progress

ExportHistory

Exports chain data in the Era1 format.
  • The step value is typically 8192
  • File names include the accumulator root hash
  • checksums.txt is generated automatically

Preimage Import and Export

Preimages are the original key data prior to hashing in the state trie.

ImportPreimages

Restores preimage data from an RLP file into the database.
  • Processed in batches (1024 entries)
  • Uses the Keccak256 hash as the key
  • This feature is deprecated and scheduled for removal

ExportPreimages

Exports all preimages stored in the database to a file.

ExportSnapshotPreimages

Extracts only the preimages required for a specific snapshot root.
  • Account address preimages
  • All storage key preimages for those accounts
  • Fails if any required preimage is missing

LevelDB Data Import and Export

Low-level database operations support state transfer or recovery between nodes.

LDB Export Header Structure

  • Magic: "gethdbdump"
  • Version: 0
  • Kind: Data type
  • UnixTime: Export timestamp

ImportLDBData

Imports raw key-value pairs into the database.
  • Operation codes: Add / Delete
  • Immediate flush when batch size is exceeded
  • Interrupt checks every 1000 entries

ExportChaindata

A generalized database export utility.
Processes multiple data sources via the ChainDataIterator interface.
type ChainDataIterator interface {
  Next() (byte, []byte, []byte, bool)
  Release()
}

Data Integrity and Validation

Checksum Verification

During Era1 imports, each file is fully verified using SHA256 before processing.
This guarantees integrity prior to parsing.

Block Validation

  • Duplicate block elimination
  • Header and body validation via the consensus engine
  • State availability checks

Interrupt Handling

All long-running operations follow this pattern:
  1. Create an interrupt channel
  2. Handle SIGINT / SIGTERM
  3. Close the channel upon signal reception
  4. Periodically check the channel in the main loop

File Format Summary

FormatExtensionCompressionContentUse Case
RLP Chain.rlpOptional .gzBlock sequenceFull backup
Era1.era1NoneEpoch blocksHistory archive
LDB.ldbOptional .gzRaw KVDB migration
Preimages.rlpOptional .gzTrie preimageState recovery

Automatic Compression Detection Logic

if strings.HasSuffix(fn, ".gz") {
  reader, _ = gzip.NewReader(reader)
  writer = gzip.NewWriter(writer)
}

Progress Reporting

  • Default log interval: 8 seconds
  • Reported information:
    • Operation type
    • Current position
    • Cumulative elapsed time

Command-Line Integration

CommandFunctionDescription
gstable importImportChainImport RLP chain
gstable exportExportChainExport full chain
gstable export first lastExportAppendChainExport range
gstable import-historyImportHistoryImport Era1
gstable export-historyExportHistoryExport Era1
gstable import-preimagesImportPreimagesImport preimages
gstable export-preimagesExportPreimagesExport preimages
gstable db importImportLDBDataImport raw DB
gstable db exportExportChaindataExport raw DB