Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conformance test runner #638

Merged
merged 25 commits into from
Aug 25, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
41f4795
Setup test vector deserialization and types
austinabell Aug 18, 2020
ec69df0
Update submodule version
austinabell Aug 18, 2020
27a31fb
update loading car to be bulk written to store
austinabell Aug 18, 2020
9827bdb
Put buffered blockstore behind feature
austinabell Aug 18, 2020
14c2c07
add car file decoding
austinabell Aug 18, 2020
d326879
Update rand to be trait and finish off test runner
austinabell Aug 19, 2020
7190601
Cleanup
austinabell Aug 19, 2020
75e892c
Merge branch 'main' into austin/tvframework
austinabell Aug 19, 2020
97ff54b
cleanup implicit changes
austinabell Aug 19, 2020
c0c6510
bump sm commit
austinabell Aug 20, 2020
cdc72f0
Merge branch 'main' of github.com:ChainSafe/forest into austin/tvfram…
austinabell Aug 20, 2020
b86761f
Setup regex file ignoring for test runner
austinabell Aug 20, 2020
650c5d3
Merge branch 'main' of github.com:ChainSafe/forest into austin/tvfram…
austinabell Aug 24, 2020
f40a969
fix
austinabell Aug 24, 2020
121148d
fmt and update makefile
austinabell Aug 24, 2020
0bc18d3
separate vector and unit tests in CI
austinabell Aug 24, 2020
b19e7ba
Update CIs
austinabell Aug 24, 2020
ddfff68
Update workflow
austinabell Aug 24, 2020
5f79f7d
Update test runner code (not fail on first error)
austinabell Aug 24, 2020
89480bb
Merge branch 'main' of github.com:ChainSafe/forest into austin/tvfram…
austinabell Aug 24, 2020
fe079a1
typo
austinabell Aug 24, 2020
e1abcf5
Fix dep
austinabell Aug 24, 2020
d2b0a0e
bump submodule commit
austinabell Aug 24, 2020
317262a
Update regex
austinabell Aug 24, 2020
110b551
Update actions workflow
austinabell Aug 24, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 13 additions & 4 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -126,12 +126,18 @@ jobs:
steps:
- build_setup
- restore_cargo_package_cache
- run:
name: Build Unit Tests
command: make test-all-no-run
- run:
name: Run Unit Tests
command: make test-all
command: make test
test-vectors:
executor: test-executor
description: Run serialization and conformance tests
steps:
- build_setup
- restore_cargo_package_cache
- run:
name: Run test vectors
command: make run-vectors
install:
executor: test-executor
description: Install forest binary
Expand All @@ -152,3 +158,6 @@ workflows:
- test:
requires:
- prefetch-crates
- test-vectors:
requires:
- prefetch-crates
11 changes: 7 additions & 4 deletions .github/workflows/ci-rust.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,12 +32,15 @@ jobs:
profile: minimal
toolchain: stable
override: true

- name: Pull submodules
run: git submodule update --init

- name: Cargo Build Tests
run: make test-all-no-run
- name: Run all unit tests
run: make test

- name: Cargo test all
run: make test-all
- name: Run test vectors
run: make run-vectors

fmt:
name: rustfmt
Expand Down
3 changes: 3 additions & 0 deletions .gitmodules
Original file line number Diff line number Diff line change
Expand Up @@ -8,3 +8,6 @@
[submodule "ipld/tests/ipld-traversal-vectors"]
path = ipld/tests/ipld-traversal-vectors
url = git@github.com:ChainSafe/ipld-traversal-vectors.git
[submodule "tests/conformance_tests/test-vectors"]
path = tests/conformance_tests/test-vectors
url = https://github.com/filecoin-project/test-vectors.git
26 changes: 26 additions & 0 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ members = [
"ipld/graphsync",
"utils/bigint",
"tests/serialization_tests",
"tests/conformance_tests",
"utils/bitfield",
"utils/test_utils",
"utils/commcid",
Expand Down
10 changes: 8 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
SER_TESTS = "tests/serialization_tests"
CONF_TESTS = "tests/conformance_tests"

install:
cargo install --path forest --force
Expand Down Expand Up @@ -53,14 +54,19 @@ release:
pull-serialization-tests:
git submodule update --init

run-vectors:
run-serialization-vectors:
cargo test --release --manifest-path=$(SER_TESTS)/Cargo.toml --features "submodule_tests"

run-conformance-vectors:
cargo test --release --manifest-path=$(CONF_TESTS)/Cargo.toml --features "submodule_tests"

run-vectors: run-serialization-vectors run-conformance-vectors

test-vectors: pull-serialization-tests run-vectors

# Test all without the submodule test vectors with release configuration
test:
cargo test --all --exclude serialization_tests
cargo test --all --all-features --exclude serialization_tests --exclude conformance_tests

# This will run all tests will all features enabled, which will exclude some tests with
# specific features disabled
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ Will show all debug logs by default, but the `forest_libp2p::service` logs will
# To run base tests
cargo test # add --release flag for longer compilation but faster execution

# To pull serialization vectors submodule and run serialization tests
# To pull serialization vectors submodule and run serialization and conformance tests
make test-vectors

# To run all tests and all features enabled
Expand Down
2 changes: 1 addition & 1 deletion blockchain/state_manager/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ db = { path = "../../node/db/" }
encoding = { package = "forest_encoding", path = "../../encoding/" }
num-bigint = { path = "../../utils/bigint", package = "forest_bigint" }
state_tree = { path = "../../vm/state_tree/" }
blockstore = { package = "ipld_blockstore", path = "../../ipld/blockstore/" }
blockstore = { package = "ipld_blockstore", path = "../../ipld/blockstore/", features = ["buffered"] }
forest_blocks = { path = "../../blockchain/blocks" }
thiserror = "1.0"
interpreter = { path = "../../vm/interpreter/" }
Expand Down
7 changes: 3 additions & 4 deletions blockchain/state_manager/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@ use cid::Cid;
use clock::ChainEpoch;
use encoding::de::DeserializeOwned;
use encoding::Cbor;
use fil_types::DevnetParams;
use flo_stream::Subscriber;
use forest_blocks::{Block, BlockHeader, FullTipset, Tipset, TipsetKeys};
use futures::channel::oneshot;
Expand Down Expand Up @@ -180,7 +179,7 @@ where
let mut buf_store = BufferedBlockStore::new(self.bs.as_ref());
// TODO possibly switch out syscalls to be saved at state manager level
// TODO change from statically using devnet params when needed
let mut vm = VM::<_, _, DevnetParams>::new(
let mut vm = VM::<_, _, _>::new(
ts.parent_state(),
&buf_store,
ts.epoch(),
Expand Down Expand Up @@ -256,7 +255,7 @@ where
span!("state_call_raw", {
let block_store = self.get_block_store_ref();
let buf_store = BufferedBlockStore::new(block_store);
let mut vm = VM::<_, _, DevnetParams>::new(
let mut vm = VM::<_, _, _>::new(
bstate,
&buf_store,
*bheight,
Expand Down Expand Up @@ -331,7 +330,7 @@ where
.map_err(|_| Error::Other("Could not load tipset state".to_string()))?;
let chain_rand = ChainRand::new(ts.key().to_owned());

let mut vm = VM::<_, _, DevnetParams>::new(
let mut vm = VM::<_, _, _>::new(
&st,
self.bs.as_ref(),
ts.epoch() + 1,
Expand Down
3 changes: 2 additions & 1 deletion ipld/blockstore/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,8 @@ cid = { package = "forest_cid", path = "../cid" }
db = { path = "../../node/db" }
encoding = { package = "forest_encoding", path = "../../encoding" }
forest_ipld = { path = "../" }
commcid = { path = "../../utils/commcid" }
commcid = { path = "../../utils/commcid", optional = true }

[features]
rocksdb = ["db/rocksdb"]
buffered = ["commcid"]
6 changes: 4 additions & 2 deletions ipld/blockstore/src/buffered.rs
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
// Copyright 2020 ChainSafe Systems
// SPDX-License-Identifier: Apache-2.0, MIT

#![cfg(feature = "buffered")]

use super::BlockStore;
use cid::{
multihash::{Code, MultihashDigest},
Expand Down Expand Up @@ -163,12 +165,12 @@ where
{
self.base.bulk_read(keys)
}
fn bulk_write<K, V>(&self, keys: &[K], values: &[V]) -> Result<(), Error>
fn bulk_write<K, V>(&self, values: &[(K, V)]) -> Result<(), Error>
where
K: AsRef<[u8]>,
V: AsRef<[u8]>,
{
self.base.bulk_write(keys, values)
self.base.bulk_write(values)
}
fn bulk_delete<K>(&self, keys: &[K]) -> Result<(), Error>
where
Expand Down
1 change: 1 addition & 0 deletions ipld/blockstore/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@

mod buffered;

#[cfg(feature = "buffered")]
pub use self::buffered::BufferedBlockStore;

use cid::{multihash::MultihashDigest, Cid};
Expand Down
2 changes: 1 addition & 1 deletion ipld/car/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ authors = ["ChainSafe Systems <info@chainsafe.io>"]
edition = "2018"

[dependencies]
unsigned-varint = "0.5"
unsigned-varint = { version = "0.5", features = ["futures-codec"] }
cid = { package = "forest_cid", path = "../cid", features = ["cbor"] }
forest_encoding = { path = "../../encoding" }
blockstore = { package = "ipld_blockstore", path = "../blockstore" }
Expand Down
13 changes: 11 additions & 2 deletions ipld/car/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -71,10 +71,19 @@ pub fn load_car<R: Read, B: BlockStore>(
) -> Result<Vec<Cid>, Error> {
let mut car_reader = CarReader::new(buf_reader)?;

// Batch write key value pairs from car file
let mut buf: Vec<(Vec<u8>, Vec<u8>)> = Vec::with_capacity(100);
// TODO revisit, seems possible buffer could be empty when underlying reader isn't
while !car_reader.buf_reader.buffer().is_empty() {
let block = car_reader.next_block()?;
s.write(block.cid.to_bytes(), block.data)
.map_err(|e| Error::Other(e.to_string()))?;
buf.push((block.cid.to_bytes(), block.data));
if buf.len() > 1000 {
s.bulk_write(&buf)
.map_err(|e| Error::Other(e.to_string()))?;
buf.clear();
}
}
s.bulk_write(&buf)
.map_err(|e| Error::Other(e.to_string()))?;
Ok(car_reader.header.roots)
}
7 changes: 3 additions & 4 deletions ipld/car/src/util.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,10 @@ use super::error::Error;
use cid::Cid;
use std::io::Read;

pub(crate) fn ld_read<R: Read>(mut buf_reader: &mut R) -> Result<Vec<u8>, Error> {
let l =
unsigned_varint::io::read_u64(&mut buf_reader).map_err(|e| Error::Other(e.to_string()))?;
pub(crate) fn ld_read<R: Read>(mut reader: &mut R) -> Result<Vec<u8>, Error> {
let l = unsigned_varint::io::read_u64(&mut reader).map_err(|e| Error::Other(e.to_string()))?;
let mut buf = Vec::with_capacity(l as usize);
buf_reader
reader
.take(l)
.read_to_end(&mut buf)
.map_err(|e| Error::Other(e.to_string()))?;
Expand Down
9 changes: 3 additions & 6 deletions node/db/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -49,16 +49,13 @@ pub trait Store {
}

/// Write slice of KV pairs.
fn bulk_write<K, V>(&self, keys: &[K], values: &[V]) -> Result<(), Error>
fn bulk_write<K, V>(&self, values: &[(K, V)]) -> Result<(), Error>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice. This actually makes a lot more sense.

where
K: AsRef<[u8]>,
V: AsRef<[u8]>,
{
if keys.len() != values.len() {
return Err(Error::InvalidBulkLen);
}
keys.iter()
.zip(values)
values
.iter()
.map(|(key, value)| self.write(key, value))
.collect()
}
Expand Down
9 changes: 2 additions & 7 deletions node/db/src/rocks.rs
Original file line number Diff line number Diff line change
Expand Up @@ -89,18 +89,13 @@ impl Store for RocksDb {
Ok(self.db()?.delete(key)?)
}

fn bulk_write<K, V>(&self, keys: &[K], values: &[V]) -> Result<(), Error>
fn bulk_write<K, V>(&self, values: &[(K, V)]) -> Result<(), Error>
where
K: AsRef<[u8]>,
V: AsRef<[u8]>,
{
// Safety check to make sure kv lengths are the same
if keys.len() != values.len() {
return Err(Error::InvalidBulkLen);
}

let mut batch = WriteBatch::default();
for (k, v) in keys.iter().zip(values.iter()) {
for (k, v) in values {
batch.put(k, v);
}
Ok(self.db()?.write(batch)?)
Expand Down
13 changes: 7 additions & 6 deletions node/db/tests/subtests/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -68,10 +68,9 @@ pub fn bulk_write<DB>(db: &DB)
where
DB: Store,
{
let keys = [[0], [1], [2]];
let values = [[0], [1], [2]];
db.bulk_write(&keys, &values).unwrap();
for k in keys.iter() {
let values = [([0], [0]), ([1], [1]), ([2], [2])];
db.bulk_write(&values).unwrap();
for (k, _) in values.iter() {
let res = db.exists(k.clone()).unwrap();
assert_eq!(res, true);
}
Expand All @@ -83,7 +82,8 @@ where
{
let keys = [[0], [1], [2]];
let values = [[0], [1], [2]];
db.bulk_write(&keys, &values).unwrap();
let kvs: Vec<_> = keys.iter().zip(values.iter()).collect();
db.bulk_write(&kvs).unwrap();
let results = db.bulk_read(&keys).unwrap();
for (result, value) in results.iter().zip(values.iter()) {
match result {
Expand All @@ -99,7 +99,8 @@ where
{
let keys = [[0], [1], [2]];
let values = [[0], [1], [2]];
db.bulk_write(&keys, &values).unwrap();
let kvs: Vec<_> = keys.iter().zip(values.iter()).collect();
db.bulk_write(&kvs).unwrap();
db.bulk_delete(&keys).unwrap();
for k in keys.iter() {
let res = db.exists(k.clone()).unwrap();
Expand Down
2 changes: 1 addition & 1 deletion node/rpc/src/state_api.rs
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ use clock::ChainEpoch;
use fil_types::SectorNumber;
use jsonrpc_v2::{Data, Error as JsonRpcError, Params};
use message::{
json::MessageReceiptJson,
message_receipt::json::MessageReceiptJson,
unsigned_message::{json::UnsignedMessageJson, UnsignedMessage},
};
use serde::{Deserialize, Serialize};
Expand Down
32 changes: 32 additions & 0 deletions tests/conformance_tests/Cargo.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
[package]
name = "conformance_tests"
version = "0.1.0"
authors = ["ChainSafe Systems <info@chainsafe.io>"]
edition = "2018"

[features]
submodule_tests = []

[dependencies]

[dev-dependencies]
base64 = { version = "0.12.1" }
walkdir = "2.3"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
clock = { path = "../../node/clock" }
cid = { package = "forest_cid", path = "../../ipld/cid", features = ["cbor", "json"] }
forest_message = { path = "../../vm/message", features = ["json"] }
vm = { package = "forest_vm", path = "../../vm" }
db = { path = "../../node/db/" }
blockstore = { package = "ipld_blockstore", path = "../../ipld/blockstore/" }
forest_car = { path = "../../ipld/car" }
flate2 = "1.0"
encoding = { package = "forest_encoding", path = "../../encoding" }
interpreter = { path = "../../vm/interpreter/" }
runtime = { path = "../../vm/runtime/" }
fil_types = { path = "../../types" }
crypto = { package = "forest_crypto", path = "../../crypto" }
address = { package = "forest_address", path = "../../vm/address" }
regex = "1.0"
lazy_static = "1.4"
4 changes: 4 additions & 0 deletions tests/conformance_tests/src/lib.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
// Copyright 2020 ChainSafe Systems
// SPDX-License-Identifier: Apache-2.0, MIT

#![cfg(feature = "submodule_tests")]
1 change: 1 addition & 0 deletions tests/conformance_tests/test-vectors
Submodule test-vectors added at e3b6da
Loading