Skip to content

Replication Code for: Blockchain Network Analysis: A Comparative Study of Decentralized Banks

Notifications You must be signed in to change notification settings

SciEcon/Blockchain-Network-Analysis

Repository files navigation

Blockchain Network Analysis

DOI

Project information

The Portfolio for Blockchain Network Analysis: A Comparative Study of Decentralized Banks

* corresponding authors

The flow chart of the blockchain network analysis methodology

Flow Chart

CC-BY 4.0 Datasets on Harvard Dataverse

Yufan Zhang; Zichao Chen; Yutong Sun; Yulin Liu; Luyao Zhang, 2023, "Replication Data for: Blockchain Network Analysis: A Comparative Study of Decentralized Banks", https://doi.org/10.7910/DVN/CZSB6C, Harvard Dataverse, V2, UNF:6:0afchMq55rHG86p/ikY02g== [fileUNF]

@data{DVN/CZSB6C_2023,
author = {Yufan Zhang and Zichao Chen and Yutong Sun and Yulin Liu and Luyao Zhang},
publisher = {Harvard Dataverse},
title = {{Replication Data for: Blockchain Network Analysis: A Comparative Study of Decentralized Banks}},
UNF = {UNF:6:0afchMq55rHG86p/ikY02g==},
year = {2023},
version = {V2},
doi = {10.7910/DVN/CZSB6C},
url = {https://doi.org/10.7910/DVN/CZSB6C}
}

Repository structure

This repository includes supplementary resources, data, and code.

*The Structure is generated by the ASCII Tree Generator

.
├── Code
│   ├── analysis.ipynb
│   ├── extract_feature.py
│   ├── query_tx.ipynb
│   └── utils
│       ├── cp_test.py
│       └── network_fea.py
├── Data
│   ├── processedData
│   │   ├── AAVE_2022-07-13
│   │   ├── COMP_2022-07-13
│   │   ├── Dai_2022-07-13
│   │   ├── LQTY_2022-07-13
│   │   └── LUSD_2022-07-13
│   └── queriedData
│       ├── AAVE_2022-07-13.csv
│       ├── COMP_2022-07-13.csv
│       ├── Dai_2022-07-13.csv
│       ├── LQTY_2022-07-13.csv
│       └── LUSD_2022-07-13.csv
├── Figure
│   ├── AAVE_2020-10-02-2022-07-12
│   ├── BoxPlots
│   ├── COMP_2020-03-06-2022-07-12
│   ├── Dai_2019-11-18-2022-07-12
│   ├── LQTY_2021-04-05-2022-07-12
│   └── LUSD_2021-04-05-2022-07-12
├── README.md
└── requirements.txt

Data

Token Name Queried Data (need to download by yourself) Processed Data
AAVE ./Data/queriedData/AAVE_2022-07-13.csv ./Data/processedData/AAVE_2022-07-13/*
COMP ./Data/queriedData/COMP_2022-07-13.csv ./Data/processedData/COMP_2022-07-13/*
Dai ./Data/queriedData/Dai_2022-07-13.csv ./Data/processedData/Dai_2022-07-13/*
LQTY ./Data/queriedData/LQTY_2022-07-13.csv ./Data/processedData/LQTY_2022-07-13/*
LUSD ./Data/queriedData/LUSD_2022-07-13.csv ./Data/processedData/LUSD_2022-07-13/*

Code

Content File
Query the transaction data in Kaggle kernel ./Code/query_tx.ipynb
Extract the network features ./Code/extract_feature.ipynb
Analysis the features and perform visualization ./Code/analysis.py

How to use

Contract address

Token Protocol Contract Address Start Date
LUSD Liquity 0x5f98805A4E8be255a32880FDeC7F6728C6568bA0 2021-04-05
LQTY Liquity 0x6DEA81C8171D0bA574754EF6F8b412F2Ed88c54D 2021-04-05
AAVE Aave 0x7Fc66500c84A76Ad7e9c93437bFc5Ac33E2DDaE9 2020-10-02
COMP Compound 0xc00e94Cb662C3520282E6f5717214004A7f26888 2020-03-04
Dai MakerDAO 0x6B175474E89094C44Da98b954EedeAC495271d0F 2019-11-13
  1. Create a conda environment with Python>=3.8

    conda create --name bna python=3.8
    conda activate bna
  2. Install required packages

    pip install -r requirements.txt
  3. Query token transaction records via Kaggle Integration of BigQuery

  1. Extract network features and the core-periphery test results

    cd ./Code
    nohup python extract_feature.py --token-name LQTY >> ./logs/LQTY.txt
    nohup python extract_feature.py --token-name LUSD >> ./logs/LUSD.txt
    nohup python extract_feature.py --token-name AAVE >> ./logs/AAVE.txt
    nohup python extract_feature.py --token-name COMP >> ./logs/COMP.txt
    nohup python extract_feature.py --token-name Dai >> ./logs/Dai.txt
    • Note: If you are using ssh, you might need to use nohup to run the Python since it might takes hours for the core-periphery test.
    • The output data will be saved to ./Data/processedData/{token_name}_{data_collected_date}/
  2. Register a infura project for Ethereum API use, and remember the ENDPOINTS

    • The infura ENDPOINTS will be used to detect whether an address is CA or EOA, through Web3.py
  3. Run analysis.ipynb and get the visualization results

    • The output figures will be saved to ./Figure/{token_name}_{start_date}_{end_date}

Acknowledgements

Code derived and reshaped from: BNS

Releases

No releases published

Packages

No packages published