Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve Exchange Scraper Document #596

Open
wants to merge 7 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
79 changes: 62 additions & 17 deletions documentation/tutorials/exchangescrapers.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,15 +8,15 @@ Now, let's assume you want to scrape a data source that provides trade informati

```go
type APIScraper interface {
io.Closer
// ScrapePair returns a PairScraper that continuously scrapes trades for a
// single pair from this APIScraper
ScrapePair(pair dia.Pair) (PairScraper, error)
// FetchAvailablePairs returns a list with all available trade pairs (usually
// fetched from an exchange's API)
FetchAvailablePairs() (pairs []dia.Pair, err error)
// Channel returns a channel that can be used to receive trades
Channel() chan *dia.Trade
io.Closer
// ScrapePair returns a PairScraper that continuously scrapes trades for a
// single pair from this APIScraper
ScrapePair(pair dia.Pair) (PairScraper, error)
// FetchAvailablePairs returns a list with all available trade pairs (usually
// fetched from an exchange's API)
FetchAvailablePairs() (pairs []dia.Pair, err error)
// Channel returns a channel that can be used to receive trades
Channel() chan *dia.Trade
}
```

Expand All @@ -27,15 +27,52 @@ Also, please take care of proper error handling and cleanup. More precisely, you
Furthermore, in order for our system to see your scraper, add a reference to it in `Config.go` in the dia package, and to the switch statement in `APIScraper.go` in the scrapers package:

```go
# pkg/dia/scraper/exchange-scraper/APIScraper.go
func NewAPIScraper(exchange string, key string, secret string) APIScraper {
switch exchange {
case dia.MySourceExchange:
return NewMySourceScraper(key, secret, dia.MySourceExchange)
}
switch exchange {
case dia.MySourceExchange:
return NewMySourceScraper(key, secret, dia.MySourceExchange)
}
}
```

If you are working on ethereum chain with the decentralized exchange, you can get your node api key from environment variant.

```go
func NewMySourceScraper(exchangeName string) *MySourceScraper {
// some initial stuff...
log.Infof("Init rest and ws client for %s.", exchange.BlockChain.Name)
restClient, err := ethclient.Dial(utils.Getenv(strings.ToUpper(exchange.BlockChain.Name)+"_URI_REST", curveRestDial))
if err != nil {
log.Fatal("init rest client: ", err)
}
wsClient, err := ethclient.Dial(utils.Getenv(strings.ToUpper(exchange.BlockChain.Name)+"_URI_WS", curveWsDial))
if err != nil {
log.Fatal("init ws client: ", err)
}
// other stuff here...
}

```

There's no limit to use your own method to connect with blockchain, you can make your own connection.

For a working example, check out `CurvefiScraper.go` or other decentralized exchange scrapers.

For centralized exchange, you should follow the CEX provider documents. Then do your own connection. For an illustration you can have a look at the `KrakenScraper.go`.

Also, if your want to get data from contract, install `abigen` and generate the code from exchanger provided abi.

```sh
go install github.com/ethereum/go-ethereum/cmd/abigen@latest

abigen --abi myexchange/myexchange.abi --pkg myexchange --type MyExchange --out myexchange/myexchange.go
```

Put your abi file and generated code into a folder under the `exchange-scrapers/` and name it with the exchange name.

## Steps to run a scraper locally

1. Navigate to the `deployments/local/exchange-scraper` directory of the project.
2. Run the required services using `docker-compose up -d`, this will run and prepare Redis, PostgreSQL, and InfluxDB databases.
3. Set the required environment variables using the following commands:
Expand All @@ -54,13 +91,21 @@ export REDISURL=localhost:6379

Or simple by sourcing the `local.env` inside the `deployments/local/exchange-scraper` directory.

If you are working on ethereum chain, export your node api key from terminal.

```sh
export ETHEREUM_URL_REST=${YOUR_API_REST_ENDPOINT}
export ETHEREUM_URL_WS=${YOUR_API_WS_ENDPOINT}
```

Tips: Find you favourite ethereum node api provider at [EthereumNodes](https://ethereumnodes.com)

Also if you are working on another ethereum-compatible chain, simply replace `ETHEREUM` prefix with your chain name and get them from code.

4. Execute `main.go` from `cmd/services/pairDiscoveryServices` for fetching the available pairs and setting them in the Redis database.
5. Finally, run the scraping executable flagged as follows:

```text
```sh
cd cmd/exchange-scrapers/collector
go run collector.go -exchange MySource
```

For an illustration you can have a look at the `KrakenScraper.go`.

14 changes: 7 additions & 7 deletions documentation/tutorials/ratescrapers.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# Write your own rate scraper

These instructions concern writing scrapers for single units characterised by a (floating point) number. For scrapers describing the relation between pairs of units, i.e. exchange rates see the instructions in exchangescrapers.md.
These instructions concern writing scrapers for single units characterised by a (floating point) number. For scrapers describing the relation between pairs of units, i.e. exchange rates see the instructions in [exchangescrapers.md](exchangescrapers.md).

## Instructions for the addition of a rate scraper

In order to add your own scraper for a new data source, you must adhere to our format. Create the package file `UpdateMYRATE.go` in the package `/internal/pkg/ratescrapers`. The central method is ` UpdateMYRATE()`. This method acts on a RateScraper struct which is defined in RateScraper.go in the ratescrapers package. For instance, for the the Euro Short-Term Rate (ESTER) issued by the ECB, `UpdateESTER.go` would look like
In order to add your own scraper for a new data source, you must adhere to our format. Create the package file `UpdateMYRATE.go` in the package `/internal/pkg/ratescrapers`. The central method is `UpdateMYRATE()`. This method acts on a RateScraper struct which is defined in RateScraper.go in the ratescrapers package. For instance, for the the Euro Short-Term Rate (ESTER) issued by the ECB, `UpdateESTER.go` would look like

```go
func (s *RateScraper) UpdateESTER() error {
Expand All @@ -16,14 +16,14 @@ The scraped data has to be written into a struct of type InterestRate from `pkg/

```go
type InterestRate struct {
Symbol string
Value float64
Time time.Time
Source string
Symbol string
Value float64
Time time.Time
Source string
}
```

and sent to the channel chanInterestRate of the RateScraper s. In order to write a new scraper, it is not imperative to understand the architecture of the pathway from top to bottom, but it might be helpful. For a first impression you can have a look at the following [diagram](github.com/diadata-org/diadata/documentation/tutorials/rate_scraper_diagram_down.pdf).

**Remark**: Parsing xml in go is not always straightforward. A useful resource for parsing can be found here:
github.com/gnewton/chidley

3 changes: 1 addition & 2 deletions documentation/tutorials/write-your-own-rate-scraper.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Write your own rate scraper

These instructions concern writing scrapers for single units characterised by a \(floating point\) number. For scrapers describing the relation between pairs of units, i.e. exchange rates see the instructions in exchangescrapers.md.
These instructions concern writing scrapers for single units characterised by a \(floating point\) number. For scrapers describing the relation between pairs of units, i.e. exchange rates see the instructions in [exchangescrapers.md](exchangescrapers.md).

## Instructions for the addition of a rate scraper

Expand Down Expand Up @@ -46,4 +46,3 @@ type InterestRate struct {
```

and sent to the channel `chanInterestRate` of `s`. In order to write a new scraper, it is not imperative to understand the architecture of the pathway from top to bottom, but it might be helpful. For a first impression you can have a look at the following [diagram](https://github.com/diadata-org/diadata/tree/master/documentation/tutorials/rate_scraper_diagram_down.pdf).