Write your own exchange scraper

Add your own scraper

Before you begin writing a scraper, please check whether the exchange under consideration offers integration through websockets. In this case please implement your scraper using the websocket instead of a RESTful API.
Now, let's assume you want to scrape a data source that provides trade information. Create a new file in exchange-scrapers/ and call it MySourceScraper.go. In order for us to be able to call your scraper from our system, you should introduce a type MySourceScraper struct that implements the APIScraper interface from the scrapers package:
1
type APIScraper interface {
2
io.Closer
3
// ScrapePair returns a PairScraper that continuously scrapes trades for a
4
// single pair from this APIScraper
5
ScrapePair(pair dia.Pair) (PairScraper, error)
6
// FetchAvailablePairs returns a list with all available trade pairs (usually
7
// fetched from an exchange's API)
8
FetchAvailablePairs() (pairs []dia.Pair, err error)
9
// Channel returns a channel that can be used to receive trades
10
Channel() chan *dia.Trade
11
}
Copied!
From the MySourceScraper type you derive a MySourcePairScraper type which restricts the scraper to a specific pair. Next, you should write a function with signature NewMySourceScraper(exchangeName string) *MySourceScraper initializing a scraper. We suggest that this function calls a method func (s *MySourceScraper) mainLoop() in a go routine, constantly receiving trade information through the trade channel of MySourceScraper as long as the channel is open. The collection of new trading information inside the mainLoop() should be done by an update method with signature func (s *MySourceScraper) Update(). Finally, in order to implement the interface APIScraper you should include ScrapePair returning a MySourcePairScraper for a specific pair, so our main collection method can iterate over all possible trading pairs.
Also, please take care of proper error handling and cleanup. More precisely, you should include a method Error() which returns an error as soon as the scraper's channel closes, and methods Close() and cleanup() handling the closing/shutting down of channels.
Furthermore, in order for our system to see your scraper, add a reference to it in Config.go in the dia package, and to the switch statement in APIScraper.go in the scrapers package:
1
func NewAPIScraper(exchange string, key string, secret string) APIScraper {
2
switch exchange {
3
case dia.MySourceExchange:
4
return NewMySourceScraper(key, secret, dia.MySourceExchange)
5
}
6
}
Copied!

Steps to run a scraper locally

  1. 1.
    Navigate to the deployments/local/exchange-scraper directory of the project.
  2. 2.
    Run the required services using docker-compose up -d, this will run and prepare Redis, PostgreSQL, and InfluxDB databases.
  3. 3.
    Set the required environment variables using the following commands:
1
export USE_ENV=true
2
export INFLUXURL=http://localhost:8086
3
export INFLUXUSER=test
4
export INFLUXPASSWORD=test
5
export POSTGRES_USER=postgres
6
export POSTGRES_PASSWORD=password
7
export POSTGRES_HOST=localhost
8
export POSTGRES_DB=postgres
9
export REDISURL=localhost:6379
Copied!
Or simple by sourcing the local.env inside the deployments/local/exchange-scraper directory.
  1. 1.
    Execute main.go from cmd/services/pairDiscoveryServices for fetching the available pairs and setting them in the Redis database.
  2. 2.
    Finally, run the scraping executable flagged as follows:
1
cd cmd/exchange-scrapers/collector
2
go run collector.go -exchange MySource
Copied!
For an illustration you can have a look at the KrakenScraper.go.