Add a new exchange scraper

Add your scraper

Before you begin writing a scraper, please check if the exchange offers integration through WebSockets. If it does, implement the scraper using the WebSocket protocol instead of a RESTful API.

To scrape data from an exchange data source, called "MySource" for instance, follow these steps:

  1. Create a new file in the directory pkg/dia/scraper/exchange-scrapers/ with the filename MySourceScraper.go.

  2. To allow the platform to call your scraper, implement a scraper that conforms to the APIScraper interface from the scrapers package:

type APIScraper interface {
	io.Closer
	ScrapePair(pair dia.Pair) (PairScraper, error)
	FetchAvailablePairs() (pairs []dia.Pair, err error)
	Channel() chan *dia.Trade
}

Start building your scraper by writing a function with the signature NewMySourceScraper(exchangeName string) *MySourceScraper. We suggest that this function calls a mainLoop() method in a go routine, constantly receiving trade information through the trade channel of MySourceScraper as long as the channel is open.

In order to implement the interface, it is necessary to include the ScrapePair method, which returns a MySourcePairScraper for a specific pair, so our main collection method can iterate over all possible trading pairs. Note that MySourcePairScraper should respect and implement the PairScraper interface

Also, to ensure proper error handling and cleanup, it is recommended to include a method Error(), which returns an error as soon as the scraper's channel closes. Additionally, Close() and cleanup() methods should be included to handle the closing and shutting down of channels.

For a better understanding of how to set up a scraper, you can refer to the CoinBaseScraper.go file, as it provides an example of an existing exchange scraper and its overall structure and logic.

  1. To make the scraper visible to the system, add a reference to it in Config.go in the dia package:

const (
  MySourceExchange = "MySource"
)
  1. Add a case for your scraper in the switch statement in the pkg/dia/scraper/exchange-scrapers/APIScraper.go file:

func NewAPIScraper(exchange string, key string, secret string) APIScraper {
	switch exchange {
	case dia.MySourceExchange:
		return NewMySourceScraper(key, secret, dia.MySourceExchange)
	}
}
  1. Finally, add exchange's pairs to config/MySourceExchange.json config:

{
  "Coins": [
    {
      "Exchange": "MySource",
      "ForeignName": "QUICK-USD",
      "Ignore": false,
      "Symbol": "QUICK"
    }
  ]
}

Steps to run the scraper locally

  1. Modify the build/Dockerfile-genericCollector and build/Dockerfile-pairDiscoveryService files and add the following two lines before the RUN go mod tidy step:

COPY . /diadata
RUN go mod edit -replace github.com/diadata-org/diadata=/diadata
  1. Build the necessary service containers by running the following commands:

minikube image build -t diadata.pairdiscoveryservice:latest -f build/Dockerfile-pairDiscoveryService .
minikube image build -t diadata.exchangescrapercollector:latest -f build/Dockerfile-genericCollector .
  1. Create a manifest for the new exchange scraper by creating a new mysource.yaml file. You can refer to existing files for guidance or use the following template:

apiVersion: "v1"
kind: Pod
metadata:
  name: "exchangescraper-mysource"
spec:
  containers:
  - name: exchangescraper-mysource
    image: diadata.exchangescrapercollector:latest
    imagePullPolicy: Never
    command: ["collector"]
    args: ["-exchange=MySource", "-mode=current", "-pairsfile=true"]
    env:
    - name: USE_ENV
      value: "true"
    - name: POSTGRES_USER
      value: "postgres"
    - name: POSTGRES_PASSWORD
      value: "password"
    - name: POSTGRES_DB
      value: "postgres"
    - name: POSTGRES_HOST
      value: "postgres.default.svc.cluster.local"
    - name: INFLUXURL
      value: "http://influx.default.svc.cluster.local:8086"
    - name: INFLUXUSER
      value: "test"
    - name: INFLUXPASSWORD
      value: "testtest"
    - name: REDISURL
      value: "redis.default.svc.cluster.local:6379"
    - name: KAFKAURL
      value: "kafka.default.svc.cluster.local:9094"
  initContainers:
  - name: pairdiscovery-mysource
    image: diadata.pairdiscoveryservice:latest
    imagePullPolicy: Never
    command: ["pairDiscoveryService"]
    args: ["-exchange=MySource", "-mode=verification"]
    env:
    - name: USE_ENV
      value: "true"
    - name: POSTGRES_USER
      value: "postgres"
    - name: POSTGRES_PASSWORD
      value: "password"
    - name: POSTGRES_DB
      value: "postgres"
    - name: POSTGRES_HOST
      value: "postgres.default.svc.cluster.local"
    - name: INFLUXURL
      value: "http://influx.default.svc.cluster.local:8086"
    - name: INFLUXUSER
      value: "test"
    - name: INFLUXPASSWORD
      value: "testtest"
    - name: REDISURL
      value: "redis.default.svc.cluster.local:6379"
    - name: KAFKAURL
      value: "kafka.default.svc.cluster.local:9094"
  1. Before running the manifest, create a new entry in the database for the new exchange:

kubectl exec -it deployment/postgres -- psql -U postgres -c "INSERT INTO exchange (exchange_id, "name", centralized, bridge, contract, blockchain, rest_api, ws_api, pairs_api, watchdog_delay, scraper_active) VALUES(gen_random_uuid(), 'MySource', true, false, '', '', '', 'wss://mysource.com', 'https://mysource.com', 300, true);"
  1. Deploy the manifest using the following kubectl command:

kubectl create -f mysource.yaml

Hooray 🎉 Your scraper should now be running.


Exchanges Command

CEX

pairDiscoveryService -exchange=ExchangeName -mode=verification
go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install && collector -exchange=ExchangeName -mode=current -pairsfile=true

DEX

go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install && collector -exchange=ExchangeName -mode=current -pairsfile=true

Exchange Test

go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install && collector -exchange=EXCHANGE_NAME -mode=current -pairsfile=true

Centralized:

  • Bitfinex: go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install && collector -exchange=Bitfinex -mode=current -pairsfile=true

    • rm ./collector.go && cp /mnt/env-context/cmd/exchange-scrapers/collector/collector.go ./collector.go && go mod edit -replace github.com/diadata-org/diadata=/mnt/env-context && go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install -a && collector -exchange=Bitfinex -mode=current -pairsfile=true

  • Bittrex: go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install && collector -exchange=Bittrex -mode=current -pairsfile=true

  • CoinBase: go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install && collector -exchange=CoinBase -mode=current -pairsfile=true

  • MEXC: go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install && collector -exchange=MEXC -mode=current -pairsfile=true

Decentralized:

  • PlatypusFinance

  • Orca: go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install && SOLANA_URI_REST=https://try-rpc.mainnet.solana.blockdaemon.tech/ collector -exchange=Orca -mode=current -pairsfile=true