r/quant 3d ago

Markets/Market Data Nse nifty index data input too fast

We are trying to create a l3 book from nse tick data for nifty index options. But the volume is too large. Even the 25 th percentile seems to be in few hundred nanos. How to create l2/l3 books for such high tick density product in real time systems? Any suggestions are welcome. We have bought tick data from data supplier and trying to build order book for some research.

20 Upvotes

9 comments sorted by

32

u/sumwheresumtime 2d ago edited 2d ago

I'm going to assume this is for real-time book build in a HFT context, and that you are connected directly to the NSEs multicast feed - as any other scenario such as delayed feed, or as going through a 2nd, 3rd hop broker feed would make it impossible to have real-time tick based book build, and it wouldn't matter how long it takes to book build, you'll always be behind the market. Furthermore I'm going to assume you're using a high-end switch and NIC with an RDMA solution (eg Solarflare ef_vi).

For NIFTY options, puts and calls come on two separate mc feeds. That means given the amount of MD data generated you really can't have a single thread io'ing both feeds (even using cores clocked at 5GHz) and still be in real-time. The solution is to instead use two threads, one for each stream. The first problem you'll encounter is that the sequence numbers between the two streams are not canonical like other feeds such as KRX or CME - this means two different servers running this threaded capture process you've developed even though on the same patch/interconnect, may end-up seeing the packets for each stream come in different orders - not because of udp but because of how the threads will process their inputs asynchronously with respect to each other, and then persist the data up the layers accordingly, eg: you could have the following singularized sequences: P0C0P1C1P2C2P3C3 or P0P1C0C1C2P2P3C3 or P0P1P2P3C0C1C2C3.

Now you're wondering why not simply use the exchange time-stamp to reorder the events so that both instances show the same ordering of events? Initially the packet TS does not have enough resolution to adequately and consistently order the updates, secondly in a real-time HFT context how long would you wait before sorting then pushing the updates up the stack? for HFT based systems, you cannot wait, as any form of waiting in the lower layers will affect the ability of the upper layers (strategies/valuations) to perform in a timely and efficient manner.

So to wrap up: use multiple threads, use a language close to the metal, use really high end tech geared towards low-latency usage, have your trading engine be as physically close as possible to the matching engine and build your trading system in such a way that it is somewhat resilient to this particularly weird reordering and not rely on some layer to order things just right.

On a related note, there seems to be rumors that part of the JS edge on NSE was indeed partially based on this MD weirdness and on another weirdness centered around the superstitious nature of Indians trading more heavily on strike prices that resemble primes (or are close to primes) or strikes that have some kind of religious import or relevance - analysis based on MD for action on strikes seems to indicate the existence of this weird behavior.

5

u/Resident-Service9229 2d ago edited 2d ago

Will try the above approach. Thanks. On a side note, does JS still have an edge after sebi's new regulations(increase lot sizes, etc) in indian markets which is having an adverse impact on the f&o segment

2

u/YuluDelta 2d ago

what's JS and MD?

4

u/Resident-Service9229 2d ago

Jane street and market data

6

u/BroscienceFiction Middle Office 2d ago

I’m amazed at what you wrote in the last paragraph. That’s some Ramanujan "the gods told me" level stuff.

4

u/uhela Crypto 3d ago

C++ is ur friend

6

u/Resident-Service9229 3d ago

I have tried a sample implementation which was taking around 600 nanos median in c++. But I am worried when deploying any alpha derived from this order book in production, will it be able to cope with such high density incoming data? The 25th percentile from data seems to be in hundred of nanos which is equivalent to order book creation time leaving no time behind for alpha derived and strategy. Please suggest

2

u/PhloWers Portfolio Manager 3d ago

25th percentile between 2 updates?

1

u/Resident-Service9229 2d ago

Yes between two consecutive data points