Governing the Algorithm: Implications of SEBI's Proposal for Retail Algorithmic Trading: Part II
- Abhishek Sanjay and Tanya George
- Apr 1
- 13 min read
Updated: 1 day ago
Abhishek Sanjay and Tanya George*
The foregoing analysis in Part I of the article has demonstrated that SEBI’s decision to permit retail participation in algorithmic trading, while ostensibly aimed at enhancing market accessibility, introduces substantial risks that remain unaddressed. The regulatory framework, as it stands, lacks the necessary safeguards to preempt the destabilizing effects of high-frequency, algorithm-driven trades. The assumption that such trading enhances liquidity in all contexts is, at best, incomplete, as evidence suggests. These concerns necessitate a closer examination of the adequacy of existing regulatory mechanisms and the extent to which they can adapt to the complexities of algorithmic trading. Part II of the paper, turns to this question, assessing the gaps in oversight and outlining a regulatory approach that seeks to reconcile technological advancement with market integrity.
I. REGULATORY CHALLENGES IN GOVERNING RETAIL ALGORITHMIC TRADING
While SEBI’s initiative to extend algorithmic trading to retail investors is ostensibly aimed at creating a level playing field, achieving such parity appears implausible and poses considerable regulatory challenges. At the outset, it is important to differentiate between the governing needs of retail and institutional players. While the regulation of institutional traders are already covered under the present regime, retail investors, who primarily fund the small-cap stock market, find hardly any mention.
The primary challenge faced by regulatory bodies is the detection of illegal trading. For advanced investigative agencies such as CFTC and SEC, it took almost 5 years to catch the culprit behind the 2010 flash crash. The simple reality is the vast disparity between the technological advancements of regulatory bodies and the sophisticated algorithms employed in today's financial markets. Algorithms continuously refine their datasets and trading strategies, often outpacing the surveillance systems designed to detect market abuses. In the Indian context, the SEBI operates with a modest annual budget of approximately ₹250 crores. In stark contrast, large trading firms and high-net-worth individuals (HNIs) execute transactions that frequently exceed this figure in daily trading volumes. In fact, SEBI itself has admitted that it cannot stop all instances of manipulative HFT due to the sheer technological disparity.
Through a 2018 circular, the permissible Order-to-trade ratio was set along with their penalties. OTRs appeal to regulators because they target HFT firms in aspects of their liquidity supply activity that may both cause “message overload” issues and be used as part of manipulative attempts based on temporary outbursts of order entries and cancellations. While OTRs are expected to reduce the frequency of “non-bona fide” orders, OTR caps have been shown to inadvertently discourage market makers from providing liquidity, particularly in illiquid securities. For instance, research by Hendershott and Menkveld reveals that stringent OTR limits reduce order book depth, as market participants place fewer orders to avoid penalties, thereby amplifying bid-ask spreads. Similarly, a 2018 study by the European Securities and Markets Authority (ESMA) observed that OTR restrictions in EU markets led to a measurable decline in liquidity for small-cap stocks. Additionally, there exist simpler solutions to get around the constraint by generating smaller trades in higher volumes so as to increase the denominator of the set ratio. Consequently, OTRs have also been shown to encourage traders to generate additional trades for the sole purpose of falling under the ratio.
Circuit breakers, designed to halt trading during significant price swings, have been integral to India's market stability since 2001. These mechanisms aim to prevent panic-induced irrational trading by temporarily suspending market activity when prices breach thresholds of 10%, 15%, or 20%, calculated daily based on the previous day’s closing level. However, empirical studies reveal that while circuit breakers aim to mitigate extreme volatility, they can inadvertently destabilize markets through a “magnet effect”. As prices approach circuit breaker thresholds, conditional price volatility intensifies, and returns become increasingly negatively skewed. This phenomenon stems from investor behaviour: anticipating a trading halt, market participants engage in aggressive selling to offload risk, exacerbating price declines rather than stabilizing them. The illiquidity risks imposed by trading halts further aggravate the situation. Investors, unable to rebalance portfolios during a halt, reduce their willingness to hold stocks, driving prices down further to attract new buyers.
As argued in an earlier section, SEBI’s kill switch mechanism is hardly adequate in the current market scenario. While SEBI’s initiatives are undeniably a proactive approach to refining the regulatory framework, the above-mentioned challenges are still present. Keeping the same in mind, the following section will explore a possible way forward, drawing inspiration from comparative legislation and empirical data.
II. WAY FORWARD
To address the existing disparities between retail and institutional investors in algorithmic trading, it is imperative to consider policy interventions and regulatory frameworks. This section evaluates potential strategies aimed at fostering a more equitable trading environment, with a particular emphasis on enhancing market transparency, promoting fair access to trading technologies, and mitigating the risk of market manipulation.
a) The Location Dilemma
The technological prowess of algorithmic trading has shifted the notion that winning in the markets is caused by computing power but indicates that it is essentially driven by communication or informative power. This optimized system of trading for algorithmic traders occurs in two primary methods.
Firstly, algorithmic traders have the resources to compute micro-movements of information by profiling pending orders to ascertain the direction the market is moving towards, giving themselves the advantage of working pre-emptively in the most precise fashion, a skill that makes it or breaks it in this arena. This forms a systemic creation of latency arbitrage, which consistently allows algorithmic traders to be one step ahead of the ordinary market by accurately anticipating their actions. Secondly, algorithmic traders can now employ manipulative practices such as spoofing or quote stuffing, placing themselves in positions of power to successfully profit off of anticipated modes of actions followed by the public, who are still attempting to process the sporadic shifts in the market. The latency advantage is further augmented by the countless examples of firms evidencing preferential treatment to algorithmic traders by providing them with information earlier.
This lays the foundation for the “finance financing finance” argument, which holds that the wealthiest investment firms would always be the front runners in the market as they hold the most technological prowess, thereby amplifying market trends in the directions they choose, while the general population faces losses. The issue arising with the lack of equitable co-location for retail investors alongside major investors is that the significant disparity in co-location would consistently allow for major players to profit off micro-movements in the market, at the expense of retail investors. This is exemplified by the substantial investment that an investor must undertake to be a co-located participant, which is stipulated to be upwards of 10 lakhs. It must be noted that while major firms can easily account for such burdens, ordinary investors may not be able to do so. Further, being co-located allows members to access the Tick-by-Tick (TBT) feed, which provides, in consistent microseconds, access to details relating to the addition, modification and cancellation of orders and trades on a real-time basis, which ordinary snapshots provided to members may forgo.
This essentially leads to potential issues in market volatility and the adverse selection theory. This holds that trades would be made with a counterpart who procures a higher level of information using superior technology and quickly withdraws when the market is in need of liquidity. Further, while there has been a view that there will always be a latency advantage due to the distance, a disparity of this size, as noted by the Madras HC in NSE v. SEBI, i.e., the NSE Co-location scam case, clearly violates one’s fundamental principles of transparent price discovery and equal market access. The data clearly indicates the advantages provided by co-location facilities with a sizeable chunk of orders placed on exchanges being from such facilities. Therefore, as established above, the substantial costs associated with the lack of equitable co-located facilities indicate these facilities must be provided to all market participants on fair and equitable terms.
A counter to this scenario is the establishment of equitable co-location, which would enable co-located traders to obtain information at the same speeds, equidistant to the server, vitiating the aforementioned disparity. Previously, the NSE was found to be giving preferential treatment towards some of their co-located members, allowing them to make lucrative trades at the expense of ordinary investors. As per SEBI, exchanges must ensure equitable access to data feeds and similar latency for all co-locaters to ensure transparency. Although, this still enables the latency advantage to function solely for co-located participants. As stated in the Justice Sodhi Committee Report, generally available information must be defined as information that is available on a non-discriminatory basis. Considering that the modern market primarily functions as malleable to speed rather than skill, not providing non-collocated participants access to full information while their already advantaged counterparts receive the same would prima facie constitute a discriminatory practice.
In this regard, the authors argue that non-collocated participants and co-located participants must be given the same information from data feeds at the same time, ensuring transparency. In order to remove any conflicts of interest, co-location facilities may be regulated by independent third parties rather than the exchange themselves, who house an obvious interest. Further, such facilities or proximity housing schemes must ensure that there is no monopolizing of rack space by certain stock brokers or data vendors; instead, a fair pricing model should be adopted to make co-location accessible to a broad range of traders, including smaller firms and retail investors. Additionally, exchanges must be required to disclose detailed information about their co-location services, including pricing structures, latency statistics, and allocation procedures, to ensure sufficient transparency. Another aspect of equitable co-location is the standardization of latency across all co-located participants. This can be achieved by ensuring uniform hardware and network configurations, thereby preventing certain traders from gaining a speed advantage.
Furthermore, considering that SEBI may already be ill-equipped to handle all the challenges brought on by modern technological trading, the onus of self-regulation may be shifted to the facilitators of the co-location systems, albeit with SEBI still functioning as a supervisory authority. This can be executed by placing the onus on the firms and brokers to implement stringent risk management controls to mitigate financial and regulatory risks associated with high-frequency trading, such as done under the Market Access Rule, i.e., Rule 15c3-5 of the US.
An alternative to this cost-incurring mode of action lies in making two queues for co-located and non-collocated orders such that orders are picked up from each queue alternatively. This architecture will provide orders generated from a non-collocated space with a fair chance of execution. While taxation of co-location facilities is also proposed as an alternative to equitable co-location, this approach may be flawed as it fails to take into account the fact that major firms would easily be able to meet these costs. Further, such an approach does nothing to counter the disparity between major and small investors.
b) De-fragmenting the fragmented
Another alternative in talks is the minimum resting period, propounded by Mary Schapiro. The solution holds that there must be a minimum period for which an order must be open, and if the order matches with a sell counterpart, it would amount to a valid transaction. This approach aims to address the manipulative practices advanced by algorithmic trades, acting as a deterrent against order cancelling and thereby persuading investors to only engage in non-artificial trades. However, while this attempts to modulate the information transmission advantage it becomes problematic when concerned with artificial adverse selection. As this modem prevents passive orders from being cancelled, it creates an avenue for them to be exploited and then sold for substantially higher prices. This would especially occur in cases of algorithmic trades capable of processing information in fractions of a second, who would analyse the data to buy orders that have been marked at the wrong price and subsequently gain from this regulatory gate. This creates an increase in bid-ask spreads and a subsequent heightened transaction cost for ordinary retail investors while also creating less liquidity as aggressive orders by algo traders would remove passive orders. Notably, no country has adopted such a model currently. While the Australian Securities and Investment Commission sought feedback for the adoption of this model, it consequently opted against it.
While the resting period approach may not bode well considering its drawbacks, the authors concur that the spirit of the model, which seeks to organise and stabilize the fragmented market, sounds promising. One might argue that attempts to slow down algo-traders are against the very construct of algo-trading. This raises an interesting question: what is the right level of latency? Towards this premise, the Frequent Batch Auction method, might hold more benefit than the Minimum Resting Period. This model attempts to address the latency advantage problem by dividing the trader’s day into extremely frequent, but discrete intervals and further undertaking batch auctions at these particular intervals to match orders. This shrinks the binary speedy game-play adopted by algorithmic trading and hosts a system that allows for the best price to win, rather than the fastest price. Dually, this also results in a minimized load on the server as trades do not continuously occur. Furthermore, allowing for these trades to be matched at random points of time during the interval would dissipate the anticipatory advantage held by traders and act as a means of preventing false orders with an intent to be cancelled, thereby targeting both major regulatory lacunas brought, on by algorithmic trades. Therefore, this method shifts the competition on speed to competition on pricing, enhances market liquidity and puts an end to the arms race while simplifying the market computationally.
On undertaking a cost-benefit analysis of a continuous market and a market with batch auctions, it is found that a continuous market, while resulting in a higher trading volume, would ensue higher price volatility, lower prices on average and a lower trading surplus. Further, under a continuous system, once an order book is depleted, any order in surplus would be matched at lower prices and creates a cascading effect on reducing the trading surplus, with an increase in crash severity. A drawback concomitant to this method is the mechanism it creates for order sniping. Herein, market participants will try to “snipe” the old quotes by sending a message to the exchange attempting to buy x at the old ask price, before the liquidity providers can adjust. Hence, this creates an arms race. However, the volatility caused by this can be reduced in an ideal market by the liquidity providers incorporating the cost of getting sniped into the bid-ask spread that they charge, a technical cost paid to sustain the market design. Another drawback found is the incoherence of how such a method would be first implemented. However, this may be answered by initially fragmenting the market into separate queues, allowing for continuous trading as well as the FBA with people gradually shifting to the latter. This can be established by a tiered taxation system wherein individuals in a continuous trading system who have the availability of better technological mechanisms are taxed higher than compared to small investors working in exchanges modelled in the Frequent Batch Auction method. This incentivizes them to switch to the FBA model market and ensures that it would be persons on equal footing that function within continuous markets, which thereby reduces the latency advantage.
While the significant regulatory burden of ensuring the strictest implementation of this model is undeniably large, the benefits it provides appreciably supersedes this in the long run. Therefore, it seems as the benefits concomitant to incorporating the Frequent Batch Auction method, would outweigh prospective disadvantages, which can also be attempted to be mitigated by technological innovation.
c) Other Alternatives
IOSCO’s proposed trading control mechanisms which include trading halts, volatility interruptions, and limit-up/limit-down (LULD) systems are designed to curb excessive price fluctuations and prevent disorderly markets. While India already employs circuit breakers at index levels under SEBI regulations, these mechanisms could be refined by incorporating dynamic volatility control measures at the individual security level, similar to those used in the European Union’s Markets in Financial Instruments Directive II (MiFID II). This would allow for real-time responsiveness to algorithm-driven volatility rather than relying on broad-based circuit breakers that may not address security-specific disruptions. IOSCO also recommends harmonising trading controls across venues to enhance market stability, a particularly relevant measure for India given the presence of multiple exchanges such as the NSE, BSE, and MCX. Divergent trading controls across these platforms can create regulatory arbitrage and, mandating uniform volatility control measures across all trading venues would prevent market distortions and ensure a consistent approach to risk mitigation, especially for interrelated instruments like derivatives and their underlying assets. Countries such as the United States have implemented cross-market trading halts through mechanisms like the Securities Information Processor, which aggregates real-time data across exchanges. In furtherance of IOSCO’s recommendation for standardised trade cancellation, SEBI could mandate a uniform framework for error trade rectifications, similar to the U.S. Securities and Exchange Commission’s “Clearly Erroneous Trade” rule, which allows exchanges to swiftly nullify trades that deviate from the prevailing market price by a predetermined percentage.
Simultaneously, taxation provides another avenue to implicitly regulate algorithmic trades. This has been adopted by market regulators in jurisdictions such as France (0.01%) and Italy (0.02%). The intent of this method lies in its implicit regulation of disruptive and manipulative market practices, such as the case of an algo trader engaging in a million trades per second while creating only a marginal transaction cost for the everyday trader. However, a report by the European Commission indicated that the adoption of tax was seen to have caused a 10% decline in liquidity.
Presently India already adopts a Securities Transaction Tax, albeit a negligent amount which automatically applies to algorithmic trades. While algorithmic trades must be held to a higher level of taxes to prevent disruptive practices, presently, India is still developing its algorithmic trader base. Considering this present scenario, the authors propose that in a country such as India, wherein retail algo-traders are still getting their wings, placing a tax at this point in time may de-incentivize technological innovation. Therefore, adopting taxation as a modicum of regulation currently may present harmful to the nascent stage of algorithmic trading in India.
While taxation may serve as a deterrent against manipulative trading practices, its broad application can impose direct financial burdens on all market participants. In contrast, solutions such as the FBA method and the tiered taxation system offer a better approach, targeting efficient market regulation rather than imposing an equal financial burden on unequally situated participants. These solutions aim to create a fairer trading environment while still allowing algorithmic trading to evolve, unlike taxation, which directly increases operational costs and may push firms, especially smaller or emerging players, away from investing in new trading technologies. The targeted approach encourages high-speed traders to move into a system that levels the playing field, ensuring that innovation is rewarded through improved market fairness and efficiency rather than being dampened by a blanket tax.
III. CONCLUSION
As shown above, the effects of algorithmic trading are dual-edged: its capacity to enhance liquidity and efficiency, juxtaposed with its propensity to increase volatility and facilitate manipulative practices. The central argument of the authors contends that while SEBI's move to incorporate retail investors into algorithmic trading is a significant step towards inclusivity and equity, it inadvertently amplifies systemic risks due to insufficient regulatory oversight and proposes that the newfound complexities brought on by algorithmic trading necessitate a proactive regulatory approach to safeguard market stability.
Considering the palpable shift in the fundamentals of the market, SEBI has done well to incrementally open the doors toward technological advancement to ensure a higher level of regulatory scrutiny. However, as Geoff Mulgan has observed, there are two methods of engendering monetary prosperity. The first is innovation and the resourceful creation of value, and the second is to seize value through predation and other mala fide means. Unfortunately, human ingenuity, particularly in the wake of new technology, impels towards the latter. Additionally, the Indian market remains susceptible to gross fluctuations on the basis of FDIs, FIIs and global market cues, potentially creating predictable patterns for algorithmic traders to game to their advantage.
The regulatory loopholes concomitant to technology are here to stay. In this regard, it becomes imperative to implement a framework that enables ex-ante regulations to account for malfeasant technological innovation. As put forth in Section V, the authors propose adopting solutions that have been attuned to Indian economic needs, after comprehensive research to not stifle technological development. The authors propose that through regulatory mechanisms such as the Frequent Batch Auction method or equidistant informational mechanisms, SEBI may have an initial advantage in countering potential technological threats aimed at disruption.
*Abhishekh Sanjay is a 2nd Year BA.LLB(Hons) at NALSAR University of Law. Tanya George is a 3rd Year BA.LLB(Hons) at MNLU, Mumbai.
Comments