One topical subject is the need for restricting ‘naked’ short-selling. With immediate trading and settlement, this problem disappears. In order to strike a deal, the seller has to provide the securities and buyer the money. The ownership will be recorded directly after the trade. Nobody short-sells or short-buys in this kind of system. Securities trading will fall back to the common situation on other markets, participants can only sell what they own.
This will affect liquidity by limiting everybody to dealing with their current portfolios. This might look like a severe limitation at first sight but with immediate trading and settlement investors have the possibility of turning around their portfolios any number of times during the day. The result of each trade can immediately be used for the next trade. This implies continuous immediate netting across the assets in the portfolios. For example full-scale algorithmic trade will still be possible, but the algorithms have to respect the short-trading limitation of just selling what is in the portfolio and buying with the money in the portfolio. If/when somebody assigns wrong parameters to an algorithm, trading will automatically stop when the assets in the portfolio have been sold.
Securities lending will be possible in this kind of environment. The stakeholders can/will agree upon suitable collateral for repayment. This provides the possibility for ‘covered’ short-selling in a secured way. The market-makers will also face the same constraint – they need to operate within their real-time portfolios and/or lending capabilities.
As the overall balance of a given security of any custodian will always be the sum of the individual balances of its customer, the custodian can never run out of settlement assets in the form of securities. However, there will be an increased requirement for custodians and/or settlement banks to meet the immediate liquidity needs and fluctuations in money assets due to customer trade. As a consequence, custodians/settlement banks will need improved cash liquidity planning and sufficient intraday credit facilities at the central bank. The liquidity need will be netted by offsetting customer trades, which will even out intraday fluctuations.
Efficient, Interoperable STP
The industry has in the past initiated different straight-through processing (STP) projects, but these have failed because of the lack of interest by legacy service providers to make co-ordinated changes to their systems – or even agree on what the common standards should be. STP can only be realised when all systems in the processing chain use the same common transaction standards, address keys and processing conventions.
Making book-entry transfers requires basic sets of common standardised transactions. With the current level of automation, the necessary information is available in the system (in the cloud of interlinked servers), but it has to be forced into common interoperable standards, in order to make it efficiently available for all processing phases. Within the SWIFT/ISO community the standardisation work on transactions standards has started by creating the ISO 15022 set of standards, which grew into ISO 20022 standards. These are descriptions of metadata for processes and messages. In order for the transaction standards to become useful, the market participants need also to agree on actual interoperable addressing and reference standards, which transfer the transactions to the right processes and refer them to the correct data sets in the different databases. There is also a need for common processing conventions of these transactions.
The most important address information to agree upon would be a common custody account numbering for investors’ assets. We need to agree upon something that could be called an international custody account number (ICAN), which identifies any asset account in the same way as the IBAN identifies any money account within the EU area.
Each transaction processed in the network has also to be identified via a common structured identifier. This allows transactions to be traced through the system and match them exactly and individually. It can be compared to the parcel number created, for example, by a courier firm in order to be able to trace any parcel shipment in real-time all through the parcel shipments. In most cases, securities transactions ship much larger values than parcel shipment and would therefore require at least the same kind of real-time process monitoring possibilities. This kind of code could for example be called international security transaction identifier (ISTI).
In an automated environment, all participants have databases containing their view of the transaction process. The transaction processes are dialogues where initiated transactions will, in a specified timeframe (generally split seconds in a real-time environment and days in current batch systems), generate replies or confirmations. The originator would then need automated facilities to find the original information for reconciling, error processing and other purposes. This would require clear references created by each participant to be used all through the process of a given chain of tasks. In a flat environment it would require the references of the sending investor and custodian and the receiving investor and custodian and which in line with the other identifiers could be called RFSI, RFRI, RFSC and RFRC (reference codes for sending and receiving investors and custodians). These references could have a very general structure (just maximum length and control digits) making it possible for each participant to create the desired internal structures.
In addition to message and data field standards, modern technology allows for the development common reusable application modules stored in a common library. Each stakeholder could pick the standardised modules they need from thus type of library. (It is like a store of standardised ‘lego’ everyone can use.) The programming task at individual stakeholders will decrease drastically when the standardisation is extended to standardised processing modules.
The prerequisites for increased efficiency and competition are to get the very basic transaction ‘plumbing’ interoperable based on common transaction and addressing standards. Without a concrete base, STP will be unachievable. In the same way as within the single euro payments area (SEPA), the market lack incentives to co-ordinate these due to the legacy privileges and barriers and it seems inevitable that the required coordination can only be achieved via authority regulations (compare with the IBAN, ISO 20022, end-date, pricing and interchange fee regulations required for SEPA). A Europe-wide interoperability project for processing of securities implies launching an electronic single European securities area (eSESA) undertaking, which should take the e-prefix from the start to emphasise a new generation of processing technology.
Automated Tax Information
The member states have common interest in ensuring correct and efficient taxation of investor assets and their returns. This can be divided into two different tasks:
Automation of the taxation processes.
Incentivising and controlling actual tax payments.
Due to the increase in cross-border connections, tax collection is more and more a task requiring international co-operation among tax authorities. The very basic requirement for tax collection automation will be to create a tax payer identifier for cross-border usage which could be called international tax payer identifier (ITID). The structure of ITID could use an IBAN analogy, that is ITID + country code + check digit + current local tax identifier (TIN). In this way, each country could keep its current tax identifiers and the benefit would be an improved check digit feature. The tax authorities would provide the ITID to their citizens and these can then forward it to their service providers in the different parts of the EU (or even outside). This is a very basic requirement in order to make international tax collection efficient.
The necessary information for tax collection is always available in a book-entry infrastructure. Authorities need to create the incentives to make the proper reports using ITID coding. One simple way to create such incentive would be to levy higher tax rates on securities, returns and transactions where proper tax coding is omitted.
Next Generation Regulatory Policy
The basic objectives of market regulations will remain the same: sufficient competition, processing efficiency and stability. However, the future regulatory stance can select between two main alternatives:
- Continuing along the current path using current technology resulting mainly in moving from national monopolies/oligopolies to similar European level structures.
- Turning towards open competitive network structures tuned towards competition and technology efficiency.
These changes place an emphasis on efficiency and competition issues, as these are the area of change that will also require changes in government policies. The flat and lean model presented here was based on the following common international policy:
- Same kind of clear-cut industry structures.
- Common interoperable standards.
- Openness and portability with equal stakeholders in an interoperable ‘level-playing-field’ network.
- Common license, supervision, etc, regulatory requirements.
The proposed development objective would imply profound changes by moving from national systems to truly international systems, from monopoly silos to open networks, from high barriers for service provider changes to efficient provider ‘portability’, from institutions with mixed functions/services to streamlined, single-purpose entities and from legacy batch technology to modern real-time transaction-based processing. The market and industry will face a completely new competitive environment. However, the full benefits can be achieved only via profound restructuring within a coherent overall new design, which have been the situation in many other industries facing large ‘generation’ changes.
One important policy issue related to enhanced competition is the possibility of cross-ownership among entities with different licenses. This could lead to ‘disguised’ silo constructions and thereby a risk for abuse of market power and cross-subsidisation. A strong pro-competition stance would require strong limitations over all kinds of cross-ownerships in the competitive processing chain.
Governments can select different policy stances to these developments, such as:
- Passively following market developments.
- Correcting afterwards negative developments.
- Pushing for developments in desired directions.
- Actively building parts of the new infrastructure.
As future policy changes are currently unclear, all of these stances will probably be used in parallel. An active authority push in desired directions could end the current maintenance of the inefficient status quo. It could speed up developments and benefits would be achieved earlier. A clear ‘next generation’ vision provided by governments, and their proposals for measures to reach the objectives would provide clarity to the market on expected developments. Today, developments seem to be at a standstill due to mixed signals, conflicting regulations and defensive actions of legacy service providers (such as increased consolidation). Governments will be in a very important position regarding the speed, content and customer benefits of developments in this industry. The legacy burden of the current industry structure is so heavy that it lacks the internal power for starting a restructuring process.
Introducing an Efficient Change Process
The efficiency of the change process itself will be an important factor in determining cost. A lengthy and uncoordinated change process can increase the cost of change even dramatically. A lengthy process will also postpone the benefits of restructuring. A large variety of conflicting views voiced by stakeholders, whose positions in the processing chain will change due the new structuresm is typical for these kinds of change processes. In order for the change process to run smoothly, co-ordinated decisions have to be made and implemented. These include the design of the future processing chains and the transaction/data standard used in these.
Lengthy discussions on the design starting point, are also typical, i.e. whether to build on current legacy designs or start from scratch. These decisions will also affect the implementation process, which can either be a stepwise co-ordinated update of current infrastructures or constructing a new parallel infrastructure to which the volumes are shifted in a co-ordinated way. Moving to real-time processing in an open network environment will be such a radical change that updating old batch processes will most probably be an unfeasible solution with much higher costs than implementing a new parallel structure. In this case the process could start by extending the modern real-time trading platforms by adding real-time post-trading processes as immediate follow-ups to the trading processes.
All stakeholders, service providers, customers and authorities alike, need to be aboard in the design phase and especially the end-users, investors and issuers, as a major part of the overall benefits of a new infrastructure relates to STP automation benefits in the customer interfaces. Conflicting views are very hard to avoid in large restructuring projects, which will need government involvement as a neutral arbitrator of conflicting interests. Government involvement can be beneficial when the development speed can be enhanced, because the cost of change and foregone benefits will increase during a longer changeover period.
It will be important to clearly distinguish which parts of such a process will be government-driven and which, market-driven. An efficient transfer from old to new technologies becomes more and more demanding because of the growing system integration and automated stakeholder relationships. A successful approach will therefore require the co-ordination public and private efforts.
As it seems evident that this industry faces a major restructuring effort sometimes in future, the main question seems to be to start this undertaking: now or later?
Unbelievably, Kodak has created its own ‘Kodak moment’ worthy of going in the blockchain industry’s family album by announcing the launch of 'KodakCoin'. Even though the Kodak press-release is suspiciously light on details, this is perhaps not as bonkers a move as it first appears. Distributed ledger technology was designed to track assets, and valuable images and digital rights management seems a natural fit.
We have been witness to a series of significant security events recently around payment execution, from Leoni in Germany through to ABB in South Korea and SWIFT in Bangladesh to name a few of the major headlines.
When Mark Cuban declared that "Data is the new gold" he highlighted why information is possibly the most valuable asset a business has. APIs are the unsung heroes that make it possible to extract that value.
How treasury stands to benefit from blockchain: Ripple’s goal to revolutionise cross-border transactions
Imagine a world where cross-border transactions can occur in real-time, at a few cents per transaction, to and from any bank, in any ... read more