Category Archives: Early Stage

Invest in Biotech and Information Sciences

If we invest heavily in biotechnology and information services companies (especially genomics, networked centralized computing, neurology, neural network predictive applications, and nerve regeneration) in the next 50 years, many currently living people may have an opportunity to achieve substantially improved and lengthened quality of life and indefinitely extended sentience.

It’s more than a financial return, but it can still be evaluated financially. The return on these investments should be calculated as the return on the securities themselves, plus the return on your other investments over the period of time that your life and investment horizon are extended. It is possible, then, that the net return on biotech and information science investments may be substancially higher than the direct value change for those investment securities.

SOAP, .net, and the ubiquitous internet cloud

Microsoft’s recent major push to develop the .net platform is an attempt to aggregate and brand all internet services into the Windows operating system. And it just might work. SOAP and .net are sometimes referred to as a “cloud” because of the distributed nature of the processing; your machine accesses a server which renders your display based on stored interface components and applications that are potentially stored on different machines anywhere else, controlled by anyone else. I think that it is a key new technology and that it will play an important role in the development of communications technology in the next few years.

What is this technology? And where does it take us?


SOAP and .net use techniques that enable distributed computing and webserving. In other words, they allow web applications to run on independent computers and independent of the look and feel of the web site. Applications developers will want to adopt this technology because it means that they can focus on the application and spend less time on the user interface. Portals will want to adopt this technology because it means that they can integrate many external services and make them available to their users. Microsoft, I believe, is in the process of building this technology into their Windows operating system in order to enable any internet application to be run without leaving the Microsoft-controlled environment.

There is a programming design heuristic that is based on the separation of model, content, controller, and view. SOAP is analogous as it enables the separation of the model. The dominance of this design for programming was very strong, and similarly, the dominance that SOAP enables will likely be very strong. Effectively, The potential of XML is captured through the definition of protocols for the distributed exchange of applications.

What is the risk?

If .net is successful in becoming the dominant channel for web services, then there is a strong likelihood that Windows will combine the operating system and portal functionality to provide the complete computing experience. Microsoft will have the ability to target services, advertising, applications, communications, and other information to each individual. Further, web services will be conveniently available through integrated Windows applications, reducing the need for browser-based web access. Specifically, it will mean that the internet will be able to be re-faced with a Microsoft-branded front-end, and a selection of web services defined by Microsoft.

What is the potential?

In order for .net to become only one of many popular web service aggregators, the SOAP protocol must never give advantage to Microsoft over other aggregators. If SOAP (which stands for simple object access protocol) remains open in such a way that any portal can aggregate any SOAP enabled web service, then the result will be wonderful. Specifically, it will mean that the entire internet will be able to be re-faced with a customizable front-end, and your selection of web services will be personalized and context dependent based on specifications you select.

Centralized Computing Platform

The world needs a platform for centralized computing that enables anyone to commercially publish their intellectual properties through any networked device using their own interface. This platform could be supplemented with advanced and semantic search accross all IP, as well as access to any distributed web service through SOAP.

Wearable Computing

Digital Convergence is about accessing the functionality of a broad array of devices from fewer more pervasive devices. The logical result of SOAP, wireless connectivity, open source software, and increasingly compact hardware is a trend toward a small wearable computer with access to any web service, including personal information, through a customizable interface. In combination with remote device control, biofeedback input devices, and systems for enhancing senses, the implications are astounding.

Centralized network computing will win

I know it’s a big debate right now, but centralized network computing will win in the end.

Centralized network computing is the term used to describe a system of networked web servers (or a single web server) that provides integrated applications and storage for multiple users who can access the system through distributed terminals. The system can be geographically distributed or not, but will share a common (integrated) network of applications, probably using a software interface standard to encourage and enable multiple independent application development teams.

Centralized networks are inevitable because of self-reinforcing competitive advantages. Economies of scale and market forces will lead to substantial change in the way we compute, and the systems we use now are simply the seeds that will grow into (and merge together to form) global centralized information service providers. There are already some very strong indicators that this trend is happening, and the potential points to this trend being a very long one.

  1. There are economies of scale in processing Load balancing can optimize processor utilization and provide both faster interactivity and reduced hardware investment.
  2. There are competitive advantages in information and application aggregation Integrations can break down the walls between programs, improving functionality through better integration and data sharing. You can analyze data in more dimensions and with more flexibility. Development rates can improve as it becomes possible to support separation of more software components and the people that work on them.
  3. Load balancing improves transmissions Transfer rates improve because fewer nodes are required and data traffic can be optimized. Information served to you from a server on your edge is more reliable and fast than information sent to you from another server through your edge.
  4. End-user transparency The front-end possibilities under centralized computing are not limited beyond that of other systems. This implies that there will not be a selection bias away from centralized systems because end-users will not prefer or recognize the difference between systems. That is not to say that they will all be the same – only that they all could be. The opportunity set in one system is available in the other.
  5. The outsourced storage industry exists This implies that there is a willingness to adopt on the part of the owners of data.

You can see the markets already rewarding companies that are moving to take advantage of this trend. Many of these companies are providing application services along with ISP connectivity, and they are capturing traffic. This traffic is investing time and thought into signaling their own preferences. Some examples include personalizing page layout and content — often even using system-wide wallets and e-mail. Giving users what they prefer is a huge competitive advantage. The time it takes to personalize a competing system is a high transaction cost – especially relative to the low cost of inertia.

Eventually, you will be using only a browser. All your computing will occur on a centralized system and be sent to your browser for translation into your interface. All of your applications will be centrally hosted, so your profile and applications – essentially everything you do on your computer – will be available to you from any machine, at any time.

Multiple systems will compete for scale, reducing marginal costs and creating meaningful and irreproducible competitive advantages. This race will likely be won permanently by 2050. Before that time, ASP services will consolidate rapidly along with economic cycles. the early players will rely on loss leader models to attract user bases, and will transition to profitability as the scale reaches the tipping point. The companies that make it to profitability first and reinvest in their technology platform will improve their integration, breadth, and quality to further support their competitive advantages.

In the first decade or two of this trend, there will probably be dozens of smaller companies that are able to enter and gain market share against their larger competitors. These companies will have competitive advantages most likely based on data storage, traditional media integration, wireless adoption, software platform architecture, applications suite integrations, and possibly international comparative advantage. After 20 years, the marginal advantages possible from these characteristics will not pose a meaningful threat to the aggregation and scale advantages of the top few market participants.

Consolidate or die will be the mantra of information companies.

Forecast Changes in Asset Values based on Measurable Cultural influences

The sum total of the media contributed to the internet approximates the attention of society during that period. Then analysis of this media indicates trends in attention and preferences. These trends create signals about the directions of values for securities and other assets. Frequency and trend directions of keyword usage, volume of content in certain classifications, and level and type of contribution of media files vs. sector and industry pricing trends are recommended starting points for analytical comparison.

Competition in the information age

Consolidation is the result of economies of scale – essentially horizontal integration, vertical integration, and resource sharing. These methods create competitive advantages in powerful ways that make it difficult for smaller players to compete in the same markets. There is nothing necessarily wrong with this trend, but it creates large barriers to entry and often leads to larger profit margins than would be otherwise possible.

In the information age – yes, now – this effect is greatly increased, and the limitations of transportation and capacity have been eliminated. The ability to integrate and share resources is much easier, and new extra-strength synergies are created. For example, if a website allows you to shop for both books and music, then it is possible to tailor your music shopping experience based on your book purchasing preferences. This is a very simple example of a much more powerful trend. It may be impossible to enter into any sort of competition with large information companies after the next 20 years.

You can already see it beginning to happen: Yahoo builds from scratch any web business that seems to make sense. Then because of its existing market coverage, and the ability to integrate new businesses with existing businesses and data, Yahoo is able to capture so much synergistic value that they gain an insurmountable competitive advantage. In this way, I think that Yahoo and the other major aggregators and integrators are great companies.

There are risks. Big ones. And the FTC may not be able to do anything about it.

It may be inevitable that the consolidation will lead to a stable equilibrium under monopoly – where there would be no reason to be a competitor because the types of services being provided rely on historical information and broad business integration that is impossible to recreate or beat. Then this monopolist would have virtually limitless pricing discretion, and the ability to manipulate markets and cultures in unprecedented ways. Humanity, in many ways, would be at the mercy of the monopolist. (I hope that its leaders are benevolent democrats with philosophically sound motivations and long time horizons – but what if they are not?)

The only way to eliminate this market dynamic is to eliminate the factors that make it possible, namely, the opportunity to use your market dominance in one field to create dominance in another field. More specifically, eliminate the competitive advantage created by archival data. This can be accomplished by sharing archival data freely. But what about my privacy? Good question. We have a big problem here. The private information about you and your preferences plays a large role in creating the value that leads to this consolidation. If you want to eliminate this competitive advantage, then you either eliminate the value or you share private information.

There is another way.

What if users owned their own archival data? Amazon could still track my click streams, and do whatever they wanted with them. But I would also be tracking my own use, and have control over my own preferences and historically available data. Amazon would quickly learn that the personalization algorithms produce much more valuable customization using the users’ data than the Amazon archives. Market entry for this standard benefits from this implication. Now what happens if you go to a small competitor – one with little history, but better value than the others? They would be able to provide you with services that took advantage of your archival data, just as the monopolist would have. Competition is restored, and the advantages for humanity are regained as well.

Somebody should create a standard – probably using an XML document editable from within your browser. I’d love to help. Somebody has to do it eventually, and the sooner the better for all of us (except the monopolist, of course!)