Citation

Internet Future Strategies: How Pervasive Computing Services Will Change the World

Author:
Amor, Daniel
Year:
2001

From the Book: Forward Trends in Information Technology The combination of high-speed computers and intelligent devices is the exciting trend of information technology of the 21st century. The so-called pervasive computing will provide smaller, faster and less expensive technology. Such devices, for example, are scientific instruments or online databases, all completely interconnected by wired or wireless networks and accessible anywhere in the world. This will have major impacts in education, manufacturing, and health care and others. These new trends are devoutly expected by modern industries, like biotechnology. On the other hand, there is a definitive need to focus on users and their tasks rather than only on computing devices and technology. Business and the idea of services Effective business is based on the challenging principle of facing relentless competition and still being better than all the business competitors together. Just exaggerating this notion slightly: each and every company which wants to be in the forefront – and remain there – must be able to offer and sell more products, with higher quality, in bigger numbers and in shorter time periods than its competitors. And with better service. Especially with better service. This is what pervasive computing has to be about: providing services. It is about making them common on a day-to-day basis, up to the point where they become ubiquitous and yet — mostly — unnoticed. We are already surrounded by all sorts of computing services in our daily life (e.g. flight reservation system, electronic car systems). Enabling these services to use existing and future network functionalities is the next logical step that only a few companies have begun to think about, let alone started to implement. The understanding of technologies and their social — therefore potentially economical impact is essential for the success of every business concept. Companies have to integrate new technologies seamlessly into their existing conceptual framework to be able to survive the fierce competition that inevitably arises when there is money to be made. Services in the life-sciences and health care area for example are of strategic importance for the years to come. Pharmaceutical companies and doctors likewise increasingly fall back on third parties to answer specific questions that are outside their usual knowledge base. Interestingly enough, a paradigmal shift is taking place at the same time: biology gradually moves away from a bench based to a computer based science, as well as biotechnology moves to a robot based and computer instructed technology. The common point of both these trends is the fact that specialized knowledge of a few experts is made available to a broad audience by interconnected knowledge management and the possibility to share and transfer vast amounts of pharmaceutical, genetic, medical and other information. One outcome of this development is the emergence of industrial bioinformatics and the enhancement of the vitality of biotechnology companies. Business meets Science Companies have to handle business needs like managing customer relationships, electronic commerce, supply chain management, and enterprise resource planning. On the other hand, science requires effective ways to organize, store, manage and retrieve the exponentially growing volumes of data that accumulates every day. Typically, advances in science lead to advances in technology, and these in turn enable business to create new products and services. However, customers tend to have a selective acceptance of new technologies as the life span of base technologies shortens with every innovation cycle. Not all technologies will therefore be able to generate profit and this is one of the major problems that solution engineers have to overcome nowadays. Companies can improve competitive positioning by use of enterprise-wide computational systems with strong organizational relationships. The industrial bioinformation applications are a very good example of a pervasive technology. Industrial bioinformatics is not simply based on science, invention or capital, but rather on the combination of these forces. Bioinformatics is an example for one of the emerging service industries that brings together resources, technologies, information and highly skilled workers to form an integrated, high-throughput environment. The effort needed to analyze billions of molecular interactions demands researchers highly skilled in physics, chemistry and computer science working together. Lone researchers are consequently replaced by interdisciplinary teams working in more or less tightly coupled environments. Industrial bioinformatics is an intellectual fusion of biomedicine, automation technology and intensive computing allowing us to scan biology in its entirety and to dig for answers in the mass of data. It involves partnerships between diverse disciplines: application scientists and engineers, biologists and doctors, applied mathematicians, computer scientists, and robot engineers. One advantage of using industrial bioinformatics is the potential for broad corporate viewing of both data and data models, encouraging interactions among individuals and improving discoveries. Therefore the biotechnology industry is very rapidly developing the infrastructure, modifying the infrastructure, the robotics, the network management etc. to be able to handle these pervasive networks. Service meets Customers While the technological prerequisites have evolved and were improved continually, the most critical factor in business — the customers — still do not get the attention which they deserve. Many companies had to learn the hard way that rethinking the relationship to the customer is a complex process subjugated to continuous development. Only optimizing company processes is bound to fail whenever efficient customer relationship strategies are not applied consistently. But generating services is not enough. The general bottleneck is teaching the customer to access those services, understand them, and use them on a regular basis. Have a look at today’s first steps of integrated Internet services (which are mostly web based at the moment). Regardless how seemingly intuitive and easy to use they might seem, there is no prediction possible on how the human factor will react in certain circumstances. Take for example a company that invests heavily in its online order technology. Normally, the last page a customer sees is a summary of his order and a big button labeled “Order Now!” which commits the order when pressed. In a few obscure cases however, customers do not press this button but rather make a hardcopy printout of that very same order page in their web browser and send this by fax to the company. Needless to say the company really should ask why do customers behave like this, but these things do happen and will happen over and over again during the transitional phases of business/customer relationship over the Internet. One of the most effective strategies to prevent problems like the above is called collaborative commerce. Collaborative commerce describes the interactions among and within organizations which surround a transaction and improves satisfaction by meeting the customers needs on the first point of contact and creates a competitive advantage by providing a continuous stream of services to customers. It has to be viewed as a critical piece of business initiatives. Reacting to customer needs It seems of interest that the drive to use pervasive computing is not the IT-people in the company but rather the business process engineers or the business evolution teams thinking about it. What they are looking at is an opportunity to offer new services, to offer additional services and get new customers. A good example is the life science industry. Genomic information provides many potential targets for drug discovery. One of the challenges is to convert this information into drugs that treat diseases. Diseases that affect people. Knowledge of the genetic factors will allow the development of drugs that deal with the roots of the disorders. The comprehensive knowledge of genetics available today is distributed worldwide on public and private databases and accessible via the Internet. In the future, tight linking of these resources will allow biomedical research to find a molecular definition and diagnosis of diseases rather than the mainly clinical definition and diagnosis. The advantage will be a cost-effective medicine and the ability to prevent particular diseases, in the end being useful to people actually suffering from that disease. The point of genes is to provide cells with instructions for molecular functions, e.g. making proteins to kill other infected or degenerated cells. Biomedical researchers now need the expertise to merge information technology with science in a productive way. These new pervasive methods should be complemented with deep support and collaboration from experts in allied fields. There is an ever increasing need for the development of new, more efficient and more sensitive computational methods as our understanding of the complex biological interaction within living organisms grows. The everchanging world Modern biotechnology brought costs down by a 100-fold in the last ten years in biomedical studies. As a result, functional genomics primed automation and miniaturization. Data generated in the high-throughput areas of biological research has to be processed using automated modular components to ensure (1) high quality data (2) low computation cost and (3) quick exchange of applications/modules while reacting sensitively to changing market conditions or the availability of new methods, programs and technologies. Flexibility of process support is a key requirement for current and future business applications of industrial bioinformatics. Properly deployed mobile computing allows this flexibility to be streamlined. Pervasive computing brings the data from its source to the field truly ubiquitously through an enterprise data center, to those who need it most. Starting with the use of devices for managing corporate information like sales orders, the devices will be used for a variety of applications like inventory and medical computing, and ending up in a data mobility model. In a production pipeline for instance, where capturing barcode data plays an integral role, the mobile devices can be used for location or tracking the customized product. Using an open wireless Ethernet the data can be synchronized with the corporate network. Managing the company information flow For industrial computing and service applications, pervasive computing solutions are essential to extending the value of ERP systems. In a typical company, a number of technologies exist as islands of information with quasi inexistent interconnection. The transfer of data to, from and within departments in a recognizable and secure format has challenged scientists and software developers. Custom interfaces are mostly expensive to develop and maintain. Furthermore, the communication with corporate software solutions like manufacturing execution systems (MES), enterprise relationship management (ERM), supply chain management (SCM), sales force automation (SFA), computer aided selling (CAS), computer integrated manufacturing (CIM), management information systems (MIS) and e-commerce requires intimate knowledge of each software component. One possible approach for linking two systems that both use a relational database, is simply to use SQL directly to read and write the data between the two systems. Two major problems arise when doing this: 1) complexity and integrity problems rise exponentially when more than two systems are to be linked and 2) more importantly, all security and business agreement rules that are contained within applications are circumvented. In the past few years however, the XML (extensible markup language) standard has emerged as a widely accepted method for transferring data in business applications. There is a necessity for a minimum of communication between service provider and service customer. In some cases, this communication can be reduced to the point where fully automated processes can kick in (for example with XML formated messages), other cases can only be solved by intensive coaching, e.g. through contact centers. Even within a higher level of automation advanced search engines, intelligent agents, and researcher profiler tools must exist. Although intelligent systems approach aspects of human beings, people are still the best adaptive general problem solver. By implementing pervasive computing capabilities, one can dramatically improve the access to ERP solutions by mobile staff giving the customers a significant competitive advantage. MWG as a dynamic service provider MWG-BIOTECH is a leading provider of biotech products using eBusiness and CRM solutions for the research community. The company uses a comprehensive suite of business process management, applications integration and customer relationship management tools, training, and consulting to accelerate the drive towards eBusiness, enabling customers to reap the full benefits of MWG-BIOTECH’s service. The evolution of industrial bioinformatics to a customer care service seems to be a spectacular development now. Collaborative commerce has emerged to meet the growing demand for biomedical researcher interaction. Companies like MWG-BIOTECH are turning to collaborative commerce in order to boost biomedical researcher satisfaction. Drowning in a sea of irrelevant information about a gene or a gene function in which he or she is interested, the researcher has to find out all the possible information about this particular gene worldwide. In the field of DNA arrays, for instance, where the complexity of the product and services rises dramatically, a divergence does exist between the information available and how much biomedical researchers can adsorb and interpret in the context of their particular needs. The result of the so-called complexity gap means that some biotech companies end up with orders for product configurations that cannot be produced. Therefore, MWG-BIOTECH has invested both in computer technology and in the founding of a center of excellence to support collaborative commerce interaction with customers. The philosophy behind this idea is the effective transport of information from one specialist to another. It is one aim of MWG-BIOTECH to develop a framework of accepted, trusted and easy-to-use tools supporting an enterprise wide knowledge management system successfully integrated in the research process of modern life sciences. Finding of facts It is the ability to easily upgrade services another notch that makes the Internet and pervasive computing services both powerful and attractive at once. Even if some services may sound simple, they may be bundled together in an integrative way to form powerful packages meeting the needs of customers. The book provides insights in the fundamental working mechanisms of ubiquitous services as they are designed and created right now. A person with facts does not need opinions, and facts are the principles that serve as the cornerstones of this book. The author presents them in a way that is useful for the expert reader as well as the beginner, providing a sound foundation and supplying details where needed. ‘Pervasive Computing’ is the logic continuation of his previous bestselling textbook ‘The E-Business (R)Evolution’ and I hope the reader will have as much enlightment and fun as I had while reading it. Bernd Drescher Life Science Information Director, MWG BIOTECH AG Munich, Germany, 13th March, 2001