Java, one of the most exciting technical developments of recent years, has been credited with everything from saving the Internet to reviving the concept of client-server computing. But does it really mean much to the UNIX world? Is it just another programming language (as Microsoft claims), or is there more to the story? In this article we’ll examine the UNIX industry’s view of Java.

In a nutshell, Java’s primary promise is to “write once, run anywhere.” The goal is to be truly system-independent (of both hardware and operating systems). Whether this is happening today is another matter. Java has made the World Wide Web more than just a medium for the delivery of static multimedia documents. The Web would have been popular anyway, but with Java, it excites the imagination. Finally, since Java initially was created to power set-top boxes, it has a design center that enables consistent programming from small non-PC computers to large enterprise servers.

Why Microsoft Is Scared

If these features were all that Java offered, it would not be the threat to Microsoft (and the potential friend to UNIX) that it is. To understand Microsoft’s reaction, we first must look at its basic business strategy.

Microsoft’s strategy has two major parts: leverage strong positions in one market (that has the potential to branch into new areas), and tie products together so users prefer to buy them both. For example, there is a strong synergy between Microsoft’s dominant position in desktop operating systems and its client-side productivity applications. In fact, these applications only became dominant when Microsoft took advantage of Windows’ popularity while Microsoft’s competitors, Lotus and WordPerfect, did not have a similar application-OS relationship.

The synergy between Microsoft’s client-side operating system and applications and its server-side operating system (Windows NT) has contributed to NT’s popularity. And there is the fit between the server OS and Microsoft’s BackOffice (database, messaging, systems management) tools. Underlying everything are the APIs that hold it all together, making it difficult to build portable applications that are not tied to the operating system. And once a volume market (a consistent hardware base, an execution environment, or a dominant, extensible application package) emerges, there is an incentive for customers to buy the OS and for other companies to develop more products for it. This strategy has made Microsoft one of the most powerful companies on the planet.

Java Breaks This Strategy

Java is both a development and an application-execution environment. It also is portable, standardized, flexible, and already has a huge and growing installed base.

Portability and standardization (despite Microsoft’s efforts) result from both the design and history of the language, and Sun’s de facto control of it. It also stems from Sun’s desire to collaborate with companies having a genuine interest in furthering Java’s potential.

The installed base is the result of early integration with an underlying environment, Netscape (Mountain View, CA) Navigator, that dominated the new, rapidly growing Internet market. This brilliant maneuver was so successful that all other browsers had to follow or be left behind. It means that as the Internet expands, so does Java. This potentially undermines Windows’ volume advantage.

Java stretches from very small computers to very large ones. Java in some form is programmed into smart cards, telephones, single-purpose information appliances, PCs, and a variety of servers. In fact, the Internet can be used to build globally distributed systems.

The Heterogeneous World

We all know that the computing world is not homogeneous, but the increasing complexity of information environments leads many to wish it were. Java’s story emphasizes computing heterogeneity. Even further, it reminds us that the current PC is not the end of our computing devices’ evolution, that we are entering a period of increasing diversity.

One good example of dealing with diversity is how IBM is solving the problem. For years it has struggled with its incompatible computing families and the difficulty to program for all of them. Now, Java is on IBM systems, including OS390, AS1400, AIX, OS/2, and network computers. While it is not the right environment for all applications, one can program once and run on all these disparate IBM systems.

It’s The Applications, Stupid!

People buy computers to get applications, and Java has several implications that favor UNIX. Java’s current uses emphasize servers, which are central to networked computing. And UNIX excels in servers, especially scalable ones.

Scalability is particularly important in the future phases of the Internet and electronic-commerce era. Java server-side applications will be popular because they are portable and can be made to scale as needed. UNIX will be the preferred server for Java servlets because it embraces Java and Java’s preferred distributed-component Common Object Request Broker Architecture (CORBA) and it has proven robustness.

On the client side, Java applications will exist in UNIX, but one major effect will be the existence of quality office-productivity applications. Word processing, for example, is not a typical use of powerful UNIX workstations, but is needed occasionally. If the rest of an enterprise is using a standardized office suite but one is not available for UNIX, this becomes yet another reason to switch. The coming Java-based office suites will eliminate this particular problem.

It is believed that Java creates robust software better than C/C++. Although an arguable point, it does draw attention to the issue of robustness, a UNIX strength.

Solving UNIX Problems

Among the problems plaguing UNIX are incompatibilities among UNIX variants, lack of volume markets for applications, an image of being too complex (especially in user interface and system management), and cost.

Java can’t fundamentally solve the incompatibility problem but does hide it effectively. For the applications that fit Java, subtle system differences are irrelevant, user interfaces are identical, and if the Java Virtual Machines are built correctly, the applications behave the same.

Volume markets are more attractive for software developers, and this relationship is self-reinforcing. This problem is solved by Java’s portability so applications can run on both clients (Windows, UNIX, and others) and servers. Thus, volume-based arguments against UNIX also become irrelevant.

In the application-development process, Java is the same no matter where the application runs. The development environments can be the same (both professional object-oriented environments and 4GL-like visual-component assembly environments). User-interface design is, however, a style choice.

The cost issue is more complex. Hardware costs center around design goals, volume, and a host of other considerations. UNIX vendors have moved to more commodity peripherals and, for like capability, have hardware that costs as little or even less than the PC vendors’. But Java doesn’t really affect these hardware trends other than in the newest small portable devices.

Software costs are determined primarily by history and volume. UNIX software vendors are moving to compete with PC prices (witness the database vendors’ “same price on UNIX and NT” movement). Java’s run-anywhere capability means prices of Java software must be based on some criteria other than OS. The more Java software there is in the world, the less this can be used as an argument against UNIX.

If Java really flowers, and it looks like it will, UNIX will continue to compete against alternatives on its merits of performance, stability, and scalability. And we know that is a game at which UNIX excels. If you have any reason to advocate Java, do so–it will help foster a world where fair competition is the norm. This can only open opportunities everywhere, including in UNIX.

Leave a comment

For a long time, Unix looked to be the most fertile ground for seeding Hydro-Agri North America’s extensive ERP applications. Installed at the agricultural chemical and fertilizer distributor were two high-powered Hewlett-Packard Co. Unix servers running SAP AG’s R/3, with a third off-site for disaster recovery purposes. Systems analyst Jim Wiedrich, who ran the operation, was a self-described Unix fan, dubbing the “backslash” mark in Windows/DOS “backwards.” His preference? Leave Windows to the desktops and other non-mission-critical applications.

Wiedrich began to consider plowing new fields, however. With information systems dollars scarce and a low crop yield of Unix programmers, Hydro-Agri last year decided to root its future enterprise resource planning applications in Windows NT. The Tampa Bay, Fla., company pulled out its Unix servers, save for the database server, and replanted six Compaq Computer Corp. ProLiant servers with SAP R/3 applications on NT.

Norsk Hydro was a heavy embracer of NT.

Why the shift? As Wiedrich admits, conventional wisdom would dictate choosing the more scalable Unix platform, given that Hydro-Agri is on a rapid expansion path. The $900 million North American subsidiary of Norsk Hydro, of Oslo, Norway, is farming new terrain in chemical and fertilizer manufacturing as well as in distribution. But it was that very growth that precipitated the move to Windows NT. “As we grew, we needed to upgrade the performance of our systems,” Wiedrich explains. “With the HP systems, we had to have an HP guy come out and upgrade them after-hours. With the Compaq [Windows NT] platform … we can go in and add components without bringing the system down. And we can do it ourselves.”

Hydro-Agri’s shift to Windows NT is emblematic of what’s happening with ERP in general. At ERP leader SAP, more than half of all new licenses are for NT platforms. Even last year, with NT versions of R/3 modules only first available, 22 percent of SAP’s new customers chose that platform. The trend is expected to continue this year–particularly among small and mid-size organizations (from $200 million to $2.5 billion in revenues annually) implementing ERP, according to Scott Lundstrom, director of research and enabling technologies at Advanced Manufacturing Research, a Boston-based consulting group.

One major reason is Microsoft Corp.’s dead aim at the enterprise market. “Microsoft is a much more focused application platform than Unix,” Lundstrom says. Despite many attempts to unify the “Heinz 57 varieties” of Unix flavors, each provider adds its own mix. And “backwards” slash or not, Microsoft has given application vendors a focused product, he says.

There are other reasons users are migrating from Unix into NT territory, Lundstrom says. “Windows NT is simply lower-cost than Unix solutions,” he says. A PC server capable of running an ERP application typically starts at around $10,000–about half the cost of a Unix server. Add to that some aggressive pricing by Microsoft Corp. on the operating system side, and the Unix system may be out of reach for most small to medium-size businesses, Lundstrom says.

But it’s not a perfect growing season for Windows NT. Performance and scalability issues need to be addressed, and some changes to the NT file structure are required before the fruits of the platform can be fully realized with ERP. Also, the fact that Unix is a mature operating system with a large contingent of ERP add-on software gives it an advantage.

For Hydro-Agri, Unix definitely had a leg up two years ago when the company sent out an RFP to 25 ERP vendors. A user panel from the company chose SAP, however, partly because Hydro-Agri liked its aggressive plans for NT, along with the software’s data accessibility and configuration flexibility. Hydro-Agri implemented Phase 1 of its ERP deployment in eight months, running initially on the HP 9000 Unix platform with an Oracle Corp. database. The company estimates that the installation of back-office functions, such as accounting, finance and cash management, has saved the company more than $1 million by better managing and controlling inventory flow and costs.

In Phase 2 of the installation, Hydro-Agri added profitability analysis and reporting applications to the system, giving managers real-time cost and revenue information online. Then, last year, the company began plans to migrate its SAP applications to NT from Unix, leaving the Oracle database on a single remaining Unix server. “It worked better than expected. On Friday, we were running on the Unix boxes, and on Monday morning, we had the applications running on Windows,” Wiedrich says. By the end of this year, the Oracle database will be transplanted as well, moving onto a new Compaq ProLiant 7000 server running under NT.

Widespread migration

It’s not just users that are finding NT to be fertile ERP ground. The operating system has become the testbed platform of choice for leading ERP vendors SAP and Baan Co. “You’ll see all of our newest versions of our applications show up first on NT,” says Allen Brault, director of NT business development at SAP’s Waltham, Mass., office.

While SAP has led the way among ERP vendors pushing the NT platform, other vendors are catching up. Baan’s NT products have only been available for eight months and they already comprise more than 10 percent of new shipments, says Anil Gupta, vice president of industry marketing at the Menlo Park, Calif., company.

The ERP vendors are finding that moving to NT opens up new markets among smaller, non-Fortune 500 companies. “Our growth market is now from the sub-$200 million- up to the $2.5 billion-size company,” says Steve Rietzke, head of SAP’s Microsoft partnership, based in Bellevue, Wash. “That’s where NT is making the most impact.”

Driving ERP’s success in that market is price. “Windows NT hardware and software are cheaper than the Unix versions,” Lundstrom says. That lower price comes from both prices for the software licenses and the cost of the hardware platform, he adds.

And even if Unix vendors lower hardware prices, as Sun Microsystems Inc. has done at times to counter Intel Corp. servers, PC servers still have an advantage, notes Hydro-Agri’s Wiedrich. “When I’m finished with a Unix server, it doesn’t go anywhere else for use. With a Compaq server, I can use it as a Notes server or, if worst comes to worst, I can cannibalize the parts for other servers and even desktops,” he explains.

That lower price also attracts customers who couldn’t afford ERP software on a Unix platform. That was the case at $47 million Green Mountain Coffee Roasters Inc., in Waterbury, Vt. “We wanted a solid ERP system, but it would have been nearly impossible to justify if we had chosen a Unix system,” notes Jim Prevo, CIO at the coffee retailer and wholesaler. Green Mountain opted for a three-tier client/server system after attempting to brew its own ERP system. Already a Compaq shop, the company decided the high-end servers would be more than adequate to meet its ERP needs. In early 1997, Green Mountain purchased 17 modules from PeopleSoft Inc., and by early June, seven were percolating: general ledger, accounts payable, purchasing, production management, bill of materials and routing, cost management, and inventory management.

Scaling new heights

The momentum behind NT doesn’t mean the combination of Unix and ERP is going away, however. “Among our large customers with large deployments, Unix remains their primary platform,” SAP’s Brault admits. NT is simply not ready for those large deployments, explains Lundstrom, because it lacks the fault tolerance and redundancy of Unix systems. Former Unix devotee Wiedrich agrees. “We run an 8 a.m.-to-8 p.m. shop. If we were 7-by-24, I don’t know if we could have chosen Windows NT at the time we did,” he says.

Others acknowledge that NT’s areas of vulnerability include its scalability. That may be remedied soon in NT 5.0 when Microsoft, of Redmond, Wash., includes an LDAP (Lightweight Directory Access Protocol) directory structure. Without the LDAP directory, converting files from the database to the NT front end is sluggish.

NT 5.0 will also include a single-sign-on feature that will allow users to log in to NT and get rights to SAP R/3 at the same time, says Edmund Yee, manager of network operations at Chevron Canada, a user of SAP on NT. Chevron Canada is beta testing NT 5.0. “Currently, you have to log on twice, so the single sign-on will make things a bit easier for the users,” Yee says.

Adding to NT’s limitations is its close ties to Microsoft’s own SQL Server database, which doesn’t scale as well as others, according to AMR’s Lundstrom. Coming soon, however, is a new release of SQL Server with improved scalability.

But perhaps one of the biggest hurdles to running ERP applications on NT is the relatively limited crop of add-on tools–for example, performance enhancement and application management software–that have been ported to the platform. “There are a number of those types of tools that simply aren’t available to Windows NT users,” Lundstrom says. Prevo at Green Mountain agrees, although he notes that many of those tools may not be of much use to a company of his size. “Even if we had those tools, it would have taken way too long to learn how to use them. That’s why we used an integrator and some PeopleSoft contractors–people with experience there–for our project,” he says.

However, notes Baan’s Gupta, many of those tools are being built into NT because it is application-oriented, which he sees as a plus for the operating system. “Backup and file copying are fully integrated into NT,” he says. “With a mainframe environment, you’d have to go buy a Tivoli [Systems Inc.] product or a [Computer Associates International Inc.] Unicenter product to do that.”

Further down the line, there are other technology improvements under way on the hardware side to strengthen NT’s roots in ERP. Compaq, in Houston, has made ERP a priority on its systems and has established several SAP Competency centers to provide a testing environment for companies interested in SAP on an Intel platform. (See story, left.) Compaq is also bundling Baan software along with Microsoft’s BackOffice on its servers. And recently, Intel has announced that it will begin working with SAP to ensure that the Intel platform is well-suited for ERP applications.

A close fit with the hardware will be key when the Intel platform moves to 64-bit microprocessing next year with Merced, making it roughly equivalent to Unix microprocessors. Realistically, though, it may be another year or so before NT servers are robust enough to stand shoulder-to-shoulder with large, entrenched Unix servers.

For one thing, apart from a few vendors, such as Sequent Computer Systems Inc., NCR Corp. and HP, large eight-way servers are not generally available on the Intel platform running NT. That will change with Merced, but it won’t happen overnight, analysts say. And even if it does, it would take some time and a compelling reason to dislodge bulletproof Unix installations. “You’d have to have a pretty compelling case to toss a system that’s working, just to replace it with new technology,” AMR’s Lundstrom says.

And Unix vendors themselves aren’t standing still. “As Windows NT is moving more towards 7-by-24 operations, Unix systems are moving to five 9′s [99.999 percent uptime] in reliability. So the bar is moving in both areas,” says James McDonnell, group marketing manager for personal information products at HP, in Palo Alto, Calif.

Dual approach

Vendors’ strong backing of NT has given some large companies the confidence to move at least some of their ERP applications to the platform. For example, Chevron Corp., in the United States, has standardized on SAP running on Unix, but the company’s smaller Canadian branch chose NT, says Yee, manager of network operations for the company, based in Vancouver, British Columbia. While the U.S. operation has more than 7,000 users, the group in Canada counts only about 180 active users. Because SAP has provided ways to migrate between the two platforms, the groups are able to share information. “We grab information from their servers when we need it,” Yee says.

R/3 can be configured to have the database and application running on Unix, while the presentation appears on Windows. In Chevron’s case, the Canadian database is run on NT, while the U.S. database runs on Unix. While the two systems don’t interact on a constant basis, Yee says there is no problem in converting the Unix database information into NT.

And while Unix has been appropriately advertised as a more stable platform than NT, Hydro-Agri’s Wiedrich has found a hidden advantage to moving to the more desktop-oriented platform. “When the Unix server went down, we had to either call me in or call someone from HP in to fix it. With the Compaq servers, about half the time, the desktop guys can make any adjustments. It’s nice not having your beeper go off every weekend,” he says. Even if that backslash does look “backwards” to him.

Leave a comment

Who says last rites need to be administered to Unix? The Dealer Services Group of Automatic Data Processing Inc. is among the many faithful keeping the operating system alive and well. Last month, the $750 million technology division of ADP signed a $100 million deal with Digital Equipment Corp., committing to Digital’s 64-bit AlphaServer Unix system to drive enterprise applications across its 18,000 car and truck dealerships.

DSG is part of a much larger flock of followers that believe in the virtues of Unix for mission-critical and enterprise applications. Officials at companies across all industries who run some flavor of the operating system say its scalability, reliability and management capabilities still make it the most logical environment to host large applications, including SAP AG’s R/3 and Oracle Corp.’s Financials, along with data warehouses and all the back-office operations that make their organizations run.

While a Windows NT setup can cost significantly less, these Unix devotees maintain that Microsoft Corp. still has significant work to do before NT can compete with the traditional strengths of Unix. But these believers are by no means blind followers. They acknowledge that in some cases, platform preference will be dictated by the availability of appropriate third-party applications. Therefore, many are keeping a watchful eye on Microsoft’s progress to decide if–and when–the time is ripe to convert.

“In the enterprise area, we feel Unix today is the right place to be … and [that it] offers the best performance and reliability, although we’re carefully watching NT,” says Miles Lewitt, vice president of DSG’s Global Product Development, in Portland, Ore. “We intend to be pragmatists. It’s not a religious thing for us.”

The principal issue for Unix remains unchanged: If the operating system is to maintain its position in the market, vendors need to accelerate their progress in developing standard extensions as well as pare down the number of versions available. “ISVs are having to port to 20 different Unix systems, and that makes life very difficult,” says David Floyer, an analyst at International Data Corp., in Framingham, Mass. “The … Unixes have to consolidate … and that’s happening.” Floyer says there’s still work to be done to enhance Unix standards for clustering (an architecture that provides continuous, uninterrupted service), high availability (24-by-7) and administration.

Clearly, there is a market for NT, primarily in the desktop and workgroup arenas, analysts say. IDC forecasts NT spending to increase from $6.9 billion to $27 billion between 1997 and 2000. By contrast, sales of Unix systems will experience slower growth, from $24.7 billion to $39.5 billion in total worldwide sales in the same period for hardware spending.

Seamless Web integration

Internet Shopping Network Inc. counts itself among the disciples of Unix. Although the online computer superstore evaluated both NT and Sun Microsystems Inc. Unix systems in 1994, it chose the latter’s Sun SPARC and Solaris UltraSPARC servers as the platform to host its first Web site (www.isn.com). While NT admittedly didn’t have a strong presence in 1994, it did last year. It was then, once again, that ISN opted for Sun over NT to roll out its second site (www.firstauction.com), this time using 64-bit Solaris UltraSPARC servers.

With page views for both sites running slightly over 1 million per day and daily order transactions for both just under 1,000 per day, ISN needed a platform that could accommodate its rapid growth. “Unix has been great for us, and we’ve stuck with it for three reasons: scalability, manageability and flexibility,” says Brett Colbert, director of quality assurance and IS for ISN, in Sunnyvale, Calif.

Unix also excels from a systems administration perspective, says Colbert, who cites the availability of tools compared with what’s out there for NT. For example, if an IS administrator wants to pinpoint a directory on a loaded hard drive to free up space, it requires only a simple command in Unix, Colbert explains. On NT, however, the administrator must go into the NT Explorer utility to view the system’s entire configuration as the first route for figuring out which directories are most full.

Similarly, says Colbert, the operating systems offer striking differences in their approaches to remote access. The Telnet protocol built into Unix allows an administrator to easily tap into clients via remote dial-up, while NT requires a third-party program such as Symantec Corp.’s pcAnywhere or Compaq Computer Corp.’s Carbon Copy to control the system remotely, he says. “That’s important if you get paged at 3 a.m. and you want to just log in, take care of the problem and go back to sleep,” says Colbert. “It’s much more complicated with NT.”

Scaling new heights

The ability of Unix to scale reliably as business needs grow is also unmatched by NT, users say. For running SAP R/3 alone, Chevron Corp. is using about 150 HP-UX servers from Hewlett-Packard Co. Granted, Chevron is more of an ardent Unix believer than most. It has been running other large applications on Unix for several years, specifically for engineering and processing geologic data, says Bob Washa, technical manager for SAP R/3 implementation at Chevron, in San Ramon, Calif.

“We did not consider NT a viable option at that time. … HP [Unix] met all of Chevron’s requirements, and it’s continued to get stronger each year in functionality, scalability and performance,” he says. “We feel today it’s the only viable choice” for handling Chevron’s needs, he adds. At Chevron, that’s no small task. The company’s SAP installation has 7,000 users connected to three HP 9000 servers with one 350GB database, Washa says. In that configuration, R/3 produces an average of 12 million online transactions per month, covering all aspects of corporate finance.

For now, Skyway Freight Systems Inc. also views Unix as the platform of choice to run its mission-critical business applications, including Oracle Financials. The $160 million logistics and supply chain management company does, however, deploy a number of NT servers to operate desktop applications, according to Tom Duck, vice president of IS at Skyway, in Watsonville, Calif. “Today we wouldn’t be able to run our systems on NT–it’s not big enough,” says Duck. “We have applications that need to have high-end, fast servers,” and Unix systems can handle that need, he adds.

Skyway’s Unix spread includes HP 9000 systems used for distributed processing, which can be expanded from a single machine with less than 1GB of memory to systems containing six processors and 4GB of memory, Duck explains. “We can move our processes from one [Unix] server to another fairly easily and upgrade boxes easily,” says Duck, who says Skyway is committed to Unix for the foreseeable future.

“Five years from now we may have a different opinion, but I don’t think things will change radically,” Duck says. The mix of desktop and PC application servers deployed on NT 4.0, along with database servers and large-scale application servers standardized on HP-UX, “seems to be a good combination,” he adds.

Skyway maintains an electronic data interchange server that processes 5,000 files a day and Concerto, its desktop-based supply chain management suite of applications, processes around 4,000 new shipments every day on HP-UX. “There has not been a circumstance where we could not upgrade to cover the growth,” says Duck. “As the business grows, we add CPUs. … We cluster Unix boxes together so if there is a hardware failure, the other box in a cluster can pick up processes and run with it so you don’t have an interruption.” Skyway will add several more HP-UX servers this year, primarily for expanding Concerto’s functionality, he says.

With the expected roll-out of the 64-bit Intel Merced Unix architecture late next year, loyal followers say they’re covered for any kind of growth. Nokia Mobile Phones, which has used primarily HP servers for the last 15 years, is one company that is very interested in Merced. “That will be a consideration in sticking with Unix,” says Bob Schultz, an IT manager at Nokia, in San Diego. Nokia is running about 200 Unix servers (including some Solaris) for research and development, mechanical design, and engineering applications. The company also has about a half-dozen NT servers in play. But, says Schultz, “a lot of high-end tools are not available on NT,” specifically office automation applications.

“While some other operating systems are spending their time fixing bugs and problems, Unix has been working well and has been improved upon for over 25 years,” observes Mike Dotson, program manager for professional development programs at Florida Institute of Technology, in Orlando, which is running SCO Unix on Intel processors. Longevity has given Unix vendors “the luxury of spending their time making improvements,” Dotson says. “The Unix [community] will be the first to come out with the 64-bit operating system,” and that will allow for even greater scalability, speed and functionality, he says.

In many cases, companies are dependent on application availability to dictate their choice of operating system platform. Take SSM Health Care, in St. Louis, which owns and manages 24 entities, including 19 acute-care hospitals. SSM has approximately 40 HP-UX and Sun SPARC servers running its back-end applications for all clinical day-to-day patient care functions, including lab, radiology, admissions, discharge, census information and patient accounting. The company also uses some NT and Novell Inc. NetWare servers.

If given the choice, SSM officials say they’d probably stick with Unix. However, that will not be an option, since the health care provider is committed to rolling out Pathways Care Manager, software from HBO & Co., of Atlanta, which will only be available on NT 4.0. PCM allows SSM’s member hospitals to manage multiple contracts with different health care providers.

“It is being rolled out on NT simply because it was only available on NT,” says Jack Adams, director of operations for SSM. “I think we’d standardize on Unix if we had a choice, but unfortunately we don’t.” PCM will, however, interface to the HBOC Unix system, Adams adds.

Striving for diversity

With many ISVs strengthening their commitment to NT, some Unix shops are hedging their bets and picking vendors that embrace both disciplines. Magnet Interactive Communications has chosen Silicon Graphics Inc.’s Origin system as the primary platform for its Web site, application development and hosting environment. However, the multimedia company also maintains a heterogeneous operations center that includes Unix systems from Sun, along with PC-based Unix and NT servers. Like others, Magnet’s IT officials maintain that NT is limited in the amount of processors it can support and in bus speeds. But they have “some doubts as to our vendors’ commitment to the Unix way of things,” admits David Brookshire, director of IT at Magnet, in Washington.

“SGI is making a major play into the NT market later this year,” Brookshire says. With the price of high-end graphics PCs coming down, “SGI is having to focus on NT-based systems to compete with the likes of Intergraph. … We’re a little insecure along these avenues for obvious reasons, but that is why we remain a diverse environment.”

Microsoft is “doing a great job of convincing software vendors to port to NT–sometimes exclusively,” agrees Mike Prince, CIO of Burlington Coat Factory Warehouse Inc., in Burlington, N.J. “So I think more and more, you’ll find an increase in the use of NT in the application server space.”

Before that can happen, though, NT will have to be weighed in the balance alongside Unix and prove its equal in all features–not just continue to excel in price/performance. And that may be a tough challenge, given the strategy of many Unix loyalists that “fewer is better.”

For example, Chevron’s Washa maintains it is ultimately less expensive to run fewer big Unix servers that can be integrated rather than lots of small, isolated NT machines. “Fewer is better in our book–they’re easier to manage and there’s a lower total cost of ownership associated with that [approach] in software maintenance and repairs,” he says.

Burlington Coat Factory’s Prince agrees. “Spreading enterprise processing to lots of little systems makes an administrative nightmare. You wind up with cross-dependencies in boxes,” he says. For example, one Unix box that provides file server capabilities might depend on another server for other functions, and they get to be interdependent. “Then it becomes hard to unravel the two when there is a problem,” Prince explains.

With an industry shift back to centralized computing, Microsoft faces the issue of being unable to build an NT system that’s big enough, says Prince. To compensate, you get into “racks of NT [servers], and administration of those racks is undesirable,” he explains. “We’re going back to a more centralized model.”

ISN’s Colbert also buys into the fewer-is-better model, noting that when ISN’s load grew significantly on the first auction site, “the beauty of the [Sun] box was we could pop in” six additional processors to double capacity.

Given NT’s limitations, Unix-based systems will remain the primary high-end server of choice for Magnet Interactive for the next few years, “but we’re not wearing blinders,” Brookfield says. With responsibility for managing more than 8,000 Unix servers in North America, DSP’s Lewitt is inclined to agree: “When [Unix] ceases to be the best product, it will be time for us to do something different.”

For that reason, DSP has also hedged its bets by committing to Digital, which offers both platforms. Leavitt says the company’s belief that both Unix and NT systems have a role to play means preserving his customers’ investments over the long term. “We’re committed [to Unix] for as long as being committed is the right thing to do in the market,” he says.

So, for the foreseeable future, many users are remaining faithful, but the message is clear: No one is such a Unix zealot that they can’t be tempted to convert. Burlington Coat Factory’s Prince says it won’t be a problem for him if the market changes and NT gains momentum for enterprise applications.

“The truth of the matter is, what everybody ought to be concerned about is what’s the best way to deploy computing power,” Prince says. “Today, that answer is clearly Unix. That’s not a Unix bigot’s opinion. That’s the opinion of a lot of CIOs. … It’s what works best that counts.”

Leave a comment

Real time is all about providing a result in a bounded amount of time. It is about juggling multiple inputs from the outside world and supplying outputs back to it exactly when needed. An example of a real-time application is the antilock brakes on your car (they must be accurately pulsed tens of times per second). To satisfy the needs of real-time systems, APIs must be available to support accurate timing, fast communications and I/O, and precise, priority-driven scheduling. Conventional wisdom held that without proprietary APIs and OSes, it was impossible to achieve the level of performance required to solve real-time problems. However, there are significant costs associated with coding solutions to a proprietary product.

Recognizing this, OS vendors, researchers, and users participated in an IEEE working group known as Posix.4. The group’s goal was to refine existing Posix APIs and develop new APIs to address the needs of the real-time environment. The result of this effort was the Posix 1003.1b-1993 standard, or Posix.4. To address the need for efficient communications, the group added APIs that support memory mapping, message queues, semaphores, signals, and asynchronous I/O, or it extended existing calls. It also added timers, memory locking, and programmable scheduling capabilities to support the accurate timing and scheduling necessary for time-critical tasks. Many legacy OSes offered one or more of these features, but often as a clumsy, heavyweight, kernel-based implementation, rather than the simple and fast implementation necessary for real-time applications.

Real-Time Additions

Posix specifies an OS environment where multiple processes operate independently, each with its own protected address space. While the protected address space ensures that processes do not affect a system’s integrity or that of other processes, this environment is confining for real-time communications. This is because programs must perform time-consuming OS calls to communicate with each other and with the outside world. The fastest form of communication is through memory itself. Therefore, for processes to interoperate at the highest possible speed with each other or with devices, they must be able to share physical memory. Posix.4 defines a sophisticated yet elegant function called mmap() that uses a file descriptor to establish shared-memory mapping. Here’s how it works.

First, the shm_open() call provides an easy means to obtain a file descriptor corresponding to a supplied path. If multiple processes call shm_open() with the same path argument, and each process supplies this returned file descriptor to mmap(), this effects a mapping to the same physical memory. That is, all the processes access the same block of memory, as shown in the figure “Client/Server Application of Shared Memory.” Mmap() is extremely powerful. It maps memory among processes. Also, you can use it to map a disk file into memory. You can read or alter data in the file through fast memory operations, rather than through the traditional set of open/read/write/close calls. To ensure safe operation, mmap() provides control over which processes are allowed to read or write given shared-memory areas. Traditional synchronous I/O (e.g., writing to a disk file) puts an application to sleep while the I/O operation is pending (an unbounded time). Clearly, this action is not suitable for a real-time application that must always be ready to handle an event. To satisfy this requirement, Posix.4 provides asynchronous I/O (e.g., aio_read()), so that the application can continue executing and be notified (via a signal) when the operation completes. A list I/O (lio_listio()) function can execute many synchronous or asynchronous I/O operations via a single command.

When a real-time process executes on a demand-paged OS, it must ensure that any memory it uses stays locked in physical RAM. If it does not, the OS might page that memory out to disk. When the process next accesses this memory, the system’s memory management unit (MMU) must schedule a synchronous I/O operation to reload the memory page into RAM. Meanwhile, the process is put to sleep, leaving it vulnerable to missing time-critical events. Posix.4 provides the mlock() and mlockall() calls to accomplish the desired memory locking.

Control and Access Shared memory is great for fast interprocess communications (IPC), but some means must be provided to manage access to it. Confusion will result if a process attempts to read or write to shared memory that is being updated by another process. The Posix.4 solution to this problem is the semaphore. Both named and unnamed semaphores are provided. You create and access named semaphores via a path name. Access to named semaphores is through the sem_open() call. Unnamed semaphores are created directly in shared memory and managed by the user. Through the use of shared memory and unnamed semaphores, it is possible to build elaborate shared data structures with fine-grained locking. Event notification is vital to real-time applications, because processes must react quickly to outside events–such as releasing the brake on a wheel just about to lock up. Posix.4 provides an extension to traditional Posix signals called real-time signals. Posix signals simply set a bit, so it is impossible to know how many signals were actually sent to a process, or why. In contrast, real-time signals are queued so that none are lost. They have also been extended to contain additional information. If a signal is sent by sigqueue(), an integer or pointer value can be passed to the recipient. This result provides some indication of the actual event to be processed.

Messaging is a staple of real-time applications. Posix.4 provides an elegant set of APIs to implement message queues. To address real-time requirements, Posix.4 message queues support at least 32 levels of priority and can use real-time signals to notify a recipient of delivery. Message queues are efficient: Tests show that they are two to four times faster than traditional communications mechanisms such as sockets. Message queues are accessed via mq_open(), which uses a path name to identify the message queue to access. Real-time applications must ensure that operations occur on schedule. To meet these requirements, Posix.4 provides real-time clocks (clock_gettime()) with up to nanosecond resolution and real-time timers (timer_create()). Unlike traditional Unix timers, many real-time timers can coexist in one process. Posix.4 timers use real-time signals to notify the process when an interval has expired. If all that is needed is a simple time delay, the nanosleep() call can delay the current thread of execution for a precise amount of time. In addition, Posix.4 provides a set of scheduling APIs that let a process define, query, and alter the scheduling policies and characteristics that apply to that process.

Leave a comment

Anecdotal evidence suggests Lotus Notes expertise is still very much in demand, so maybe this has just been an abnormal quarter (last time its growth rate was 33%). None the less, it has fallen six places in the table as a result.

The biggest growth was shown by Java, with the jobs on offer up nearly threefold to 2,800, raising it 12 places in the table. This is testimony to the continuing rise in interest in Web-based applications, as shown also by a 72% increase in demand for generic Internet-related skills (now 27th) and a doubling in demand for HTML expertise (now 34th).

Esconced

Only one other skill in the top 25 featured in more than twice as many ads as a year ago, and that was SQL, up from 2,800 to 5,700 posts. It remains firmly esconsed in the top 10 in eighth place. Two other skills that just failed to reach 100% growth were Windows NT-up from 4,900 to 9,700 posts – and, one place outside the top 25, SAP, which appeared in just under 1,500 advertisements this time.

The others that showed more than average growth in demand were, in descending order, Visual Basic, object-oriented programming, Access, Delphi, Oracle, RPG400, Ingres and Cics.

Apart from the last three, all the skills listed as growing in popularity in 1998 are new wave products. This suggests the boom in IT recruitment is being fuelled by a combination of a flourishing UK. economy and a chronic shortage of skilled IT personnel, rather than short-term factors such as the year 2000 issue.

RPG400 owes its return to the top 10 to a resurgence in recruitment by AS/400 sites – this sector showed the biggest increase in demand over the quarter. Cics, similarly, is showing a growth rate almost identical to that of IBM mainframe recruitment overall.

Curiously, DB2 has not shared in this growth – demand here was up just 28%, compared to 55% for IBM mainframe staff generally and 57% for Cics expertise.

Two older IBM mainframe database products, IMS and IDMS, showed much bigger growth. IMS appeared in twice the number of ads as a year ago (620) and IDMS in nearly three times as many (500, 75% of which were in IBM sites), and these two skills are now in 40th and 44th places respectively.

Other IBM mainframe legacy skills to appear in more than twice as many ads as a year ago were PL/I (850 posts, now 33rd in the fist, its highest position since 1993), DL1 (490, 46th) and JCL (450, 50th). There is perhaps a year 2000 factor here – IBM mainframe sites are clearly looking for a greater proportion of staff with legacy skills than in early 1997.

But the numbers are so small in the context of 65,000 jobs on offer in total, that this does not materially affect the overall picture

Epitome

Cobol, the epitome of legacy skills, has shown growth of 52%, which is very much in line with the overall market growth and of the rise in IBM mainframe recruitment (more than four out of five Cobol jobs are in these sites). As a result it has actually fallen a place in the table, as demand for Visual Basic has risen at a significantly faster rate.

Taking a long-term view, it is instructive to look back four years to that first quarter of 1994, when Unix moved into the first place it has held till now. The changes in the skills most in demand then and now are remarkably few, and most of these were not widely forecast in 1994.

Of the 10 skills most in demand four years ago. eight remain in the top 10 today.

Those that have dropped out are Ingres and Lan, and their replacements are Windows NT (which was down in 43rd place in the first quarter of 1994) and Visual Basic.

Apart from Ingres. only four others dropped out of the top 20 MS-Dos, VMS, graphical user interface (GUI) and Informix. Their replacements are Java, Powerbuilder, Office and object programming.

So, comparing the two tables, we can see the rise of Windows NT. and the growth in interest in the Internet (as represented by Java), in object-oriented programming (both as a generic skill and in terms of the displacement of C by C++), and in more modern application development methods (as represented by Visual Basic and Powerbuilder).

Offsetting that, MS-Dos and Vax VMS have fallen from favour, while all the open systems databases apart from Oracle have lost significant popularity.

The biggest surprises, though, are what have not happened. Cobol and RPG400, far from disappearing, are today in more or less the same places as they were then.

The mainframe, which many in 1994 thought would be extinct by now, is still represented by DB2 (in exactly the same place) and by Cics (actually five places higher), as well as by Cobol.

Leave a comment