Oral-History:Fred Andrews

From ETHW

About Fred Andrews

Andrews worked with Bell Labs from ca. 1948 to 1982, and then, after divestiture, switched to Central Services Organization/Bellcore for the next seven years. His first research was on switching and line concentration. He then worked on early digitalization, and figured out that bipolar transmission was the best way to implement pulse code modulation. He describes Bell Labs' slow switch to digital electronic switching (Northern Telecom was more of a technological leader), taking until the 1981 5ESS system to be substantially complete. He then worked on transmissions systems engineering, figuring out automatic systems to ensure voice quality. He was part of the Comité Consultatif International Telephonique et Telegraphique (CCITT) that decided there should be no more than one satellite-hop in telephone transmissions, and later, via the CCITT and IEEE, figured out ways of doing objective measurements of telephone set quality. He was strongly involved in the IEEE Communications Society. This interview provides an overview of Andrews' career and of communications engineering innovations.

About the Interview

FRED ANDREWS: An Interview Conducted by David Hochfelder, IEEE History Center, 4 November 1999

Interview # 380 for the IEEE History Center, The Institute of Electrical and Electronics Engineers, Inc.

Copyright Statement

This manuscript is being made available for research purposes only. All literary rights in the manuscript, including the right to publish, are reserved to the IEEE History Center. No part of the manuscript may be quoted for publication without the written permission of the Director of IEEE History Center.

Request for permission to quote for publication should be addressed to the IEEE History Center Oral History Program, IEEE History Center, 445 Hoes Lane, Piscataway, NJ 08854 USA or ieee-history@ieee.org. It should include identification of the specific passages to be quoted, anticipated use of the passages, and identification of the user.

It is recommended that this oral history be cited as follows:

Fred Andrews, an oral history conducted in 1999 by David Hochfelder, IEEE History Center, Piscataway, NJ, USA.

Interview

INTERVIEW: Fred Andrews

INTERVIEWER: David Hochfelder

DATE: 4 November, 1999

PLACE: Piscataway, New Jersey

Overview of advances in communications technology and communications engineering

Hochfelder:

Could you describe what you feel are the leading advances in communications technology and communications engineering since World War II?

Andrews:


Audio File
MP3 Audio
(380 - andrews - clip 1.mp3)


I started with Bell Laboratories in 1948. The situation at that time was that communications was beginning to be provided on a large scale with vacuum tube-based carrier transmission systems using frequency division multiplexing. These systems had their major impact on long distance communications. Also, at that time most of the long distance connections were being established by operators. We didn’t have any direct distance dialing. That was all ahead of us. The technologies that were available were fairly limited, primarily vacuum tubes, and of course discrete passive components - resistors, coils and capacitors.

The point contact transistor was invented in 1947, and the junction transistor in 1948, but they were very, very crude devices at that time. I can remember that when I was working in New York at the West Street location of Bell Laboratories, the director entered our research laboratory with a Mason jar in hand. Mounted on the inside surface of the lid was the first junction transistor. He said, “Look, you guys, see what you can do with this thing.”

At that time we had already begun experimenting with point contact transistors, which were very unpredictable things. We immediately set about trying to see how the junction transistor fit into the kinds of circuits that we’d been experimenting with. We soon learned that junction transistors had entirely different characteristics. One difference was that it was an NPN device instead of a PNP device, so you had to reverse all the biases and power sources. Also, it didn’t actually amplify current between the emitter and collector circuits. Entirely different kinds of circuits were required and we had to learn all over again about solid state devices. The advent of the transistor was the beginning of a lot of changes that were to come in the communications business

Switching research; devices in communication systems

Andrews:

The work that I was doing at the time was called switching research. We were trying to apply whatever electronic devices were available to reduce the cost of providing the connections to customers, primarily what is now called "the last mile". This part of the telephone network hadn't changed much from the time of Alexander Graham Bell, who postulated a tree of cable pairs reaching out to customers from central offices.

Line concentrators were a way of sharing the use of cable pairs among a number of customers. These systems were based on the fact that not all customers were likely to make telephone calls at the same time. The concept was to deploy a switching system out near groups of customers and to concentrate the traffic of these customers on a smaller number of physical cable pairs.

Electron tubes weren't useful in building such a system because of their limited lives, high power consumption, and high maintenance costs. We turned to cold cathode gaseous discharge tubes and semiconductor diodes, anything we could find to somehow implement the concentration switching function. Eventually the line concentrator idea was implemented by others at Bell Labs, but with relays and not electronic technology. The idea was a good one, but was way ahead of our ability circa 1950 to build an electronic implementation at cost that would actually save money. The further development of the transistor soon changed this.

My work continued to look at ways that various innovative devices could be used to implement the functions that were required to build communication systems. Among these were non-linear magnetic cores, which had two distinct states of magnetization. These binary states could be used to implement digital logic circuits, the basic tool of digital communications. Of course, the transistor very soon took over this role, particularly with the advent of integrated circuits and the mass production of many transistors on a single silicon chip.

Transistors and digital transmission

Andrews:

My real opportunity to apply transistors to digital communications came in 1956, at which time I was transferred to a department working on the development of the first commercial digital transmission system. The application was for providing interconnecting trunks among local central offices. These trunks were typically ten to twenty miles in length. The frequency division multiplexing techniques that had been used successfully for long distance transmission didn't prove-in at such short distances.

During the WWII, considerable work had been done on digital transmission. It appeared that advances in pulse code modulation made in connection with military systems would be useful in developing low cost carrier terminals suitable for short trunks.

Pulse code modulation

Hochfelder:

Can you explain what pulse code modulation is, and why it’s desirable to digitize the trunk exchange lines?

Andrews:

Yes. One of the highest cost elements in the frequency division systems used up until that time was the individual channel filter. There was a separate filter design required for each of the channels so that channel signals could be modulated to an assigned frequency, combined on a common medium, and separated at the far end for demodulation.

On the other hand, a PCM system is implemented using a common design of simple low pass filter on each voice channel, confining the band to slightly less than four kilohertz. The system samples the voice signal at twice the highest frequency in that restricted band (the Nyquist sampling rate) and converts the amplitude of each sample into a digital representation using a time-shared coder. Our original work was with a seven-bit code, but later systems used an eight bit code.

The codes from the channels are combined on a common medium, but in separate time-slots rather than separate frequency slots. At the far end the codes are decoded to reconstruct the original amplitude samples, which are then distributed to individual low pass filters to reproduce the original channel signal. Studies showed that the cost of implementing a system this way should be considerably less than that of a frequency division system.Implementation turned out to be much easier said than done. The major problem was the faithful reproduction of voice over the full range of levels expected in a real-life conversation. With linear encoding, the steps that could be represented in a seven or eight digit code weren’t small enough for the low amplitude parts of speech. To beat this problem instantaneous companders were used to compress high level samples before coding. At first, we used the nonlinear characteristics of matching semiconductor diodes to compress at the sending end and expand at the receiving end. That turned out to be impractical for commercial applications.

Hochfelder:

To get the diode characteristics to match?

Andrews:

Yes, primarily because matching required very accurate temperature control. Fortunately that difficulty went away with the invention of nonlinear encoding and decoding that could be implemented logically instead of with temperature controlled diodes.

Then there was a problem of synchronizing the two ends of the system. We were dealing with a twenty-four channel system at that time. When you put all this together - twenty-four channels, an eight kilohertz sampling rate, and eight bits per sample - you have a more than 1.5 megabits per second, the well-known T1 carrier rate.

That was viewed then as a lot of signal to transmit on a single cable pair. Pulses deteriorate because of the frequency distortion characteristics of the medium. We concluded that it would be necessary to regenerate the pulses every six thousand feet.

Hochfelder:

How is that done?

Andrews:


Audio File
MP3 Audio
(380 - andrews - clip 2.mp3)


First, you frequency equalize and amplify the pulses. You derive a clock signal from the incoming pulses for defining the time at which individual pulses are expected. A threshold device determines whether a pulse was actually present at the expected time. Finally, new pulses are launched at the output of the repeater with the same pattern of 0's and1's as received.

The repeaters don't care about the beginning and end of 24 channel frame, but the terminals do. The question was, how to synchronize the terminals with a minimum of additional bits. Henry Mann and I were contemplating that problem one day and hit upon a scheme. Were we to add a single pulse position with a unique time sequence at the beginning of a frame at the sending end, it could be found to establish synchronization at the receiving end. We concluded that a pulse alternately on and off would work just fine. There wouldn’t be any other position in the pulse train that would be consistently on and off since that corresponded to a frequency of 4kHz. That is above the cutoff frequency of the low pass channel filters. In this way we solved the synchronization problem by adding only one pulse to the basic frame of 192 pulses. We were really proud of that. Only later did we realize that 193 is a prime number posing added difficulties in designing the channel multiplexing circuits.

But the worst was yet to come. The exchange trunk application required the ability to run multiple systems on cable pairs within the same cable sheath, perhaps as many as ten or twenty. While the twist in the cable pairs provided a level of decoupling which was satisfactory at voice frequencies, it allowed too much interference between systems operating at 1.5 megaHertz.

Hochfelder:

To let it cross the lines?

Andrews:

Yes, the interference made it difficult to make a clean decision on whether any given time slot contained a zero or a one. To make matters worse, the pattern of zeros and ones changed the DC component of the pulse train, causing the baseline of the signal would to wander up and down. DC restoration was needed to keep the threshold firmly between the extremes of the pulses. That turned out to be a messy problem.

An idea that suddenly it hit me was to make the pulses that represent a one alternately positive and negative, with no pulse still representing a zero. This eliminated any long term DC component and any baseline wander. Bob Aaron was quick to point out that this signal format also cut the highest discrete signal component in half, to about 750 kHz. That was enough to solve the crosstalk interference problem as well.

Hochfelder:

That’s interesting, because that’s Kelvin’s idea for reducing the capacitance effect and increasing data rate on submarine cables, where alternating dots were positive and dashes were negative, I believe.

Andrews:

It’s different in the sense that the ones were always a pulse, but those pulses were alternately positive and negative. Of course, you now have a three-level signal, and need dual decision levels above and below the baseline. Solving both the DC restoration and crosstalk problems was well worth the added circuit complexity. And there was one more bonus attribute. If there were ever two positive or negative pulses in a row you knew something was wrong.

Hochfelder:

This was also good error checking.

Andrews:

Right. You couldn’t actually correct the codes, but you knew when there was something wrong with the line. We called these events bipolar violations, and they escalate rapidly on bad lines. To the best of my knowledge, the bipolar transmission used on the first commercial T1 systems is still used when the transmission medium is cable pairs.

As luck would have it, I actually was promoted to another job before the commercial implementation of T1, but the basic principles had been laid out. John Mayo succeeded me in the job and was instrumental in moving the system into final development at the Bell Labs unit located with Western Electric at Merrimac Valley. The first systems were introduced in 1962. As they say, the rest is history. There have since been a whole series of systems that are based on these digital principles. We had laid the foundations for digital transmission that would ultimately transform the telephone network.

Digital switching, electronic switching systems

Andrews:

From this work grew the concept of digital switching, where you could use the same techniques to convert voice signals to digital form, and then switch them on a time division switching basis using time slot interchange. The Bell system did ultimately go in that direction in the development of the 4ESS switching system. It was intended for long distance, rather than local, switching and in very large switching centers.

But I get ahead of my story. About the same time that the T carrier was introduced, the Bell System also introduced electronic local switching, the 1ESS. Connections were established by metallic contacts, but the system was electronically controlled using stored software programs. The metallic connections were provided by proprietary technology known as the fereed switch. These devices were mechanical switches in a slim bottle controlled by an external magnetic field, a technology unique to the Western Electric Company.

Others in the communications switching equipment business didn’t have access to that technology. I am told that this was a major factor leading Northern Telecom to implement local electronic switching using digital time division principles. This was apparently true of others in the industry as well. They must also have seen that advances in technology would make digital switching more and more attractive as time went by. How right they were.

Northern Telecom really scooped the Bell Laboratories with its DMS 10 system. Yes, the 1ESS was doing marvelously well, and finding a big market. In retrospect, Bell Laboratories was reluctant to head off in a whole new direction. We made economic studies about the cost of digital versus analog switching, concluding that it was about a standoff.

Finally it became clear that the digital technology learning curve was going to drive down the cost of digital components making the transition to digital switching the sensible thing to do. It was in 1981 that the 5ESS was introduced, nearly 20 years after the first digital transmission system.

Hochfelder:

What does the acronym ESS or 1ESS, 5ESS, etc. stand for?

Andrews:

ESS stands for Electronic Switching System.

Hochfelder:

Numbers are the generations?

Andrews:

Not so much a generation number as a number in a series of electronic switching systems, both analog and digital. While the push for electronic switching systems came from Bell Laboratories, others really provided the impetus for the further step into digital local switching.

Hochfelder:

And Northern Telecom?

Andrews:

To the best of my knowledge, Northern Telecom was key, though there were others as well. I know that Jack McDonald was actively involved in the development of digital switching at TRW in the same time frame.

T-carrier project; transmission systems engineering

Andrews:


Audio File
MP3 Audio
(380 - andrews - clip 3.mp3)


Let's jump back to 1962 when the T-carrier systems were introduced. Actually T was the designation of the digital line, and the terminals had a D designation. So the D1 terminals worked with a T1 line. The T1 lines became much more widely used for many different applications, and not just with PCM terminals. In that sense, the story gets more complicated than future readers of this interview probably care to learn.

When I left the T-carrier project, I became involved in a broader endeavor known as transmission systems engineering. This work was aimed at understanding the needs and requirements for transmission systems, as opposed to actually being involved in the design and development systems.

A key issue at that time was providing a satisfactory level of voice quality when many systems were connected end-to-end to provide an overall voice connection. If you think about it, a telephone connection is assembled as product in response to a customer's dialed digits. There are typically at least five transmission links in tandem on a long distance connection. Before the era of direct digital connections, each of those links had to be maintained to strict standards or the overall assembled product would not have adequate quality for a satisfactory conversation to take place.

The challenge was not only to meet strict requirements in design, but to operate systems in a way that guaranteed a quality connection. By means of automatic testing the telephone companies could periodically determine whether or not transmission links met the quality requirements. Those that failed could be automatically taken out of service. Automatic transmission testing was just the beginning of a whole new field in communications. The design and deployment of operations support systems to manage communications networks has become a major part of the business. During my decade of work in transmission engineering we laid some important foundations for the future of this work.

Assessment of career achievements

Digital transmission

Hochfelder:

What are the technical achievements you’re most proud of?

Andrews:

I’m most proud of the work that I did on digital transmission. It was really hands-on work that, in my opinion, had a significant impact on the future of communications networks. After that, I was primarily involved in what you might call technical management with less opportunity for a hands-on technical contribution.

Satellite communications and echo suppression

Andrews:

I am also proud of the recognition that I have received for contributions to international communications standards. One of the issues I was involved in was the effect of the long delays on telephone communications. When you communicate via satellite, in particular a geostationary satellite, the time for a signal to go from the earth to the satellite and back is about two hundred and sixty milliseconds. That means that, with all factors considered, the round trip for a signal going from point A to point B and then coming back by the same path is of the order of 550 milliseconds, more than half a second. That is a very detectable delay in a conversation, particularly when actively talking back and forth.

Hochfelder:

Now, is it the speed of light essentially?

Andrews:

Yes, and you can't beat it. The question we faced was how acceptable will those delays be to actual telephone users? An alternate technology at the time was the Bell System's low earth orbit experimental satellite, Telstar. Being in a low earth orbit of several hundred miles, the Telstar satellite was in constant motion around the Earth. Two tracking antennas had to be used to establish a communications link. But the advantage was a much shorter inherent transmission delay.

The concept of low earth orbit satellites has been revived in today's Iridium and Globalstar systems, but with much more sophisticated satellites and ground control. Delay is not an issue in these systems, but economic viability is.So the question we faced in the 1960's was how acceptable would telephone connections be with these long delays, and would it be possible to actually allow multiple geostationary satellite hops? If you linked two satellites in tandem and you’d be up to 1,100 milliseconds. Now you’re over a full second of delay. The problem with the delay is not just the fact that the conversation seems sluggish, but there are inevitable echoes in a telephone connection that must be controlled. You’re familiar with the difference between two-wire and four-wire transmission?

Hochfelder:

No.

Andrews:

Basically, a telephone set has just two terminals connected to one pair of wires, which serves both directions of transmission. That’s called two-wire transmission, and is almost universally used in local connections. Long distance transmission systems, on the other hand, have two separate paths for the two directions of transmission, and are called four-wire systems. Hybrid coils are used to connect four wire systems to the two wire end links leading to the telephone sets. With a perfect impedance match none of incoming speech energy is sent back to the sending end as an echo, but the match is never perfect... The echo is separated from the original by the round-trip delay of the connection. There may be intermediate sources of echo as well, but the end link echo is controlling.

With the short delays of terrestrial connections, the echo problem is handled quite readily by adding a bit of loss with the delay of each added transmission link. The more delay in the echo getting back to you, the lower in level it must be so that it doesn't annoy you.

That method of controlling echo by putting in more loss is only ok up to point. Eventually you must use echo suppressors, a voice switched device that allows transmission in only one direction at a time. That was what was required on satellite connections because of the very long delays. However, the technology had to be refined to avoid very obvious chopping of the speech signals by the path switching action.

Eventually, echo suppression technology became very sophisticated, and instead of actually cutting the reverse path, a version of the incoming signal and was used it to cancel the echo signal going back. That was the concept of echo cancellation, allowing two directions of speech simultaneously but with the echoes cancelled. That principle has worked so well that the primary impact of a satellite connection is of the delay itself. As we said before, the delay is inherent in the geometry.

Hochfelder:

Right. That’s not going to go away.

CCITT; international telephone standards

Andrews:

In 1962 I was asked to get involved with the meetings of the CCITT, the international consultative committee on telephone and telegraph standards. Study Group Twelve of that committee had the responsibility for recommending what limitations on delay should be put on end-to-end connections. Obviously, the outcome of those deliberations would have a big impact on the whole satellite business, then its infancy.

We finally came to a recommendation, but only after the rancorous discussion of a great deal of subjective testing data. Strangely enough, people were very skeptical of any test that didn’t involve real circuits and real traffic. It seemed perfectly logical and acceptable to simulate a long satellite connection by adding artificial delay to real cable connection. We did this on transatlantic cable circuits, and set up connections in the real environment in which satellites were likely to be used.

After the calls were completed, we called back the people that had made the calls and asked them a series of questions about the quality of the call, without telling them anything about it, not giving them any idea that there was anything different about this circuit. In the early 60's when the tests were conducted, transatlantic calls weren’t all that common. The first transatlantic cable was laid in 1956, and had only 36 channels. So there wasn’t a lot of traffic at that time. We used the call-back data as the basis for determining the subjective difference in quality between circuits with long delays and short delays.

Hochfelder:

What were the responses?

Andrews:


Audio File
MP3 Audio
(380 - andrews - clip 4.mp3)


There was a significant difference, and it became a real concern with delays that were the equivalent of two satellite hops in tandem. We were able to reach an agreement on a recommendation that one link of satellite was okay, but that two were not. That affected the routing plans used in international telephony but allowed satellites to play an important role. The one-hop rule, as far as I know, still stands. Of course, the whole global voice transmission business has been overtaken by undersea fiber optic cables, with satellites playing a more specialized role in global communications. Satellites were very important for a period time, and I feel that we were wise to allow them to be included in global voice connections, but with a limitation of one hop. In fact, it was probably the only recommendation that would have been acceptable politically.

As a result of the delay issue, I became deeply involved in the international telephone standards business. I became chairman of Study Group Twelve of CCITT and served in that capacity for many years.

One of the things that Study Group Twelve did was to define standards and methods for measuring the quality of telephone connections. A difficult challenge was to rate the telephone set itself. At that time, telephone transmitters were still based on the use of carbon granules. Carbon transmitters were low cost, efficient, and suppressed low-level background noise. At the same time they were very unstable and difficult to measure. The CCITT maintained a laboratory with a 4-5 member staff that spent much of its time performing subjective tests of telephone sets.

One of the things that Study Group XII tackled was replacing those subjective methods for rating telephone sets with objective measurements. Instead of using real voices for testing, a voice-like signal source was defined that could be reproduced in laboratories throughout the world. This allowed objective measurements that could be replicated. That work led to my involvement with the Communications Society.

IEEE Communications Society

Andrews:

I had been a member of the IEEE for a long time, but had not been active in any of its technical committees. My boss, J. W. Emling, felt that IEEE ought to lead the way in the objective rating of telephone sets. At his instigation I formed a new standards committee within the Wire Transmission Committee, which was part of the Technical Committee on Communications, later to become the Communications Society. We successfully developed an IEEE telephone testing standard, which we carried forward to CCITT as input to an international recommendation.

That was the beginning of my nearly 40 years of involvement with that Society at various levels, including being President in 1986 and 1987. As Technical Vice President in 1984 and 1985, there were two initiatives that I pushed. One was to establish a technical committee on operations support systems. As a result of my work at Bell Laboratories and Bellcore after 1984, I felt that this was an activity that was sorely missing under the Communications Society. Largely at my instigation, we established a committee called Network Operations and Maintenance Systems (NOMS). That activity went on to become a very active part of the Communications Society, with its own annual conferences and workshops.

My other major initiative was to develop a focus on quality control and quality assurance of the elements that were used to build communications networks. That led to a new committee on quality, which is still active and thriving.

One of the things I particularly enjoyed as President was writing a monthly column for Communications Magazine. I tried to put emphasis on technical issues as opposed to administrative matters, and I received many positive reactions from this. None were negative to the best of my knowledge. When my term of office was over, the Board presented me with a plaque that listed the subject of every one of the columns that I had written.

Hochfelder:

That’s a nice touch.

Andrews:

Yes, it really was. Another thing is that we started during my tenure was taking some of our major meetings overseas. We had a meeting in 1987 in Tokyo.

Hochfelder:

Involving foreign members has been a relatively recent thing?

Andrews:

At least for the Communications Society it has. I think it really began in the mid-’80s. Since then the effort has been much more intense and much more successful.

Technical Committee on Communications

Hochfelder:

You were on the Technical Committee on Communications, I believe.

Andrews:

Yes, on the Technical Committee on Communications and on the subcommittee that developed the constitution to change the organization to a society.

Hochfelder:

So you were there from almost the beginning?

Andrews:

Yes, right.

Hochfelder:

Early ’60s?

Andrews:

I’m not quite sure of the exact timing. I think it was Dick Kirby who was the chair of that committee. You haven’t talked to him yet?

Hochfelder:

Not yet, but he’s on the list.

Andrews:

I was one of the members of the reorganizing committee. Bill Middleton was also involved. In fact, I think he may have written most of the text of the constitution. What reminded me of this is that after served as President I was given the task of rewriting parts of the constitution. As you might expect, it is much easier to make improvements after years of experience. For example, I proposed having the term of office of the President be two years instead of one, without the need for re-election. There had not been a single instance of anyone not serving two terms. We changed the term to two years and introduced the office of president-elect to be elected just one year before the term of office was to begin.

Also we began a much harder push toward getting members from outside the United States on the Board of Governors. It was a role of the past president to run the nominations committee. I can remember that my committee tried very hard to add enough international candidates to the slate so that one or more had to win. Now having international leadership is just a natural part of ComSoc.

Communications industry, 1980s; divestiture and Bellcore

Andrews:

This discussion has taken me almost to 1990, when I retired from Bellcore. Let's go back to the early 80's and talk about what was going on in the communications industry.

Hochfelder:

The break up of AT&T?

Andrews:

Yes, and the establishment of Bellcore.

Hochfelder:

How did that affect your career?

Andrews:


Audio File
MP3 Audio
(380 - andrews - clip 5.mp3)


It was a major change. It was early in 1982 that the tentative agreement between AT&T and the Justice Department was reached and announced. I was scheduled to appear before Judge Green about two weeks after the announced settlement. I was in the middle of getting my testimony together to explain why it would be very detrimental to break up Bell Systems. Lo and behold, management decided to do just that.

I’ve gone back and looked over the testimony that I was to present. In it I had emphasized that without end-to-end control of the network, it would be much more difficult to introduce new national services. At that time we were on the verge of introducing a number of things, including high speed data services and custom calling services, including calling number forwarding. Also we postulated a packet switching service, a sort of Internet-like capability. My argument, which I never got to present, was that it would be extremely difficult to roll out these kinds of national services if you had separate organizations responsible for the parts of the connections involved.

That view turned out to be quite correct. It was a long time before we actually accomplished any new kinds of national end-to-end connections. Just take the matter of common channel signaling. We were at the point of interconnecting local common channel signaling with long distance common channel signaling, so that you could send calling numbers all the way from New York to California. We still can’t do it even today.

Hochfelder:

I think it may work within your local or regional provider.

Andrews:

It doesn’t work beyond the regional provider because they’ve never been able to work out the agreements necessary to properly compensate carriers for their contribution to the service. ISDN eventually happened, but even that was set back a long time, perhaps fatally, by the separation of the network into local and long distance parts.

I wasn’t really looking forward to going to Washington and being grilled in front of Judge Green. I was relieved, in a sense. At the time of divestiture, I was responsible for switching systems engineering for Bell Laboratories. We began to work out everything that had to be implemented in order to make it possible to bring about this division between multiple long distance carriers and local carriers. There had to be a complete scheme for setting up connections so that all long distance carriers would be treated in an equal way. Up until that time, there were different access arrangements depending on whether you were going to complete the call by MCI, Sprint, or AT&T. That had to be changed, and we developed a plan for setting up calls on an equal access basis. That planning work had begun before the divestiture agreement was reached, but now it had to be completed much more urgently. The plan required the development and implementation of changes in virtually all of the switching systems that had been deployed.

Hochfelder:

Sounds like quite a challenge.

Andrews:

It was a huge undertaking. The people that were involved with the planning and implementation deserve a tremendous amount of credit for having pulled it off as quickly as they did.

Hochfelder:

Yet they probably won’t get the recognition they deserve because they dismantled the system as opposed to building it.

Andrews:

Right. They were the unsung heroes, if you will.

In early 1983, Bell System management was really getting serious about setting up an organization to provide the R and D support for the seven regional telephone companies before the actual divestiture took place on January 1, 1984. As I told you earlier, John Mayo took my job when I left digital transmission project in 1956. Now Mayo was the Executive Vice President at Bell Laboratories. He said, “Fred, we have a great opportunity for you.” He told me that Irwin Dorros the Chief Technical Officer of the newly created Central Services Organization, wanted to talk to me about becoming a vice president and founding officer of the CSO. I learned that the job Irwin had in mind for me involved the disciplines of transmission, switching, operations systems, and quality assurance. By then I’d had some experience in each of these areas, so I thought, “Well gee, I can do this.” I took the challenge, which meant ending a thirty-five year career with Bell Laboratories and helping to form a brand new company. It turned out to be a great experience and great fun, too.

My job was to oversee the development of requirements such that equipment meeting these requirements would fit seamlessly into the local exchange networks of the telephone companies along with equipment from other manufacturers. You don’t buy a telephone network. You buy parts of it and put it together, and all of the parts have to fit.

In this process the telephone companies were the boss. They reserved the right to accept or reject our requirements in whole or in part. They decided what they would buy and from whom they would buy it. We simply provided them with the information that they needed in order to deal intelligently with their vendors. That was a complete reversal of mindset from the days when the operating companies were expected to accept the authority of AT&T on these matters. Also, there was a lot of skepticism on the part of the vendor community. Many thought that setting up the CSO was just a new way for the telephone companies to steer their business and their favorite suppliers. Of course, that wasn’t the intention at all. The goal was to open up the market and to have a really open process allowing as many vendors as possible offer products that would satisfy telephone company needs.

Early in the life of the CSO I was invited to provide a seminar sponsored by USITA (United States Independent Telephone Association), primarily for its vendor subcommittee. I was to talk about the role of Bellcore: What does it do? How is it doing it? The moderator was Michael Birck, a former colleague at Bell Labs and a founder of Tellabs. His company provided equipment to the independent telephone market, but increasingly in recent years to the Bell System as well. He was among the skeptics about whether my Bellcore role was a viable one.

Over the seven years that I worked for Bellcore I believe that everyone became much more comfortable with our generic requirements process. A lot of new vendors successfully entered the local telephone network business. At the same time we maintained some pretty tough standards of performance. By and large, the integrity of the network was maintained while the market became very open to all competent vendors. I take pride in helping to make the massive breakup of AT&T work, and that may well have been the high point of my career.