Archive | Networking

The Use of Amplifiers in Fiber Optics

Most experienced network administrators, network engineers and cable technicians have a solid understanding of how fiber optics are used to transmit data / information.  While working with fiber hands on is a niche area reserved for those that specialize in cabling such as professional cable contractors and select employees from cable companies and phone companies. For anyone interested in learning a little more about fiber beyond text book theory this is a good article to better explain some interesting real world fiber optic facts.

Without fiber optic amplifiers, a signal on a network can only travel a limited distance. The average distance a signal can travel through a fiber optic cable without amplification is approximately 124 miles. This short range is not suitable for wide area networks, thus amplifiers are necessary in order to allow a network to function properly.

Three Places Amplifiers Are Commonly Used in Fiber Optics

There are three places where fiber optic amplifiers are most commonly used as it concerns fiber optic transmission. Amplifiers are used to boost the power of a signal before transmission over the optical cable even begins. This serves the purpose of extending the distance of the signal before any subsequent transmission is required. Line amplifiers are set up at carefully planned places along a network in order to maintain signal strength. It is these amplifiers that facilitate a signal’s long distance travel, and these amplifiers can keep the signal traveling for hundreds or even thousands of miles. Finally, Preamplifiers are used to raise signal levels upon reaching the input of optical receivers. The purpose of this is to facilitate adequate signal detection.

The Design of Fiber-Optic Amplifiers

Each type of amplifier is designed differently because they each serve different purposes. For instance, power amplifiers are designed to facilitate high gain. Preamplifiers are designed to keep noise to a minimum, and low noise levels are essential for these amplifiers to work properly. Because line amplifiers are required to enable signals to travel long distances over networks, they are constructed to keep noise levels low and allow for high gain.

Other places fiber optic amps are commonly located or used

There are other places on a network where fiber optic amplifiers are set up. A typical place other than the three aforementioned is in a switching node in order to make up for any loss that may occur in the switch fabric.

A Short History on Fiber-Optic Amplifiers

The invention of fiber-optic amplifiers is not a recent one. Most may think that it is, since the hype about fiber-optic networks has only been going on for the past 7 years or so. However, the first amplifiers were created in the year of 1987. These were called Semiconductor Optical Amplifiers, and they did not function well enough to impact signal transmission. These amplifiers were not nearly as advanced as they are now.

With the improvement of amplifiers over the years, a signal can travel over networks almost endlessly. These improvements were not made until recently, as researching and improving amplifiers was abandoned for the past couple of decades. Since attention has been given to amplifiers again, fiber-optic communication has improved dramatically and is becoming commonplace around the world.

Jason Kane from Orlando Florida writes about internet technology and fiber-optics from distributors like FluxLight.

Posted in Cable Company, Fiber amplifier, Fiber Optic Cable, Fiber Optics, Network Cable, Networking, Orlando Cable Contractor, What is?0 Comments

SOURCES OF SERVER AND NETWORK DOWNTIME

Unplanned server and network downtime can be caused by a number of different events:

• Catastrophic server failures caused by memory, processor or motherboard
failures

• Server component failures including power supplies, fans, internal disks,
disk controllers, host bus adapters and network adapters

• Software failures of the operating system, middleware or application

• Site problems such as power failures, network disruptions, fire, flooding or
natural disasters

To protect critical applications from downtime, you need to take steps to protect
against each potential source of downtime.

Eliminating potential single points of failure is a time-tested technical strategy for reducing the
risk of downtime and data loss. Typically, network administrators do this by introducing redundancy in
the application delivery infrastructure, and automating the process of monitoring and
correcting faults to ensure rapid response to problems as they arise. Most leading
companies adopting best practices for protecting critical applications and data also
look at the potential for the failure of an entire site, establishing redundant systems at
an alternative site to protect against site-wide disasters.

Posted in Computer Repair, Computers, Data Backups, Data Recovery, Data Storage, Hard Drives, High Availability, Memory, Motherboards, Networking, Servers0 Comments

THE IMPACT OF NETWORK AND OR SERVER DOWNTIME

A failure of a critical Microsoft Windows application can lead to two types of losses:

• Loss of the application service – the impact of downtime varies with the
application and the business. For example, for some businesses, email can
be an absolutely business-critical service that costs thousands of dollars a
minute when unavailable.

• Loss of data – the potential loss of data due to an outage can have
significant legal and financial impact, again depending on the specific type of
application.

In determining the impact of downtime, you must understand the cost to your
business in downtime per minute or hour. In some cases, you can determine a
quantifiable cost (orders not taken). Other, less direct costs may include loss of
reputation and customer churn.

The loss of production data can also be very costly, for a variety of reasons. In the
manufacturing environment, the loss of data could affect compliance with regulations,
leading to wasted product, fines, and potentially hazardous situations. For example, if
a pharmaceutical company that is manufacturing drugs does not show all of the
records of its collected data from the manufacturing process, the FDA could force the
company to throw away its entire batch of drugs. Because it is critical to know the
value for every variable when manufacturing drugs, the company could face fines for
not complying with FDA regulations.

Publicly-traded companies may need to ensure the integrity of financial data, while
financial institutions must adhere to SEC regulations for maintaining and protecting
data. For monitoring and control software, data loss and downtime interrupts your
ability to react to events, alarms, or changes that require immediate corrective action.

The bottom line is downtime is very expensive and preventing downtime is the most important factor in any business operation.

Posted in Computer Repair, Computers, Data Backups, Hardware, High Availability, Networking, Servers0 Comments


Advert