Company network security management: a case study of grenada of electricity services Ltd.

Abstract

Incessant Analysis, modification of design, and methodology development of Grenada Electricity Services Ltd. to enhance the company protection is central to the company network security management in this highly competitive industry of the 21st century.  This dissertation is anchored along this issue and bid readers apt measures to update and upgrade under viable trade.

Considering the speed at which developments evolve in the electronic industry, regular updating becomes logical to Grenada Electricity Services Ltd. to be at par with the latest modifications.  In like manner, the paper is suitable to every IT outfit.

In today's fast-changing and often risky electronic business environment, it is likewise crucial for IT of Grenada Electricity Services Ltd. to effectively secure its network systems as these are the arteries of its modern business and are vital for its sharing information and communication.  It is the lifeline of this company as an organization.

At the same time, Grenada Electricity Services Ltd. IT managers are expected to enhance worker productivity and implement new technologies that will drive competitive advantage for the business. Otherwise, it would be a process that is both costly and time-consuming to deal with outdated systems.

This case study describes how, by adopting an organized approach to security in Grenada Electricity Services Ltd. IT managers will spend less time cleaning up messes and more  time helping their organization meet its goals.  It will also look at some history of networking, as well as an introduction of a description of risk management, network threats, firewalls, and more special-purpose secure networking devices.

Furthermore, this report also contains information related with networking:   history, topology, architecture, equipment of the Grenada Electricity Services Ltd, advantages and drawbacks, security and threats. Included are other causes of problems in network security from the software used to maintain the over all system of Grenada Electricity Services Ltd.: software crisis, maintaining the software as well as brief description of wireless network security, wired vs. wireless, WEP, WPA, VPN, etc., and other issues such as considering real-time database system and real time system for accurate results.

1. -    Brief overview on information management and security

        1.1. -            Information technology from the late 19th century

Looking back at the early beginnings of information technology in the late 19th century, it was humbly a working and operational mode of calculator machines.

Communication was not its initial aim. It was merely to facilitate calculations of large sums of numbers. Its capability was limited then.

A replica of the machine is demonstrated by International Business Machines initial computers that are as large as a building can hold.  Nevertheless, this idea prompted enthusiast to enhance its capabilities which brought about the latest innovations in technology these days.

           1.1.1. -       The early years of calculating

In the early years of the 60’s, entrepreneurial companies went into the business of host computing. Even governmental institutions went into computerization especially in its treasury departments.  Enterprising and creative minds saw this as an opportunity to improve computer capability.  Thus was the birth of midrange computer platforms that are cost effective and efficient beyond the initial machines

            1.1.2. -       The personal computer breakthrough

Computers have evolved from early abacus, papyrus and ENIAC to today’s main frames and super computers that can manipulate huge amount of information across the globe.

In today’s digital world, having a digital Information System (IS) fully equipped with precise up to date network security system management is important in order to compete globally. Taking advantage of the technologies in the digital world does give an extra push toward a flexible and feasible business. Information Systems knowledge which is very essential will be boosted too for a company like Grenada Electricity Services Ltd. to further grow and prosper.

Inclusive are the line of objectives such as reaching far away location, offer new products and services, reshape jobs and work flows as well, which will profoundly change the traditional ways of business. As the information age advanced the need for enhancements in computer network security management became a profound challenge.

          1.1.3. - The steady growth of the internet in Grenada Electricity

                       Services Ltd.

Actually, the use of computers for internet was discovered by a military man to share information. After the first dial up connection, the trend became popular amongst local public, especially Grenada Electricity Services Ltd. At present, computers are indispensable part of its existence. Starting from basic calculators to storing and manipulating of terabits of information computers became its handy machine. As the information age advanced, Grenada Electricity Services Ltd. need for enhanced tools such as computer network connections became indispensable.

          1.2. -     Starting security considerations for Grenada Electricity

                       Services Ltd.

To implement a system that will maintain and manage a perfect security issue of Grenada Electricity Services Ltd is central. With the advent of electronic age, an enhancement for efficiency records is a must. A software will be installed which will replace the existing system. This is an electronic system maintained by network administrator and the employees. This will make possible all existing payroll documents to be converted into electronic forms where employees can access using Intranet. High-speed Internet services will be provided for employees that will be monitored using local ISP.

       1.3. – The influence of internet on information security of Grenada

                 Electricity Services Ltd.

The recent developments in internet communications posted a critical level of danger on the information security of Grenada Electricity Services Ltd. Concurrence of cyber crime is all over the papers. Plus is the danger of losing highly important communications documents of economic transactions. The present security aspect of Grenada Electricity Services Ltd. is the centralized system which is weak compared to the client-server three-tier system because the security protocols are used for application security services. Client case management software that will then be installed will ensure highest level security system in the market for the Grenada Electricity Services Ltd.

       1.4. – Is complete information protection possible for Grenada Electricity

                 Services Ltd.?

Total information protection may not be doable at the moment. The two software architectures, SSL and VPN however, though each has its own advantages and disadvantages to fulfill the requirements of Grenada Electricity Services Ltd. the main concern is the security of the company where a system is feasible enough to protect and secure in the highest possible level, and to control its information files along exemplar plane.

2. -       Introduction to information security

        2.1. – Background of this study

Computers have emerged from early abacus, papyrus and ENIAC to today’s main frame computer and super computers that can manipulate huge amount of

information across the globe. The use of Internet came about after a military initiative with the initial interest of sharing information. After the first dial up connection however, the use of Internet became very popular mode of communication amongst local public. Today, computers became the most essential part of human existence starting from basic calculators to storing and manipulating of terabits of information.

The process of information has become easier compared to what it was a decade ago. In order to maintain and enhance information, the process has to be fast, more efficient and feasible. Traditional ways of information process such as sending mails using regular post office and using telephone or keep papers based documentation became insufficient and old fashion. To fulfill the demands computer and internet became a solution. Information Technology (IT), a process of information using computers and computer software is the answer, but it must likewise be upgraded and updated to suit market needs.

The objective to convert, store, and process, protect, retrieve and transmit information faster and easier than ever before is in place. There exist many IT companies that implements and provide services for such system who are trying to change the traditional ways of doing business.  For example, the Electronic business mode E-Com is commerce on the internet that maintains distributing, buying, selling, marketing and servicing of products or services. E-com involves electronic funds transfer, supply chain management, e-marketing, online marketing, online transaction processing, etc using electronic communication such as the Internet.

Companies subsequently established research centers all over the world to conduct research in developing new technologies which emphasizes managing process in order to deliver product that are cheap and fast while maintaining a good Information Management strategy.

Manipulation of information however, using documentation, emails and other paper works became a potential information overload for many companies. Hence employing the new advance technologies to manage and maintain the information overload became inevitable.

For example, many companies are giving services to educational institute by going online, decreasing the cost of infrastructures and providing free services: Global School net (GSN) [2], GSN works with schools, universities, communities, businesses and other organizations to develop free or low cost programs.  This is to educate students to compete with global workforce. ICT solution online, Think.com [3] is also an online environment for teachers and students. It is global in coverage. Teachers and students can easily communicate and share ideas and enrich classroom learning experiences through real time discussions.

The Internet is a set of connections of networks which is system created by huge mainframes in research establishments connected to normal computer at homes and offices.  It can be accessed from anywhere around the world. Internet was first developed in 1970s by the US Department of Defense's communications systems by interconnecting a collection of computers where no central computer storing huge amounts of data, rather information is dispersed.

Today, Millions of individuals, companies, programmers, consultants, researchers and students worldwide uses Internet to share information in a more fast and easy way of communication. Internet influences the growth of businesses by providing new, fast and efficient ways of advertising and new and different ways to reach the public and expand their organization.

Internet symbolizes sense of freedom which is uncensored and unregulated by the government. However, use of Internet has its drawbacks such as different security issues. The main security issue such as data access must be implemented in such a way that the privilege given to the individual or companies is controlled and can be manipulated.

Controlling the web-browsing habit such as browsing illegal or unwanted web sites need also to be controlled, hence providing user access to email, web sites using password and encryptions is essential. Keeping off the malicious users from accessing valuable company information and other external information that can be used against the GES stuff and the GES authorities needs to be taken under consideration. Sending emails and other attachments must be allowed according to hierarchy of access permissions.

Use of Internet is mostly possible if one uses computers or other devices such as mobiles. Need for speedy information has become an important issue in today’s business. Using such devices Internet caused a revolution in our society. “The power members of society: celebrities, professional criminals, fashion setters, etc become much less powerful than they are in real life. The social groups of advanced computer users, geeks, nerds, dweebs, hackers have a much larger power on the Internet because of their knowledge of its mechanics.” [4].

Peoples shopping style is also changing because of Internet. Selling products over the internet is cheap, fast and easy for both the company and the customer. Hence issues such as privacy and ethics are most important and needs to be concentrated on by the governments and education institute. Many of the ethical issues involve privacy. For example, privacy concerning e-mail uses by the employees, head office of a company and individuals.

In the late 1960s, networks only existed in the sense of huge mainframes and multiple networked terminals. Each terminal was connected using hub connected to one big central processing units spinning tapes and rotating drives. Today, networking is so vast and broad that with considering the security issues would be a great loss.

Security plays a main role when considering functions such as client/server network models, time sharing, or multi-user and multi-tasking processors. “It was not until the end of the 1960s and into the 1970s that the environment for network security did evolve.” [12].

Confidential information such as confidential data transmitted over public networks must be encrypted and the network connection must be secured. For example, no machine should be connected to other networks except the GES corporate LAN and a firewall should exist for communication over external (public & private) networks. GES email should be encrypted so that user can send information to other important client or users and when opening email, user should be aware of the risks of opening documents with macros, postscript files, etc via email.

The history of network security has been delineated, leading now into some of the numerous potential threats to information on a network.  Threats to network security range from harmless pranks to devastating crimes of destruction and theft. Breaches in network security occur internally by employees and externally by hackers. “In a recent attack on the Texas A&M University computer complex, which consists of 12,000 interconnected PCs, workstations, minicomputers, mainframes, and servers, a well-organized team of hackers was able to take virtual control of the complex.” [15]

Texas A & M attack is one of many examples that can be set as an extreme threat for any organizations. In order to avoid such attacks an organization need to be fully equipped with latest technologies and state of the art software such as antivirus (Escan [16]). “It is often impossible or very difficult to know if you are under attack and from whom and attackers sophistication has increased enormously in the last 5-10 years.” [5]

Other threats such as virus development have increased at an alarming rate. However, the most common cause of security problems are as stated “Human Error 52%, Dishonest people 10%, Technical Sabotage 10%, Fire 15%, Water 10% and Terrorism 3% and many computer crimes Money theft 44%, Damage of software 16%, Theft of information 16%, Alteration of data 12%, Theft of services 10%, Trespass 2%.” [5]

Wireless communications offer organizations and users many benefits such as portability, flexibility and lower installation costs. Wireless technologies cover a broad range of capabilities toward different uses and needs. Wireless local area network (WLAN) devices, for instance, allow users to move their laptops from place to place within their offices and homes without the need for wires and without losing network connectivity. However, risks are natural in any wireless technology.

The most significant source of risks in wireless networks is that the technology’s underlying communications medium such as the airwave, is open to intruders. Unauthorized users may gain access to network systems and information, corrupt the agency’s data, consume network bandwidth and launch attacks that prevent authorized users from accessing the network.

Wireless technologies have become increasingly popular in everyday business and personal lives. Unfortunately, no computer network is truly secure. However, some networks are built and managed much more securely than others. For both wired and wireless networks alike, the real question to answer becomes - is it secure enough?

Wired LANs use Ethernet cables and network adapters where it uses central devices like hubs, switches, or routers to accommodate more computers. It is difficult and very expensive to installing Ethernet cables because cables must run under the floor or through walls. However, it is extremely reliable and only common failure is when there are loose cables.

Wired LANs gives fast and superior performance, providing close to 100 mbps bandwidth. It is sufficient for file sharing, gaming, and high-speed Internet access. As for security, wired LAN hubs and switches do not have their own firewalls but external firewall software products can be installed.

Wireless LANs uses three main Wi-Fi communication standards such as 802.11b, 802.11a, and 802.11q. The 802.11b was the first slandered used in wireless LANs and 802.11a is a slandered used in business networks because it is faster. The 802.11q slandered combines 802.11b and 802.11a making it an expensive home networking.

Wireless adapters and access can be three or four times expensive compared to Ethernet cable adapters and the performance of the wireless depends on the slandered used as well as distance covered. Wireless LANs are less secure than wired LANs because the signals travel through air with many types of interceptions.

 A wireless network seems to be a good option for the company due to the difficulty of cabling the company buildings. Since the system is implemented is an electricity company, security is more important issue compared to cost or other issues. Wireless is easier to install, more reliable and mobility is excellent where as wired is more difficult to install with limited mobility.

An increasing number of government agencies, businesses, and home users are using wireless technologies in their environments. There are many wireless security technologies that can be implemented for better security, for example, WEP (Wired Equivalent Privacy), WPA (Wi-Fi Protected Access) and VPN (VPN).

WEP is a security protocol for WLAN defined in the 802.11b standard. The 802.11 standard describes the communication that occurs in WLAN. The algorithm of WEP is used to protect wireless communication from eavesdropping. It relies on a secret key that is shared between a mobile station and an access point. The secret key is used to encrypt packets before they are transmitted, and an integrity check is used to ensure that packets are not modified in transit.

WEP is designed to make up for the inherent insecurity in wireless transmission, as compared to wired transmission.  WEP relies on a secret key that is shared between a mobile station (e.g. a laptop with a wireless Ethernet card) and an access point (i.e. a base station). The secret key is used to encrypt packets before they are transmitted, and an integrity check is used to ensure that packets are not modified in transit. The standard does not discuss how the shared key is established.  In practice, most installations use a single key that is shared between all mobile stations and access points, Wired Equivalent Privacy (WEP) encryption for 802.11b wireless networks.

Once again, the clamor rises that the IEEE is using shoddy encryption, and that they are leaving the poor consumers and users of 802.11b networks open for the foulest kind of violations.  WEP is not, nor was it ever meant to be, an industrial data security algorithm.  It was never designed to protect your data from script kiddies and more intelligent crackers, who want to discover your secrets.  It is designed to make up for the inherent insecurity in wireless transmission, as compared to wired transmission.

When you have a wireless network, all the base stations and end nodes are transmitting their packets in a sphere, regardless of where you may want them to transmit.  In general, this sphere is about three hundred feet in diameter, although external and other factors can limit or enhance this range.  So, when you imagine your wireless network, it's important not to imagine a web of lines from point to point, but rather a series of interconnected bubbles, like the foam in a bubble bath (Welch).

Some manufacturers of wireless security refer to 64-bit encryption as 40-bit, but in reality they represent the same degree of protection.  Devices using 40-bit encryption are able to communicate with devices using 64-bit and vice versa.  Also, the security enhancements when using a 128-bit key as compared to a 64-bit key are minimal at best.  In terms of performance, there is no extra cost when encrypting with a 128-bit key over a 64-bit key, however there is cost to transmit extra data over the network.  If network performance is a concern, then 64-bit encryption is recommended.

Furthermore, when a wireless network is run with no security (the OFF setting), anyone within reasonable proximity can connect to that network and be able to use its internet connection.

Although WEP offers some protection for wireless networks, there are many free tools that are widely and publicly available which can break, or crack WEP encryption.  A potential attacker would be able to sniff network transmissions and then use these tools to determine WEP encryption keys.

In 2001, research teams at Berkeley and the University of Maryland published separate papers that disclosed the security flaws in WEP, including its encryption algorithms.  Because of such vulnerabilities in WEP new security technology for wireless networks had to be developed.  Henceforth, Wi-Fi Protected Access was introduced in 2003.

The original design of WEP was to provide encryption and authentication as part of the 802.11 standard.  It uses an encryption algorithm, which utilizes a key, or sequence of hexadecimal numbers entered by the user. With WEP, wireless clients and access points are manually configured with the same key. Rather than having the user enters complicated hexadecimal strings for keys by hand, it also introduces the concept of a pass phrases.

A pass phrase is chosen like a password and then entered into the system where is converted to complex encryption key. When using a stronger security setting, WEP requires 4 pass phrases to implement a key. During transmission between wireless devices, WEP switches amongst the four keys to make traffic more difficult to intercept.

Regarding the levels of security, WEP offers three settings that consist of OFF (no security), 64-bit (weak security), and 128-bit (stronger security). For wireless devices to communicate, they must all use the same type of encryption.

WPA (Wi-Fi Protected Access) is a Wi-Fi standard that was designed to improve upon the security features of WEP. Its security is greater then that of WEP and hence has two significant advantages over WEP.

Wi-Fi Protected Access (WPA) is wireless security with a far greater degree of protection than WEP.  WPA has two significant advantages over WEP.

First, WPA utilizes an encryption key that differs in every packet of information transferred between wireless devices. The Temporal Key Integrity Protocol (TKIP) mechanism shares a starting key between devices.  Each device then changes their encryption key for every packet.  This makes it extremely difficult to for hackers to read messages even if they’ve intercepted the data.

Secondly, Certificate Authentication is used in order to block a hacker’s access posing as a valid user on the network.  A Certificate Authority Server is part of the recommended configuration to allow computers with WPA software to communicate with other certified computers on the network.  To run WPA between two computers both must have WPA software as well as all access points and wireless adapters between them.  WPA computers will communicate with WEP encryption, if they cannot use WPA for a particular device.

WPA encryption offers several advantages over WEP.  WPA provides extremely strong security for wireless networks.  It adds authentication to WEP basic encryption.  WPA has backward compatible support for WEP devices that are not upgradeable.  WPA also integrates with servers to allow administration, auditing, and logging.

Moreover, larger networks almost often contain devices that are not WPA upgradeable such as network interface cards and access points (in which case these devices are using WEP encryption or none at all) so the network is still vulnerable to attacks. Despite the known security flaws with WEP and WPA, new solutions are being developed to combat the issues of wireless security.  One of those solutions is known as the VPN.

It provides extremely strong security for wireless networks and adds authentication to WEP basic encryption. WPA also integrates with servers to allow administration, auditing, and logging. As mentioned above, WPA encryption offers several advantages over WEP. Firstly, WPA utilizes an encryption key that differs in every packet of information transferred between wireless devices.

The Temporal Key Integrity Protocol (TKIP) mechanism shares a starting key between devices.  Each device then changes their encryption key for every packet. This makes it extremely difficult to for hackers to read messages even if they have intercepted the data. Secondly, Certificate Authentication is used in order to block hacker’s posing as a valid user on the network.

Although WPA is powerful tool for network security, it does have its drawbacks.  It can be complicated to setup which makes it unsuitable for home users.  In most cases, it requires firmware upgrades for main products and older firmware will usually not be upgraded to support it.  It is also not compatible with older operating systems such as Windows 95. Since WPA adds to packet size, transmission between devices takes longer.

The encryption and decryption software is generally slower and performance is lost. Despite the problems WPA, new solutions are being developed to fight the issues of wireless security and one of those solutions is known as the VPN.

VPN (VPN) is a network that is constructed by using public wires to connect to nodes. A VPN enables a specific group of users to access private network data and resources securely over the Internet or other networks.  It is called a VPN because it uses a public network and only inherits the characteristics of a private network.

There are two main types of connections that can be made by a VPN, an individual machine and a private network connection, also known as a client-to-server connection and also a remote local area network and a private network connection, which is known as a server-to-server connection.  A routed network, a tunnel switch, and tunnel terminators are all needed to make up a VPN.

A routed network is needed to transport encrypted data packets. A tunnel switch is used to increase security and versatility.  Lastly, tunnel terminators take the role of acting as virtual cable terminators cutting off and restricting access, by users, to the network. A VPN is characterized by its concurrent use of tunneling, encryption, authentication, and access control over a public network.

Whenever information is being transmitted across the Internet, whether it is e-mail or shared files, security is always a main concern. What most people are starting to realize, is that whenever data packets and files travel on a publicly shared network like the Internet, they are potential targets from malicious attackers. The only things that can make VPN secure are solutions that integrate several mechanisms, from software to additional hardware devices.

VPN security is mainly based on two techniques, encryption and authentication.  Encryption is used to ensure data integrity and privacy. Authentication is used to verify that users have the rights to access the private network and which data they can access.  To provide strong encryption for a safe and secure VPN, one must first consider the two mechanisms, which guarantee data confidentiality. The encryption algorithm provides the mathematical rules that convert the plain text message to a random cipher text message.

The algorithm provides steps for converting the plain text message with an encryption key, which is a combination of alphanumeric data that introduces the random element into the cipher text message. The longer the secret encryption key is, the longer it takes for an attacker to test all possible values of the key.

While VPN are pretty stable, it does have its pros and cons. VPN enable secure broadband connection through cable modems and DSL and it also make it easy to manage T1 lines, phone and data lines and remote access terminals. It can create significant communication savings in particular when lots of remote users dial-in from outside the local calling area. However, VPN being mostly Internet-based hence it is dependent on 24 hours connection. If the ISP is down, so is VPN.

A VPN enables a specific group of users to access private network data and resources securely over the Internet or other networks.  A VPN is characterized by its concurrent use of tunneling, encryption, authentication, and access control over a public network.

Whenever information is being transmitted across the Internet, whether it is e-mail from an individual or shared files within a corporation, security, and how secure is your information is always a key concern.  What most people are beginning to realize, is that whenever data packets and files travel on a publicly shared network like the Internet, they are potential targets from malicious attackers.

VPN are considered highly secure because they multiply the security mechanisms.  But how secure is secure enough?  It is proven that even the most popular VPN protocols have been deemed vulnerable to certain types of attacks.  The only things that can make VPN, if not fully secure, at least secure enough, are VPN solutions that incorporate several mechanisms, from software patches to additional hardware devices and security standards.

As far as security standards are concerned secure VPN only apply additional security protocols either to the network “tunnel” or to the data being transmitted in the tunnel. Most network systems include local authentication, which looks up user information in a database stored on the VPN device or VPN management station, but you should probably also provide a gateway to some external authentication database to avoid creating yet another password for users to forget.

In the absence of official standard a consensus has been built around Remote Authentication Dial-In User Service (RADIUS) as the best network-based authentication gateway available.  RADIUS servers provide an additional layer of security to Windows-based security policies as well as of other network systems using tokens or smart cards, such as SecurID or Cryptocard.  And that all makes for a safer more secure VPN.

While VPN s is pretty stable, they do have their pros and cons. VPN s enable secure broadband connection through cable modems and DSL. VPN make it easy to manage T1 lines, phone and data lines and remote access terminals.  VPN can create significant communication savings in particular when lots of remote users dial-in from outside the local calling area. VPN may provide less bandwidth than by using direct lines.  VPN being mostly Internet-based, it is dependent on connections to be up.  If your ISP is down, so is your VPN.  Emergency dial-in access may be used as a limited, temporary back up.  These are just a few things to consider before deciding upon a VPN.

SSL protocol deals with encryption and authentication which helps to secure information transaction between client and the server. It does not provide network security services because SSL is used for application security services. It is built into most internet browsers, Web servers and e-mail applications in order to provide data encryption, authentication and message integrity. Since no client software is required, anyone with proper authorization can access information from anywhere using simply the browser.

The main challenge of SSL is to provide security by initializing a private key in return helps to protect against attackers such as hackers and crackers, eavesdrop on secure data and steal passwords and other valuable information.

The IPsec protocol protects IP traffic at the network layer by encryption, authentication, confidentiality, data integrity, anti-replay protection, and etc. It creates a passageway for one VPN server to securely communicate with another VPN server as well as secures all traffic between the devices and applications such as e-mail, database, etc.

Unlike SSL, the client must have special IPsec client software installed. However, users can access information remotely and have the same privilege as directly connected to the enterprise LAN.

VPN provides fast, easy remote accessibility and very secure connections similar to local LAN connection while SSL provides casual or on-demand access to applications. VPN can offer secure network access using standardized client software that is managed, configured, and maintained by the companies IT department where SSL advantage is to allow access from any browser or application with embedded SSL capabilities.

VPN has stronger security compared to SSL because it can manipulate access to the user and maintain a proper security measure when confidentiality is the issue. SSL is easier to install compared to VPN but VPN is more secure and flexible.

Tools that can be used to enhance the security of computers connected to a network are called a firewall. When selecting firewalls, one must take into account such as easy installation and configuration, report of the attack by identifying time, location and type of attack, good maintenance and monitoring requirements, etc. The firewalls that should be used in companies are Packet Filtering, Stateful Packet Inspection, Application-Level Proxy and Network Address Translation (NAT).

A software package installed on a server operating system that acts like a full fledged firewall is called a software-based firewall. As a firewall, it helps to protect applications such as web application and email servers by using complex filters. For example, Check Point Integrity Secure Client [8] with a price of $1,569.59 is suitable for the any company because it provides advanced remote access connectivity, endpoint protection, and network access policy enforcement and Check Point FireWall-1 GX [9] which cost $73,520.63 is also appropriate for a company.

A hardware firewall is a hardware device which includes network routers with additional firewall capabilities that are designed to manage large amounts of network traffic. Hardware firewalls are used in combination with software firewalls where hardware firewall filters out the traffic and the software firewall examines the network traffic. For example, Nokia IP1220 Security Platform [7] which delivers long-term high performance perimeter security is perfectly feasible for the company with a price rate of about $16,901.55 and Check Point UTM-1 MODEL 450 [10] which cost $5,613.26 and also can be very useful for any company.

Computer programs that help to track and eliminate viruses are very important for any company. Using two different techniques to accomplish this task, one scans every file and compares it with virus dictionary or identifying suspicious behavior from any computer program which may include data captures, port monitoring and other methods.

Today there exist many commercial antivirus such as Norton, MacAfee, eScan, Kaspersky etc. However, company such as GES needs to be aware of issues such as antivirus software can considerably reduce performance and disabling the antivirus protection overcomes the performance loss but increasing the risk of infection. Other issues are such as not to install more then one anti-virus which can have devastating affect on the computer.  It is always safe to scan for virus in windows Safe Mode and keeping the anti-virus disabled for major update of the Operating system.

The systems analyst and system users need to be aware that physical security plays an

important role in the overall protection of GES networks. Disaster can attack GES anytime and hence the entire PC and the hardware should be properly secured such as building destruction or extreme natural disasters. Most of the problems that GES can face are application security. Therefore, all types of password such as screen saver password, BIOS passwords and other software password should be implemented successfully.

Every system of GES should be properly secured using passwords and other ways of restricting access to the system. The OS should be installed from the server as well as other computers connected to the network. As for file sharing, GES must restrict access to important files to unknown officials. Use of P2P software must be restricted and all output port of all PC should be blocked such as CD player, USB port, etc.

Best Practice is an effective means of controlling the implementation of IT projects. It is important to utilize the “Best Practice” for IT projects such as Software developments.

Today, software exists in almost any imaginable device; more and more products are enhanced with software embedded into small scale computers. As software grows more and more important, the need for quality in our software is ever increasing. Large software development project requires a team of programmers rather than a single individual.

Software products are not without their problems; some systems do not function as expected or do not even function at all. The network systems are increasingly becoming more software demanding because software is replacing the functionality performed by people and hardware. Hence, getting the developed software to work properly for the first time is becoming more and more essential. Apart form the cost of developing the software, developer are facing serious challenge when developing large scale software.

Remi H. Bourgonjon, director of software technology at Philips Research Laboratory in Eindhoven, states that "The amount of code in most consumer products is doubling every two years," [18]. He also stated that 500 Kb of software will be implemented in television and 2 Kb of software in an electric shaver as well as 30, 000 lines of code in General Motors cars. If software fails, it can affect us profoundly both financially and well as our lives.

Companies need large scale software which is delivered on time with good quality as well as cost effective. The basic problem that the developers are facing today is to cope with the technology changes in both software as well as hardware. Today, software’s are more interacting with real life environment. The challenging part is to create total ERROR and bugs free software.

Many developers state that we are in the center of a 'software crisis.' and the feeling of working in crisis mode will never leave us, and despite our hard work to make a difference, the basic methods that are used in software development have not changed at all.

There are two common challenges that could influence the software crisis and they are information and uncertainty. Information: Very soon it will be technically feasible to put everything we have online and remember it forever. But making use of the information is the main issues that developers will face when developing software. Uncertainty: Interacting with the physical world necessarily involves dealing with uncertainty.

Computer that uses software to interact with real life will be so important, that developers will face tremendous amount of problem developing the software. Hardware is becoming cheaper and more complex making hard for the software to compete. The models that are used to develop software’s could become obsolete and new methods of developing software project have to be invented.

According to Moore’s law, the average PC will have 100 Giga Hertz and will have hard-disk capacity of Terabytes. “Processors will be smaller and faster and a lot more of them hiding in everything from cars to house-hold appliances” [19]. So, the problem is, for the developer to take advantage of all this improvement using traditional methods. The quality of the software development tools that programmers use today is defiantly better than what it was thirty years ago, but the question is will the programmer be able to use the same tools in the future. Unfortunately, they might need something more.

“Today's tools support a style of programming that is at odds with the needs of the types of applications that will soon dominate. They provide good support for the development of sequentially executing code, but have only limited support for multi-threading and concurrency” [19].

Today it is impossible to test and debug a distributed system but it might not be the same in the future. For example, “At Bell Labs, for instance, we are working on a tool called 'Spin' that is designed to help the programmer to systematically and reproducibly find the inevitable bugs in distributed system applications” [19].

The term software crisis was used in the early days of the software engineering field. It was used to describe how the rapid increase in computational power and the complexity of problems which now could be tackled. It refers to the difficulty of writing correct, understandable and variable computer programs. The consequences of the software crisis led to projects running over-budget, over-time, low quality and that software became unmanageable or did not even meet the requirements.

Today in the 21st century software is every where imaginable, when you check out at the grocery store, using a credit card, driving your car or listening to music in your new MP3 player just to name a few. How can this software intensive era be if the problems of the crisis in 1970's are still valid? Software development has undergone a change from its early days, the awareness of the software crisis has forced engineers to address the problems by various processes and methodologies and the industry are realizing that formal software processes lead to better products with higher quality and reduced costs. And yet the principles of the software crisis are still here.

Software projects still run over budget; projects are late, contains large amount of errors and are completed with the wrong requirements. The software crisis of the 1970's has evolved hand in hand with the software engineering field. Software products can sometimes be very vague. As Mike Wooldridge say "It's hard to claim a Bridge is 90% complete if there is not 90% of the bridge there" [6], but it is easy to claim that a software project is 90% complete even though there is no outcome of the product.

The problems are many and can vary, there seems to be no concessive pattern, process or testing that will help solve quality issues and time to market. Although it is clear that without processes any large scale project will almost surely fail. Even though not all software projects fail, most do and it can be hard to see why given all the supposedly great tools and techniques that are supposed to work if applied correctly. Maybe this is what is not done, by using the techniques and tools in the wrong way leading towards sure failure.

The crisis has been studied by many researches during the decades and their work has given us the myriad of models, tools and practices of today, it is recommended to encourage software engineering researchers to come up with more unified models that can become a standard in both large scale and small scale industry projects.

Owning and maintaining software for customer and company can be very tedious and expensive. “The cost of owning and maintaining software in the 1980’s was twice as expensive as developing the software. The costs of ownership and maintenance have increased by 30% in the 1990’s [20]. In 1995, statistics showed that half of surveyed development projects were operational, but were not considered successful. “The number of defect or problem arrivals is largely determined by the development process before the maintenance phase” [21].

“Software maintenance and evolution are characterized by their huge cost and slow speed of implementation. Yet they are inevitable activities - almost all software that is useful and successful stimulates user-generated requests for change and improvements” [22]. The cost of maintaining software increase from 1970 to 1990 for many companies and the following table shows the percentage of increase. “$30 billion spent each year on maintenance ($20 billion in the United States). 50% of the data processing budgets going to maintenance and 50-80% of the time of an estimated one million programmers or programming managers spent on maintenance.

In the UK, $1 billion livers spent annually on software maintenance and also maintenance was 49% of the total data processing budget in 1979 and 65% in 1986” [23]. “Normally maintenance costs are between two and four times the costs of development, but can be as high as 130 times development costs” [24].

Building reliable and successful application developers themselves must adapt to component based technologies. A component is a reusable object or program that is meant to perform a specific function as well as communicate with other components and applications. Today, this technology is used everywhere to cut down cost as well as speed up the development process by keeping the quality of the application insured.

Components Based development helps to build software applications by connecting multiple components together which are produced independently. In order for this communication to take place we must have an interface between the components in such a way that is helps to develop the overall application as well as implementing it in the future applications. However, it is the compatibility between component interfaces that determines the success of the implementation of the components.

To be precise, an interface is made of number of operations which can be manipulated by the user. The operations play a vast role in implementing the interface and use of the interface by the client or the user. Today, software is more like crafts or tools rather than science. More and more talented and skilled software developers are emerging and turning it into a vast industry.

Currently, more and more humans are reliant on software’s and hardware’s as opposed to traditional means of earnings. However, the constrained between software technology and hardware technology makes it more difficult for software to take advantage of the hardware improvements. To make a matter worse, “errors are inevitable” which was acceptable by software developers is unacceptable today because software is becoming more and more embedded in consumer electronics.

Getting the software to work properly for the first time is hard for many developers leading both the developers as well as the company to expensive testing of the software. Owning and maintaining software in the 1980’s was very expensive compare to developing the software which increased by 30% in the 1990’s. Statistically, it is proven that half of the development projects in 1995 were operational rather than successful. Most of the large software products that were operational do not meet customer requirements. Opportunely an awareness of the software crisis has encouraged developers and owners worldwide.

So far the software industry leaders are beginning to realize that following formal software process leads to better product with quality and reduce cost. After the specification phase comes the phase in which you device your software solution in terms of a model. This phase is called the design phase. When designing a solution to a large or complex system it may be difficult to embrace the whole system at once. The usual practice is to divide the problem into smaller modules or sub-problems that are more manageable and easy to understand.

In addition to the design phase it is a good practice to create a risk analysis. No software development is without risks, whereas the risks are that the software will not meet requirements, deadlines, cost constraints or something else. Even if some risks can be eliminated and some cannot it is always good to know about the risks. It should be clear to anyone that a risk that is eliminated early in the project has the potential to greatly reduce costs. After the design phase it would also be sound to verify the system design.

In every organization there might be a primary sponsor, project manager and a project team. Other key members are such as Analyst, Client, Client Project Manager, Designer, Project Manager, etc. In order to develop projects, a detailed planning is needed which can take at least 50% of the development time and the bigger the project, the more thoroughly one should develop its outlines.

Analyst: In order to properly develop, the analyst must ensure the requirements of the projects needed by the clients such as the structure of the web applications. It must be properly documented before any development and implementation of the web application as well as for future reference.

Designer: Understanding the business requirements and create a design solution to that requirements in order to met the business needs. The designers determine the best approach for the web application such as which browser to focus on in determining the best design of the web application. Designers must also determine the best technology to use that creates an optimum solution for the customers as well as the business for example selecting an overall model and framework for the solution.

Project Manager: This person has the authority to manage the project such as the development of all project deliverables, meeting the deadlines of each module of the project. PM should know what the requirements of the programmers as well as the over all Project. For example, requirements such as use of different tools and latest technologies and providing with sufficient information to reevaluate the Design and Development issues of the project.

Project Team: The team mostly contains programmers who will do the coding part of the application. This may also include the analysts and designers. Responsibility includes understanding, assigning and accomplishing work within the budget, timeline and quality expectations, etc. the team must also communicate with the project manages to inform about scope changes, risk and quality concerns, etc.

Network Administrator: to properly manage and maintain network of GES Hardware Administrator: objective is to properly install and secure hardware of the company.  System Administrator: Job is to administrate The technologies that are available now for advancement of communication system are very powerful.  It is as well as vulnerable to all sorts of new problems that can be faced by any company.  So, risk management is critical to success because companies are implementing distributed computing architectures using high-speed substation local area networks and process bus technology.

Several issues within the company can lead to cost growth or other problems on development projects such as companies working a project with budget or schedule which is inefficient or making company decision before understanding the relationships between cost, performance, schedule, and risk. In order to mitigate maximum risk, issues such as Risk management and Risk assessment must be taken into consideration.

To complement such trends, the objective to implement a system that will maintain and manage the security issues of Grenada Electricity Services Ltd is logical. There is software available that can be installed to replace the existing system. This is an electronic system which can be maintained by network administrator and the employees. All existing payroll documents can be converted to electronic forms where employees can access using Intranet. High-speed Internet services can be provided for both employees where it can be monitored using local ISP.

One of the most important objectives of the plan is to maintain a perfect security through out the company while doing business. The Client case management software that will be installed will be strongly secured for the Grenada Electricity Services Ltd. The two software architectures, SSL and VPN, both have its advantages and disadvantages to fulfill the company’s requirements. The main concern however is the security of the companies where a system is feasible enough to protect and control the information. .  This is anticipated to meet present developments in network security standards.

Authentication: The act of verifying the identity of a terminal, workstation, originator, or individual, to determine that entity's right to access specific categories of information. Also a measure designed to protect against fraudulent transmission by verifying the validity of a transmission, message, station, or originator.

Authorization: The granting a user's right of access to a terminal, transaction, program, or process. Authorization is generally used in conjunction with the concept of authentication. Once a user has been authenticated, he or she may then be authorized for different types of access within a process, transaction, or program.

Availability: The condition that a given information resource will be accessible to authorized persons or programs in a timely way, in the form needed, and that the information it provides will be of acceptable integrity.

Backup Procedures: A documented plan, to be used for the recovery of data, software, and hardware following a system failure or other disaster resulting in the loss of systems, services, and/or data.

Classification: Separation of information into two or more categories, each having different protective requirements.

Computer Security: The protection of a computing system against internal failure, human error, attack, and natural catastrophe, with the goal of preventing improper disclosure, modification, destruction of information, or the denial of service.

Confidentiality: A condition ensuring that information is not made available or disclosed to unauthorized individuals, entities, or processes because of the degree to which sensitive data, about both individuals and organizations, must be protected.

Contingency Plan: A documented strategy for emergency response, backup operations, and post-disaster recovery, maintained as a part of the unit security program. The contingency plan ensures the continued availability of critical resources and facilitates the continuity of operations in an emergency situation.

Custodian: A person or entity designated to have access to, and possession of, authorized information. The Custodian is responsible for providing proper protection, maintenance, and usage control of the information in an operational environment.

Data: A representation of information, knowledge, facts, concepts, or instructions that is being prepared or has been prepared in a formalized manner and is intended to be stored or processed, is being stored or processed, or has been stored or processed in a computer. Data may take many forms, including but not limited to, computer printouts, magnetic or optical storage media, and punch cards or it may be stored internally in computer memory.

Database: An organized file of related data.

Data Dictionary: A file, document, or listing that defines all items or processes represented in a data flow diagram or used in a system.

Data Integrity: The assurance that computerized data is the same as its source document form -- that is, that it has not been exposed to accidental or malicious modification, alternation, or destruction.

Denial of Service: The prevention of authorized access to resources.  It is slso the unauthorized delay of time-sensitive operations.

Dial-up Access:  It is the access to a computer system using telephone lines and a modem.

Distributed Processing:  Two or more systems in a fixed design is the form of system-to-system communications in which an application is divided. The decision as to where work will be done is made during the design stage and is bound during implementation. The system itself is designed for multiple users and provides each user with a fully functional computer. Distributed processing is designed to facilitate communications among the linked computers, and shared access to central files. In personal computer environments, distributed processing takes the form of local area networks (LANs).

Electronic Data Interchange (EDI): The interchange of electronic business documents (e.g., purchase orders, invoices) among multiple organizations' computers (e.g., from the Grenada Electricity Services Ltd. to a bank or a vendor).

Encryption: The transformation of data by cryptographic techniques to produce cipher text.

Event Logging: A collection of information, generally of machine and user activities, that tracks a sequence of electronic transactions.

Federal Privacy Act: A federal law that allows individuals to request and view any information on file which pertains to them, and to know how that information is being used by government agencies and/or their contractors.

File Protection: The processes and procedures established to inhibit unauthorized access to, and contamination or elimination of, a file.

Identification: The recognition of users or resources as those previously described to a computing system, generally by using unique, machine-readable names.

Information Asset: Valuable or sensitive information in any form (i.e., written, verbal, oral, or electronic.)

Information Security: The practice of protecting information from accidental or malicious modification, destruction, disclosure, or denial of service. Similar to Data Security, Information Security implies a broader scope, encompassing both electronic and traditional, non-electronic forms of information.

Integrity: The condition of maintaining data, processes, or information resources in such a way that they are not improperly altered or destroyed.

Least Privilege: An access principle requiring t