Jim Lloyd's Computer & Technical

JCLloyd.com

Introduction:

The majority of people reading this page are undoubtedly younger than I am, and most likely have been exposed to a more advanced environment. This is just a product of technology over time. What I learned in college some 30 plus years ago is most likely common knowledge or now taught as a high school prerequisite. When I was in school, the slide rule had just been abandoned for the calculator, which we were only allowed to cross-check our work on. And, I learned programming code on a breadboard with a Motorola 6800 processor and a hexadecimal display panel. The personal computer was still in its infancy. I bought my first personal computer in 1983… a Commodore 64. I almost bought an Atari 800, but Commodore dropped their price. That decision ultimately killed the company as we knew it, but that is another topic for a business section. Commodore still in business, filling a small niche of folowers. If you are interested, check out their web site - Commodore USA Home Page .

Since I began working in 1981, I have gone through the slow process of learning things on my own. I started out in field service repairing computer peripherals and analog bench technician work. In the late 1980s, the IBM PC had made major changes in how business used computers. Although still technical, it really became a business appliance, more like a technical typewriter, and eventually a really cool spreadsheet calculator. It really wasn’t until the mid-1990s that computer graphics really matured and Microsoft Windows began to dominate the business world. I have nothing against Apple’s Macintosh, as I have used both. My work environment is dominated by Microsoft operating systems and applications, so that is the majority of my work.

It has been a logical progression from tinkering to repairing, to programming, to configuring. Once I mastered the PCs, I moved on to servers. I had a great job at a software development company where I progressed into the server world. I started with NT4 and was introduced to Windows 2000. I had become a Novell Netware Engineer in 1994, but hired into a Microsoft centric company in 1995 to not really ever look back.

This is the most recent part of my computer technology life.It has been a logical progression from tinkering to repairing, to programming, to configuring.Once I mastered the PCs, I moved on to servers.I had a great job at a software development company where I progressed into the server world.I started with NT4 and was introduced to Windows 2000.I had become a Novell Netware Engineer in 1994, but hired into a Microsoft centric company in 1995 to not really ever look back.

In 2004, I tossed caution to the wind, had my mid-life crisis and moved to the Big Bear Valley, without a job prospect.I spent about 18 months trying to get a PC business off the ground.I studied Server 2003, got pretty good at making web pages, and had a half way decent go of it.The PC repair side was where the money was, but not where my heart was.

I found a job as a network administrator, and began getting the network environment into shape.Since mid-2006, I have focused my efforts to ‘security.That is really a broad term, but in general I am talking about restricting a domain environment to limit user abilities on a network, capabilities of a user account on a specific computer, and network devices.There are other items, such as backups, that are only important when something goes terribly wrong.Then, it is the difference between lost intellectual property and jobs, possibly even the ability for a company to remain viable.Click the link and read on…

 

One could probably cram my last 30 years of experience into less than a few semesters of college, today. And, that is how technology and other things in life works.  One generation goes through the longer process of discovery in learning that the following generations are taught, or just pick up as common knowledge. None-the-less, I hope some of my experiences documented here will prove useful in some way to at least some. That all being typed, here are some of my mental ramblings on Computer Technology…

Network Security:

This is the most recent part of my computer technology life. It has been a logical progression from tinkering to repairing, to programming, to configuring. Once I mastered the PCs, I moved on to servers. I had a great job at a software development company where I progressed into the server world. I started with NT4 and was introduced to Windows 2000. I had become a Novell Netware Engineer in 1994, but hired into a Microsoft centric company in 1995 to not really ever look back.

I found a job as a network administrator, and began getting the network environment into shape. Since mid-2006, I have focused my efforts to ‘security'. That is really a broad term, but in general I am talking about restricting a domain environment to limit user abilities on a network, capabilities of a user account on a specific computer, and network devices. There are other items, such as backups, that are only important when something goes terribly wrong. Then, it is the difference between lost intellectual property and jobs, possibly even the ability for a company to remain viable. Read on…

Firewalls & Routers

Depending on the size of your environment, these may be the same device. Home and Small Office users are familiar with those name brand routers… Linksys/Cisco, Netgear, D-Link and Belkin. Because they add Network Address Translation (NAT), a different internal (Local Area Network or LAN) IP address scheme than what is on the outside (Wide Area Network or WAN), their function inherently provides basic firewall capabilities. The important thing is that they keep outsiders from probing into your network of devices and possibly gaining access to them.

Firewalls are designed for the point where your network accesses the Internet. Larger ones provide filtering for such things as restricting access to bad sites (also called Unified Threat Management or UTM) and filtering things that need to come into your network (i.e. – mail or web server). Some offer VPN tunnel connectivity, for remotely connecting back into the internal network from the outside. These features range from very basic and insecure to complex with multiple authentication protocols for enhanced security.

Routers, in general, are designed for interconnecting multiple sites, either through a direct connection or secured route through an Internet connection. These can be wired or wireless, and should include an encryption protocol to mask the data being transferred such that it will not be intercepted. VPN tunnels are usually utilized, and Network Address Translation protocols, which create at least a basic firewall connection.

Smart Network Switches

Not all network switches are the same. There are capabilities categories, called “Layers”, based on the OSI model for network communications.. A “Level 1” network switch just routes traffic, and usually has multi-path channels for multiple simultaneous paths. Other than that, these switches do not log or control anything.

“Smart” switches are “Layer 2” or “Layer 3”, and inclusive of the lower layers. These switches provide levels of control and reporting that can segment and isolate traffic (Virtual Local Area Network or VLAN), and allow exceptions for limited connectivity between them. I personally use Enterasys network switches which add profile management. This level of functionality is similar to having a firewall on every network switch port, with profile definitions and network switch port groupings.

Traffic modeling is a major form of network security that occurs before taking into account workstation or server settings.

Windows Active Directory & Group Policy

The crux of a Windows Domain Environment… A Windows Network Administrator lives in these two environments. Active Directory defines every ‘object’ that exists in the Domain.

Active Directory Users and Computers is the place to start. Learn how to create users, computers, and groups. In each Object, specific settings can be assigned, including unique logon scripts and network permissions for User objects. Groups are ways to organize objects to apply permissions. The most popular use is to determine the level of access, if any, to directories and network resources. The second is email distribution groups. This is the foundation of creating a secure network. You can start by searching through Microsoft’s immense array of material. To get you started, here is a Tech-Net link… http://technet.microsoft.com/en-us/library/bb727067.aspx .

Group Policy Management takes the Active Directory assignments and assigns functionality to Active Directory Organizational Units (OU). There is a difference between a Group and an OU. The graphical difference in Active Directory is that the OU folder icon will have an image of a document on it while the Group folder will not. To start, make sure that an Organizational Unit folder is created in Active Directory for objects intended for Group Policy Management. Just about every Windows operating system control is customizable. Printers can be assigned, functions (i.e. – Remote Desktop and custom scripts) can be assigned or denied. To get you going, here is another Tech-Net link - http://technet.microsoft.com/en-us/library/cc753298.aspx .

 

Windows Event Logs

If you are going any direction towards Network Administration for Windows environments, become familiar with the Event Logs. They look like a mystery to the uninitiated, but the Windows operating system logs important information, here. The primary logs are Application, Security and System. By far, the majority of these events are benign notations of things just doing what they should. The ones to look for are the ones showing issues. These can include services failing, credential issues, application problems, and attempts to do unwanted things (i.e. - log into a network device, either remotely, directly or through an IP port).

To be sure, some alerts may be difficult to read. And, one may not even know what to be looking for. Doing various searches on your favorite search engine and Microsoft’s, a number of links can be found with at least the first several hundred basics to be looking for. To get you started on finding events of interest for your environment, here is a link from Microsoft for Server 2008R2 and Windows 7 - http://www.microsoft.com/en-us/download/details.aspx?id=21561. Don't stop with just these Microsoft alerts. There are plenty more. Try logging into a server and logging off, both from a console or remote session. Many applications add event log entries that are instrumental in identifying issues with them.

Automating the review of Event logs is paramount, these days. No one really has time to perform daily reviews of the server logs. Luckily, many companies have software for sale. These include (hyperlinked to vendor site for you to review): GFI LAN Guard, Solar Winds Log & Event Manager, and Manage Engine Event Log Analyzer, just for a few. These applications are not inexpensive. But, they can be configured to generate near real time alerts for email and page, as well as scheduled reporting for audit.
. Manage Engine allows their version to be run monitoring 5 devices for no cost, and it happens to be the software I chose for my office environment. I initially ran the free version, until I was approved to purchase the Professional version and enough licenses for my environment.  Today, I have over 300 alerts programmed.

Sys Logs

Other non-Windows network devices (such as Layer 2/3 network switches, routers and firewalls) can be set to send Sys Log information to a logging server. These are similar to Windows Event Logs, and may supply important information regarding network traffic passing through them (or not). The level of data being sent depends on the particular device and what setting levels it may have. I have Enterasys network switches that allow me to monitor things such as: reboots, firmware updates, logging on or off of a device, and attempts to connect with improper SNMP credentials. I am able to monitor Sys Log with Manage Engine’s Event Log Analyzer.

Patch Management

Keeping a system updated is paramount, any more. Just about every popular operating system and application is being exploited in an attempt to gain access to systems that access the Internet. Just going to web sites deemed as ‘safe’ no longer apply. Many web servers use applications that can be compromised (i.e. – Apache database, MySQL, SQL, and Java). Sometimes, exploits are discovered in active use (called “zero day”) and patches often just take time to be created. One user accessing a compromised web site can cause grief across an entire network.
.  For Windows environments, newer server versions have the option of installing Windows Server Update Services, also called WSUS. This is a non-added cost to a network environment that also updates workstations.It does have limitations, but any small office of under 100 computers needs to consider implementing this server Role. Aftermarket applications are available that can do this, as well as patch the major third party applications. The same three companies mentioned in the “Windows Event Log” paragraph also sell patch management products.  Those would be; GFI LAN Guard, Solar Winds Patch Manager, and Manage Engine Desktop Central.These are linked for your easy review, and be aware that these applications get pricey… especially for larger networks. I personally had issues with GFI LAN Guard not working with my network security settings, and the network security part wasn't as good as I expected it to be. Manage Engine allows the Desktop Central application to run without cost on a network of 25 or fewer devices. My network has more, and I am using that application. Desktop Central has a number of added features that are indispensable to me, now that I am using it.

.

Servers:

Small businesses, SOHO and home users most likely don’t have a need for an in house server. Even with off-site hosting (also called “Cloud Computing”), there is a need to have both redundancy and the convenience of local access. As the versions of both Windows and Linux servers have become easier to program with each newer release. That being said, one has to know what the settings are and how to set them. It just takes time and reading.

Local Computer Policy (gpedit.msc) is the Workstation or Member Server system settings are individually customized. On a Domain Controller, the Group Policy Management Console (gpmc.msc) is used to make global changes to a large number of servers. For like operating systems (i.e. – Server 2012 & Windows 8, Server2008R2 & Windows 7, Server 2008 & Vista, or Server 2003R2 & Windows XP-SP3), the Local Policy settings are identical to the Group Policy settings. A Domain Controller doesn’t have the Local Computer Policy. Its changes are made in GPMC. It is easier to make changes in the GPMC when there are a number of servers in a network that includes a Domain Controller. One location to make changes that reflect to all servers in the same groups.

Server 2008 introduced a new interface. A management console called the Server Manager is pretty much a single location application for everything one would want to do with a server. The most important thing to do is assign Roles to a server. Back in the day, hardware technology did not allow for adding too many Roles to a single server. With the advent of the multi-core processors and advanced chipsets, it is possible to have enough resources to add all or most of possible roles. But, it is generally still not advised to mix certain Roles together. One example is Windows Server Update Services (WSUS), which uses an internal database based on SQL Server. SQL Server is not advised on a Domain Controller, because of its periodic security issues that could compromise the internal database of Active Directory. Likewise, setting up a web server on the Domain Controller and directing outside traffic to it, even just port 80 or 443, is considered a risk by allowing outside traffic to directly contact it.

Server 2008 also introduced a virtual server manager, called Hyper-V, and allows the re-use the primary server license key for one instance of a virtual server. Moving applications off of the actual Domain Controller onto a hosted server instance allows the separation and isolation of these application from the Domain Controller environment. Properly setting up the network interface will direct all virtual server traffic through an independent network interface, completely isolating traffic between the two environments. In this environment, a smaller office with limited resources can utilize a slightly more robust server without having to purchase two. Any issues with the virtual server will not affect the physical server. On the other hand, there is no amount of redundancy that can fully account having everything in a single server.

It is also important to monitor a server’s health. In the past, one had to log into a server and manually check things out. These days, software is available that can be programmed to do just that. And, it can send alerts or run scripts to take corrective actions when things go awry, and send email summary reports for audit purposes. Many companies offer such applications that include the three companies I have listed in the “Network Security” section. Without being redundant, I will say that I personally chose Manage Engine’s Op Manager. As per the other Manage Engine products I have mentioned, there is a free version that monitors up to 10 devices, including many switches, routers and firewalls. Advanced features are not available after evaluation, which include Active Directory and Exchange. I have even used the URL monitor to periodically test remote locations my office needs access to, which is a unique monitor technique.

Workstations:

This used to be my main item, and I could fill pages with endless banter on the topic. Things have really changed in this arena. Computers are quickly going from build projects to a disposable tablet that cannot be opened without a heat gun. As of the beginning of 2013, the mouse (or other pointer device) is losing its place to touch screens and operating systems centric to that. If you are still interested in this old man’s thoughts on this topic, go ahead and read on.

As it tends to go with older people, it is easy to reminisce over the days of old. In an attempt to not drift into that. In college, I learned analog and digital electronics, and one of the last things I did (in late 1980) was to put a Motorola 6800 processor on a breadboard with a couple of 7 segment LEDs and program hexadecimal code to make the output do what I wanted. For the rest of the 1980s, the industry just kept growing. IBM’s introduction of the personal computer with Bill Gates coming up with an operating system slowly grew. With Apple Computer introducing the graphical operating environment, even though ‘borrowed’ from Xerox, companies vied to be the ultimate computing solution.

As hardware technology advanced, operating system complexity grew. Together, these last 20+ years have seen the age of assembling computers from parts being passed up for entire computing tablets and smart phones… for those with really good eyes. Wireless technology isn’t nearly as good as wired, but the convenience of not being harnessed to a network cable and external keyboard/ mouse is taking off. With the exception of niche and legacy applications, having to decide between Microsoft, Apple or Linux is becoming transparent to the end user. It is usually not difficult to find something that does what you want on any platform.

The only thing left that is currently looking to be coming out in mid-2013 is 4G-LTE enabled tablets. This means that one can truly be data connected while on the move, provided that there is a data carrier antenna nearby. What is currently unclear is whether the VoIP method of voice communications being adopted will be allowed usability in a tablet.
. The Lenovo Helix looks to be an all-inclusive computer experience, complete with a docking station for desktop work and a blue tooth keyboard to turn the tablet into a net book. It is supposed to run Windows 8 Professional on a Core i5/i7 processor. Microsoft’s Surface will have a similar competitor, but details are sketchy. I truly think that this is where computing devices are headed for the near term.

Operating Systems

I have been exposed to many operating systems over the last 30 plus years. The first IBM PC was a 16 bit processor running CPM/86. That didn’t go over well with the general public, and IBM worked with Bill Gates to come with DOS v1.0. Yes, I started back then. I really don’t have any qualms with one OS over another, especially in 2013. It really comes down to personal choice and if there is a niche product, personal desire or financial requirement that dictates one over the other.

I am and have been Microsoft centric in business for over 20 years. Starting with DOS v1.0 and moving up to Windows 8 Professional, today, each upgrade has had a period of discomfort while the first service pack came out and/or the software vendor I relied on made a change to their product to work properly on the new OS. I remember when XP first came out. I maintained workstations for over 150 software developers who absolutely hated what they referred to as the “Tyco Toy Desktop” and that the user profiles moved under a new folder and broke their applications.
. The versions of Windows Server wasn’t any different. Moving from NT4 to Windows 2000 seemed almost crippling, at the time. And, it was a nightmare until that magic first service pack came out. But, NT4 wasn’t any different. Every even numbered service pack seemed to have its issues. Skipping service pack 3 in an upgrade process was a disaster that could only be corrected by a rebuild.

Apple wasn’t any different. I remember OS updates that broke applications, and those updates were needed. Some have been better than others. Just like Windows, certain hardware versions are not capable of taking newer OS updates after a certain number of years.

Linux keeps chugging along. Certain flavors have particular benefits. Novell bought out Suse, and those two work well with each other. Red Hat has a great following among hard core developers. Ubuntu has really opened the Linux experience up to the average computer tinkerer. Some versions are stripped down to allow use of older hardware. But, the real benefit is toward the newer hardware.

IBM’s OS/2 Warp! Is still hanging around. Discontinued in the late 1990s, it is so old that network exploits really don’t exist on it. This is a proprietary and legacy environment that you will see this still being used.

The other OSes of the 1980s seem to be around for those who just can’t leave it alone. Commodore has created a Linux version of their “64 Basic”. I don’t see any vintage recreations of an Atari 800XL around, but plenty of sites are devoted to it. If you are into nostalgia, just look around the Internet for the history of computers. They had to run on an operating system…

 

 

Computer Technology

Electronics Technology is a unique industry. It is the only category I can think of that obsoletes itself while it is still functional. To restore a vintage automobile or antique piece of furniture will undoubtedly produce something both enjoyable and usable for one’s needs. But, not our electronic devices. Technology advances happen at enough of a rate and with enough of a desirable change that most of us make regular switch outs of working electronic devices for the ‘better’ thing. Most of us trade up our cell phones every two years, along with our contract renewals. The old phone probably worked great, but it didn’t have the new features or capabilities. A Pentium 3 processor is on the verge of being useless, and the 1GHz+ version was released in 2001.

The software side isn’t much different. Operating systems are slated for obsolescence on a schedule, mainly because usage changes causes the need to redevelop the product. That costs money paying for staff to do this. So, companies find a compelling reason to produce and sell new versions on a regular basis. That is to make more revenue. Most have a lesser upgrade price and then the full price for first time buyers.

Hardware companies are on the same path. It isn’t easy to make a faster chip to replace on an existing computer, so development over the last 10 years or so has been to use the latest chipsets to make an entirely new computer. To save more money, software companies collaborate with major computer builders, like Dell, HP, Lenovo and the others, to have software preinstalled on them at time of purchase. This helps subsidize the manufacture of a computer and gets the software out to a market segment. Most of us have had that problem where someone with a newer computer has sent us a file that we cannot open because our software is too old. Even though the option to save files in an older version is usually available, most people begin the process of deciding that they need to upgrade their software, and possibly the entire computer. And, the cycle is complete.

If we thought in the realm of appliances for purpose, we would buy just what we needed. Unfortunately, we often don’ know exactly what we need. That causes us to pay a little more for something that can do more than what we originally wanted, just in case we may need it to do more, later. The other side could be early replacement of an entire system.

The basic office productivity software has done everything we have needed, for decades. Word processing, spreadsheets and presentations really do the same thing they did twenty years ago. It is the new features and functionality that draws us in, if not that the newer software came with a new computer purchase and the old license was not transferrable.

If technology didn’t keep progressing, we as a society would not be advancing. Therein lies the conundrum. I will stop here, to avoid getting philosophical…

Miscellaneous

Programming

I started out programming machine code on 6800 processors, in college. My first job (1981) was with Tektronix, when they had a computer peripherals division. They used basic on their 4050 series smart terminals, which had a tape drive. The Commodore was a logical progression for with its Basic. I even helped a person make a penny market trending program in 1982. I went into aerospace in 1983 and spent the rest of the decade bouncing around with different things doing a little Pascal and dabbling with an interesting program called Enable Office Automation, a DOS-based spreadsheet, word processor, database & modem application bundle. By then, I my work experience had migrated me from actual programming to application use. I got into HTML in 1998, programming in Notepad, then eventually Dreamweaver. After starting this latest employment in 2006, I haven’t had much time for these things, past a few basic things that NVU could do for me. Now, I am using Microsoft’s free application, Visual Studio Express 2012 for Web. It has its limitations, but is very desirable for free!

Last Updated: 2013-04-09