PrezNotes Oct 2011

Contents

A Bit Of History

SPAUG is the follow-on to the Homebrew computer club, which seemed to implode about 28 years ago. That Homebrew crowd was on the bleeding edge of the technology that was available at the time. At that time, upon the void created by the cessation of the Homebrew computer club, through a miracle of events, personnel, and circumstance, a new club formed which had the inglorious name because of circumstance, the Stanford Palo Alto User Group (for PC) (SPAUG).

The Stanford portion of the name came from the fact that, at the time of incorporation, SPAUG was meeting at Polya Hall at Stanford and numbers of Stanford students were an integral part of the club. I joined about 4 months after incorporation.

Spaug-History-coverTo commemorate and document about 28 years of SPAUG operation, a team of SPAUG members decided that they would write and document the history of SPAUG. Thus was born a project that has been going on for about two years that has occupied the time and efforts of its leader Robert Mitchell, and John Sleeman. Stan Hutchings, Maury Green, John Buck, and Bev Altman also worked on the history document.

In order to present the entire history of 28 years of PrintScreen, the team amassed an incredible 28-year library of the newsletter, which, in itself is a great achievement. In addition to the PrintScreen library, the group has written a 40-page historical epistle covering the evolution of the PC and the activities of SPAUG.

From the beginning, it has been intended to present the 300-plus PrintScreen bulletin library and a master bound copy of the History of SPAUG to the Computer History Museum in Mountain View, a process that is just now beginning. In addition, we are going to try to make the History of SPAUG available to all members by a special bulk ordering via SPAUG through the printing house and distributing them at the monthly meeting. We do not know the pricing yet. Other options are being reviewed. Because of the size and the color requirements, we are still researching this option.

You May Be Vulnerable To Failures

Many computers come into the Clinic with insufficient or no backups—often at the loss of some customer data—an absolute disaster for the user/owner.

1. Assure that you have a relatively empty HDD (Hard Disc Drive) that is:

If a laptop—connected by the highest data rate port you have and is lying flat to minimize tipping over causing data loss.
If a desktop—that the HDD is installed internally to assure the best protection from shock.

2. Take the time to check the backup HDD.

a. Open My Computer by double clicking its icon on your desktop. You should see a list of hard drives for your machine.
b. Right click on the backup drive, then select Properties in the popup menu.
c. Click on the Tools tab.
d. Click the Check Now button.
e. Put check marks in both boxes in the popup window.
f. Click Start in the popup window.

The Error Checking process may take several hours, and is essential to avoid surprises. If the result shows bad blocks, reboot the computer, which will allow automatic re-running of the job in a stand-alone condition that will allow Microsoft to fix the bad blocks on the HDD.

3. Obtain, install, and run the latest version of Acronis True Image Home.

a. Go to UGR.COM, the website of User Group Relations.
b. Purchase an Acronis True Image Home license key; the cost is $25.
c. When UGR has acknowledged your payment—the next day or so—review the information they sent you for a document that has the word STARTER in the title. Print the document out—in color— and follow its instructions to the letter, except change the number of iterations between full backup from “6” to “13”. This process has been quite effective to assure that backups are taken automatically.

Once every few months, you will have to delete old backups to release space to be used again for newer ones.

How Long Should You Keep XP Going?

Microsoft support for Windows XP will end in 2014. That means that a lot of services and capabilities that are routinely updated by Microsoft—such as virus deletion and other similar functions—will cease to be effective.

In all fairness, Windows XP has been running for 10 years, which is a long time for an operating system. There is no urgent need to abandon XP, as it does the job and you already know how to deal with it and know how to use it effectively.

There will, however, be a compelling need before the end of 2013. Just plan on how to move your data over as just about everything else will be obsoleted by Win7. Those non-Microsoft programs you need already have made the changes needed by Win7 and will have routines to bring over your data in a new format if needed – the transition is usually quite painless. But do it you will have to do.

Databases will be the most complicated merely because it is a database. But if your computer just plain dies and you have to replace it because the parts are no longer available, then you might have to take the leap to Windows 7 (Win7) sooner than you’d planned for.

How To Deal With The Win7 Human Interface

The Win7 user interface is causing some trauma because nothing is where you are used to finding it. Most of the essential and most used routines are still available, but finding them has become a gigantic puzzle.

I’m surprised that no one has made an interface routine that lets you state what you need and then displays its icon on the desktop. My way of handling the problem is to go though the agony of finding the icon I need and then copying the icon to the desktop. Simple but effective. You will find that many things are much snappier with Win7. The speed gain is often not from Microsoft’s software but from the 8–10 years of advances in the speed and structure of the computer upon which the software is running. The newer CPUs are screamers, and SATA (http://tinyurl.com/opbc9) is faster than before. You may need to upgrade your data communications capabilities to keep up with the capabilities of these newer computers.

Get As Much Fiber Optic Speed As You Can, Using Sonic.Net

If you are on DSL, and your options are not attractive, consider the Santa Rosa-based ISP Sonic.net. Sonic.net was chosen by Google to supply Stanford with fiber optic services as a test case.

Open Sonic.net’s Speed Test (http://speedtest.sonic.net/ookla/) and give them a call. Find their local phone number via http://www.sonic.net/popf2/. They will do some testing and give you a speed estimate, depending on where you are positioned geographically, to receive fiber optic via the copper wires of the phone company.

When I went through the above drill, I was told that I was about 9700 feet from the fiber optic origin point, and, therefore, that I should expect their signal, which starts at 20 Mbps, to degrade to “about” 6.8 Mbps at my site. I was getting 2.8 Mbps from AT&T. When showing something from YouTube, I used to outrun the downloaded movie and have lots of stops and spaces. The higher speed stopped that. Going from AT&T’s 2.8 Mbps to Sonic.net’s 6.8 Mbps cost me only $3.05 more per month.

What Other Suppliers Offered For Data Communications

AT&T also hassled me another way. In order to install their U-verse capability, for a 7-day period, I would have no service at all while the line owned by AT&T was allowed to “cool off” and its “availability” verified. It took 7 days to assure that the outside vendor (Sonic.net) had actually released the line.

I filed a complaint with the FCC that the real reason was that AT&T’s records were in such disarray that the “elephant could not jump”, and thus imposed the 7-day cooling off period. One AT&T argument was that my two lines needed to be treated as a single entity, unlike those of any regular corporation with multiple lines, because it was easier to lump their records into one. I didn’t accept that, so filed with the FCC. About two months later, Sonic.Net was listed as the “owner” of the line and my Sonic bill went up by the cost of the line and the amount owed to AT&T decreased by the same amount.

During the time that I was at another location, I had no choice about communication services—Comcast told me that they were supplying the building with 12 Mbps. That number was published everywhere, including by the sales personnel who, when pressed, said the speed “might be a little bit less”. Was it ever! My speed there never got above 2.3 Mbps.

So, just how does one predict what speed and may be had in your area, and at what cost? The best advice I can offer is to see what your neighbors have. Other than that, keep your existing service, and get the new service under a 30-day to 90-day trial install. If no trial period is offered, tell ’em to take a hike.

Leave a Reply

Your email address will not be published. Required fields are marked *