OpenID: When Versions Conflict

OpenID was supposed to be a web-based single sign-on; however, the conflicts between versions can cause confusion – and prevent sign-on.

When presented with an OpenID sign-on box, you should sign in with:

userid.openidserver.com

For example, with a userid of jdoe at OpenID provider myopenid.com, enter this into the OpenID text box:

jdoe.myopenid.com

(OpenID.net has a more detailed description of the process.)

The problem with using OpenID comes when people try to use OpenID providers like Google.com and Yahoo.com with sites like Toodledo.com: the problem is that Toodledo.com only connects with providers that support OpenID 1.0; there is no message to suggest that the provider does not support that version. Google and Yahoo only support OpenID 2.0; other providers may or may not support OpenID 1.0.

Will Norris has a list of OpenID providers and the features of OpenID they support (broken down by feature). Look for providers that support things like the following:

  • openid-html
  • signon-10
  • sreg-10

Those providers that support these are, I suspect, most likely to support OpenID 1.0 (worked for me!). Also, if you are evaluating these providers in order to choose one, look for a provider that supports a lot of these features of OpenID.

OpenID.net has the specifications for all the versions of OpenID and the features of each.

I chose to go with myopenid.com for my OpenID provider; so far so good – and it works with Toodledo.com (vital!). Another thing – at least with myopenid.com – is that you get an identity page that others can see (I have one).

Another OpenID provider is WordPress.com; if you’ve a login on WordPress.com you have an OpenID. No word on whether WP supports OpenID 2.0.

Restoring Data for GNOME Evolution

Evolution is the personal information manager (PIM) for GNOME desktops, and includes Palm integration, todos, memos, contacts, email, and calendar.

Recently, I migrated from one desktop to another, and moved my data from my home directory over. Most applications were perfectly happy to find their data from their hidden directories preconfigured for them (VirtualBox was one of these).

Evolution refused to recognize the data as it was copied, and started by asking for all of the relevant information to set up a new mail account. To copy the data, it is necessary to first backup all information using Evolution’s backup process (from the File menu). This backup file can then be transfered to the new machine and restored. However, passwords are not restored as a part of this process; passwords are not included in the backup.

The passwords can just be re-entered again if necessary. If you’ve forgotten them (as I did) you can pull them from the GNOME keyring using the Seahorse application found in every GNOME installation. You can run seahorse from the command line or run it from the menu (inUbuntu Karmic Koala: Applications > Accessories > Passwords and Encryption Keys).

Patrick Ahlbrecht over at onyxbits.de has an excellent article about recovering the passwords from Evolution. Older versions of Evolution stored the passwords using base64 encoding in a plain text file (i.e., not encrypted at all).

Next time one saves passwords in an application, think about that base64 encoded password file…

Software Bugs: Good or Bad?

Recently, Karl Fogel wrote about bugs and “technical debt” – as a response to a mailing list thread about the future of Subversion in 2010. This resonates with me as I recently found myself struggling with bugs in Ubuntu to find that they would not be fixed (in my case, it was the lack of embedded ROM code for a USB-serial adapter – normally included with the Linux kernel).

Karl’s article was then reported on by Joe Brockmeier of OStatic.

Much of this reporting makes me think of TeX, the typesetting system created by Donald Knuth, one of computing’s “founding fathers” (so to speak… despite coming on the scene later on.). TeX has remained unchanged except for bug fixes for several decades, and shows no sign of slowing down or dying, in contrast to what the articles report.

I also think of Ubuntu distributions contrasted with the Ubuntu LTS (“Long Term Support”) distributions – which mirrors the difference between Fedora and Red Hat Enterprise Linux. It is possible to have a system without bugs – or at least, with few bugs. Fixing the bugs in a rapid and constant fashion will improve the user experience, as well as build up the “Good Will” value of your name rather than being known for bugs that aren’t fixed.

A complaint often heard from users is that the “fix” for a problem a user of a commercial product has is to “upgrade to the new version” (normally with a substantial cost). This should not be the way things are done.

Bug removal should be primary, and solid reliable program operation number one. As a user – and a enterprise user – reliability is primary. A product which proves through history that bugs are second to upgrades and new features will not last long. This is the very reason that products like Ubuntu LTS and Red Hat Enterprise Linux exist.

I agree with several premises in Joe’s article on OStatic – that bug removal should not be the only focus, or that an increase in bug reports is not all bad. More bugs means more users are using the product, and provides a way to make the software more reliable. Users would much rather apply patches and update to a more reliable version than upgrade to something entirely new and with newly introduced bugs not yet fixed.

LexisNexis Tools Come to Microsoft Office

At the LegalTech Conference taking place in New York City, Lexis announced a partnership with Microsoft. The competition has tools, but this partnership has all the markings of a competition killer.

LexisNexis research tools will be built into Microsoft Office products, in particular: Microsoft Word, Microsoft Outlook, and Microsoft Sharepoint. This means that no matter what Westlaw comes up with, and no matter what Bloomberg comes up with, Microsoft Office comes ready to use LexisNexis out of the box.

Thus, I would expect Microsoft Office upgrades to be high on every lawyer’s agenda shortly. Your corporate counsel is likely to be begging for it as soon as they hear about it.

Intel Itanium Tukwila CPU Out Soon?

ComputerWorld reports that Intel has started shipping the Itanium Tukwila processor. The Itanium processor drives the HP Integrity line of servers, as well as the HP NonStop servers.

In the near future (2nd or 3rd quarter?) HP is expected to announce Integrity servers based on the Tukwila processor. These new servers are predicted to be blade servers, and it is also suggested that Superdome will receive a complete overhaul – which is uncomfortably close to suggesting a “forklift upgrade” (i.e., pull out the entire server and replace) for Superdome. The Superdome system infrastructure is 10 years old, so it may be time – but an expensive upgrade like that is never welcome.

At the International Solid-State Circuits Conference next week, both Sun (UltraSPARC “Rainbow Falls”) and IBM (Power 7) are expected to announce new chips. Some coverage of both these chips went on at the HotChips Conference in August; ExtremeTech covered both chips well in its conference preview. In September, the Register managed to snap up a copy of the Sun SPARC roadmap; it shows the Rainbow Falls chip being introduced in 2010. As for Tukwila, Intel is rumored to be making the formal announcement of Tukwila at the ISSC.

We shall see…

The Microsoft Windows 7 Time Bomb

A while back, I received the Windows 7 pre-release version (Windows 7 RC apparently). I was excited to try it, but decided not to install it after seeing that it had an operational time limit.

Now the time is upon us, and Microsoft’s Windows 7 RC will start notifying users on February 15 that it will start shutting down on March 1. On that date, Windows 7 will start shutting down every two hours, and without warning, potentially causing data loss. The Windows Blog has an article that clarifies these points.

After the June 1, 2010, expiration date passes, Windows 7 RC will flag itself as “not genuine” and will have a black background specifying that fact for all the world to see. Not a pleasant thing to have happen, to be sure.

Even for those who decide to upgrade, an in-place upgrade is not possible; this points to another way for possible data loss during reinstallation. (Another reason to store your data on a separate drive, whether a network drive, USB drive, or separate partition.)

This entire thing is nothing less than a time bomb penalizing the Windows customer for using Windows 7 RC. I am relieved that I, for one, did not install it.

I can only imagine the problems faced by a small shop that installed Windows 7 RC on several clients, now being forced to reinstall Windows 7 from scratch. I can also just imagine what would happen if a UNIX release did this…

Botnet Making Fake SSL Connections

A recent report from CNet relates how a botnet is making fake SSL connections to a variety of popular hosts in order to hide the central control center of the botnet.

The list of affected hosts (from the botnet fighters at shadowserver.org) is enormous; it includes hosts from such people as Ubuntu Linux, Twitter, the US CIA, Last.fm, National Science Foundation (NSF), Dropbox, NASA, the US Army, the US Navy, the Pirate Bay, Wisconsin Unemployment Insurance, IEEE, US National Institutes of Health (NIH), Symantec, Sun, and so many more… Shadowserver.org has more about these fake SSL connects in their January calendar.

The Pushdo botnet is responsible; it reportedly has been around since 2007 and is the second largest botnet in the world. TrendMicro did an in-depth analysis of Pushdo a while back. SecureWorks also has a nice analysis of Pushdo as well. Microsoft’s Matt McCormack had a widely read article on Pushdo.

These SSL connections are never completed, and are mostly just a nuisance for web operators. However, on the other hand, the botnet is a serious problem – second largest in the world after all. We can only hope those that are in the know manage to shut it down soon.

Alpha Emulators

Emulators are an excellent way to replace aging hardware, saving electricity, rack space, and support costs. (Don’t think you’ll save on administration costs though: the operating system still requires support….)

However, finding emulators for architectures other than the i386 and its ilk can be difficult, particularly for recent orphans. The really old processors are emulated more often (such as the PDP series and others emulated by the SIMH emulator, or the System/370 and its ilk emulated by the Hercules emulator).

Emulators for the DEC Alpha are out there, but are not that easy to find. Stromasys has several, including the PersonalAlpha that can be used for personal use and the Charon-AXP which is a commercial product. For Charon-AXP, they now offer the Charon-AXP NCE (Non-Commercial Edition) which runs on Linux. Charon-AXP has for a long time been the best-known Alpha emulator out there, and there is a lot of recommendations for this product from those in the know.

There is also the open source project ES40, which aims to create an open source ES40 emulator. ES40 has a presence on Ohloh and on Sourceforge. There doesn’t seem to have been any activity on the project over the last year, which is unfortunate.

There is another emulator, FreeAXP, now entering beta status. FreeAXP emulates an AlphaServer 400 and is a prelude to a commercial Alpha emulator product from Migration Specialties, and FreeAXP will be available for commercial and non-commercial use. The current FreeAXP beta appears to be for 64-bit Windows only; the 32-bit Windows version was to come later.

Both FreeAXP and PersonalAlpha appear to be for Windows XP or Windows 7 only; neither list Windows 2000 as an option, and neither run on Linux or Unix. There is a Charon-AXP for OpenVMS, however.

News about Alpha emulators can often be had over at the OpenVMS Hobbyist Portal. After all, what better to run on an Alpha than OpenVMS?

Follow

Get every new post delivered to your Inbox.

Join 43 other followers