Sunday, November 14, 2010

Legacy icons in Windows XP

When your computer is a bit busy, you might notice something like this while Windows Media Player is loading:


The icons in the upper right hand corner show while it is busy loading. A similarly dated prompt appears when there is an error accessing your CD or DVD drive:


Notice that this prompt does not use the Windows XP styles.

I have always thought that this is because the XP visuals are simply a layer over Windows 3.1 or Windows 2000 functionality. If a window gets stuck (as in the Windows Media Player example) then you can see the old layout before the XP one is rendered over it. This is obviously a waste of performance.

On the other hand, I think the CD/DVD error was simply forgotten and left as it was.

Tuesday, October 26, 2010

LaTeX: MiKTeX undefined references

For the past three days I tried to get started with LaTeX on Windows. Using MiKTeX, I ended up in a situation where (a) my bibliography wasn't showing at all, (b) citations showed up as [?], and (c) the log complained about undefined references.

It turned out that the errors were obscure and unrelated to the cause. The references and citations were fine; it was the bibliography style that was incorrect. It needs to be like this to work:

\bibliographystyle{plain}
\bibliography{mybib}
\nocite{*}

You need to use \bibliographystyle and not \bibstyle that appears on some websites. The style needs to exist or it will not work. The bibliography file specified must be without the .bib extension. \nocite{*} is optional and makes LaTeX include references even if they are not cited.

Also, you may need to compile two or three times for the references to be properly linked.

Saturday, October 16, 2010

inet_ntoa() segmentation fault issue

inet_ntoa is a UNIX system call used in socket programming to obtain an IPv4 string form of an IP address from its in_addr form.

I have encountered an issue where its use results in the following warning:

warning: assignment makes pointer from integer without a cast

Attempting to run a program with this warning on inet_ntoa will result in a segmentation fault.

The cause of the problem is not incorrect C, but a missing header file. Shachar Shemesh wrote that the solution to this obscure problem is simply to include the arpa/inet.h header file:

#include <arpa/inet.h>

Saturday, September 4, 2010

Errors: helpful vs useless

Today I tried to install the Windows Phone 7 Developer Tools beta, and was given this lovely error message while the setup tried to start:

"A problem has been encountered while loading the setup components. Canceling setup."

Thanks. What problem? Why? What can I do about it?

This really sounds like the installer crashed and there is no reason why - as if the developer just put a try..catch block to catch any kind of problem regardless of what the problem is. The downside is that the end user sees a problem and has no clue what happened.

The least you could do as a developer is put an error number. The user does not know what this means, but it gives him room to (a) ask the developer what went wrong, or (b) Google it, knowing that other users definitely ran into it and posted about it somewhere.

The best way to show errors, though is to do it in a way that is indicative of (a) what exactly went wrong (e.g. SQL error -264), (b) where it went wrong (e.g. function name), and (c) what the user can do about it (e.g. contact customer support, read some FAQ, configure a setting, etc).

This is informative for the user (who knows what went wrong and what action to take) and also for the developer (who can debug the software if necessary and trace the source of the problem).

Sunday, March 28, 2010

Smart meters and use of IT in critical environments

An article titled "`Smart' meters have security holes" shows that smart meters can be hacked, and that this can allow malicious parties to both tamper with a user's security bill and actually control the supply of power.

I think that use of IT systems in critical circumstances such as this (because your home security is indeed critical) is a big mistake. Our work and leisure (not to mention time and money) is already at stake every day with the vulnerabilities that our computers contain. Extending this insecurity to the power grid is simply insane. In a few years' time, when IT becomes even more central in our lives, hackers will probably be able to control the doors to our homes.

Saturday, November 7, 2009

defrag interface: one year later

In my first post to this blog last year, I showed the difference between the Windows Disk Defragmenter on Windows XP and that on Windows Vista. The bottom line was that the Vista version lost the graphical interface and the user has no idea what is going on in the background.

With the release of Windows 7, one would hope that this annoyance has been taken care of. Sadly, this is not the case. The interface does give some information on what it is doing, but the user gets no idea of how much progress has been made, and the graphical display we have come to love in XP is long gone.


Fortunately, the good folks at Piriform have created a defragmentation utility called Defraggler that shows the defragmentation operations on the hard disk in real-time. This is even better than XP's defrag utility. The visual display is, of course, only one of the great features of this utility. I only gave it a quick try (and loved the Quick Defrag option), but the interface was enough to amend the sorrows caused by Vista and Windows 7.

Wednesday, October 14, 2009

Using wget

With Geocities going down in less that 2 weeks' time, I found myself needing to archive a number of websites hosted there that would otherwise disappear. For this purpose, one can go through the frustrating experience of saving a webpage's files one by one, but that would be stupid when there exist tools that automate the whole process.

The tool for the job is GNU Wget. While I've used this tool before for similar purposes, Geocities has several annoying things that made me need to learn to use the tool a bit better.

For starters, this is how to use the tool:

wget http://www.geocities.com/mitzenos/

Great, that downloaded index.html. But we want to download the whole site! So we use the -r option to make it recursive. This means that it will follow references to files used by each webpage, using attributes such as href and src. While this recursion could potentially go on forever, what limits it is the (default) recursion depth (i.e. follow such references only to a certain limit) and the fact that wget will, by default, not follow links that span different hosts (i.e. jump to different domains). Here's how the recursion thing works:

wget -r http://www.geocities.com/mitzenos/

OK, that downloads an entire site. In the case of Geocities, which hosts many different accounts, wget may end up downloading other sites on Geocities. If /mitzenos/ links to /xenerkes/, for example, both accounts are technically on the same host, so wget will just as well download them both. We can solve this problem by using the -np switch [ref1] [ref2]. Note combining -r and -np as -rnp does not work (at least on Windows it doesn't).

wget -r -np http://www.geocities.com/mitzenos/

So that solved most of the problems. Now when we try downloading /xenerkes/ separately, Geocities ends up taking down the site for an hour because of bandwidth restrictions, and you see a lot of 503 Service Temporarily Unavailable errors in the wget output. This is because Geocities impose a 4.2MB hourly limit on bandwidth (bastards). Since the webspace limit for Geocities is 15MB, it makes it difficult to download a site with size between 4.2MB and 15MB.

The solution to this problem is to force wget to download files at a slower rate, so that if the site is, say, 5MB, then the bandwidth will be spread over more than one hour. This is done using the -w switch [ref: download options], which by default takes an argument in seconds (you can also specify minutes, etc). For Geocities, 40-60 seconds per file should be enough, if the files aren't very large. Back when Geocities was popular, it wasn't really normal to have very large files on the web, so that isn't really an issue. This is the line that solves it:

wget -r -np -w 60 http://www.geocities.com/mitzenos/

This command will obviously take several hours to download a site if there are a lot of files, so choose the download interval wisely. If you're exceeding the bandwidth limit then use a large interval (around 60 seconds); if there are lots of files and the download is too damn slow, then use a smaller interval (30-40 seconds).