Category Archives: Computers

Of computers and gadgets — why your screen goes blank, what kind of hosting you should get, how to get started on blogging etc.

Missing Events and Photos in iPhoto?

Let me guess – you got your new iMac. You had a recent Time Machine backup on your Time Capsule. Setting up the new iMac was ridiculously easy — just point to the backup. A few hours later, your new iMac is just like your old Mac, right down to the wall paper and browser history. You shake your head in disbelief and say to yourself, “Man, this thing just works! This is the way it is supposed to be!”

A couple of days later, you fire up your iPhoto. It says it needs to update the database or whatever. No sweat. Just a couple of minutes — the new iMac is ridiculously fast. Hullo — what is wrong with the last four events? How come they have no photos in them? Well, actually, they do have something, you can see the thumbnails for a second, and then they disappear. The events seem to have the right number of photos. They even list the camera model and exposure data.

You scratch your head and say to yourself, “Well, may be the Time Machine backup didn’t unpack properly or whatever. May be the version upgrade messed up some data. No sweat. I can use the Time Machine and find the right iPhoto Library.” You fire up the Time Machine — probably for the first time for real. You restore the last good backup of the iPhoto Library to your desktop, and launch iPhoto again. Database update again. Anxious wait. Hey, the damned events are still missing.

Panic begins to set in. Mad Google for answers. Ok, hold down the Option and Command keys, and launch iPhoto. Regenerate thumbnails. Repair the library. Rebuild the Database. Still, the ****** events refuse to come back.

How do I know all this? Because this is exactly what I did. I was lucky though. I managed to recover the events. It dawned on me that the problem was not with the restore process, nor the version update of iPhoto. It was the Time Machine backup process — the backup was incomplete. I had the old Mac and the old iPhoto library intact. So I copied the old library over to the new iMac (directly, over the network; not from the Time Machine backup). I then started iPhoto on the new machine. After the necessary database update, all the events and photos showed up. Phew!

So what exactly went wrong? It appears that Time Machine doesn’t backup the iPhoto Library properly if iPhoto is open (according to Apple). More precisely, the recently imported photos and events may not get backed up. This bug (or “feature”) was reported earlier and discussed in detail.

I thought I would share my experience here because it was important piece of information and might save somebody some time, and possibly some valuable photos. And I feel it is disingenuous of Apple to tout the Time Machine as the mother of all backup solutions with this glaring bug. After all, your photos are among the most precious of your data. If they are not backed up and migrated properly, why bother with Time Machine at all?

To recap:

  1. If you find your photo collection incomplete after migrating to your shiny new iMac (using a Time Machine backup), don’t panic if you still have your old Mac.
  2. Exit gracefully from iPhoto on both the machines.
  3. Copy your old iPhoto Library from the old Mac over to the new one, after properly exiting from iPhoto on both machines.
  4. Restart iPhoto on the new Mac and enjoy.

How to prevent this from happening

Before the final Time Machine backup from your old Mac, ensure that iPhoto is not running. In fact, it may be worth exiting from all applications before taking the final snapshot.

If you want to be doubly sure, consider another automated backup solution just for your iPhoto Library. I use Carbon Copy Cloner.

Photo by Victor Svensson

Slow Time Machine with Time Capsule – SOLVED!

Let me guess — you bought a new Time Capsule, set up your Time Machine to back up half a terabyte of family photos and home videos, and expected it to be “hands-free” from then on? Then you got this progress bar saying that it will take 563 days (or some such ridiculous number) to sync?

Your next step was to trawl Google, which would have shown you that you are not alone. You would have tried disk utility to repair your Time Capsule disk, disabled Spotlight indexing, connected your Mac directly to TC etc. Nothing has helped so far? Fear not, here is what you need to do.

First of all, launch your software update pane from your system preferences on your Mac.

Mac Software Update
Ensure that you have this update, which specifically addresses this problem.
Mac Software Update

Here is what Apple says about this update:

About OS X Lion 10.7.5 Supplemental Update
The OS X v10.7.5 Supplemental Update is recommended for all users running OS X Lion v10.7.5 and includes the following fixes:

  • Resolves an issue that may cause Time Machine backups to take a very long time to complete
  • Addresses an issue that prevents certain applications signed with a Developer ID from launching

If it is not installed, click on the “Scheduled Check” tab, and install it. Note that it may be installed as bundled with other updates. So, as long as your Mac is up-to-date, you don’t have to worry too much about missing this particular update.

In all likelihood, this update is all that you will need to fix your slow Time Machine on Time Capsule To verify, restart your machine and launch Time Machine. Give it a few minutes and see if the speed is acceptable (about 10-20 MB a second on your wired Gigabit network).

If it is not, or if you have other reasons for not installing the update, there are a few other these tips you can try.

  • QuickSilver and Dropbox iconsQuit applications that may be indexing the file system. Dropbox, QuickSilver etc.  Find them on your menu bar. Right click on the icons and select Quit.
  • Finder optionEnsure that Finder is not set to show all size. Open a Finder window, hit Cmd-J to bring up these options, and ensure that the Calculate All Sizes is not ticked (despite the fact that it is shown ticked in the screenshot here).

    Note that it is not under the usual Finder preferences, which you would bring up using Cmd-I.

  • Kill FinderThe last thing to try is to kill and relaunch Finder. Click on the Apple logo on any menu bar, select “Force Quit…” to bring up the window show, select Finder and hit the Relaunch button

The last step (of killing and relaunching Finder) has been touted as something that definitely works. So do give it a try if nothing else helps. Another way of killing and relaunching Finder is to issue the command killall Finder from a terminal window.

If these tips didn’t work, you are pretty much out of luck. There are still one more thing you could try, which probably will not work. It certainly didn’t, for me, but gave me a sense that I was “fixing” the problem.

Connect your Time Capsule (TC) directly to your Mac. In order to do this, follow these steps.

  • First, connect your TC to your network, and set it up using the Airport Utility.
  • Disconnect it from your network. (Disconnect the ethernet cable.)
  • Disconnect the ethernet cable from your Mac, and connect TC (one of the three output ports) to your Mac.

How to Avoid Duplicate Imports in iPhoto

For the budding photographer in you, iPhoto is a godsend. It is the iLife photo organization program that comes pre-installed on your swanky new iMac or Mac Book Air. In fact, I would go as far as to say that iPhoto is one of the main reasons to switch to a Mac. I know, there are alternatives, but for seamless integration and smooth-as-silk workflow, iPhoto reigns supreme.

iPhotoTaggerBut (ah, there is always a “but”), the workflow in iPhoto can create a problem for some. It expects you to shoot pictures, connect your camera to your Mac, move the photos from the camera to the Mac, enhance/edit and share (Facebook, flickr) or print or make photo books. This flow (with some face recognition, red-eye removal, event/album creation etc.) works like a charm — if you are just starting out with your new digital camera. What if you already have 20,000 old photos and scans on your old computer (in “My Pictures”)?

This is the problem I was faced with when I started playing with iPhoto. I pride myself in anticipating such problems. So, I decided to import my old library very carefully. While importing “My Pictures” (which was fairly organized to begin with), I went through it folder by folder, dragging-and-dropping them on iPhoto and, at the same time, labeling them (and the photos therein) with what I thought were appropriate colors. (I used the “Get Info” function in Finder for color labels.) I thought I was being clever, but I ended up with a fine (but colorful) mess, with my folders and photos sporting random colors. It looked impossible to compare and figure out and where my 20,000 photos got imported to in iPhoto; so I decided to write my very first Mac App — iPhotoTagger. It took me about a week to write it, but it sorted out my photo worries. Now I want to sell it and make some money.

Here is what it does. It first goes through your iPhoto library and catalogs what you have there. It then scans the folder you specify and compares the photos in there with those in your library. If a photo is found exactly once, it will get a Green label, so that it stands out when you browse to it in your Finder (which is Mac-talk for Windows Explorer). Similarly, if the photo appears more than once in your iPhoto library, it will be tagged in Yellow. And, going the extra-mile, iPhotoTagger will color your folder Green if all the photos within have been imported into your iPhoto library. Those folders that have been partially imported will be tagged Yellow.

The photo comparison is done using Exif data, and is fairly accurate. Note that iPhotoTagger doesn’t modify anything within your iPhoto library. Doing so would be unwise. It merely reads the library to gather information.

This first version (V1.0) is released to test the waters, as it were, and is priced at $1.99. If there is enough interest, I will work on V2.0 with improved performance (using Perl and SQLite, if you must know). I will price it at $2.99. And, if the interest doesn’t wane, a V3.0 (for $3.99) will appear with a proper help file, performance pane, options to choose your own color scheme, SpotLight comments (and, if you must know, probably rewritten in Objective-C). Before you rush to send me money, please know that iPhotoTagger requires Snow Leopard and Lion (OS-X 10.6 and 10.7). If in doubt, you can download the lite version and play with it. It is fully functional, and will create lists of photos/folders to be tagged in Green and Yellow, but won’t actually tag them.

Your Virtual Thumbdrive

I wrote about DropBox a few weeks ago, ostensibly to introduce it to my readers. My hidden agenda behind that post was to get some of you to sign up using my link so that I get more space. I was certain that all I had to do was to write about it and everyone of you would want to sign up. Imagine my surprise when only two signed up, one of whom turned out to be a friend of mine. So I must have done it wrong. I probably didn’t bring out all the advantages clearly enough. Either that or not many people actually lug their data around in their thumbdrives. So here I go again (with the same, no-so-hidden agenda). Before we go any further, let me tell you clearly that DropBox is a free service. You pay nothing for 2GB of online storage. If you want to go beyond that limit, you do pay some fee.

Most people carry their thumbies around so that they can access their files from any computer they happen to find themselves in front of. If these computers are not your habitual computers (ie, your wife’s notebook, kids’ pc, office computer etc.), the virtual DropBox may not totally obviate the necessity of a real thumbdrive. For random computers, virtual just doesn’t cut it. But if you are a person of habits and shuttle from one regular computer to another, DropBox is actually a lot better than a real USB drive. All you have to do is to install DropBox on all those machines, which don’t even have to be of the same kind — they can be Macs, PCs, Linux boxes etc. (In fact, DropBox can be installed on your mobile devices as well, although how you will use it is far from clear.) Once you install DropBox, you will have a special folder (or directory) where you can save stuff. This special folder/directory is, in reality, nothing but a regular one. Just that there is a background program monitoring it and syncing it magically with a server (which is on a cloud), and with all other computers where you have DropBox installed under your credentials. Better yet, if your computers share a local network, DropBox uses it to sync among them in practically no time.

Here is video I found on YouTube on what DropBox can do for you:

In addition to this file synchronization, DropBox is an offline mirror of your synced files. So if you keep your important files in the DropBox folder, they will survive for ever. This is an advantage that no physical, real thumbdrive can offer you. With real thumbdrives, I personally have lost files (despite the fact that I am fairly religious about regular copies and mirrors) due to USB drives dying on me. With DropBox, it will never happen. You have local copies on all the computers where you have DropBox running and a remote copy on a cloud server.

But you might say, “Ha, that is the problem — how can I put my personal files on some remote location where anybody can look at them?” Well, DropBox says that they use industry-standard encryption that they themselves cannot unlock without your password. I chose to trust them. After all, even if they could decrypt it, how can they troll terabytes of data in random formats in the hope of finding your account number or whatever? Besides, if you are really worried about the security, you can always create a TrueCrypt volume in DropBox.

Another use you can put DropBox to is in keeping your application data synced between computers. This works best with Macs and symbolic links. For instance, if you have a MacBook and an iMac, you can put your address book in your DropBox directory, create a symbolic link from the normal location (in ~/Library/ApplicationData/Mail.app) and expect to see the same address book in both the computers. Similar trick will work with other applications as well. I have tried it with my offline blogging software (ecto) and my development environment (NetBeans).

Want more reasons to sign up? Well, you can also share files with other users. Suppose your spouse has a DropBox of her own, and you want to share some photos with her. This can be easily arranged. And I believe the photos folder in DropBox behaves like a gallery, although I haven’t tested it.

So, if you find these reasons to have a virtual thumbdrive in addition to (or instead of) a real physical one, do sign up for DropBox via any of the million links on this page. Did I tell you that if your friends signed up using your link, you would get 250MB extra for each referral?

Photo by Debs (ò‿ó)♪

Hosting Services

hosting.gifIn today’s world, if you don’t have a website, you don’t exist. Well, that may not be totally accurate — you may do just fine with a facebook page or a blog. But the democratic nature of the Internet inspires a lot of us to become providers of information rather than just consumers. The smarter ones, in fact, strategically position themselves in between the providers and the consumers, and reap handsome rewards. Look at the aforementioned facebook, or Google, or any one of those Internet businesses that made it big. Even the small fries of the Internet, including small-time bloggers such as yours faithfully, find themselves facing web-traffic and stability kind of technical issues. I recently moved from my shared hosting at NamesDirect.com to a virtual private host at Arvixe.com, and even more recently to InMotion. There, I have done it. I have gone and dropped technical jargon on my readers. But this post is on the technical choices budding webmasters have. (Before we proceed further, let me disclose the fact that the links to InMotion in this post are all affiliate links.)

When you start off with a small website, you typically go with what they call “shared hosting” — the economy class of web hosting soltuion. You register a domain name (such as thulasidas.com) for $20 or $30 and look around for a place on the web to put your pages. You can find this kind of hosting for under $10 a month. (For instance, InMotion has a package for as low as $4 a month, with a free domain name registration thrown in.) Most of these providers advertise unlimited bandwidth, unlimited storage, unlimited databases etc. Well, don’t believe everything you see on the Internet; you get what you pay for. If you read the fine print before clicking “here” to accept the 30 page-long terms and conditions, you would see that unlimited really means limited.

For those who have played around with web development at home, shared hosting is like having XAMPP installed on your home computer with multiple users accessing it. Sure, the provider may have a mighty powerful computer, huge storage space and large pipe to the Internet or whatever, but it is still sharing. This means that your own particular needs cannot be easily accommodated, especially if it looks as though you might hog an unfair share of the “unlimited” resources, which is what happened with my provider. I needed a “CREATE TEMPORARY TABLE” privilege for a particular application, and my host said, “No way dude.”

Shared hosting comes in different packages, of course. Business, Pro, Ultimate etc. — they are all merely advertising buzzwords, essentially describing different sizes of the share of the resources you will get. The next upgrade is another buzzword — Cloud Hosting. Here, the resources are still shared. But apparently they reside on geographically dispersed data centers, optimized and scalable through some kind of grid technology. This type of hosting is considered better because, if you run out of resources, the hosting program can allocate more. For instance, if you suddenly have a traffic spike because of your funny post going viral on facebook and digg, the cloud could easily handle it. They will, of course, charge you more, but in the shared hosting scenario, they would probably lock you out temporarily. To me, cloud hosting sounds like shared hosting with some of the resource constraints removed. It is like sharing a pie, but with all the ingredients on hand, so that if you run out, they can quickly bake some more for you.

The “business class” of web hosting is VPS or Virtual Private Server. Here, you have a server (albeit a virtual one) for yourself. Since you “own” this server, you can do whatever you like with it — you have “root” access. And the advertised resources are, more or less, dedicated to you. This is like having a VirtualBox running on your home PC where you have installed XAMPP. The only downside is that you don’t know how many other VirtualBoxes are running on the computer where your VPS is running. So the share of the resources you actually get to enjoy may be different from the the so-called “dedicated” ones. For root access and quasi-dedicated resources, you pay a premium. VPS costs roughly ten times as much as shared hosting. InMotion, for instance, has a VPS package for $40 a month, which is what I signed up for.

VPS hosting comes with service level agreements that typically state 99.9% uptime or availability. It is important to note that this uptime refers, not to your instance of VPS, but to the server that hosts the virtual servers. Since you are the boss of your VPS, if it crashes, it is largely your problem. Your provider may offer a “fully managed” service (InMotion does), but that usually means you can ask them to do some admin work and seek advice. In my case, my VPS started hanging (because of some FastCGI issues before I decided to move to DSO for PHP support so that APC worked — I know, lots of techie jargon, but I am laying the groundwork for my next post on server management). When I asked the support to help diagnose the problem, they said, “It is hanging because your server is spawning too many PHP processes. Anything I can help you with?” Accurate statement, I must admit, but not necessarily the kind of help you are looking for. They were saying, ultimately, the VPS server was my baby, and I would have to take care of it.

If you are real high-flying webmaster, the type of hosting you should go for is a fully dedicated one. This is kind of like the first class or private jet kind of situation in my analogy. This hosting option will run you a considerable cost, anywhere from $200 to several thousands per month. For that kind of money, what you will get is a powerful server (well, at least for the costlier ones of these plans) housed in a datacenter with redundant power supplies and so on. Dedicated hosting, in other words, is a real private server, as opposed to a virtual one.

I have no direct experience with a hosted dedicated server, but I do have a couple of servers running at home for development purposes. I run two computers with XAMPP (one real and one on a VirtualBox on my iMac) or and two with MAMP. And I presume the dedicated-server experience is going to be similar — a server at your beck and call with resources earmarked for you, running whatever it is that you would like run.

Somewhat spread out over shared and VPS hosting is what they call a reseller account. This type of hosting essentially sets you up as a small web hosting provider (presumably in a shared hosting mode, as described above) yourself. This can be interesting if you want to make a few bucks on the side. InMotion, for instance, offers you a reseller package for $20, and promises to look after enduser support themselves. Of course, when you actually resell to your potential customers, you may want to make sure your offering has something better than what they can get directly from the company either in terms of pricing or features. Otherwise, it wouldn’t make much sense for them to come to you, would it?

So there. That is the spectrum of hosting options you have. All you need to do is to figure out where in this spectrum your needs fall, and choose accordingly. If you end up choosing InMotion (a wise choice), I would be grateful if you do so using one of my affiliate links.

We Are Moving…

Unreal Blog has moved to a more powerful server at Arvixe. [Disclosure: All the server links in this article are affiliate links.] For those interested in moving your hosting to a new server, I thought I would describe the “gotchas” involved.

This gotcha got me during a test migration of my old posts to the new server. I had over 130 posts to migrate. When I moved them to the new blog on the new server, they looked like new posts. To the unforgiving logic of a computer (that defies common sense and manages to foul up life), this pronouncement of newness is accurate, I have to admit — they were indeed new posts on the new server. So, on the 10th of January, my regular readers who had signed up for updates received over 100 email notifications about “new posts” on my blog. Needless to say I started getting angry emails from my annoyed regulars demanding that I remove their names from my “list.excessive” (as one of them put it). If you were one of those who got excessive emails, please accept my apologies. Rest assured that I have turned off email notifications, and I will look and hard into the innards of my blog before turning it back on. And when I do turn it on, I will prominently provide a link in each message to subscribe or unsubscribe yourself.

As you grow your web footprint and your blog traffic, you are going to have to move to a bigger server. In my case, I decided to go with Arvixe> because of the excellent reviews I found on the web. The decision of what type of hosting you need makes for an interesting topic, which will be my next post.

Cloud Computing

I first heard of “Cloud Computing” when my friend in Trivandrum started talking about it, organizing seminars and conferences on the topic. I was familiar with Grid Computing, so I thought it was something similar and left it at that. But a recent need of mine illustrated to me what cloud computing really is, and why one would want it. I thought I would share my insight with the uninitiated.

Before we go any further, I should confess that I write this post with a bit of an ulterior motive. What that motive is is something I will divulge towards the end of this post.

Let me start by saying that I am no noob when it comes to computers. I started my long love affair with computing and programming in 1983. Those late night bicycle rides to CLT and stacks of Fortran cards – those were fun-filled adventures. We would submit the stack to the IBM 370 operators early in the morning and get the output in the evening. So the turn around time for each bug fix would be a day, which I think made us fairly careful programmers. I remember writing a program for printing out a calendar, one page per month, spaced and aligned properly. Useless really, because the printout would be on A3 size feed rolls with holes on the sides, and the font was a dirty Courier type of point size 12 in light blue-black, barely legible at normal reading distance. But it was fun. Unfortunately I made a mistake in the loop nesting and the calendar came out all messed up. Worse, the operator, who was stingy about the paper usage, interrupted the output on the fourth month and advised me to stop doing it. I knew that he could not interrupt it if I used only one Fortran PRINT statement and rewrote the program to do it that way. I got the output, but on the January page, there was this hand-written missive, “Try it once more and I will cancel your account.” At that point I ceased and desisted.

I started using email in the late eighties on a cluster of Vaxstations that belonged to the high-energy physics group at Syracuse University. At first, we could send email only to users on the same cluster, with DecNet addresses like VAX05::MONETI. And a year later, when I could send a mail to my friend in the next building with an address like IN%”naresh@ee.syr.edu” or something (the “IN” signifying Internet), I was mighty impressed with the pace at which technology was progressing. Little did I know that a few short years later, there would be usenet, Mosaic and e-commerce. And that I would be writing books on financial computing and WordPress plugins in PHP.

Despite keeping pace with computing technology most of my life, I have begun to feel that technology is slowly breaking free and drifting away from me. I still don’t have a twitter account, and I visit my Facebook only once a month or so. More to the point of this post, I am embarrassed to admit that I had no clue what this cloud computing was all about. Until I got my MacBook Air, thanks to my dear wife who likes to play sugar mama once in a while. I always had this problem of synchronizing my documents among the four or five PCs and Macs I regularly work with. With a USB drive and extreme care, I could manage it, but the MBA was the proverbial straw that broke my camel of a back. (By the way, did you know this Iranian proverb – “Every time the came shits, it’s not dates”?) I figured that there had to be better way. I had played with Google Apps for a while now, although I didn’t realize that it was cloud computing.

What I wanted to do was a bit more involved than office applications. I wanted to work on my hobby PHP projects from different computers. This means something like XAMPP or MAMPP along with NetBeans on all the computers I work with. But how do I keep the source code sync’ed? Thmbdrives and backup/sync programs? Not elegant, and hardly seamless. Then I hit upon the perfect solution – Dropbox! This way, you store the source files on the network (using Amazon S3, apparently, but that is beside the point), and see a directory (folder for those who haven’t obeyed Steve Jobbs and gone back to the Mac) that looks like suspiciously local. In fact, it is a local directory – just that there is a program running on the background syncing it with your folder on the cloud.

Dropbox! gives you 2GB of network storage free, which I find quite adequate for any normal user. (That sounds like the famous last words by Bill Gates, doesn’t it? “64KB of memory should be enough for anyone!”) And, you can get 250MB extra for every successful referral you make. That brings me to my ulterior motive – all the links to Dropbox! on this post are actually referral links. When you sign up and start using it by clicking on one of them, I get 250MB extra. Don’t worry, you get 250MB extra as well. So I can grow my online storage up to 8GB, which should keep me happy for a long time, unless I want to store my photos and video there, in which case I will upgrade my Dropbox! account to a paid service.

Apart from giving me extra space, there are many reasons you should really check out Dropbox!. I will write more on those reasons later, but let me list them here.
1. Sync your (Mac) address book among your Macs.
2. Multiple synced backups of your precious data.
3. Transparent use for IDEs such as Netbeans.
Some of these reasons are addressed only by following some tips and tricks, which I will write about.

By the way, we Indian writers like to use expressions like ulterior motives and vested interests. Do you think it is because we always have some?

Blank Screen after Hibernate or Sleep?

Okay, the short answer, increase your virtual memory to more than the size of your physical memory.

Long version now. Recently, I had this problem with my PC that it wouldn’t wake up from hibernation or sleep mode properly. The PC itself would be on and churning, but the screen would switch to power save mode, staying blank. The only thing to do at that point would be to restart the computer.

Like the good netizen that I am, I trawled the Internet for a solution. But didn’t find any. Some suggested upgrading the BIOS, replacing the graphics card and so on. Then I saw this mentioned in a Linux group, saying that the size of the swap file should be more than the physical memory, and decided to try it on my Windows XP machine. And it solved the problem!

So the solution to this issue of blank screen after waking up is to set the size of the virtual memory to something larger than the memory in your system. If you need more information, here is how, in step-by-step form. These instructions apply to a Windows XP machine.

  1. Right-click on “My Computer” and hit “Properties.”
  2. Take a look at the RAM size, and click on the “Advanced” tab.
  3. Click on the “Setting” button under the “Performance” group box.
  4. In the “Performance Options” window that comes up, select the “Advanced” tab.
  5. In the “Virtual Memory” group box near the bottom, click on the “Change” button.
  6. In the “Virtual Memory” window that pops up, set the “Custom Size” to something more than your RAM size (that you saw in step 2). You can set it on any hard disk partition that you have, but if you are going through all these instructions, chances are you have only “C:”. In my case, I chose to put it on “M:”.

How to save a string to a local file in PHP?

This post is the second one in my geek series.

While programming my Theme Tweaker, I came across this problem. I had a string on my server in my php program (the tweaked stylesheet, in fact), and I wanted to give the user the option of saving it to a file his computer. I would’ve thought this was a common problem, and all common problems can be solved by Googling. But, lo and behold, I just couldn’t find a satisfactory solution. I found my own, and thought I would share it here, for the benefit of all the future Googlers yet to come and go.

Before we go into the solution, let’s understand what the problem is. The problem is in the division of labor between two computers — one is the server, where your WordPress and PHP are running; the other is the client’s computer where the viewing is taking place. The string we are talking about is on the server. We want to save it in a file on the client’s computer. The only way to do it is by serving the string as an html reply.

At first glance, this doesn’t look like a major problem. After all, servers regularly send strings and data to clients — that’s how we see anything on the the browser, including what you are reading. If it was just any PHP program that wants to save the string, it wouldn’t be a problem. You could just dump the string into a file on the server and serve the file.

But what do you do if you don’t want to give the whole world a way of dumping strings to files on your server? Well, you could do something like this:

<?php
header('Content-Disposition: attachment; filename="style.css"');
header("Content-Transfer-Encoding: ascii");
header('Expires: 0');
header('Pragma: no-cache');
print $stylestr ;
?>

So, just put this code in your foo.php that computes the string $stylestr and you are done. But our trouble is that we are working in the WordPress plugin framework, and cannot use the header() calls. When you try to do that, you will get the error message saying that header is already done dude. For this problem, I found the ingenious solution in one of the plugins that I use. Forgot which one, but I guess it is a common technique. The solution is to define an empty iFrame and set its source to what the PHP function would write. Since iFrame expects a full HTML source, you are allowed (in fact, obliged) to give the header() directives. The code snippet looks something like:

<iframe id="saveCSS" src="about:blank" style="visibility:hidden;border:none;height:1em;width:1px;"></iframe>
<script type="text/javascript">
var fram = document.getElementById("saveCSS");
<?php echo 'fram.src = "' . $styleurl .'"' ;
?>

Now the question is, what should the source be? In other words, what is $styleurl? Clearly, it is not going to be a static file on your server. And the purpose of this post is to show that it doesn’t have to be a file on the server at all. It is a two-part answer. You have to remember that you are working within the WordPress framework, and you cannot make standalone php files. The only thing you can do is to add arguments to the existing php files, or the plugins you have created. So you first make a submit button as follows:

<form method="post" action="<?php echo $_SERVER["REQUEST_URI"]?>">
<div class="submit">
<input type="submit" name="saveCSS" title="Download the tweaked stylesheet to your computer" value="Download Stylesheet" />
</div>

Note that the name attribute of the button is “saveCSS.” Now, in the part of the code that handles submits, you do something like:

<?php
if (isset($_POST['saveCSS']))
$styleurl = get_option('siteurl') . '/' . "/wp-admin/themes.php?page=theme-tweaker.php&save" ;

?>

This is the $styleurl that you would give as the source of your iFrame, fram. Note that it is the same as your pluging page URL, except that you managed to add “?save” at the end of it. The next trick is to capture that argument and handle it. For that, you use the WordPress API function, add_action as:

<?php
if (isset($_GET['save'] ))
add_action('init', array(&$thmTwk, 'saveCSS'));
else
remove_action('init', array(&$thmTwk, 'saveCSS'));
?>

This adds a function saveCSS to the init part of your plugin. Now you have to define this function:

<?php
function saveCSS() {
header('Content-Disposition: attachment; filename="style.css"');
header("Content-Transfer-Encoding: ascii");
header('Expires: 0');
header('Pragma: no-cache');
$stylestr = "Whatever string you want to save";
ob_start() ;
print $stylestr ;
ob_end_flush() ;
die() ;
}
?>

Now we are almost home free. The only thing to understand is that you do need the die(). If your function doesn’t die, it will spew out the rest of the WordPress generated stuff into your save file, appending it to your string $stylestr.

It may look complicated. Well, I guess it is a bit complicated, but once you implement it and get it running, you can (and do) forget about it. At least, I do. That’s why I posted it here, so that the next time I need to do it, I can look it up.