Back to school sale

EDIT: The sale is over now. Thanks!

In celebration of the 200,000th download of Blurity, we’re having a back-to-school sale! From now until September 1st, 2015, save 25% off the normal price of Blurity when you buy a registration key. The discount will be automatically applied.

Want to save even more? While we can’t give away Blurity for free, we can give you a coupon code that will save you an additional $5 off! Enter this coupon code at checkout: BTS2015 (Not valid for PayPal purchases) 

Thanks for choosing Blurity for your photo deblurring needs!

Categories: Imaging | Tags: , , | Comments Off on Back to school sale

The many definitions of blur

Over the years I’ve been developing Blurity, one thing has become clear: not everybody thinks of “blur” the same way.

For me, and thus for Blurity, a “blurry” image is one that has had its high-frequency content obscured through either motion or poor focus. Other than that blur, the image is fine: correctly exposed, minimal noise, and an acceptable resolution. This idea of a blurry image is common to many Blurity customers, particularly those with photography experience. Others have an entirely different view.

Through email exchanges with customers and potential customers, I’ve learned that many people consider “blur” to include image degradation of all types. Sometimes, the problem is excessive noise, whether from the sensor, from quantization, or from compression. Other times, the blur is nominally motion blur, but the image is so horribly over-exposed that all of the latent information has been lost.

When the image is noisy, what you really need is a good noise-removal program. The latest versions of Lightroom, Photoshop, and GIMP all have decent noise-removal tools.  My favorite is a third-party tool called Noiseware.

If the image is over- or under-exposed, again Lightroom, Photoshop, or GIMP are all your friends. It might not be possible to restore the areas that have clipped to white or black, but generally a decent amount of improvement is possible.

But those situations are relatively infrequent. By far the most common misconception of “blur” has to do with low-resolution images.

The exchange usually starts as a complaint. “It isn’t working!” the person will say about Blurity. I ask them to send me a photo, and what I get is a thumbnail-size, highly compressed hint of an image. Further communication reveals that the person was expecting Blurity to enlarge the photo, leaving behind something high enough resolution to print large.

If you want to enlarge this…

Blurity doesn’t do that. Nothing does that. It isn’t that Blurity doesn’t know how to scale up an image using something like bicubic interpolation (though some people seem to think that feeding in such an image will magically work better than the original thumbnail). The problem is that increasing the resolution of an image is only distantly related to deblurring an image, and even then, there are practical limits to how much extra detail can be recreated.

…don’t expect much better than this

When Blurity deblurs an image, most of the newly revealed detail was already in the image; it was just hidden. Blind deblurring, which is what Blurity does, is what’s known as an ill-posed inverse problem, meaning that some assumptions are necessary to make the deblurring work, but they are relatively minor.

On the other hand, increasing the resolution of an image, a task called “super resolution,” requires many more assumptions about the nature of the enlarged image. There is a lot less data to work with than in the blurry-image case.

As I’ve mentioned before when writing about enlarging small images, there are a handful of utilities that can help slightly increase the resolution of photos. Results tend to be hit-or-miss, but it’s worth giving them a shot. Just don’t expect Blurity to do super-resolution.

Actually, there’s one more common support email type: the person with the severely out-of-focus image. Although the Blurity user manual makes it very clear that only small focus blurs can be repaired, I still get a number of emails from people who accidentally turned off autofocus on their DSLR or had their cell phone camera decide to focus on the mountains in the background instead of the people in the foreground. Oftentimes, the unfocused images are of loved ones, handshakes with the famous, or cherished vacation souvenirs.

Generally, Blurity can recover a decent amount of detail in an out-of-focus image. The problem is that the photos tend to be so out of focus that there are large gaps in the information that’s left. Those gaps lead to visual artifacts, like ringing (discussed near the end of the user manual). While ringing isn’t a problem for forensic work, it’s highly noticeable on peoples’ faces. That makes people unsatisfied.

So what to do about that? First, understand that Blurity is best used on either motion blur or relatively small focus blur. If plenty of detail gets revealed during the deblurring, but some visual artifacts are left over, those artifacts can sometimes be smoothed over manually with a photo retouching tool like Photoshop.

Just remember, have reasonable expectations, prepare for some artifacts, and don’t expect tiny images to magically become big.

 

Categories: Imaging | Comments Off on The many definitions of blur

Larger images now supported on Windows

Blurity no longer limits images on Windows to 24 megapixels. We heard this complaint time and again, with increasing frequency as high-pixel-count cameras became more common. Fundamentally, the problem was memory: Blurity requires a very large amount of memory to process images. The newest version of Blurity, at least on Windows, resolves the problem.

That limitation had been in place because Windows prevents 32-bit applications from allocating more than 2 GB of memory. Starting in version 1.4.167, the core deconvolution code is built as a 64-bit application, allowing it to use a virtually unlimited amount of memory. All that is required is enough RAM and a 64-bit version of Windows.

An informal survey of Blurity users showed that almost all have more than 2 GB of RAM, with 4 GB and even 8 GB being relatively common. A handful even had systems with 16 GB or more of RAM. Blurity can now make use of all of that extra space. On a system with 16 GB of RAM, Blurity should (in theory) be able to handle a 267 megapixel image.

If your system has less than the recommended amount of RAM (i.e., less than 4 GB of RAM), or you’re using a 32-bit version of Windows, Blurity now has better logic about when to sacrifice performance for the sake of image size. Therefore, large images on those systems have a higher probability of being processed, albeit at a slower speed relative to systems with the recommended amount of memory.

One other major improvement in version 1.4.167 of Blurity is that processing is roughly 10% faster than 1.4.165.

The Mac version of Blurity is still built as a 32-bit application, but because Mac OS X allows 32-bit applications to allocate up to 4 GB of memory, the size limit has always been higher: at least 33 megapixels, possibly more. The speed and small-memory optimizations will be released soon, so watch for a new 64-bit version for the Mac.

Categories: Code, Features | Comments Off on Larger images now supported on Windows

Advanced Blurity usage: custom blur models

Sometimes, not even Blurity can remove the blur from a photo on its own. If that happens, you might be able to help it along by using one of Blurity’s advanced features: opening a custom blur model.

In ordinary operation, Blurity builds a model of the blur in a photo by looking for clues about the hidden sharp image’s structure. Unfortunately, poor lighting, bad focus, and sensor noise can all cause that modeling to fail.

However, it is sometimes possible for a human to “see” the form of the blur in the image even when Blurity cannot. It is most visible as glints on rounded surfaces, where point sources of light trace out the movement of the camera. In effect, those glints show the “point spread function” that Blurity normally attempts to derive via an algorithm.

If a glint showing the point spread function is visible in the image, and — importantly — the glint represents an accurate model of the blur throughout the image, it’s possible to extract the glint from the image and then use Blurity’s “Open blur model” function to deblur the image using that model.

How about an example?

Here’s a photo of a bottle of vitamins and a bottle of cologne. I took it with my cellphone, a Motorola Droid X, which runs Android. The same procedures could be applied to any other camera, from a DSLR to an iPhone to a point-and-shoot.

The photo is in focus, but it is clearly motion blurred. Thus, it’s a good candidate for deblurring.

The blurry image, taken with an Android smartphone (Click image for full-resolution version)

Normally, you’d use Blurity in Basic or Advanced mode and let it figure out the blur model on its own, which it does easily with this image. You wouldn’t want to follow the steps outlined here in most situations.

However, for the sake of demonstration, we’ll assume that the automated ways failed.

Look closely at the cover of the cologne bottle. Notice how it’s shiny. In particular, notice how at one of the corners there’s a bright, slightly curved line. That’s a glint, and it’s tracing out the motion of the camera. We want to extract that glint so that we can feed it back into Blurity.

First, open up the blurry photo in your favorite photo editor. We’ll use Photoshop in this example.

Select the glint plus some margin around it.

Copy that selection into a new image.

The more accurate the blur model is, the better the deblurring results will be. Therefore, we want to eliminate all of the superfluous data from around the glint. In the end, we want to end up with something that looks like one of the computed blur models. (If the idea of the computed blur model is unfamiliar to you, please refamiliarize yourself with how to use Blurity by referring to the user manual.)

First, ensure that the blur model is grayscale by desaturating it.

Next, adjust the levels so that the glint is isolated.

You want the glint to be bright and everything else to be black. However, you don’t want the entire glint clipped to white; shades of gray in the blur model can be just as important to deblurring quality as the overall shape of the blur model. You just want the brightest parts of the blur model to be just below being clipped.

Depending on your situation, you might need to manually paint parts of your blur model black. Remember that you want to eliminate everything that isn’t part of the model of the blur. In this example, the blur model was sufficiently clean without manually painting any areas black.

Assuming all has gone well, the end result is a clean model of the blur. Save it as a PNG file.

Now it’s time to bring our blur model into Blurity. First, open the blurry image in Blurity. Once it’s open, select “Open blur model…” from the “Blur model” sub-menu in the “File” menu. Navigate to where you’ve saved the desaturated, level-corrected, cleaned-up blur model, and open it.

After a short processing period, Blurity produces the deblurred image. If you open the full-resolution version of the image, you can see that the glint has been resolved back into a point-source of light. Seeing that indicates that the blur model was successfully developed and applied. Since the blur in the glint accurately represented the blur on the front of the vitamin bottle, the text there is also sharp.

The deblurred image (Click image for full-resolution version)

Obviously, it’s easier to use Blurity in one of its normal automated modes. However, if you need the extra power of extracting and defining your own blur models, that functionality is nice to have.

As a variation on this technique, you can use the “Save blur model…” menu option to save any blur models that Blurity has calculated. If you wish, you can then reapply those blur models to other images. That can be useful for batch processing, where many images are blurred the same way. A series of video stills might be an example of that.

Or, if you can tell that Blurity got close to a good blur model by observing the “Computed blur model” display, but you can tell that it isn’t 100% there, you can save the blur model Blurity calculated, touch it up on your own, and then open your revised model back in Blurity for use during the final deblurring.

Happy deblurring!

Categories: Features, Lessons | Comments Off on Advanced Blurity usage: custom blur models

How we learned that Stripe alone was costing us customers

Like most of the tech world, we love Stripe for processing payments.  The API? Great. The integration? Painless. Unfortunately, we discovered that going with an all-Stripe solution was costing Blurity customers.

People just wanted to fix their blurry photos — and pay us money! — but we were turning them away.  Unknowingly.

Blurity loves Stripe

We first got the inkling that something was amiss when we looked at the map of Blurity customers.  Although there were many international customers — mostly in the United Kingdom, Canada, and Australia — there were some big names missing.  Germany, in particular.

We could see that many Germans were finding the Blurity web site and trying the Blurity demo, but nobody from that large, prosperous, technologically advanced country was buying.  We chalked it up to a language difference, and wrote off as a fluke the fact that we had many customers from Spain.

The problem escalated when German users started emailing to ask about possibly paying with PayPal. Were they simply reluctant to use their credit cards?  Did they have an aversion to buying anything on the internet except via PayPal, perhaps reinforced by too much time on eBay?

No.  The real reason: they didn’t have credit cards.

Unlike people in the United States, where seemingly everybody has at least one credit card or Visa/MasterCard-branded debit card, that is not the case throughout the rest of the world.  It turns out that credit card penetration is remarkably low in some surprisingly large countries.  For example, according to payments company Adyen, only 26% of Germans have credit cards.

Users from those countries were practically begging to buy Blurity, and because Blurity supported only Stripe, they were getting the door slammed in their faces.  That’s not good business.

After getting a dozen such emails, we decided that something needed to be done, and that something was PayPal.  Though it’s clunky, expensive (1% higher for international with PayPal versus Stripe), and sometimes irritating, PayPal does have one, huge, colossal advantage: you can use it to take just about any form of payment, from just about anywhere.  Also, its name recognition is hard to beat.

So, begrudgingly, we opened a PayPal account and spent 10 minutes adding a “buy with PayPal” button on a child page of the main Blurity purchase page.  It went live on September 20th, and the results were amazing.  Since we added the option to pay with PayPal, 26% of the purchases have been made using PayPal.

Blurity has now been purchased in at least 20 countries, in part thanks to PayPal

It’s hard to say how many of those users would have purchased anyway using Stripe, but based on their email addresses, IP addresses, and the prevalence of credit cards in their associated countries,  my wild guess is that somewhere around half would have been lost sales without PayPal.

Though it seems to be popular in tech circles to hate PayPal, it has its place.  For us, that place is alongside Stripe.

Blurity loves the set of Stripe and PayPal

Categories: Lessons | Tags: , , , | Comments Off on How we learned that Stripe alone was costing us customers

Blurity got Mac

About a month ago, a new option quietly snuck onto the download page: Blurity is now available on the Mac!

Blurity, running on OS X, in the middle of the tour

That’s right: now blur can be a thing of the past for Mac owners, too.  Hooray!

The Mac version of Blurity has all of the capabilities of the Windows version.  It has the same great fixing of blurry photos, the same interface, and the same passionate technical support.

Actually, it’s slightly better in one way: the Mac version can handle larger images, at least up to 33 megapixels, compared to the roughly 24 megapixel limit in the Windows version. (The reason is due to memory allocation differences between the platforms.)

The only major open issue specific to the Mac is that the Blurity application is not digitally signed.  If you’re running OS X 10.8 (Mountain Lion) or newer, and you haven’t disabled Gatekeeper, you’ll see a scary-looking warning about an “unidentified developer” the first time you run Blurity after you complete the installation.  To get around the Gatekeeper warning, simply hold down the ctrl key, click on Blurity in your Application folder, select “Open” from the pop-up menu that appears, and then choose “Open” again on the dialog box that pops up.

It took about two weeks of off-and-on work to get the Mac version out the door. Wondering how we did the Mac port so quickly?

Blurity was built from the beginning with cross-platform compatibility in mind.  The underlying image processing code is written in portable C++, allowing it to build with both Visual Studio and the Intel C++ compiler.  Higher up the stack, the GUI is written in Python using the wxPython toolkit, both of which are also cross-platform.  Equally critical was the availability of Intel’s Math Kernel Library, which was cross-platform, fast at FFTs, and far less expensive than FFTW to license.

Yes, there were a thousand little things that made the port tedious (e.g., how unnamed semaphores aren’t supported on OS X, how text colors don’t propagate correctly on OS X in wx, how command-line tools are invoked), but part of that was our relative lack of experience on writing desktop apps in OS X.

Overall, it was accomplished with relatively few #ifdefs, and the upshot is that improvements to one platform are really improvements to both platforms.

 

 

Categories: Code, Features | Tags: , , , , , , , | Comments Off on Blurity got Mac

Blurity is now faster

You like new features, right?  I’m betting you like improved stability as well, eh?  So do I.  That’s why I’m happy to announce that Blurity v1.1.87 is the fastest, most-stable, least-memory-hungry version of Blurity yet!

In particular, you’ll now find an option in the File menu for setting preferences.  Within the Preferences dialog box, you’ll see an option to enable accelerated processing.

The new Blurity preferences dialog box

If you check that little box, you’ll find that Blurity is up to 30% faster at processing.  It’s particularly noticeable when dealing with large blur and sample box sizes.

The trade-off is that the blur modeling can be (but won’t necessarily be) less accurate when acceleration is enabled.  Also, certain parameters need to be tweaked if you’re using accelerated processing in Advanced mode, most often the level of solver filtering (usually higher) and the number of solver iterations (also usually higher).

Give it a try; it’s off by default. If you like it, great! If you don’t like it, switching it off again is as simple as checking a box.

 

Categories: Features | Comments Off on Blurity is now faster

How I converted a software thief into a customer

**UDPATE 21 July 2012**

Due to the fantastically high traffic this blog post has seen, the Easter egg mentioned within is no longer much of a surprise.  Also, honest customers have written in wondering why only attempted thieves should get discounts.  Good points.  

Thus, the Easter egg with the coupon code with the coupons for the almost-pirates has been removed, and I’ve decided to give everybody who reads this blog a coupon code just for being a good person: NOBLUR (Good for 10% off a Blurity purchase between now and September 1, 2012)

Software piracy doesn’t really represent lost sales, claims by the software and entertainment industries notwithstanding.  That’s why I was absolutely giddy when I converted an attempted pirate to a customer today.

Not long after I launched Blurity, I started to notice a strange pattern.  A handful of people were trying the same two invalid registration numbers.  The numbers had the right form for valid keys, but they were clearly invalid.

I couldn’t figure out where those numbers were coming from.  The people trying the invalid numbers seemed to have nothing in common, and my Google-fu failed me.  What could those two numbers possibly have to do with removing blur from blurry photos?

So, as I often do in these situations, I asked my friend Tyler for advice.  Not only did he figure out the origin of the numbers almost immediately, he proposed an idea: show something to whoever tries one of those numbers.  Make them squirm.  Make them know I’m on to them.

I thought that was a great idea!  I added a few lines of code, and whenever either of those numbers were entered, the user would see this dialog:

Redacted for obvious reasons

That’s right: Not only would I call them out on their attempted piracy, I’d reward it.  I figured it would be better to get some money from them, if possible, rather than no money at all.

Nobody tried the numbers again for a while, and I gradually forgot about the dialog box.

Until today.  And it worked better than I could have hoped.

An automated alert popped up in my inbox notifying me that somebody was trying an invalid registration key.  I noticed the number, figured that they’d be seeing the “easter egg” dialog box, smiled, and went back to what I had been doing.

Then, a couple minutes later, another new message popped up in my inbox: a purchase!

*Ding!* You've got money! (Note that Blurity was on sale at the time)

I checked the server logs, and indeed, it was the same person who had tried the special invalid registration keys.

Not only had the person made a purchase, they did so at the full price!

Had the pirate been shamed into becoming a customer?  Had they been too embarrassed to use the “I’m a thief” coupon code?  I’m not sure, but I’m happy I got paid.

Categories: Lessons | Comments Off on How I converted a software thief into a customer

How not using Internet Explorer put me out of touch and cost me dearly

It’s never good to scare away your customers.  It’s even worse if you don’t realize you’re doing it.  That was me.

Like most folks in the developer community, it’s been years since I last used Internet Explorer as my daily browser.  Oh sure, we all keep copies around for web development work, but Firefox, Chrome, and Safari now rule the web roost.  Unfortunately, that was not the case with the Blurity userbase.

I had long known about a steep drop-off between the number of people successfully downloading the Blurity installer and the number actually completing the installation.  Roughly 50% of downloaders were abandoning the product after downloading it but before trying it.  Was the installer crashing? Were they forgetting about it?  I wasn’t sure.  My peers told me that a 50% drop-off wasn’t out of the question.  Months rolled by.

Then two days ago, I was testing some changes to the web site in Internet Explorer and decided to try downloading the installer.  A big, scary, red warning box popped up:

That's kind of scary!

“Oh no!” I thought.  The scary warning and lack of clear options to proceed were alarming.  Also, I was a bit miffed by the assertion that the Blurity installer was unpopular.  So I clicked “Actions” only to have another scary warning box show its face:

Harm my computer? I don't want that!

Dang. People would just be trying to remove the blur from their blurry photos, and they’d get smacked in the face with messages that my software would destroy their computers.

Turned out I’d been snared by IE’s SmartScreen filter.

I wondered how many of my users were coming in with Internet Explorer. It turned out that 43% of the visitors to Blurity.com were using IE.  But it was worse than that, since Blurity runs only on Windows, and 86% of visitors were using Windows, so almost 50% of the eligible visitors were using IE. Shoot.

Distribution of visitors by browser

I then wondered how conversion rates were suffering.

About 80% of non-IE users who downloaded the Blurity installer successfully completed the installation.  However, only 20% of IE users who did the download actually got through the install.  That’s an expensive bug.

Here’s the worst part. About half of my sales were coming from IE users.  That’s right: those 20% of IE users, representing 10% of my total traffic, were accounting for 50% of my sales.  That’s a REALLY expensive bug.  I was forgoing many, many hypothetical dollars.

So how do you fix that problem?  One way is to wait for the installer to “age” a sufficient length of time, but the specifics are murky, and the problem comes back when a new installer is released.

The better solution is to get a code-signing certificate and sign the installer.  StartSSL had what appeared to be the best prices, so I parted with some money and got a certificate.

No SmartScreen warnings now, thanks to the certificate

Success!  Internet Explorer no longer complains when a user tries to download the Blurity installer!  No scary warnings. About 85% of IE users are now going from download to installation.

Will that translate into increased sales?  I’m not certain, but if I get just two additional sales from the certificate over the next two years, it will have been money well-spent.

Going forward, you can bet that I’ll be more in tune with my users’ habits.

 

Categories: Lessons | Comments Off on How not using Internet Explorer put me out of touch and cost me dearly

Enlargement blur

Let’s say you have a small image and you want to make it big.  Perhaps you lost the larger version, or perhaps there never was a larger version. If you go to a normal photo editing program, like Photoshop, and try to enlarge the image, you’ll probably be somewhat disappointed.

The editing program is likely to use either bilinear or bicubic interpolation to create the missing pixels in the enlarged image.  The problem is that they’re looking rather naively at just the neighboring pixels when they create a new pixel.  Obviously, they can’t magically “enhance” their way into adding new information to the image, but neither are they being clever about using what data is available.  Thus, the enlarged image is rather soft and blurry.

So what can be done?  Unfortunately, Blurity isn’t well-suited for enlarging photos.  The problem of enlarging photos, called “superresolution”, is related to the problem of removing blur from photos, called “blind deconvolution”, but not in a way that makes Blurity useful for enlargement.  Instead, we need a different type of specialized tool.

One of the most interesting and successful approaches takes advantage of the fractal nature of a natural image.  In particular, visual elements are repeated within the image at different scales.  This is important for two reasons.

This is an extreme example, but many images have repetition.

This is an extreme example, but almost all images have some degree of repetition. Notice how the yellow circles enclose what appear to be multiple copies of the same chimney. This repetition can be used to create a higher-resolution version of the chimney.

First, the visual elements are likely to be repeated within the image at sub-pixel misalignments.  In essence, the higher-resolution version of that repeated visual element has been sampled multiple times, which provides enough data to reconstruct a model of the higher-resolution visual element without directly observing it.  The extra sampling also enables the removal of noise and compression artifacts.

Inferring the latent high-resolution version of the patch from a variety of slightly differently sampled lower-resolution versions of the patch is the essence of classical superresolution.

Second, the visual elements at different scales can be used directly in the resized image.  For example, a circle might appear repeated in the small version of the image at multiple scales.  A large scale of that circle could then be used in the enlarged image as a replacement for a small scale circle.

At the moment, most of this technology is limited to the lab.  A paper by Glasner et al. called “Super Resolution From a Single Image” includes some great examples of state-of-the-art image enlargement.

If you’re not in the lab, never fear. There are several  products that attempt to do the same thing. (Disclaimer: bleeding-edge technology like this tends to be finicky, and these algorithms don’t always produce good results.) Some options to consider:

Happy enlarging, and don’t forget Blurity if you just need to remove some blur.

Categories: Imaging | Comments Off on Enlargement blur

← Older posts