Tuesday, May 29, 2012

Ubuntu 12.04: The Good. The Bad. The (Not So?) Ugly.

Hello friends!

I recently upgraded by home PC from Ubuntu 10.04 to Ubuntu 12.04. I decided to make a hardware change as well, specifically, upgrading my primary hard drive from a 320GB SATA to a 2TB SATA.

Now, I didn't actually "upgrade" the 10.04 to Ubuntu 12.04 in place, I installed a fresh copy of Ubuntu 12.04 on the 2TB drive and manually migrated my programs, settings and files over to the new system. I personally like to start with fresh installs when possible, and given all the custom modifications I had made to my system, I feared an upgrade would cause more problems then it was worth (if you've had a different experience, let us know).

I didn't take the choice of upgrading to Ubuntu 12.04 lightly. I knew I would have to upgrade 10.04 sooner or later, only having a year 10.04 support left, and didn't want to let it go until the last minute. Having heard a lot of dislike between versions 10.10, 11.04, and 11.10 (the Unity UI for example), I seriously considered switching to another distribution such as Linux Mint.

In the end, I decided to give Ubuntu 12.04 a try. From looking at the screen shots and reading the design philosophy, I decided Unity was worth a try. Even if I didn't like it, I knew I could always install another desktop on top, like classic GNOME, Cinnamon, or GNOME 3, without having to switch distros.

So far, I'm happy with what I see, and think I'll be sticking around with Ubuntu, at least for now. The future is always open though!

The Good


Off the bat, a lot of 12.04, including the GUI, impressed me. Install and format process was very fast, even on the 2TB drive. I haven't tried setting up a dual boot system (Linux only) but that has been extremely easy since at least 9.10 so assuming it's been kept up to date with newer versions of other OS's, I don't imagine much of an issue.

Nvidia Binary Drivers


Nvidia Binary Drivers were detected and installed out of the box, no configuration needed. This gave me access to 3D acceleration and advanced multi-display configuration - a must have for a HTPC. I did use the Hardware Drivers option to upgrade to the "bleeding edge" drivers, though rather this was necessary or not, I'm not sure.

HDMI Audio Support


Remember this post? Thanks to an updated ALSA, HDMI Audio worked out of the box too. They also fixed the under-enumeration problem present in 10.04. I immediately had a (single) HDMI audio option in the Sound Devices that I could select and move back and forth seemlessly, no fuss. Success!

New MythTV & HVR1600 support


Since one of the main purposes of the my machine is to use as a HTPC and DVR, I installed MythTV straight from the repositories, and no issues setting up my card. Just don't forget to add you cx18.conf file to /etc/modprobe.d :)

Webcam Support

In 10.04 I was never quite able to get my Webcam and my HVR1600 working together. In 12.04, they seem to both work happily, though I haven't tried unplugging the webcam yet (that may still cause some issues to be aware of)

Improved Third Party Software

Firefox and Thunderbird are installed by default. Maybe you don't like them, but I do. Skype works fairly well and is easily installable. The Ubuntu Software Center is really nice, so far I haven't had any need to use another package manager (though I may need to at some point).

The Bad


USB3.0


I never got USB3.0 working reliably on 10.04, and 12.04 doesn't seem to have improved much. Although the Linux kernel officially supports USB3.0, I'm guessing the there is something amiss with the drivers for my specific chipset (the 3.0 ports are build into the motherboard). I am certain the ports, cords and external hard drive are all USB3.0, but I've never gotten any more then a sustained 60 MB/sec (480Mb/sec, USB2.0) transfer speeds when connected to the USB3.0 ports. Worse, I get sporadic "unmounting" of the USB drive when connected over 3.0, and cannot remount without restarting the computer. I realize this is likely the fault of a driver, or missing kernel module (and less Ubuntu), but it's still irritating. Hopefully I can figure out the bottle neck and start getting reliable USB 3.0 support.

Flash

So basically, YouTube videos have a blueish tint. Yep, you heard me right. Apparently, this is a bug in Flash,  but since Adobe has officially discontinued Flash for Linux, with 11.2 being the last version, it is unlikely the bug will be fixed. Now, there are workarounds, the easiest being to disable hardware acceleration,  but come on, something like this shouldn't be necessary as now I have to drive my CPU hotter. 

Another bizarre issue with Flash if you are using dual desktops: Full screen sometimes only appears on your *primary* display, even if the originating window was on the second. Now, I didn't have this problem with YouTube (full screen was on the correct display), but I did have this problem with sites that used their own custom player based on Flash, such as video from the sites of TV Networks. I found a work around for this using gDevilspie (see this thread). But this only addressed the *position* of the window, *not* the size. so unless the two displays were running at the resolution, the full screen was cropped and I could only see the top-left portion. My only work around of this (for now) is to run both displays at the same resolution. 

Hopefully with Flash for Linux discontinued, Canonical (or someone out there) will do the right thing and take over the reigns of support Flash going into the future. Even if you wish Flash would just die, pretending it's already dead is not the solution to a good user experience.

LIRC

LIRC is a great program, unless your card just happens to not be supported. This isn't LIRC's fault, it tries its best, but for legal reasons there are certain remote they just can't support. Getting LIRC to work for my HVR1600 card was a *huge* pain on 10.04, requiring me to build a custom kernel module. I did eventually get it, but if memory serves, it took several months.

Unfortunately, my card *still* isn't supported in the newest version of LIRC with 12.04. Likely, it never will. On 10.04, I was able to build the kernel module thanks to the lirc-modules-source package and a very nice Fedora user, but it wasn't fun. Unfortunately, it appears lirc-modules-source is not available for 12.04! So until it is, I can't update the patch for 12.04, nor build my kernel module or use my IR Blaster/Receiver.

UPDATE!  I *have* gotten LIRC for my HVR1600 card working on 12.04! Without too much difficulty. I happened across a pre-complied version of the necessary module, lirc_zilog, here inside of a DEB file (32 and 64-bit versions available). I'll post a more comprehensive tutorial as soon as I have a spare moment :).

The (Not So?) Ugly

Unity

Hey, I like it. Been using it for a few weeks now and I find it pretty fluid and easy to use. The HUD feature is very cool. Now, now, I know what you are thinking...that damn Unity bar is glued to the left side!!! While I agree that, in principle, this was a poor choice on the part of the Unity developers, you have to remember that Unity is just one desktop environment. You aren't forced to use it, and there are even forks out there (or could fire one up yourself) to get it on the bottom if it's really important to you. Or use an alternative like Cairo dock, or another desktop environment. So it's not really a reason to abandon the distribution altogether, at least IMHO. And if you give it try, you might just like it too. While I do hope they add an supported option to move it in the future, for now I can deal.

X-Session Weirdness

In Ubuntu 10.04, I ran GNOME on two separate X-sessions, one of the PC display, and one of the TV display. I liked this better then running a single display on both, since applications seemed to handle it better (e.g. full screen Flash), even though it meant you couldn't move windows between displays (no biggie to me).

When I enabled dual X-sessions (with the Nvidia-config tool) in 12.04, I was very confused. On one display was the normal Ubuntu desktop, but on the other was just a pure white background with a "X" for a cursor. Obviously 12.04/Unity wasn't designed to handle dual X-sessions this way.

So, I flipped over to Twinview, to see if I could configure that the way I wanted. To my vast surprise, I found two separate desktops (complete with Launcher and top menu bar) just as if I was using dual X-sessions! But now, I had the ability to move windows between desktops. Pretty cool, considering I had been expecting something similar to way Windows handles extended displays, just making a blank window space on the other monitor, but no separate task bar.

So it seems like they tried to take the best of dual X-sessions and the best of Twinview (extended desktop) and munge them together (as a new form of Twinview). And honestly, it works pretty well, except for two problems I've found so far:

1) The "Flash" full screen issue I mentioned above
2) Applications don't quite "understand" it.

What I mean by #2 is that I'm used to being able to launch an X program on either GUI from the command line simply by setting the "DISPLAY" environment variable. Well, in 12.04 no such luck, the DISPLAY environment variable has the same value (0) regardless of what display you are on. So now, launching applications based on the X-session is a little trickier than it used to be - not a good thing. For the most part, applications do work correctly if you launch them on the correct display, but I haven't yet figured out how to launch an application on a specific display from the command line. Unfortunately, MythTV was one of the applications that didn't quite behave, it was showing up on the primary display regardless of what display I launched it from. No good - I need Myth on my secondary (TV) display. Thankfully I found an option in the MythTV appearance settings which allowed me to choose between display "0" and "1". Setting to 1 put it on the TV display....just as if I had set the environment variable. So there is some support similar to the old methods, but it appears to be application specific.

But sometimes (like when I'm running mythbackend), I *do* want it on the primary display. I used to have separate scripts that I could run for which display I wanted to appear on...but even trying to use the "--display" option on the mythfrontend command line, it goes to whatever display I set it to in the options. So, I have a workaround for now (setting the display option in the settings), but this is irritating...having a way to move it on the command line would be much preferred.

It's only been a few weeks, but so far I like what I see in 12.04 and find it a worthy successor to 10.04 (I usually skip non-LTS versions...due to the work involved with upgrading). So I'll be sticking with Ubuntu for now, and hope others out there enjoy it as well.

Have a great day!

Saturday, May 12, 2012

The Art of Computer Programming


Greetings all!

Have you ever heard that computer programming (and by extension, Software Engineering) is an act/profession that doesn't require any imagination or creativity?

I certainly have, but such comments tend to be made by people with no knowledge or experience in the field, so their ignorance can be (at least partially) forgiven.

But as such, it prompted me to write an article about why this is untrue; software development does indeed require imagination and creativity, in fact, quite a lot of it!

It's not too difficult to see how such an option might evolve among the common populace, if you look at the "stereotypical" view of a software developer in society, you'll find a person (often male), wearing a white collared shirt sitting in dull, grey-looking cubical typing on a dull black computer with meaningless lines of white (or green if you are really lucky) text on the screen. How could someone in such a boring-looking world possibly have any sense of creativity?

If you examine software developers in real life, you actually find a much different perspective. But I digress, this article isn't really about the people, it's about the work itself. There is another common misconception about programming: that it is completely formulamatic, and that programmers just go through the motions dicated from a book or a superior and somehow miraculously end up with a perfectly working program. Nothing could be further from the truth.

Imagination can be defined in serveral ways, but to me, it typically refers to the ability to create images or sensory input in one's mind. For example, I can picture what a peanut butter sandwich looks like, without my eyes actually seeing in. I can recall what it smells like, what it tastes like and what it feels like to hold. I can ever hear what I sound like while chewing it.

But this is only the memory aspect of imagination. There is a whole other layer of imagination on top of it, namely, the ability to create things that you have never  actually experienced.

For example, I have never been to the Eiffel tower, but I can imagine standing on top of it. I have never been in space, but I can imagine floating in midair.

Our imagination can dream up things we've never thought of before, like the plot of a novel or a "Eureka" moment in a discovery. These ideas use building blocks to create something completely new. This also leads into creativity, typically the creation of something new, where "new" depends on the domain of what's being created.

Returning to the theme of developing software, there are many aspects of which require imagination and creativity, for example:

1) Imagining the finished project from high level sketch. The idea for the a piece of software typically comes form one of two places: a) Your brain; b) Someone else's brain. If you come up with an idea for a piece of software (perhaps because it's something you need, but doesn't exist yet), then you clearly exercised creativity. But even if you are developing the idea for some else, you still need to exercise your imagination and creativity. For example, in both cases you exercised imagination to envision the new software even though you've never actually seen it.

Descriptions of software to be developed tend to start as extremely abstract...just a setence or paragraph on what the software needs to do without any idea of how it is going to be done. It is up to the developer to figure out how to get there, by imagining the finished project. The developer might even come up with new things the software should or needs to do that the client never even thought of. Software gets very complicated very fast. For example, there can be many programs (not just one) involved, and communication between them, as well as documentation and many rounds of testing. Sometimes, it even takes creative methods to get the client to tell you what they need in their software, since often, they aren't sure themselves!

2) Building a cohesive whole from primitive building blocks. Computers have evolved over the years from lights blinking on a box to complex machines capable of displaying millions of pixels or performaing trillions of operations per second. It is extremely rare that any piece of software developed today is built entirely "from scratch". If software always needed to be developed from completely scratch, we'd still be using boxes with blinking lights and flipping switches to represent binary input. Over the years, software developers have add layers of "encapsulation" onto the basic circuitry inside a computer. Primitative building blocks that allow us to make the computer do something useful, for example, moving a piece of data from memory block to another, or turning a specific pixel a specific color.

When you go to develop a piece of software, you need to carefully understand the requirements, and then figure out how to combine the primitive operations available to form a cohesive whole. And sometimes you don't have the building blocks you need, so you have to find them or create them, using even more primitive operations.

Software is all about layers and building from smaller blocks. It's like when you were a child playing with your first batch of Legos. In the box, they are just thousands of individual blocks, useless by themselves. But when you combine them with imagination and creativity, you can combine those blocks in an astronomical number of ways to form something completely different. Building software is much the same, and without imagination, you wouldn't be able to combine them together to create something useful, nor keep track of the building blocks the software requires which can easily number into the millions.

3) Visualizaling what the ones and zeros actually mean. It can be hard to imagine to the novice user, but even today, at the heart of computing is nothing more then the manipulation of binary data, that is, ones and zeros (binary digits, or bits). And yet, when you look at your screen, you see text, buttons, pictures, even videos. And yet, it's ALL just ones and zeros. When a programmer is creating a piece of software, they continually have to imagine what that data, the ones and zeros, actually represents. Now, there are building blocks to help, some of the most primitive operations the computer provides are used to interpret once piece of binary data as another, for example, turning a group of bits into the letter 'A', or the number 65, or the color of a pixel. But you can't lose track of what it is underneith, and to visualize ones and zeros forming a picture of a beautiful sunset certainly requires imagination. 

4) Solving problems in creative ways. Solving a problem or problems with software usually requires solving a bunch of smaller problems, then combining the solutions of the smaller problems together to solve the big problems. Each smaller problem is usually solved with a concise sequence of logical statements often mathematical in origin. The same task can usually be done in many different ways, using different sequences of logical statements.


Navigating a logical sequence of statements is rarely trivial, whether you are writing it or reading it. At each stage, you have to clearly imagine what that statement means (sometimes in terms of the ones and zeros in the computer, sometimes visually like boxes and pipes) and keep it in your minds eye for hundreds or thosands more logical statements you are writing to make it make sense.

The ability to break the large problem up into smaller problems takes a lot of creativity, and solutions to the smaller problems themselves are often far from obvious.

5) Development of user interfaces. Programmers rarely write programs which just do a single thing, all the time. Most software is capable of performing many different tasks, and requires input from a user, which can be entered an enormous number of ways. So not only do programmers have to write the software to perform a funtion, they need to envision a way to let the user tell the software what to do.

A lot of software today provides a Graphical User Interface (GUI, pronounced 'gooey'). Creating a good GUI is a subject of much discussion, and can be considered an art form all on its own. The operating system you are using now (likely, though perhaps not) includes a GUI, as does the web browser you are reading this page on. It took a lot of time and creativity on the part of the devlopers to figure out how to make the GUI work well and be usable. And even then it's difficult, because a devloper will tend to write a GUI that a developer wants to use...but often that isn't sufficient for the non-devloper. As such, they have to imagine what it's like to be an average user, and create the GUI for them, not themselves. Stepping outside of yourself like that is not easy, and requires a lot of imagination. Even software which provides a non-GUI interface, like a text-only interface (yes, they still exist) needs to think about the user, and figure out a good way to describe all of the functions and operations of the software through text only.

6) Breaking software. You might be suprised to learn that an important job of a software developer is actually trying to figure out how to break the software, let alone figuing out how to write it. All professional developers want their software to be completely bug free, but in reality, that goal isn't achievable. Still, as perfectionists, we strieve to make our software the absolute best that it can be, and one possible metric by which to measure that is bugs. Bugs (and sometimes new "features") typically happen because the user does something with the software that the developer didn't ancipiate, and so the software does something unexpected.

In fact, software development companies have entire departements dedicated to testing and trying to break software, sometimes these departments are even bigger than the developent department. And they are effectively programmers themselves, just doing a different kind of programming. One job is to write their own programs, specifically for the purpose of testing the product program. If the software is supposed to do things "A","B" and "C", then they write a program to exercise A, B and C in different ways, then check that the software preforms as expected.

But, it's also their job to imagine ways to use the software in unexpected ways, and see how the software reacts. Such a task requires you to create inputs entirely from your imagination, and sometimes to break the software you need to get pretty creative, depending on how well the original developer considered the possible inputs. For the programmer, tracking down bugs and figuring out methods to fix them while maintaining the integrity of the software is a strong exercise in creativity and imagination.

7) Documentation and help files. Software is useless unless you have a good way to tell the user how to use it. Developers also need to write extensive documentation on the software themselves so they can use it as reference material when updating and changing the software.

A GUI can only hold so much information, so most software typically has a Help section. The Help section describes all of the software in greater detail, so that the user can understand it at a more basic level in order to decide how to perform tasks. Writing this documentation requires you to imagine how the software will work, and what it looks like, especially since the person writing it might not have developed it. It's also important to organize how the Help appears so that it is accessible to the user.

The documentation of the software can number into the thousands of pages and also needs to be well organized. Having creative ways to store and reference this documentation so it can be recalled by the developers when needed is critical. Trust me, you don't want the developer of a nuclear power plant's software writing code without a clear picture in his or her head of what it's going to do!

8) Considering the pace at which technology changes, and anticipanting future needs. Technology moves very, very rapidly. As such, professional software is almost never just a |one-off" peice of work which is never touched again. Software evolves over time, and when programming, you can't just consider what the software needs to do today, you need to anticipate what it's going to need to do tomorrow, the next day and the day after that. Some software systems survive for decades after their original developement. If the original developers of the software hadn't had any imagination or creativity, they wouldn't have been able to build the software to evolve, and have to be constantly replacing the software. But, in reality, developers build software to evolve, to live and to grow.

Even this list is hardly exlusive, a professional programmer could probably add a few more ways on top of the ones listed on how programming requires creativity. Just because we might (emphesis on the might) not be able to paint a masterpeice or decorate a house hardly means that we lack imagination and creativity, it just means that we express it in different ways.

I hope that this blog post has been enlightening to you, and encourage you to consider all the imagination, creativity, and hard work into the software you use everyday. Today, the world runs on software, and that's unlikely to change any time soon. Considering all of this, why one would think  programming doesn't require imagination and creativty, especially when it's something they've never experienced, is beyond me. Or at least, beyond this particular blog post :)

Take care, and have a great day!