March 2012 Archives

Moving /tmp to a tmpfs

| 1 Comment
There is a move afoot in Linux-land to make /tmp a tmpfs file-system. For those of you who don't know what that is, tmpfs is in essence a ramdisk. OpenSUSE is considering the ramifications of this possible move. There are some good points to this move:

  • A lot of what goes on in /tmp are ephemeral files for any number of system and user processes, and by backing that with RAM instead of disk you make that go faster.
  • Since a lot of what goes on in /tmp are ephemeral files, by backing it with RAM you save writes on that swanky new SSD you have.
  • Since nothing in /tmp should be preserved across reboots, a ramdisk makes sense.

All of the above make a lot of sense for something like a desktop oriented distribution or use-case, much like the one I'm utilizing right this moment.

However, there are a couple of pretty large downsides to such a move:

  • Many programs use /tmp as a staging area for potentially-larger-than-ram files.
  • Some programs don't clean up after themselves, which leaves /tmp growing over time.

Working as I do in the eDiscovery industry where transforming files from one type to another is common event, the libraries we use to do that transformation can and do drop very big files in /tmp while they're working. All it takes is one bozo dropping a 20MB file with .txt at the end (containing a single SMTP email message with MIME'ed attachment) to generate an eight thousand page PDF file. Such a process behaves poorly when told "Out Of Space" for either RAM or '/tmp.

And when such processes crash out for some reason, they can leave behind 8GB files in /tmp.

That would not be happy on a 2GB RAM system with a tmpfs style /tmp.

In our case, we need to continue to keep /tmp on disk. Now we have to start doing this on purpose, not just trusting that it comes that way out of the tin.

Do I think this is a good idea? It has merit on high RAM workstations, but is a very poor choice for RAM constrained environments such as VPSs, and virtual machines.

Are the benefits good enough to merit coming that way out of the box? Perhaps, though I would really rather that the "server" option during installation default to disk-backed /tmp.

Tape!

Tape isn't going away, much like mainframes never actually went away. However, their utility is evolving somewhat.

The emergence of the Linear Tape File System is quite interesting. It's an open tape format (nothing new there) that looks to have broad acceptance (the new part). Especially since the LTO governing body has adopted it, and LTO is the de-facto standard for tape in the industry right now.

Open standards make the long-term archive problem easier to tackle, since the implementation details are widely understood and are more likely to either still be in use or have industry expertise available to make it work should an old archive need to be read. They also allow interoperability; a "tier-5" storage tier consisting of tape could allow duplicates of the tape media to be housed in an archive built by another vendor.

In my current line of work, a data-tape using LTFS would be a much cheaper carrier for a couriered TB of data than a hard-drive would. We haven't seen this yet, but it remains a decided possibility.

Understandably, the video content industry is a big fan of this kind of thing since their files are really big and they need to keep them around for years. The same technology could be used to build a computer-image jukebox for a homebrew University computer-lab imaging system.

Looking around, it seems people are already using this with CloneZilla. Heh.

It's hard to get excited about tape, but I'll settle for 'interested'.

A complaint

| 2 Comments
This came across on a private twitter feed, otherwise I'd attribute the bejebers out of this one.
NoBros.png
I know it's not my field, and probably not yours if you're a regular around here, but even I have noticed a steady uptick in ninja rockstars on job postings for people who do software engineering all day.

So, so true.

That said, it does bring up an opportunity for when you need a brush an employer off for being too skeevy.

"I'm sorry, it turns out I'm not a good fit for $Company. I'm more of a rockstar, not a ninja."

Time off

I'm not going anywhere, but yesterday the following article was thrown under my nose:

Unlimited Vacation Doesn't Create Slackers -- It Improves Productivity

The base premise is very roughly:

If you let employees take the time they need, when they need it, they'll perform better than if you tracked (and metered) the time they are allowed to take.

True, dat. My last three jobs:

  1. Current Job: Unlimited time off
  2. Last Job: 5 weeks of vacation a year, 3 weeks sicktime.
  3. Last +1 Job: Seniority based vacation times, starting at 2 weeks vacation + 3 sick, topping out at 5 weeks vacation + 3 sick for 20 year veterans.

Since my previous two jobs tracked time, both of them had the phenomena of 'use it or lose it' for when you hit the max time carry-over. Last +1 was December, Last job was at your hire anniversary. The fact that people were hitting their carry-over max shows that they were not using all of the vacation allotted to them (or alternately, were consciously building up a severance-pay pad due to vacation cash-out).


And I must say, when you work some place that's "hey whatever" about hours you get less stressed about work. And you also are tempted to 'work' outside your normal bounds. The article mentioned studies that show people working 'from vacations' or suchlike. They're out of the office (yay!) but still being productive. A win for the company, and a win for employee agency.

However, having lived with 5 weeks of vacation a year (or as I've called it in the past, "five freaking weeks of vacation.") the effect is almost the same as unlimited. You just take time off when you need it and don't sweat it. When your carry-over hits max, that's when the differences show up; we had people take two week 'call me if you need me, I'll be around,' vacations just to burn it off.

As a highly paid IT professional, I've shown I have sound judgment. It's nice that my employer recognizes that and treats me like I can be trusted with company-time management.

Also, the "Fool's Errand" idea is genius. I'll have to see if we can do something like that at work.

The cost of insecurity

| 1 Comment
This rant is a familiar one to a lot of desktop-facing IT professionals.

Today the person who handles our bank stuff came to me with a problem. The check-scanner wasn't working. I poked around but couldn't make it work, and advised her to talk to the bank since they provided the scanner, the scanner driver, the IE-plugins it worked with, and Windows didn't recognize it as a scanner.

So she did.

Their advice?

You need to turn off Windows Update. Every time it runs it changes things in IE and you have to go through and do a bunch of things to make it work

I gave her the look. Then I remembered myself and redirected the look to a harmless corner until I could speak again. Even getting a piece of that baleful stare caused her to cringe. Oops.

Like all right-thinking Windows admins, I'm a believer in leaving Windows Updates turned on if I'm not doing something else to manage that particular risk. Our fleet of Windows machines is not big enough for me to bother with WSUS (we have a LOT of Mac users), and we most definitely do not do anything like blocking browsing at the border. So, I want those updates thanks.

So I pulled a laptop out of the dead pile. That laptop will now be a dedicated machine for talking to the bank and nothing else. All because our freaking bank can't run secureable software. Makes you question your trust in them, it does.
In the last month-ish I've had a chance to read about and use two new graphical shells that impact my life:

  • Windows 8 Metro, which I already have on my work-phone
  • Gnome 3

Before I go further I must point out a few things. First of all, as a technologist change is something I must embrace. Nay, cheer on. Attitudes like, "If it ain't broke, don't fix it," are not attitudes I share, since 'broke' is an ever moving target.

Secondly, I've lived through just enough change so far to be leery of going through more of it. This does give me some caution for change-for-change's-sake.



As you can probably guess from the lead-up, I'm not a fan of those two interfaces.

They both go in a direction that the industry as a whole is going, and I'm not fond of that area. This is entirely because I spend 8-12 hours a day earning a living using a graphical shell. "Commoditization" is a battle-cry for change right now, and that means building things for consumers.

The tablet in my backpack and the phones in my pocket are all small, touch-screen devices. Doing large-scale text-entry in any of those, such as writing blog posts, is a chore. Rapid task-switching is doable through a few screen-presses, though I don't get to nearly the window-count I do on my traditional-UI devices. They're great for swipe-and-tap games.

When it comes to how I interact with the desktop, the Windows 7 and Gnome 2 shells are not very different other than the chrome and are entirely keyboard+mouse driven. In fact, those shells are optimized for the keyboard+mouse interaction. Arm and wrist movements can be minimized, which extends the lifetime of various things in my body.

Windows 8 MetroUI brings the touch-screen metaphor to a screen that doesn't (yet) have a touch-screen. Swipe-and-tap will have to be done entirely with the mouse, which isn't a terribly natural movement (I've never been a user of mouse-gesture support in browsers). When I do get a touch-screen, I'll be forced to elevate my arm(s) from the keyboarding level to tap the screen a lot, which adds stress to muscles in my back that are already unhappy with me.

And screen-smudges. I'll either learn to ignore the grime, or I'll be wiping the screen down ever hour or two.

And then there is the "One application has the top layer" metaphor, which is a change from the "One window has the top layer" metaphor we've been living with on non-Apple platforms for a very long time now. And I hate on large screens. Apple has done this for years, which is a large part of why Gnome 3 has done it, and is likely why Microsoft is doing it for Metro.

As part of my daily job I'll have umpty Terminal windows open to various things and several Browser windows as well. I'll be reading stuff off of the browser window as reference for what I'm typing in the terminal windows. Or the RDP/VNC windows. Or the browser windows and the java-console windows. When an entire application's windows elevate to the top it can bury windows I want to read at the same time, which means that my window-placement decisions will have to take even more care than I already apply.

I do not appreciate having to do more work because the UI changed.

I may be missing something, but it appears that the Windows Metro UI has done away with the search-box for finding Start-button programs and control-panel items. If so, I object. If I'm on a desktop, I have a hardware keyboard so designing the UI to minimize keystrokes in favor of swiping is a false economy.

Gnome 3 at least has kept the search box.



In summation, it is my opinion that bringing touch-UI elements into a desktop UI is creating some bad compromises. I understand the desire to have a common UI metaphor across the full range from 4" to 30" screens, but input and interaction methodologies are different enough that some accommodation-to-form-factor needs to be taken.