April 2008 Archives

Legal processes

Yesterday we received a Litigation Hold request. For those of you who don't know, this is the order given as part of a lawsuit ordering us to take steps to preserve data that could be used as part of the Discovery process of the suit. This is something that is becoming more and more common these days.

Our department has been pretty lucky so far. Since I started here in late 2003 this is the first Litigation Hold request we've had to deal with. We've had a few "public records requests" come through which are handled similarly, but this is the first one involving data that may be introduced under sworn testimony.

This morning we had an article pointed out to us by the Office of Finance Management at the state. WWU is a State agency, so OFM is in our chain of bureaucracy.

Case Law/Rule Changes Thrust Electronic Document Discovery into the Spotlight
.

It's an older PDF, but it does give a high level view of the sorts of things we should be doing when these requests come in. One of the things that we don't have any processes for are the sequestration of held data and chain of custody preservation. We are now building those.

Guideline #4 has the phrase, "Consultants are particularly useful in this role," referring to overseeing the holding process and standing up before a court to testify that the data was handled correctly. This is very true! Trained professionals are the kind of people to know the little nuances that hostile lawyers can use to invalidate gathered evidence. Someone who has done a lot of reading and been to a few SANS classes is not that person.

Just because it is possible to self represent yourself in court as your own lawyer, doesn't make it a good idea. In fact, it generally is a very bad idea. Same thing applies to the above phrase. You want someone who knows what the heck they're doing when they climb up there onto the witness stand.

This is going to be an interesting learning experience.
Just this last weekend I went to Linuxfest Northwest, which is held here in Bellingham. This is nice! It's just a short drive.

One of the talks I went to was held by Ted Haeger, currently of Bungee Labs. The topic of the talk was one he had just posted to his blog, "Sharing Source Code In The Cloud".

One point he brought up that I hadn't heard of before is that the GPL triggers when you 'convey' the software to someone else. And that the GPL specifically excludes where the software is hosted on a server and users just use the software there, so long as the software itself never leaves the company in question. This is exactly what Google did and still does. All of their search IP was built on an OSS platform, but is still held as the crown jewels of their company; all because they haven't given the software to anyone else.

Apparently, this 'loophole' is being exploited by a LOT of new companies trying to get in on the software-as-a-service market. Such as Bungee Labs, as it happens. What effect will this have on the state of GPLed software? Hard to say, the market is still in its early days.

It makes you think.

And a gripe

2.5 hours is too freakin' long for "rug lu" to tell me which patches need application to this particular OES2 server. This needs fixing. I hope its fixed in SLES10 SP2.
A couple days ago Richard Bliss had a long blog entry about, "Novell's Cash Cow - How NetWare almost killed the company". It had some very interesting points. Some we knew:
We are all familiar with NetWare, the dominate Network Operating system of the 1980s and 1990s. We are all familiar with Microsoft's tactics of penetrating the NOS market with Windows NT by focusing on using Windows as an application platform.
Apparently Richard worked for Novell around 2001. I find that interesting since my first BrainShare was 2001, and that was when they announced the release of NetWare 6.0. While there he saw what seemed to be an outright denial that NetWare had been passed up by Windows and something new needed to be done.

In 2001 I knew that Windows had for all intents and purposes won. The only place you ever really saw NetWare servers were as file-servers, or running GroupWise or the small handful of apps that used NetWare as an application server. The stalwart loyalists among us saw this as annoying, but not a major problem.

It was also good for Novell's bottom line. NetWare still accounted for a large percentage of their revenues. Even though the writing was on the wall, they were still making real money on it so didn't see a need to change. This is why NetWare 6.0 introduced the AMP stack to NetWare, as a way to better make NetWare an application server and to slow the loss of customers. At BrainShare 2001 there was open speculation about "NetWare 7.0" and what it would look like.

And there still was until 2005 when Novell announced what the next version of NetWare would be. This being after the SUSE and Ximian purchases, it would be based on Linux. This move had been rumored, and alternately derided and lauded, for some time. There was a great wailing and gnashing of teeth on the part of the stalwart NetWare loyalists. It also started an exodus of customers, as Novell's financial reports at the time point out.

Fortunately for the company, they started actively promoting (for certain values of 'active' that are higher than they were previously, but still in the theme of Novell Stealth Marketing) and developing their other products, like GroupWise, Novell Identity Management, ZenWorks, and most especially their Linux business. It took them until last quarter to turn in a quarter in the black, and NetWare revenues are under 20% of total now. So, they've turned the corner and are no longer dependent on the NetWare cash cow. They have a couple of them in the field now, which is a MUCH healthier place to be.

It's a funny thing, but one of the reasons why NetWare is such a kick-butt file-server compared to everything else is why it's a challenging environment to develop in. Had Novell seen the light earlier and bought SUSE (or rolled their own Linux distro) in... 1999 instead, right after the NW5.1 release, they still would have run into the fundamental architectural problems in 32-bit linux that make it an inferior file-serving platform for large environments. By 2008 their server could have been a LOT more mature, and perfectly poised to take advantage of 64-bit Linux.

Novell in the 1990's is not an example of a 'nimble' company. It is trying to get there now through diversification. Not many companies (especially tech companies) have survived the loss of their prime money earner; Apple has done it through OSX, which required a fanatically loyal fan base to survive the dark years. This is the prime reason people kept predicting the imminent demise or buyout of Novell. Now that they're earning profits again, and have diversified away from just the OS sector, they're not going to be going out of business any time soon.

Now if only they had better SMB packages and programs. I hear repeatedly from peers who support SMBs that Novell's packages and programs in that space are lacking or exploitative. Significant revenue, and more importantly mindshare, are in the SMB market. Plus, today's SMB is tomorrow's large or global enterprise.

Beta attitudes

One thing I've noticed while working on this beta is a change in attitude. Specifically, attitude regarding problems. I've run into problems so far that would have had me throwing things across the room by now. Yet, instead I get that 'ahah!' feeling and proceed to figure out how it went poink exactly like that. And then report it. That feels good.

All of my prior bug-hunting has been post-release, when we ran into issues in production. Now, it's in pre-release and the bugs and issues I find now will be fixed by release (or at least documented so people know to expect it to break that way).

It's an interesting change in attitude.

On email, what comes in it

A friend recently posted the following:
80-90% of ALL email is directory harvesting attacks. 60-70% of the rest is spam or phishing. 1-5% of email is legit. Really makes you think about the invisible hand of email security, doesn't it?
Those of us on the front lines of email security (which isn't quite me, I'm more of a field commander than a front line researcher) suspected as much. And yes, most people, nay, the vast majority, don't realize exactly what the signal-to-noise ratio is for email. Or even suspect the magnitude. I suspect that the statistic of, "80% of email is crap," is well known, but I don't think people even realize that the number is closer to, "95% of email is crap."

Looking at statistics on the mail filter in front of Exchange, it looks like 5.9% of incoming messages for the last 7 days are clean. That is a LOT of messages getting dropped on the floor. This comes to just shy of 40,000 legitimate mail messages a day. For comparison, the number of mail messages coming in from Titian (the student email system, and unpublished backup MTA) has a 'clean' rate of 42.5%, or 2800ish legit messages a day.

People expect their email to be legitimate. Directory-harvesting attacks do constitute the majority to discrete emails; these are the messages you receive that have weird subjects, come from people you don't know, but don't have anything in the body. They're looking to see which addresses result in 'no person by that name here' messages and those that seemingly deliver. This is also why people unfortunate enough to have usernames or emails like "fred@" or "cindy@" have the worst spam problems of any organization.

As I've mentioned many times, we're actively considering migrating student email to one of the free email services offered by Google or Microsoft. This is because historically student email has had a budget of "free", and our current strategy is not working. The way it is not working is because the email filters aren't robust enough to meet expectation. Couple that with the expectation of effectively unlimited mail quota (thank you Google) and student email is no longer a "free" service. We can either spend $30,000 or more on an effective commercial anti-spam product, or we can give our email to the free services in exchange for valuable demographic data.

It's very hard to argue with economics like that.

One thing that you haven't seen yet in this article are viruses. In the last 7 days, our border email filter saw that 0.108% of incoming messages contain viruses. This is a weensy bit misleading, since the filter will drop connections with bad reputations before even accepting mail and that may very well cut down the number of reported viruses. But the fact remains that viruses in email are not the threat they once were. All the action these days are on subverted and outright evil web-sites, and social engineering (a form of virus of the mind).

This is another example of how expectation and reality differ. After years of being told, and in many cases living through the after-effects of it, people know that viruses come in email. The fact that the threat is so much more based on social engineering hasn't penetrated as far, so products aimed at the consumer call themselves anti-virus when in fact most of the engineering in them was pointed at spam filtering.

Anti-virus for email is ubiquitous enough these days that it is clear that the malware authors out there don't bother with email vectors for self-propagating software any more. That's not where the money is. The threat had moved on from cleverly disguised .exe files to cunningly wrought (in their minds) emails enticing the gullible to hit a web site that will infest them through the browser. These are the emails that border filters try to keep out, and it is a fundamentally harder problem than .exe files were.

The big commercial vendors get the success rate they do for email cleaning in part because they deploy large networks of sensors all across the internet. Each device or software-install a customer turns on can potentially be a sensor. The sensors report back to the mother database, and proprietary and patented methods are used to distill out anti-spam recipes/definitions/modules for publishing to subscribed devices and software. There is nothing saying that an open-source product can't do this, but the mother-database is a big cost that someone has to pay for and is a very key part of this spam fighting strategy. Bayesian filtering only goes so far.

And yet, people expect email to just be clean. Especially at work. That is a heavy expectation to meet.

Generations

My boss pointed us at an article this morning, about a topic near and dear to managers everywhere. Boomers are retiring, and for every 2 boomers leaving, 1.2 workers are entering the workforce. I know I've been watching a steady drum-beat of retirements the last few years.

In the article is this sentence:
Statistically, Millennials are the most pluralistic, integrated, high-tech generation in American history—traits that make them ideally suited to our increasingly demanding, diverse and dispersed global workplace.
I had to snort. Not 10 years ago you could replace the word "Millennials" with "GenX" and it would have been true. And before that the, "tweeners," the folk between GenX and the Boom, got the same treatment. And the boomers before them got it too. Each new generation is the most puralistic, integrated, high-tech generation in American history. Whatever the people being born right now get called will be the same and the Millennials will get to feel a bit fuddy duddy.

My boss is a boomer, and our chief Unix admin is a boomer. That's it for Technical Services, so it doesn't apply as much to us as other groups. We're all GenX here, with one Millennial shared with Telecom who is moving on to something else soon. It's a bit different across the hall in ADMCS, but not a lot.

Protecting against Cosmic Rays

Apparently Intel filed a patent for a system to protect chips from cosmic rays.

This makes a lot of sense. I've explained to many people over the years just why it is that the computers that run the Space Shuttle are so much less capable than what they have on their desk. Part of that reason is due to cosmic rays. The smaller the transistor feature size, the more vulnerable the transistor is to charge flipping from things like cosmic rays. NASA has to deal with this any time it puts hardware in space.

The Cassini Probe around Saturn regularly goes into safe-modes due to Galactic Cosmic Rays that twiddle bits they aren't supposed to. Again, NASA expected these and engineered around them. Of scientific interest, they've run into different concentrations of these galactic cosmic rays during the cruise to Saturn and while in orbit around Saturn.

So why is Intel worrying about this here on the surface of the Earth? Because we also get cosmic rays down here too. Not nearly as many, but we get them. For years I've used the phrase, "Must have been a cosmic ray strike," when something computer-like breaks in truly weird ways. Only partially am I being flip about it.

In a more wider scope, these 35nm feature size chips they're now coming out with are designed to work in very low radiation environments. Such as the type humans can live in unsupported. So when NASA/ESA/JAXA/Proton send laptops to the ISS, they're probably running older CPU's that are more radiation tolerant. Space is not a good place for supercomputing clusters.

Stupid user tricks

| 1 Comment
I had a case of this the other day. I was minding my own business, when suddenly one of my monitors starts going wonky. This is an LCD monitor, but an older one, so it isn't inconceivable that it could be going bad. How else would I explain the weird spots that were showing up on it? They looked like this:

Pretty spots

Which looks like weird hot-spots in the screen. So I started to muttering. Plus, the screen was noticeably dimmer. Futzing with the brigthness and contrast settings didn't do a thing for it either. Plus it seemed to follow no matter which window I put on the hot spots.

Then, I realized what the problem was.

Pretty stars

Compiz. Somehow, the rdesktop window that represents had been made slightly transparent, and the wall-paper was showing through. This screen shot is with the transparency fully down, you can barely make out the ConsoleOne icon in it.

So no, I didn't have a monitor going bad, I had a mouse mis-cue somewhere that caused that rdesktop window to go a bit transparent. No worries!

Aie.
Should IT Shops Let Users Manage Their Own PCs?

It's a very Web 2.0 concept. And there is some merit to it. Back in the day when workstation lock-downs were getting common in workplace settings (ZENworks was good for that), there was a debate about some of this. At my old job one thing we wanted to lock down was the wall-paper. That one thing would help reinforce the idea that this was a WORK Pc, not a home PC. The counter argument to this is that such user environment things are mostly harmless, so permitting them allows the lock-down to be less intrusive on the user.

This is another step in that direction. Workplaces have PC configuration standards for a variety of good reasons. You want all machines plugged into your network to not be festering hives of scum and malware, and these sorts of standards can prevent that. On the other end of the scale, high end users know the tools of their field better than your general IT desktop support person does and in theory can do more with the tools they know versus the tools forced upon them.

On the control end of the spectrum, you keep IT costs down by standardizing the configs in your enterprise. This keeps the Total Cost of Ownership down, a big thing for companies with the right internal costing controls (*nudge nudge*). One tech can support many more end users that way, since the range of things they support is kept to a minimum.

On the freedom end of the spectrum, the end user gets exactly the tools they want to do their job. They're happier that way. And since they support themselves, IT costs are controlled. One tech can support many more end users that way, since the bits they're supporting are significantly reduced.

The 'freedom' end of things runs smack into some standard industry practices, such as volume licensing and big-buy discounts. Dell, for instance, sells PCs cheaper if you buy them by the gross rather than in singles as users are onboarded. Specialized packages like AutoCAD also come cheaper if you buy them in packs of 10 rather than one at a time. Licenses all too often these days are timed and enforced, so you could have end users forgetting to renew the license on their Scrivener install and being non-productive for a few days while purchasing gets them a renewed license. The big 'endpoint management suites', what they seem to be calling the AntiVirus/Firewall package these days, all assume enterprise central control.

On the other hand, users liked being treated like reasoning, intelligent people who are capable of making choices about their work environment. This makes for happier workers.

Also working in this favor is the trend to webify everything in the workplace. The days when you have a whonking big file-server to store all the company data on are slowly going away, and being replaced with things like SharePoint (which can get just as big, don't get me wrong). The fights we've had in the past about how to roll out a new Novell Client to all our desktops would be moot in such an environment as the 'client' is called 'Firefox' (or Gnome, or Office 2007).

On the downside of the 'freedom' end of things is piracy. Tools like Zen Asset Management are there to make sure that the software in use is actually legal. In this freedom environment there is the significantly increased probability of someone bringing their 'backup' copy of something from home to install on their work machine and creating legal liability for the company if they get audited.

Another downside is interoperability problems. The Microsoft Office users create document-macros that the WordPerfect Office users can't run, and the OpenOffice users can't read the WordPerfect files. The Microsoft Office users publish things to SharePoint, where the OpenOffice users drop their stuff onto a handy WebDAV server somewhere. Office peer-pressure will still work on software selection to a point, even if you absolutely love Package Q for your day-to-day work you won't use it if the software everyone else in the office uses can't do a thing with it.

The trade-off here is balancing the chaos and increased direct costs 'freedom' will introduce to the IT environment versus the productivity bonuses and intangible benefits (morale). That will decidedly depend on the culture of the office, and what it is that they do. I know some people who would leave their current jobs just to get the freedom to order the machine they want and use the software they want to use, even if it means somewhat less benefits.

A friend of mine recently changed jobs. The old job was was Microsoft. Since Microsoft is a software development firm of some significant size, they try to dog-food their own stuff wherever possible; even if the tool is a poor fit for the task at hand. She spent a lot of time clubbing her software to do what it didn't really want to do, all the while knowing that there were two non-Microsoft packages that did exactly what she wanted. The new job is not with Microsoft, and the first day there they gave her an order sheet to order the software she wanted; they wanted results and trusted her to turn them in in an understandable format. Thus, the joys of freedom.

So, to answer the question, it depends. It depends on corporate culture to a significant degree, as well as the sector the company is in, as well as the work being done. In highly creative areas such as design, the benefits can be great. In highly regimented areas such as accounting, perhaps not so much or at least a high degree of freedom won't be worth the ultimate costs.

Slow blogging

| 3 Comments
I found out at BrainShare that WWU has been accepted as a Novell Authorized Beta site for OES2 SP1. And that's what I've been doing for the better part of the past week. Due to the NDA required, I can't talk about it. So, not much bloggable stuff to bring forward.

We requested entry into the program in part because of what I learned at BrainShare 2007. Specifically, Novell doesn't test for our scales of users. Therefore, it is in our best interest to make sure that organizations like us are in the beta. We have the hardware to make a go of it right now (all those new ESX boxes are liberating some still-useful 3-5 year old servers), and I have the time. Unfortunately, the only 64-bit testing we'll be doing will be in VMWare, so the newest of the new code will have to be really tested by other people.

That's why I've been quiet.