LCA2010 – Day 1

First real day of Linux.conf.au is always full on anticipation. I woke up a little early and nibbled a small breakfast as I walked from ustay to the venue. After the crap weather on the weekend things were stating to look a bit better.

The signup are at the venue was fairly quite with people being processed quickly and many having been signed up for the weekend.

First up was the Welcome talk which had a few hitches. Due to illness it was being given by and understudy who was a little unpracticed with the delivery and had a problem when the overhead screen went blank for 5 minutes due to technical problems (not sure if it was the screen or the laptop’s fault). Highlights were a 42-below ad for Wellington and everyby singing Happy Birthday to Rusty.

I spent the first couple of sessions at the Haechsen/LinuxChix Miniconf since most of the topics were interesting and for various reasons (mumble mumble) talk times between miniconfs were not sync’d so it was hard to move between them.

It looks like this year the video situation is fairly good. All Miniconfs and main sessions are both being streamed live (although in wma format which caused some comment ) and being record for later download. Hopefully It’ll all work out.

Talks I attended:

  • Version control for mere mortals by Emma Jane Hogbin was a good intro to VCS and practices including a bit aimed at sysadmins and content maintainers rather than just coders. She obviously likes Bazaar a lot more than git. Goods intro and once again I feel guilty about not using it more.
  • Happy Hackers == Happy Code by Sara Falamaki was an overview of what makes programmers happy. Mostly concentrating on tools but with some other bits and pieces mentioned. Great, especially the bit where Sara started throwing (often wildly) lollies to members of the audience who made good suggestions.
  • Through the Looking Glass by Elizabeth Garbee gave here perspective on using open source software and the high-school level. Interesting stuff on tools, and how other teens viewed open source and programming and the scary story about how her school had a rule that any student how bought a computer to school running Linux/Unix would be expelled!
  • Creating Beautiful Documentation from Lana Brindley covered some high level bits of the process redhat uses to create documentation as well as a bit of an overview of what technical writers do and why their jobs rock 🙂
  • Getting you feet wet for Angela Byron gave ways and advice for getting involved with Open source projects ( including the old “woman’s work” (my, not her term)) of documentation etc. Pretty good.
  • Code of our own from Liz Henry was about the first feminist orientation talk of the day. Lots of stories and advice for women in open source as well as a few bits where she gave your low opinion of how well some ideas have worked in practice.

Overall fairly interesting sessions. I noticed that for most of the 2 session the majority of people in the room were male and quite a few of the audience questions/comments were from them. This didn’t really cause a problem for most talks which were on general topics but I noticed the “male perspective” was less useful/welcome for Liz Henry’s talk.

For Lunch I wandered around a little bit an eventually found a place called “The coffee club” where I had a soy milkshake and a pesto bruschetta. Very nice.

For the last session I went to “The business of Open Source” Miniconf and then “Libra Graphics”

  • The 100 mile Client Roster from Emma Jane Hogbin was an interesting overview of the way her business and business model has evolved and where she thinks the next step is. Good talk and delivery although it’s a bit outside my area for me to give a good review of the content.
  • Building a service business using open source software by Cameron Beattie didn’t really appear to me. The talk was a bit flat and delivery lacked much spark.
  • Cheap Gimmicks to Make your designs ‘New’ by Andy Fitzsimon from suffered a bit from technical problems with delivery but looked like there was a good talk in there somewhere that just required a bit more prep.
  • Dynamic PDF reports via XSL and Inkscape by Peter Lieverdink was cool but a little over my head.
  • Inkscape: My Cheerleading Adventures by Donna Benjamin was a little sparse even for a 5 minutes talk

After the end of the day I went along to a Wikipedia Meetup at the Southern Cross Hotel. The Meetup was fairly small ( just 3 other people) but interesting people and several hours of discussion. Some talk about a NZ Wikimedia Chapter and also helping with the Wikimedia stand at the LCA open day.

Last up I grabbed a coffee and cake at Midnight Espresso.

Overall not a bad day, tomorrow will by Sysadmin Miniconf all day wih the Speakers Dinner in the evening.

Share

Hole in nbr.co.nz paywall

Update: NBR have fixed the hole

It looks like the National business review has a hole in their paywall. I don’t know if this is an intentional hole but as at the time I’m posting this it enables people to read articles that are “subscriber only content”.

A sample restricted article by Chris Keall “Did Paul Reynolds collect millions for hitting squishy targets?”  If I browse to it via http://www.nbr.co.nz/article/did-paul-reynolds-collect-millions-hitting-squishy-targets-109070 I get an error message:

Blocked version of article

However if I take the article number ( 109070 )  and access it via the URL http://www.nbr.co.nz/print/109070 I can see the whole article content:

Visible version of atcile

I guess somebody made a little mistake with the way the setup things or possibly this is designed to allow search engines like google to still find and index NBR’s content.

Share

Tech Updates, looking to the future

A few things I’ve been looking at or intending to look at over the next few months.

  • I’ve bought a new computer a couple of weeks ago for home. The computer is intended to replace the house server. The main functions will be as a file server and host for virtual machines. The big changes is that I’ll be switching from Xen to KVM as virtualisation technology.
  • KVM + PXE + Kickstart + Ubuntu  – I really want to build my virtual machines automatically and at the same time to be using a more general machine building method . This page on the Ubuntu site looks like it is a good start and I’ll blog a bit when I get it all done.
  • I need to do some work on Mondo Rescue , I have a bug I reported that is supposed to be fixed and I have to test.
  • GlusterFS is a distributed network file system that looks really cool, I’m intending to play with this a bit.
  • Once again we’ve applied to do a Sysadmin Miniconf at the 2010 Linux.conf.au conference. Once again we hope to have a really good miniconf. However no less that 32 miniconfs have applied for just 12 slots so not sure if we’ll get in. We were really popular last year but personally I’ve no idea what our chances are this year. Bit down about the thought of not getting but I guess whatever happens will happen.
  • I keep getting good ideas for websites and products. Not programming and having poor time control means most of these ideas are probably not going anywhere. Maybe I’ll try a couple of them though. Also got some further ideas for technologies to play with but want to get the ones above sorted first.
Share

Watching processes with monit

I’ve been having a small problem on one of my server with the http daemon dying every week or two. It’s not often enough to be a huge problem or invest a lot of time in by enough of a nuisance to require a fix. So what I ended up doing was installing monit to look after things.

monit is a simple daemon that checks on server resources ( mainly services and daemons but also disk space and load ) every few minutes and sends and alert and/or restarts the service if there is a problems. So after installing the package ( apt-get install monit ) I just created a series of rules like:

check process exim4 with pidfile /var/run/exim4/exim.pid
   start program = "/etc/init.d/exim4 start"
   stop program = "/etc/init.d/exim4 stop"
   if failed host 127.0.0.1 port 25 protocol smtp then alert
   if 5 restarts within 5 cycles then timeout

check process popa3d with pidfile /var/run/popa3d.pid
   start program  "/etc/init.d/popa3d start"
   stop program  "/etc/init.d/popa3d stop"
   if failed port 110 protocol pop then restart
   if 5 restarts within 5 cycles then timeout

for the main processes on the machine. Sample rules are available in the config file and documentation and google is fairly safe as long as you make sure you don’t copy a 10th generation rule of a “Ruby on Rails” site ( ROR components apparently require frequent restarts). All up the whole install and configuration took me around half an hour and I’m now monitoring:

# monit summary

System 'crimson.usenet.net.nz'      running
Process 'lighttpd'                  running
Process 'sshd'                      running
Process 'named'                     running
Process 'exim4'                     running
Process 'popa3d'                    running
Process 'mysql'                     running
Process 'mailman'                   running
Device 'rootfs'                     accessible
Process 'mailman'                   running
Share

Hacking InternetNZ Council Vote

Internetnz is the main New Zealand Internet lobby and policy organisation. More or less they take money from .nz fees and redirect it to benefit the New Zealand Internet and Internet users.

In a few days there is an election for it’s president and council. Following a post by Andy Linton to the NZNOG mailing list about the “need for a strong voice from the technical community” several technical people have put their name forward for council.

Following a discussion on the Internetnz mailing list I realised that many people are unsure of the best way to rank a list of candidates to ensure the “best” result. Looking around I was unable to find a good reference for this online so I thought I’d write a quick post here. I should give the disclaimer that I’m not an expert in this are so possibly I’ve made an error. I’m also only addressing the Council Vote note the President and Vice-President votes.

Voting System

The voting system for Internetnz is outlined here but what it simply means for the voter is that they rank the candidates from 1st to last. For each council seat the lowest polling candidates are eliminated and their votes allocated to the next preference until one has an absolute majority. For the next council seat it happens again except the ballots that had the previous round winner as first preference are eliminated from any further consideration.

You can see what happened last year here . There were 9 candidates, 6 seats and 90 voters. Rounds 1 through 7 show people being eliminated and their votes transferred around until Jamie Baddeley is elected. On Round 9 it starts again but 16 votes have been removed from the pool, these are the people who voted for Jamie as their first preference.

In rounds 9 though 15 the eliminations continue until Michael Wallmannsberger is elected. Then his 16 first preference votes are removed and it starts again until all 6 candidates are elected. The 2006 result is also online .

The interesting thing to notice is that only ballots that put an elected candidate as the 1st preference are eliminated in the first round.  So while the 16 people who voted for Jamie Baddeley helped elect him in the first round they had no influence in later rounds. On the other hand the 22 people who put Neal James, Carl Penwarden, Sam Sargent and Muchael Payne as their first preference got to participate in all 6 rounds of the election.

So what is the trick?

So out of the candidates I would characterise the following people as technical: Lenz Gschwendtner, Glen Eustace, Stewart Fleming, Andrew McMillan, Dudley Harris, Gerard Creamer, Nathan Torkington and Hamish MacEwan. This is eleven out of the 17 candidates running for the four  council seats.

Now assuming that there is a some level of support for technical candidates the worst case would be that all “technical” voters put say Nathan Torkington (to pick a well known name) as first preference. Nathan is elected as the first candidate and then the technical voters have no further influence on the other 3 councillors.

Instead we want to make sure that techie votes elect as many candidates as possible.

So what should I do?

Note: I am using the term “round” below to refer to each council seat election ( 6 in 2008, 4 in 2009 )

If you have a group of voters and a ground of candidates you have two main objectives:

  1. Avoid giving a first preference to a candidate that will be elected in the early rounds so your ballot will participate in as many rounds as possible.
  2. Give enough first preferences to your candidate to ensure they are not eliminated early in each round

The first idea is easy. Don’t give you first preference to a technical candidate. However this is where the second objective comes in, you need to give them enough first preference votes so that they are not eliminated early in every round.

I think the following should work:

  1. Rank all the candidates in you order of preference
  2. Decide how far down the list you are “happy” with the candidates (ie the 11 techies listed above)
  3. Randomly (yes, really randomly) pick one of the acceptable people and put them as your first preference.

The idea now is that if say we have 40 technical people voting then each of the 11 technical candidates will end up with at least 3 or 4 first preference votes. As the lowest ranked of these is eliminated then preferences will flow to the other technical candidates (in order of most popular) . If a technical candidate is elected only around 1/10 of the technical ballots will be eliminated from later rounds so there is still a good chance of electing other candidates.

What could go wrong?

It’s possible than the random allocation of first preferences will result in a popular candidate ( eg Nathan Torkington ) randomly getting a smaller number of first preferences and being eliminated early in every round. I think this is a small risk since

  1. it is likely that popular candidates will get first preferences from other voters
  2. popular candidates will have a higher random chance of people put as first preference since they will be in the “acceptable” list of more techie voters
  3. Even if this does happen others in the slate will still get in.

Feel free to let me know any questions ( or point out horrible errors I’ve made)

Share

What I want in a netbook for 2010

A recent thread about laptops  in the NZLUG list remind me how I’m not 100% happy with the way netbooks are evolving. The problem is that when the EEE came out the idea was that you’d buy a cheap, portable PC  which would do 90% of what people used PC for ( email, browsing, simple documents, simple video and audio).

However the problem is that the portable and cheap seems to be going out the window as the “Netbooks” now cost as much as low end laptops and are getting almost as big. So the big advantages of my existing EEE:

  • Small and light enough to carry in my bag all the time and not notice.
  • Cheap enough that I can not use it for 2 months but not feel like I’ve wasted money
  • Cheap enough that people can give one to their kids and not worry about the kid breaking it.
  • Solid state so I don’t worry about dropping it.

at sort of lost with the new netbooks. Remember how the original EEE ( nearly 2 years ago) was supposed to cost just $US199? That is the sort of price we need so people can buy them as “kids toys”, “play machines, “travel kit machines” , etc.

I’m intending to buy a replacement for my EEE in 2010 ( 3 year replacement cycle), what I’d really like to get would be:

  • Case the same size as EEE70x or EEE90x series
  • Display 1024×768
  • 1GB RAM ( upgradeable would be nice )
  • 8 or 16GB built in flash drive
  • CPU fast enough to play video on full screen
  • Ports: 3xUSB , Ethernet, WiFi, SD-slot, VGA, Sound/Mic , Camera
  • 6+ hours battery
  • Ubuntu standard
  • no more than $US 300

I think having a standard Linux ( I like Ubuntu but that me ) OS that Netbook makers can just install on their machine or that targets a netbook platform would be a big win. Even better if it’s a “full status” version of Ubuntu that gets updates every 6 months or best of all it would be “standard” ubuntu and would “just work” on a smaller machine.

I’m hoping the 3rd (4th?) generation netbooks can be what I want. The 1st generation was just getting something out there ( EEE 701 ) , the second was upping the spec as people demanded more while I hope with the 3rd that the performance is now “good enough” and the cost and size can be shrunk back down again.

Share

A week of Twitter

So about about a week ago I signed up a twitter account and started micro-blogging . I’m on 89 updates which is around a dozen a day although this week things are busy with the Blackout Campaign against Section 92a of the new Copyright Act so in a typical week their will probably be less (especially when the novelty wears off). If possible I’m trying to make tweets that might be of interest to other people  especially doing things like links to good articles which in the past I sometimes posted to the main blog.

Following people is interesting, for now I just look at the last page that somebody has posted and if it looks interesting (on average) I’ll add them. So I’m following around 50 feed so far and I’ll see how it goes, but since on average the impact of each feed is less than a RSS feed ( ie I’ll usually not scroll back to stuff I missed overnight) I’m not overwhelmed yet.

So far I am using the web interface a bit (which is good for looking at people, their followers and history), twitux at home and twitterfox at work. If you go to my actually blog website you’ll see I’ve added a RSS feed of my tweets (it only updates every hour or so) and I added the Twitme wordpress plug so every time I post to my blog I tweet is sent.

Earlier today I was inspired by nzpolice feeds that Sam Sargeant created and decided to create my own. So I’ve made the nz_quake twitter bot which updates whenever a new Earthquake is reported on the geonet website. The actual bot is just a shell script that checks every few minutes to see if the status page has changed and if it points to a new earthquake (they have unique IDs) it’ll just use curl to connect to twitter’s simple web interface.

It took me around an hour of playing around to implement in around 35 lines of shell script. I’ll have to wait a few days to see how well it works since the webpage only reports the earthquakes that people might have felt rather than every tiny little one.

Share

Strike 1 against APRA

The big thing this week has been the protest against the new section 92(a) of the New Zealand Copyright Act which will cut off the Internet for people or organisations repeatedly accessed of copyright violations. The new law says that after repeated infringement complaints (4 strikes is the current proposal in the code of practice) ISPs will have to close the account of their customers.

So I thought I’d have a look around the website of APRA (The Australasian Performing Right Association) when I came across this page which is a cut-and-paste of a recent New Zealand Herald story. After I started poking around it looks like the story has mostly been removed (which indicates APRA did not have permission to host it) but there is a google cache of it here (mirrored below) and the original page still has a photo from the Herald story.

So it appears while APRA is happy to on one hand to write press releases about about how “Those working in the creative industries need the protection from theft of their work” but on the other hand it is quite happy to rip off material from websites when it thinks they might be of interest to it’s members.

Of course the Law hasn’t come in force yet so APRA has managed to avoid the threat of having it’s Internet cut off and instead just gets a bitchslap from the Herald but it is certainly an interesting combination of hypocrisy, stupidity and arrogance on their part (they can’t claim ignorance I’m afraid).

APRA ripping of the NZ Herald
APRA vs NZH. Click for full size image
Share

LCA09: Day 5 : Friday plus bonus Saturday writeup.

The keynote this morning was from Simon Phipps from Sun. I thought he was quite good especially since he was in front of an audience that was not 100% friendly. One of the interesting statements he made was pointing out that it was hard for a company to install a free version of Redhat ( say Centos or Fedora ) and then later start getting commercial support for it. As things are right now you would have to reinstall all your servers with RHEL in order for it Redhat to support you. He felt that sooner or later Redhat would have to change their policy in order to allow easy transition for people, although at least one redhat person in the audience either missed his point or completely disagreed with it.

Next I went to a talk from Mathew Wilcox on Solid state drives. It was pretty interesting although a little over my head.

Afterwards I hosted ( to the extent I stood up, wrote notes and pointed at people) a BOF for Miniconf organisers. Around a dozen people showed up including about 3/4s of this year’s miniconfs plus at least one perspective on for next year. We had a good round of discussion and I wrote up a few notes (not really for public sorry , but contact me if you have a special interest) and somehow volunteered to help setup a Miniconf Organiser’s howto document.

After lunch I went to sessions on Power management and usability. Bother excellent and giving me a chance to pick up some information in areas outside of what I normally do.

Then it was a presentation from Terri Irving at Dreamhost, she did some overview stuff about how they do things and then a little bit about how they use their internal “servicectl” tool to provision and run their services but not a lot of technical nitty gritty. She used the second part of her talk to introduce the Ceph distributed file system that one of the Dreamhost people are working on ( which is publicly released) so the talk wasn’t a total loss though.

I had a bit of a headache so I skipped the lightning talks and the wind-up and announcement for next year. As expected it’ll be in Wellington. Generally I think this should work out okay, the extra distance for Australians should be balanced by a good number of locals attending, I did hear some concern about lack of the publication of the bid documents and the fact that the Wellington organisation seems to be a “company led” rather than a “community led” . Also I am not sure if cheap accommodation is going to be available, having the college dorms like in previous years is great, lots of space, security and closeness to the other attendees at a low price. Hopefully Wellington can do something similar.

Anyway I missed all of that and had a snooze till after 7 when I got up and went into the Sandy Bay shops with a few guys. On the way we went via the party but it didn’t seem to be much fun. The drinks were “buy your own” (probably to avoid problems from previous years) and the food was of the sausage roll type ( I heard that Google was unimpressed with the food quality after having dropped a lot of $$$$ towards it).

Ended up going to a nice pizza place with 4-5 other guys and having a really nice pizza at a good cheap price (low $20s per head) . The manager even gave us some shots at the end free ( cool, although I don’t drink so just had a small one).

Next day was the the semi post-conference with the Open Day being the only real official thing happening plus a cheesy sounding “march” from Salamanca place to the Open Day ( at the Casino). I decided to skip these and went to the Salamanca market instead. I was really impressed with the whole thing, 2-300 stalls of mostly high quality with plenty to choose from, since I was flying back to NZ I couldn’t buy much food to take away but I got some nice fudge and ate at a couple of vendor stalls. Couple of galleries by other people here and here.

And apart from an uneventful trip back that was my linux.conf.au for 2009.

Overall I felt the whole event was on par with previous years, I understand they had a few speakers and attendees drop out at the last minute which was a bit unlucky and the extra travel distance probably put some off. I got them impression there were not as many locals as in previous years but I guess Hobart isn’t a big place.

The organisers seemed pretty on the ball most of the time and largely kept in the background compared to previous years. The weather was pretty good (apart from some sprinklings of rain) and not enough to extremes to cause problems. I’ll definitely be back again.

Share

RHEL 5.3 + HP = kernel panic

So I was upgrading an HP server to the latest Redhat Enterprise Linux 5.3 today. However I rebooted the machine after the kernel upgrade it died with a kernel panic. I wasn’t anticipating this since RHL5.3 has been in beta for months and the kernel is over a month old which should have shaken all the bugs out.

It turns out there is a Redhat Knowledgebase article for it. The problem is that the ProLiant Support Pack (PSP) tools which we used to monitor the hardware ( raid, fans, heat etc) from the OS use binary kernel modules. So when the HP daemons run and load the modules into a different kernel than what they are written for everything dies.

Even worse HP have yet to release updates to PSP which supports the new kernels. Pretty slack and some people in the HP forums are not impressed. So roughly speaking I now have high-price enterprise hardware that won’t run the current version of the most common enterprise Linux distribution unless I disable all the software that lets me talk to the “value added” hardware.

Makes me wonder what I’m paying the extra 50% per box for.

Share