The meeting was, in general and in my opinion, a success. Lana Brindley gave the first talk, entitled "10 Reasons Why You Do Not Want To Install Linux. Ever.", which was (no surprises) really a "10 old myths about not using Linux and why you should ignore them" talk. It was clever, well presented and covered all the things Linux users get tired of explaining. Several times Lana would pose a myth about Linux and people would automatically call out objections or corrections - which I take to be a good sign that her talk dispelled the myths that us enthusiasts want cleared away.
My talk, unfortunately, I feel was doomed from the start. It was "Paul's Ten Tips About Bash", and the content was definitely useful to some people - and I think it says a lot about Linux users that even the most learned people in the room still learnt a few tricks and mentioned some that I didn't know. However, it wasn't a talk for everybody, and importantly it contrasted with Lana's number one point: that Linux is perfectly possible to use without ever coming near the command line. My disappointment is that I didn't think of this earlier - I got carried away by my own geekiness. It should have been "Paul's ten tips on avoiding the command line", which would have been something that many more people learned from. Heck, I could have learnt a lot putting that talk together.
I'll do it at the next Linux Learners meeting, which will be in August (I think we'll set a schedule of doing them every three months and see how that goes).
There's more information on my website, so please feel free to email me with your questions. If you want to come along and are interested in having a sausage for lunch, please also drop me an email by Friday!
This meeting is a 'fixfest' and learner session for people new to Linux or still finding their way around. (That's most of us!) We'll be having short talks about a variety of subjects but the majority of the night will be given over to people helping other people fixing problems and learning their way around Linux.
Lana Brindley will be starting the night with a talk entitled "10 Reasons Why You Do Not Want To Install Linux. Ever." and with a provocative title like that you can tell it's going to be interesting! Paul will then give a short talk on tips he's learnt in using Bash, the current standard command line shell in Linux.
You're welcome to bring your computer along but please email me (email@example.com) beforehand so I can get an idea of the numbers of machines involved.
"Awesome Things You've Missed In Perl" is a good way of updating seasoned Perl coders to the new things you can do with recent versions of Perl. It's more about the modules that have come out in recent years that make writing Perl much easier, such as Moose, autobox, and autodie (of course). But there's still much that Paul mentions that exists in Perl 5.10 that makes a coder's life easier; given/when, the smart match operator and named captures just for starters. For us people that still have a Camel book second edition on their desks (somewhat guiltily), it's an excellent refresher and reminder to get with the times. It also makes the transition to Perl 6 much easier.
The meeting was very well attended - 18 people - including three from the class that Paul was teaching. For me it was a wonderful "small world" moment as a friend of mine from Melbourne happened to be at the course - of course, my embarrassingly useless people memory caused me to have to ask her name. But it really was quite wonderful to see Louise again, albeit somewhat briefly. The main programmer for the water resources project that Paul was tutoring was interested to learn about the Canberra Perl Mongers group and will hopefully join and come along as a regular participant.
So, thank you Paul Fenwick for making this a really great night!
The first minor glitch was discovering that I couldn't record and play back simultaneously. So I could watch the levels moving up and down, and note that they were at least vaguely reacting to the speaker and the audience, but I couldn't check that what was recorded was actually sensible. I tried stopping one recording (again, before the first speaker started) and then continuing recording in another session and playing back in the first session. Audacity played nothing. I tried to normalise the first while the second was recording. Audacity crashed. Don't do that again, then.
The first clue that something was going wrong was that the levels weren't following David's talking closely enough. They'd pick up loud noises in the room, but not David's pauses. The second clue was that this changed before Tridge started talking - the levels went up - but they stayed even flatter. Tridge knows when to raise his voice - we have no amplification for the CLUG meetings - and I should have seen definite peaks. And the levels weren't flat, either - so it wasn't recording silence. I thought I hadn't changed anything when the levels went up, but as it turned out I had.
The third clue had to wait until I got the recordings home: I listened to one and I could hear my comment to Tridge sitting beside me fairly clearly, but not his response. Otherwise, it was mostly buzz. Diagnosis: something was working, but the noise from the circuitry on the rather ordinary on-board audio was pretty high (no surprises there, laptops are not renowned for their onboard sownd, er, sound). And why had the levels gone up for the second recording? What had I changed? I couldn't remember. It took a sleep for it to all congeal in my head, and the next morning the ghastly truth of my elementary error flooded into my brain unbidden:
I had put the microphone in the headphone jack, and vice versa.
This is why it picked up vague noises, and could hear my comments: because headphones will act as fairly microphones in a pinch (it's just the same circuit, only being read from rather than written to, if you will). This explained why the noise had gone up but no voice had been actually recorded in the second phase: because I'd unplugged the headphones half-way through (since they were useless as monitors). From then on it was picking up uncompleted circuit noise. And what had probably done it was that the extension cord from the microphone looks a lot like the extension cord that I use at home to plug my audio output into my home-made switch board.
So: save up for a nice Edirol UA-1EX USB adapter which can do 24-bit 96KHz recording and playback and has a microphone input, and possibly a nicer microphone stand so I don't look quite like a reject from Woodworking 101 with it. And try to solve the hardware problems in my brain before doing this again.
I'd also managed to combine not preparing my talk until the last minute with forgetting that I had already said I'd go to see "For The Term Of His Natural Life" at the National Library with some friends. And I didn't realise this until I was basically committed to going to the CLUG meeting. So it all felt a bit rushed and unorganised.
To cover my bit of the talk, DAR is an archiver built for the modern world: it compresses, it slices, it encrypts, it allows you to include and exclude file easily, it seeks right to where you need to be on the disk rather than reading through an entire .tar.gz file, and it saves Extended Attributes so your SELinux contexts and Beagle annotations will be saved too. I'll put the slides of the talk up soon, but it wasn't a very in-depth look at the program anyway so it's not really going to tell you any more than you'd get reading the man pages.
I'd forgot, when I was standing up, to point out the absence of my laptop case cover. This is because, on this very day, I have picked up six pieces of 0.5mm stainless steel cut to my exacting specifications by a powerful jet of water, and taken them to a place where they can be bent into the right shape to fit in my laminated laptop case covers. I had to leave the case cover with Precision Metals so they could work out how to bend both pieces to the right shape. So I'm very close to actually being able to make them. With my dining table near completion, it looks like I'm actually finishing some of my projects!
Andrew's talk on OpenStreetMap was much more interesting, and he fielded a lot more questions on it as a result. OpenStreetMap is solving the libré problem of the data on Google Maps and similar mapping sites being free to look at but not free to use in your own work. With OpenStreetMap you can correct it if you think it's wrong (leaving aside the 'easter egg' issue), add to it, analyze the map data in new ways, and so forth. There were a lot of very good questions, which I think shows that geodata is one of the current hot topics in computing these days.
Finally, I took orders for pizza (a new process to me!), Brad fired up his DVD burner, and we had the regular stand-around-and-talk that seems to be perennially popular at CLUG meetings.
We had a good turn-out, though, and people who had seen my last-minute email had still come along. But the day was really saved when David Collett and Michael Cohen did an impromptu talk about integrating Python with C. Those of us with laptops and internet connections went to the PyFlag code repository that David and Michael and others have been working on, and followed along as David showed us how they'd progressed from using SWiG to writing entirely in C and using the Python integration library to pass data in and out and to call Python methods from within C. David knows his code well and he and Michael were able to demonstrate all the standard things you need to do to integrate the two languages, as well as why those methods were chosen. I was really impressed at their off-the-cuff presentation and it really saved the night.
And then somehow I found myself explaining all about my sequence counting program, why I'd used C instead of Perl to implement it, and what its limitations were. Though everyone was listening attentively, I was secretly fearing that it was turning into a conversation between myself and Michael, who was doing most of the questioning. And there were a couple of good ideas - things like using array-based trie systems, seeking through Clustal .ALN files, and using buffer systems to break the problem down - that he mentioned that I'll have to follow up. But I'm very annoyed with myself that it turned into exactly what I have felt all along that it should not be (q.v.).
Finally it came down to Owen, Ian, Rhys (or Ryan, I can't remember) and myself talking about esoteric things like Big Bang vs Steady State theory, the four quantum forces and their relative strengths, and the families and groups of the Periodic Table. So it ended on a good note after all.
Firstly, the main focus of the presentation was the PS3. Hugh Blemings and Jeremy Kerr (I think) gave a talk about the heart of the PS3, and several of IBM's large blade servers, the Cell processor. I'll gloss over a lot of the technical detail because it's pretty easy to find, but the key things to me were:
Nick Piggin also gave a talk about his work on getting better performance from NUMA machines. NUMA is Non Uniform memory Access, where each processor or core has direct access only to a subsection of the total memory in the system, and has to ask for other blocks to be sent to it if the block it wants is attached to another processor. Blocks that are read-only can be duplicated in each processor's local memory: for example, the pages containing libc. Blocks that are read-write can be replicated while everyone's only reading them, and flushed out and reloaded when a write occurs. So overall this was a night for the supercomputing enthusiasts amongst us (e.g. me).
(Note to self: I need to find a good way to talk to Nick about his presentation style.)
Once most of the presenting is over, the night is given over to eating and chatting. Usually in CLUG meetings this is given over to a pizza feeding frenzy once famously compared to a gannet colony. Last night, however, we also had sushi. This was organised by myself with some assistance from Pascal Klein, and had to be arranged in advance. It was an experiment in alternative foods prompted by Hugh Fisher's talk on Open Source software communities. We had seven people, Hugh and Jeremy included, request sushi; one didn't show up, so despite me asking for an eighth person to join in we still only had seven people sharing the cost. So I got stuck with a bit of the bill, but it was worth it for the quantity of sushi. There was a good variety, in reasonably good quality, and enough wasabi to entirely destroy the sinuses of the entire CLUG attendance for the night. So I think this was a success; I'll do it next time.
I'm still casting around for other cuisines that have small, easy to eat portions that don't require cutlery and can be sourced relatively quickly and don't go off their serving temperature for the period they're stored between picking them up and dispensing them. But sushi twice in a row won't be all that bad...
Addendum: I called my brother up in Brisbane to tell him of the PS3 coolness, and ended up spending more time talking to his friend Nick who works there. Nick's a linux user in a crowd of Windows geeks, like myself, so we ended up chewing the fat over processing coolness and Vista badness for a good hour or so. I also passed on to him the news about http://www.debian-multimedia.org - i.e. that it exists. That should save him the agony of getting MythTV to compile...
At the CLUG Programming SIG for February, we had Rusty Russell speaking on his project LGuest, a Linux x86_32 hypervisor system. He talked about his aims (to develop a framework for testing virtualisation in Linux) and how he overcame the various obstacles in his way, such as glibc's way of using an segmentation calculation overflow to store information about the program code, and how he got around them. He finished up with some benchmarks and a bit of a comparison between the LGuest and the various other virtualisation and hypervisor packages out there - primarily Xen and VMware. There were at least a dozen people there, which was big for a PSIG meeting, and this included two people from ANU who had never been to any CLUG event before but had wanted to hear Rusty's presentation. The talk was very well received, and Rusty gave an excellent presentation in his usual engaging manner. He handled the constant interruptions from food arriving and plates departing neatly, and didn't miss out on his own meal either (this last is important, I think, for a speaker). My only apology to him was my brief and inelegant introduction, but since pretty much everyone knew who he was and what he was doing, I felt words were superfluous :-)
I've been working in the background trying to get various people that I know in the CLUG scene to give talks at the PSIG. There are essentially three things that I want to hear about:
There were two problems with the process, I felt. One was that having a discussion like this is possible up to about eight people - for a CLUG meeting of twenty or so it sometimes degenerated into a shouting match. I'm as guilty as the rest - I'd stick my hand up sometimes and wait patiently to be noticed, but then five minutes later I'd be calling out amusing comments or counterexamples with the rest of us.
The second problem was that Hugh's approach was basically to attack FOSS's dogmas and articles of faith. This often ended up with arguments coming from both sides - you can't say the Free Software Manifesto is equivalent to Marxism, and then say that there's nothing wrong with capitalism and proprietary software without ending up sounding like you're arguing about completely different things. And these are also the sorts of declarations that get Open Source practitioners somewhat riled up, which means that they want to go on the attack, which is hard if it's coming from the other side of the political arena.
Personally I don't have a problem with a lot of these statements. The Free Software Manifesto is a lot like classical Marxism - where people get confused is then thinking that it equals communism or Stalinism. Proprietary and free-but-closed-source software has a lot to teach Open Source programmers: Mark told me of two features of a Finder-replacement in Mac OS X which would make Nautilus or its KDE equivalent green with envy. Sure, we can copy features from them just as they copy features from us, but it seems a curious inversion of the "it's worth nothing if it's free" mentality to say that the only software worth using has no cost. I certainly don't mind paying for a game or an application I might use; I know the hidden cost is there that I've been locked into their file formats and so forth. I just factor that into the equation. It's the everyday equivalent of reading and understanding all the EULA stuff.
One interesting topic came up right at the start: for a group of people that prides itself on 'openness' and hates companies and governments putting up barriers to participation, we're an awfully White Anglo-Saxon Programmer group. That night was especially poignant as we had no women in the twenty or so people there - and that's not uncommon either. As a contrast, even SLUG has a higher proportion of women. What are we doing wrong.
It's not that hard to deduce, when you ask the question 'where are all the
beginners?' CLUG is, somewhat unashamedly, a very technical (and
technical-for-the-sake-of-it) audience. Some newcomers (e.g. a former
acquaintance) are driven away by the sheer technical complexity of the
talks; others are driven away by the
technical questions launched at the speaker from anywhere in the audience
without warning. Others still, I would argue, are driven away by the way
little cliques will form and gossip about geeky, technical, 'you have to
have been reading the mailing list for three months to understand the joke'
stuff given any opportunity - the speaker taking a breath, for instance.
All of these and more drive women away - the guy at the back saying "I love
women - I appreciate how they look" (or something like that) was just the
tip of the iceberg.
I think there's a reasonable area between being patronising and being gruffly neutral in our attempts to encourage women to come along. I think part of the problem is that, just by there being fewer women, us guys feel a bit uncertain. Women don't automatically think that we're talking to them because we're chatting them up or trying to impress them. You can treat someone as an equal without them having to know as much as you in your little special interest fields. While I fear that the next woman who visits a CLUG meeting for the first time is going to be swamped with people trying to make her feel welcome ("have a chair!" "no, have mine!" "this one's warm!"), I think we may obsess too much over the problem to admit that the solution is just to be friendly and supportive - much as we (gender-neutral) all like being treated.
But also, we're going to try organising different meals to the standard Pizza Gannet Feeding Frenzy that CLUGgites call "a good way to wrap up a talk". My proposal is:
: Feature one: tabbed panes, so you can keep multiple places in the file system and move between them easily. Feature two: a 'drop box' that you can collect files into and then drop them into your destination. Saves all that confusing control- shift- alt- left-click-with-a-fringe on tip selecting in file list windows in order to grab the files you want.
Tonight at the Programmer's SIG we were 'supposed' to be having a sort of round-table discussion, with people with ideas meeting up with people who know how to implement them. Or, at least, have more knowledge into the way that Linux is organised and may be able to recommend language choices, libraries to look for and people to speak to. If any of those people had actually turned up, this would have happened. But they didn't.
After the usual early round of "Hey have you seen this cool stuff / weird shit" as meals were served (amazingly quickly, this time), I tried to jump start the thing by asking what people's ideas were. Maybe it's just me - this didn't seem to get any real discussion started. Conversation kept revolving around Pascal Klein's idea for rewriting the Linux kernel in C#, and the multivarious reasons why this would be a Bad Thing. As amusing as it is to discuss bad language choices, the things we hate about customers, and what's new on Slashdot, this wasn't really doing it for me as someone who a) has ideas and b) is a programmer.
Despite the good nature of Steve Walsh's teasing, I do worry that I'm talking too much about my own ideas. I say this because we then had a long and quite spirited discussion about how to solve a problem with my backup process. It started with me noting that I'd thought of an improvement to rsync:
At the moment, rsync will only try to synchronise changes to a file if the destination directory has a file with that name. If you've renamed the file, or copied it into a new directory, then rsync (AFAIK) won't recognise that and will copy the entire file again. However, rsync already has a mechanism to recognise which files are the same - it generates a checksum for each file it encounters and only copies the checksums if the file is different. So the idea is for the receiver to check if it already has a file with that checksum somewhere else. There's more to it than this, but I'll develop that in another post.
This all supports my partner's method of backing up her PhD - every once in a while, she takes all the files so far and copies them into a directory named 'Backup date'. Separately to this, I then rsync her entire directory up to my brother's machine in Brisbane, as an off-site backup. While I'm not especially worried about the time it takes or the amount of data transferred, since rsync's principle aim is to reduce both of these I thought it would be a useful improvement to optimise for the case where a file has been renamed on the client - why transmit the whole file again if you can just copy and delete on the server?
I suppose the thing I enjoyed was the idea of co-operatively solving a problem using the tools at everyone's disposal. Several people suggested that Revision Control Systems would be better in this scenario, because they would only store the diffs and would give instant reversion to any point in time. Other people suggested automated folders that would pick up the files in a 'drop' directory, put them in an appropriately labelled directory, and then start a remote copy of the appropriate folder on the remote server. Other people suggested that having two backups was overkill - that as long as I had the remote server updated I could retrieve backup copies should anything go wrong locally. All of these were good suggestions, and despite the problem that they didn't really solve the problem the way I wanted it to be solved, I did really appreciate the new ideas and approaches.
That led me to my next question, which was: rsync is a largish and complicated piece of software. The philosophy of Open Source says that if you have an idea, you should modify the source rather than ask someone else to do it; and I can program in C so the source of rsync wouldn't be foreign to me. So where do I start? One approach suggested was to generate a tags file and start tracing through the execution of the main routine; another was to find the printed text messages that are generated at the time that I want my revision to be used, and start reading from there. A further approach was to draw a concept map - sketch out the top-down design of rsync in order to narrow down the code I had to read. All excellent suggestions, and when I have some spare time I shall try them.
Then we had some real nuts-and-bolts stuff; I showed Hugh how to do Doxygen documentation, and Daniel showed me a bit about autoconf/automake and how to integrate them into my coding. He also suggested a technique of checking for the existence of a library at runtime (e.g. libmagic) in order to determine whether we should call the libmagic routines to check file type; unfortunately I can't now remember what this magical call was. I should have been writing this nine days ago.
It started out not looking so good, but I think it was one of the better Programming SIGs I've been to.
P.S. I've also learnt tonight that, if my WiFi is connecting and then almost immediately disconnecting after showing now signal strength, unloading and reloading the kernel module (after stopping the ipw3945d service) will reset it; starting the ipw3945d service again will get things back on track. Or so it would seem from this initial test.
It took me a while to work transcoding out, mainly because I didn't understand how things worked. When receiving an analogue signal, you have to run some form of compression because otherwise you'd chew up a 250GB hard disk in about two hours of recording. But with digital TV, you get MPEG2 at either 4 or 8 Mbps for standard definition picture, and probably in the 12 to 16 range for high def. So it's already compressed and just gets put into the file system as is. You still chew through 2GB or more per hour of recording, so at some point you can cut the commercials out of the programme and transcode it down to a smaller size, optionally shrinking the picture size and recompressing the audio as well. Or, if you want to move the data out of MythTV altogether - e.g. because you want to play it on a phone or an iPod - you can use nuvexport or user jobs to automatically make a smaller file with a reasonable file name (rather than something like 1001_200608251730000.nuv). But you do lose a lot of the metadata in the process since that file is no longer kept with the metadata in MythTV.
I hope that didn't bore the people who were just wanting a 'spotters guide to MythTV', but everyone seemed reasonably happy and after I'd finished my near-interminable rambling we ordered pizza and got back to the important task of getting machines working. Bob and George Bray from the University of Canberra had a great time getting UC's UDP Multicast streams of various satellite broadcasts playing in the ANU lab subnet, and I had the great pleasure of watching the irrepressible Tridge and a couple of other guys get my infra-red remote control working (finally). The whole process was done from first principles - find out which card is receiving the IR, which device that corresponds to, how to get lirc to read that device (in my case, with an AverMedia DVB-T 771 card, it was /dev/input/event2 and you need -H dev/input in the lirc options file to get it to read the device correctly) and finally how that plugged into MythTV (or, more correctly, how MythTV plugged into lirc). Now I have a working remote control, and I just need to set up a ~/.mythtv/lircrc file to get the remote to do something in MythTV.
I also spent some time with a few people getting them to register for the tvguide.org.au data. Unfortunately it was playing up, no doubt because we had a dozen or so machines going through NAT and appearing as a laptop upstairs. I also helped Rainer Klein get his MythTV database installed, although I kept on being distracted by other things, the chief of which was upgrading Nick's machine to Fedora Core 5 and MythTV 0.19. That was a little more hassle than we really needed
For no apparent reason, running the Fedora update process took the best part of two hours. Theories abounded but I have no really good idea why; on other machines that I've run it's been much faster. Then the ATI fglrx driver was out of date. In order to upgrade that we had to upgrade the kernel. yum upgrade threw up its hands at a lot of packages, mostly because they were ATRPMS dependencies, so I plugged in my livna and freshrpms setups, removed the worse offending packages (i.e. half of the MythTV front end, not a good sign), and upgraded. Still no upgraded kernel. Yum was firmly convinced that, indeed, there were no kernels available. Finally, at 11:30, in desperation I scp'd the current version of my work mirror. Success! Everyone - the four people who were waiting on giving me a lift, Bob who had to secure the room, others who were just masochistic, and not least Nick - breathed a bit of a sigh of relief. Then we had a bit more fun trying to get the screen configuration working, but eventually it chose the right kernel and all was well. We left at about 12:15am.
I'm really thankful to Tridge, Tom Ciolek and the other guys who got my infra-red remote working. I'm indebted to the patience of Bob and the others who waited around for this machine to work. I think there was a mutual understanding then that we couldn't just deliver Nick's machine back to him unworking and semi-catatonic and say "Sorry, don't have time now, I'll be back on Saturday, I'll get in touch." I feel that that's something that is deep in the Hacker Ethos - you can't just leave a problem unsolved, especially if someone's relying on this solution to work for their continued health and happiness (or at least domestic harmony).
I turned up at 10:20AM outside the right building. I waited until 10:50AM before I set off - alone. I cycled as casually as I could up through Turner, across the barren wasteland that is the GDE, and arrived at Haydon Drive to find Dileepa waiting for me (as he said he would). We then rode on, swapping Sun stories and enjoying nearly falling off my bike. We arrived at the designated picnic spot at 12:00. I ate lunch (Dileepa had had a large breakfast) and met up with Rainer Klein (who had gone kayaking) at 1:00. I then headed over to Das Kleinhaus and Dileepa headed back to Macquarie.
Steve Hanley organises a ride on a rainy day and gets half a dozen people; I organise one on a beautiful fine day and get two. Yay for me.
The second one by Pascal Klein was about the Tango Project, which is a project to create an icon set under a Creative Commons license, so that a consistent look and feel can be applied to GNOME, KDE, XFCE, and (if Pascal gets his way), XGl. Too many old-school hackers deride anything more complex (or simpler) than a command line as dumbing things down, usually in the same breath as they whine about how proprietary operating systems are taking over the planet. You cannot underestimate how valuable it is - both for new users and old - to have a consistent interface. The same command-liners will probably cringe when you take their beloved emacs away from them and give them an ordinary GUI text editor, because it doesn't have their favourite alt-left-shift-control-spoon key combination for correctly indenting XML in a boustrophedontic environment. That's called the interface, you morons. Get with it.
The third, 'unofficial', presentation, was by Chris Smart, showing off Kororaa and XGL. Heaps of funky stuff, some borrowed from Mac OS X, some completely new. Pascal made himself dizzy by holding down Ctrl-Alt-Shift- Right-Arrow and watching the cube of the workspaces whizz around before his very eyes. Hopefully it will support i810 integrated graphics, because that's what Pascal's new laptop is going to support, and he's going to be a very disappointed boy if he doesn't get shiny and whizzy. Actually, I should lay off Pascal because he copped enough stick from Steve over his double-edged-sword work with Mark Shuttleworth. But we do need to register www.ucultu.com.
The thing I wanted to note here is that one thing I'm worried about with things like XGL is that we're just going to have rubberised windows as the only behaviour because it's whizzy enough. I think there are a lot of ways of making things behave on a desktop, and I think Linux is all about choosing what behaviour you want. Just on the issue of window moving, I see several more ways to make windows behave as you move them around the screen:
All posts licensed under the CC-BY-NC license. Author Paul Wayper.