Building a new lab server, Ubuntu of course

How to make them do what you want, not to rant on about. Slashdot is better for rants anyway.

Building a new lab server, Ubuntu of course

Postby Doug Coulter » Sun Sep 18, 2011 12:46 pm

I thought I'd detail the build process of a new server for the lab here. While I'll spec the hardware, this thread should be mostly about getting the software as desired. It's just nice to know what we're working with here.

I have a Core II quad-core at 2.83 mhz, and 8 gigs of DDR 2 memory floating around from a mobo (Asus P5QL/EPU) that had gone bad, it had that known issue with the intel chipset sata controller failing - two boards worth. Being a cheapskate (kind of) I wanted to reuse those, having already used up the box they were originally in for something else, so I had to buy a few more parts.

I got an ASUS P5QL-VM DO mobo. This one is a lot simpler and lower power than that bleeding edge thing that failed. No big copper fluid cooling pipes for the chipset etc required here. It was according to PCLand, about the last core II board on earth that had 4 memory slots. It has onboard video, but I wanted to be able to run dual monitors, so I added an MSI video card to it, M8400 series (Nvidia chipset which seems to be the best under linux).
I bought an Intel SSD drive to be root, 160gb (more than required, but what was in stock). For other directories that get a lot of writing, I got a Seagate Barracuda 7200 rpm sata drive, 2 tb (also more than needed, or so I hope).
That's all the hardware for now, I put it in a cheap generic mid tower case with a 350w power supply. The case has a bunch of USB and audio jacks prominent on the front, which I like and which will come in handy around the fusor for data acquisition should I remove the one right on its rack (because eventually, some stray HV will toast that one).

I of course used another machine to download and burn an iso for ubuntu 10.04-2 64 bit. I have slightly mixed feelings about 64 bit linux at this point, which I'm also running on the *really* fast machine. This is largely due to adobe flash and video drivers not being ready for primetime, but other slight issues also -- session memory gets lost in the terminals and other little annoyances as well. Since this is "fresh" and easy to back off from, I may revert back to 32 bit, which works perfectly, and with PAE, allows the use of all that ram -- but only in 4gb or less chunks per process. That would seem good enough for almost anything, but you never know with realtime data aq.

First move was to partition the magnetic disk into 3 partitions, to be mounted over /tmp, /var, and /home. I used 20 gb for tmp, 40 for /var, and the rest for home, which also holds a network shared directory I always call "pub" and make "promiscuous" to be able to move things around the network, and for cross machine backups. Doing this, and mounting the SSD over root means that most of the writing that goes on during normal use gets pushed onto the magnetic drive, not the SSD, so it won't wear out. I may also disable the journalling on that drive to reduce writes to it and keep the speed high.

This is a very fast machine. It's only barely noticeably slower than the fastest one here, which is a 3.5 ghz i7 with 12 gigs ram, SSD and mag disk setup (same way more or less) and with a super vid card (getting into the big fan sort, lotsa CUDA cores). I'm hoping that despite latency due to what we used to call the "pentium pause" it will keep up with some serious high rate data aq. I don't think the fact that it has only half the virtual cores will be a big deal, but we'll see. Linux does a real decent job of splitting up multiple processes on multiple cores, so they load up nicely in most usage situations, particularly if you plan for that when organizing the code package you'll be running and needing great performance out of.

The "pentium pause" seems to be something that happens to highly loaded intel cpus -- they auto throttle down without any warning when they get hot, and seemingly just go off on some demented errand of their own for milliseconds on end. This is why PC's stink for data aq with hard realtime deadlines, and why we build devices to buffer data a little bit on the way into the PC to overcome that issue.

The install went slick, as usual. Don't even think about installing ubuntu on a machine that isn't hooked to the internet - it's a waste of time, and it'll just require more time later to get the latest-greatest everything when you do finally get it connected. And, once you get it basically in, you should invoke update manager (in the administration menu) until you've got all the possible updates - they are all good, but sometimes you have to do this more than once, since things have an order they have to get done in. Then you'll check that again after installing any other software. Once set up nice, this is automatic from then on and works pretty sweetly compared to "those other opsys" that go off and eat bandwidth when *they* please, not caring if they are sucking bw you need (both network and cpu) for the actual job at hand, and then refusing to let you shut down until they're done "phoning home".

First thing to do -- get those darned window buttons back on the right side. I don't have iOS envy like the later output of the ubuntu team seems to have, I just want a opsys to lay there and no eye-candy etc -- just load my programs, no fancy wiggly windows etc...I did the procedure linked here to get that done.

Of course I'll be wanting a bunch of other software on this machine, like MySQL, VirtualBox, a buncha perl modules to support my own coding, and so on. Procedures for getting them all in, and setup "right" will follow on this thread along with any tips I (re) discover on the way. File sharing has become all too "interesting" on ubuntu. It used to be you'd just install SAMBA, use all the defaults, maybe add a share in the .conf file, and you were done. It's not as simple now, and it's a long story how to get things really working as well as they used to. They've added some "simplifications" to the default that assume you've got some always-on other machine to be a domain server, which might not be the case here - I do a lot of peer to peer stuff, with no always-on machine wasting power just to make it possible to share between two other machines. Thus, I consider their new simplified improvement that requires that a bug. Further, the new Samba.conf file just ain't right even if you do put in the full version, and we had to tweak an older one to get thigns really right, and copy it around.

I should make a list to save me time next time. Oh, I AM making a list!
Posting as just me, not as the forum owner. Everything I say is "in my opinion" and YMMV -- which should go for everyone without saying.
User avatar
Doug Coulter
 
Posts: 3515
Joined: Wed Jul 14, 2010 7:05 pm
Location: Floyd county, VA, USA

Re: Building a new lab server, Virtualbox

Postby Doug Coulter » Sun Sep 18, 2011 1:40 pm

The first thing you need to do is go here: http://www.virtualbox.org/wiki/Downloads
and pretty much follow their directions. There are some caveats, however.
The main one is that you should just right-click on both the key file and the USB extensions/upgrade file and download them as files. The fancy auto-open stuff DOES NOT WORK.

I cut-pasted the appropriate repository into synaptic package manager. You get there by calling it up from the system/administration menu in ubuntu, then click settings/repositories. This gets you to the part of the gui to add this, and the key file. At that point, you can either search for virtual box in synaptic (after reloading) or bring up a terminal and type what they tell you to (which is more entertaining and gets the very latest). Once you've done the install, go to system/administration again, and under the advanced button on your user, add (at least) the ability to use virtual box, or things will fail in dumb ways. You might want to also add yourself to some other groups at that point for other things. I usually give myself a lot of privileges, since I'm not a click monkey and won't break things too easily.

Once you've installed virtual box, navigate to downloads and double click the extensions file, and let VB install it - it will come up with the usual annoying license stuff, and if you don't see that, it didn't work. Without this, most things will fail and your VB clients won't have USB or much else, so it's important to get this right.

At this point, you can create new VB machines, and install to them just like you would blank hardware, not much noticeable difference in what you do. You'll be exploring the menu for machine, devices, help and so on, but it's pretty straight ahead for making a new windows box or linux in most versions. The usual deal is you decide how much disk and ram to give them, and how many CPU's. Remember, that the ram is going to come out of your main system ram in one big chunk, it's not dynamic, so save some for the host or other VB machines! I don't make these system disks too big, and I make them virtual, because you'll be making copies of them or exporting them, and for bulk storage, it's better to use the virtual box provision to make a "Fake" shared folder back to the host machine, or simply use the network sharing it makes possible for the huge stuff. As it is, copying any VB file to another machine is basically only possible over the network as the files are far too large for fat-32 to handle, and trying to make a usb stick that uses ext4 (or any real filesystem, even NTFS) pretty much bricks the USB stick, so don't waste time trying that.

Once you've installed some opsys in a new VB machine, install or re-install the guest additions before doing anything else. Nothing will work as well or correctly if you don't. If you're doing a windows, be sure and turn off the auto-update as far as you possibly can -- it is super annoying, maybe the main annoying windows problem left once you get things slick enough to run windows in a vb window. It will start massive downloads, phone home to MS (not sure what it's telling them, but it sends plenty of data back there), and not let you shut down until it decides it's done enough of that crap - if it's started a huge download of a service pack, you're not going anywhere for awhile, there is no way to stop it short of a hard "reset" of windows, which has other liabilities. There are no such problems with the linux VB machines, they have good manners and will ask first, not to mention allowing you to cancel something taking too darn long.

You will, of course, have a ton of fun making USB filters, mapping devices on and off any VB machine. This is best done with VB, but not the machine, running so you can use its settings GUI to set things up. Blank USB filters can give you issues with windows clients going into a tizzy finding drivers for stuff you'd just as soon it not ever see (like hardware I'm developing right now), so beware. There are various ways to find out the needed details to set up device-specific filters -- one of the cool things about this is the fine grained control you can have over what the client virtual machine can "see" -- you can make windows utterly safe by simply not giving it a network connection for example...after all, there's not a heck of a lot of reason to give it one, if you're running it as a virtual machine, or at least not all the time. Ditto CD drives, serial ports and anything else. VB does it's own keyboard and mouse thing, so don't let your client machines see those directly -- disaster awaits when they capture those and you can't get back to the host!

Otherwise, I've found this a real boon. Backing up a whole machine is a simple as exporting the "appliance" or using the more advanced features of VB that will take "snapshots" and keep various older versions around for backing up if you do them right. That's all in their help dox. Being able to have a machine NOT see anything you don't want it to see is a security boon, and makes that stuff available for the host or another virtual machine. It's really pretty nice once you find how you want to work with it. And a hell of a lot cheaper than VMWare or most other things that try to do this job -- for now, remember the litigious Oracle has bought this...who knows where they will take it, but their track record is one of raping customers for all they can.
Luckily, this and MySQL, also now at oracle, work great as is -- there won't be a need to upgrade anytime soon unless opsys come out that you want to run depend on something like "Trusted platform hardware" that VB doesn't emulate now. I'd suggest simply skipping any of those, they'd have to have something awefully compelling to give the opsys vendor your trust -- and that's what they mean by that, not that you can trust your machine, not hardly. It's hooks into your machine so that THEY can trust it in some sense. Theft of control and your privacy is what they have in mind, DRM to the nth power.

So, on this install, I didn't bother to create a new virtual machine. Rather, I exported winXP from the machine on which I'm doing standard counter development and brought it into the new server so I could do it there too. Another huge advantage -- that particular windows has everything under the sun on it already (firefox, the CCS compiler, the driver for the Pfeiffer mass spectrometer, and so on and so forth). Simply importing this to another machine, rather than building it all again from scratch is an ENORMOUS savings of time and great flexibility. And, it was only a 12 gb file copy even though I made a 40 gig "virtual file" for the original install - it doesn't take up any space the install didn't actually use when its time to export and import the virtual "appliance". Verrrrry slick.
Posting as just me, not as the forum owner. Everything I say is "in my opinion" and YMMV -- which should go for everyone without saying.
User avatar
Doug Coulter
 
Posts: 3515
Joined: Wed Jul 14, 2010 7:05 pm
Location: Floyd county, VA, USA

Re: Building a new lab server, basic additions

Postby Doug Coulter » Sun Sep 18, 2011 2:31 pm

As shipped, a fresh ubuntu lacks some things I almost always want or need. So, we go out and add them. There's only so much you can cram into a CD iso after all, and this way you get the latest and best everything.

MySQL -- I use this more and more, and it's nice to have it on more than one machine. I just put it in using Ubuntu software center, pay attention during the install and set and confirm a root user password when you do. You'll also want SQL query browser, and SQL administrator, so get them too, they make it easy to manage a bunch of SQL databases no matter which machine they live on on your network.

Gnuplot -- must-have for plotting stuff, it saves a ton of work writing your own plot stuff, though it has its own arcane CLI, you don't have to use it - it's actually easier to just learn some perl and use gnuplot from there, using perl to do any fancy separating data from comments in files and reformatting - this uses each tool for what it does best, and avoids having to learn as many "languages" since the Gnuplot one is all its own (and only for Gnuplot, whereas perl is handy all over the place).

Octave -- the linux version of matlib, some people think it's cool and it ties in nicely with the scope capture tools I'll post up here for tek or GW instek scopes. It went "off the web" evidently due to someone feeling they owned copyrights to the scope apis and comm protocols, but I snatched a copy for everyone who has a digital scope and wants to do some very nifty things with it and the PC. Octave has endless add-ons. This is another "World of its own" when it comes to learning it's peculiar language, and since I can program my own stuff, I've not yet bothered to learn it much. Like some CAD/CAM tools, they seem to expect that's it -- you spend every waking hour just using their arcane tool and syntax, rather than having a life, and so see no problem with inventing some new language tailored to their thing alone, and useless knowledge anywhere or anytime else. OK, if you think that's easier than using a perl module to multiply matrices, I'm not going to argue, but I find other approaches workable myself for most things.

So here they are, the GDS-2000 tools. Unzip out of your bin directory, and follow the directions to get fast screen captures and so on from the scope.
gds2000tools-0.14-source.tar.gz
Tools for tek and GW Instek scopes. Really neat data capture and scope setup stuff and not hard to use.
(241.81 KiB) Downloaded 328 times


You'll likely want gtkterm -- it's the linux equivalent of hyperterminal for windows (but I like it much better as it can do cooler things). This will connect to any serial port or stream of bytes, like USB devices...save files for logs, all kinds of worthwhile stuff. We use it for looking at and debugging things like the standard counter - while we wrote some fancier software that plots, this is all you really need to get going with things like that.

You have to have "the gimp". This is the Linux version of photoshop. Yeah, it works differently -- knowing how to work one doesn't translate well to the other, but it's good and has a ton of worthwhile features. Every photo I post has gone through this, to adjust exposure, perhaps sharpen edges, reduce noise, scale with really good quality, adjust compression for best tradeoff between size and quality - you name it, the gimp is your tool for pictures.

To run some cool perl stuff I'm writing, you'll want some additions to perl. Here's a list, just type them into the search on synaptic, or more simply, ubuntu software center and click install on them. This is not as much fun as doing it the he-man command line way, which exposes you for instance to all the tests and demos of the fancy graphics stuff, but that's a lot of typing (for me to explain it, not to do).

libgraphics-gnuplotif-perl
libgtk2-perl
libgtk2-gladexml-perl
libdevice-serialport-perl
libglib-perl
device:serialport

These are needed to use the (fairly easy) glade GUI designer with perl (if you're going to use glade yourself, get it too), so it can do real guis with some complexity, and for it to talk to gnuplot as though it were a subroutine, and use serial ports from inside perl. In my system install, the stuff for perl to talk to MySQL came along already when I put in the SQL server, so no need to address those. All the above are needed to run our standard counter perl support stuff, and anything we'll be doing in the future (and we have a lot in mind already off this basic platform). So, while it's a pain -- this is worth the pain as it opens up a world where your PC does what you want (for free, otherwise) in terms of being useful in the lab. And by the way, this allows you to write a single program in perl that has a gui that needs no changes to also run in windows, though you have to set up windows to have perl (free) and all this other stuff (also free).
It's more work to set up windows to run all this stuff, as not as much of it is there already, and the pickings for windows are a little more scattered around the internet, but it is possible and it does work once you jump that hurdle the first time.

Now, once I have all this stuff, I like to put a few into the menu bar for easy access. Terminal -- you bet. Calculator, gedit, gtkterm, tomboy notes (or gnote), easy scan, system monitor, maybe even VNC viewer (for running computers remotely and getting their desktop locally in a window). This is done very easily by right clicking on the menubar and following the directions. After installing, all this stuff will also be in the linux equivalent of the "start" menu someplace, so you can just select from those to get it happening. FWIW, I also reverse the taskbar and menu bars in linux, to have the "Start" at the bottom and the tasks at the top, autohide of course. That's also easily done via right-click on the bars. To switch them, move one to a side first, to save some ugly messing around with both on one side of the screen (one hiding the other) at the same time. I *think* I like gnote better than tomboy notes, as it's in real C, therefore quicker and not depending on microsoft not suing over the use of the .net emulator mono on linux. But that's me, either is pretty cool and very handy in organizing "having a life".

You will also want at least some key scripts for "nautilus" which is the linux version of "windows explorer". What these give you is whatever they do when you select a file or directory and right click it -- there will be a "Scripts" entry in the context menu you can select these from. While I'm not adverse to the command line, it seems terribly stupid to have navigated to some complex subdirectory in the GUI, and then have to type for two minutes to navigate a command line shell to there, and then maybe sudo (get root) gedit to edit some priveledged configuration file. These scrips eliminate that junk, you can just right click something and then select "gedit as root" - bam - best of all worlds.
NScripts.zip
Put in /home/youruser/.gnome2/nautilus-scripts
(7.61 KiB) Downloaded 328 times

Think of this as a power tool for system work. 99% of making linux do as you want involves editing about 1 word in some privileged file in /etc - this will make that easy, along with various other tasks I do all the time. You have to navigate to your home directory, and select "show hidden files" to see the .gnome2 directory so you can do this. Just put the scripts from this zip in that dir, and they now appear on right-clicks in nautilus as if by magic. Once you know this stuff, it still seems simpler than with windoze registry.

More later. Since this new machine is right next to a big screen monitor (36", big for me) and a killer multichannel shop stereo, I'm going to set the network up for 1 gb/sec (have to plug around a 100mb switch) and transfer my huge audio and video collection to it this afternoon while football is on. It's "only" a few hundred gigabytes, so I suspect that moving a cable for a bit to make the xfer faster will make it get done today, instead of this week. And, it's always nice to have things backed up on more than one box. It literally took a couple weeks of all-day to rip all those CDs (thousands of them, many I've not even heard yet as I inherited them due to deaths in the family).

Another goal of this build is to have a full-capable machine that has "Everything" down in the shop. The idea is to spend more time down there doing real things (to then post up here) rather than sitting on my nice comfortable couch harassing the forums and just trading stocks. Better all 'round, and I can use the excercise. Some of that I'll be able to post here -- for example, I'd bet some would find my set of Firefox bookmarks very extremely useful. But of course, I'll strip them of auto-logins to trading and bank accounts first! It can be real handy to have that 36" LCD you can see from across the room to monitor fusors or stocks while you work the lathe and so on. Maybe I'll even get from chaos to "creative" chaos down in the lab again!
Posting as just me, not as the forum owner. Everything I say is "in my opinion" and YMMV -- which should go for everyone without saying.
User avatar
Doug Coulter
 
Posts: 3515
Joined: Wed Jul 14, 2010 7:05 pm
Location: Floyd county, VA, USA


Return to Operating systems

Who is online

Users browsing this forum: No registered users and 1 guest

cron