Ubuntu and LTSP.

Status
Not open for further replies.

Jayce

Fully Optimized
Messages
3,056
Location
/home/jason
Linux Terminal Server Project.

Has anybody tinkered with this? With the recent asking of me to figure out how I can add Ubuntu to the domain with consideration of slowly porting to it to save licensing costs, I almost accidentally stumbled across LTSP, which is a thin client package with Linux. On a hunch, I fired it up on a spare desktop I had to see how it worked. I used the Edubuntu DVD to install, since we're in an educational institution I figured the extra software could benefit. Not to mention, LTSP comes as an installable option in the Edubuntu install. Installing LTSP was... no lie... a single checkbox during the install. I was SO surprised how easy this was to set up.

Of course, once it's set up there's some config files sitting around if you want to customize the install more. For example, I did not want the LTSP box acting as a DHCP server since we have a Windows server to handle that task, so I had to remove the service and edit the network file for the new IP address I wanted, since we work on a 10.x.x.x network and by default LTSP comes with 192.x.x.x. With a few DHCP entries to the Windows server so it would pass off PXE booting systems to the LTSP server to complete the boot process, we were golden.

I was skeptical at its performance, because I have seen several other thin client based systems. Anything from Server 08 to Citrix to NComputing. I have yet to see something that worked with favorable reviews, to put it extremely lightly. Because of this, I began the instant call of "BS" to any thin client system I ran across. This is, of course, with exception to department store thin clients that run their reliable old DOS-looking programs that are so small it's difficult to backfire.

To my surprise, I was able to run 10 low hardware systems from my laptop. A quick check of the local resources being used on the thin clients showed that they were utilizing so little local processing power, and that most was done from my laptop. My laptop is a core 2 duo with 4gb of RAM. While I was using nearly all of my RAM with a bunch of programs open across these 10 systems, my processor was doing just fine, which surprised me a little bit. On each box, I had Gimp, Open Office, and several instances of Firefox open. On each box I would go to graphic intensive web sites and all in a row hit enter. And still, it trucked along.

I have since given up with trying to help out this HP box, but we have this one particular computer system that for some reason just runs dog **** slow. It's not just one of them, it's nearly all of them. Even when HP came to warranty other newer systems they had made interesting comments about these boxes, indicating they more or less "felt our pain." It's a 512 meg box with a lower end processor, but I've seen other similar spec systems run new installs of XP with complete ease. These boxes are pretty much vanilla XP installs, with Firefox and OpenOffice. I have NO idea what it is about this hardware, but they are so slow that people avoid them at work. Painfully long login times, network slowness, sluggish execution of programs, etc. And of course, I had to give it a shot. So I fired up a few systems to my laptop... Instant login times, instant start of applications, no bog down time with web browsing, etc. I am now confident we can save money and pump life into these old systems that otherwise would sit there being unused until the end of time.

There's still a lot to do and more testing to be done, but so far I am overly pleased with the performance of the LTSP project. Between ease of setup and performance on low end hardware, this could easily be the money saving option for the future. I had no clue LTSP existed until 2 days ago, and already I have set up 3 test server environments without issue. That being said, has anybody ever used LTSP before? I'm curious what other users would have to say about it.

For what it's worth, something completely awesome about this setup is it relies on PXE boot. Anything that can boot a Linux kernel can act as a thin client, whether it be a desktop computer, netbook, or an actual low powered thin client box, such as the Asus EEE Box. No proprietary vendor lock-in and no expenses paid for experimental testing as long as you use existing old hardware to give it a shot.

That being said, this is my first impression review. More to follow as we (and if...) we introduce this to some of our lab environments.
 
I guess I always assumed that Linux had something like this. I've never seen one installed, running though.

So basically you're just PXE booting and it connects right to the TS? No initial O/S to load or client viewer like XenDesktop or VMware View?


I'd be suspecting the hard drives of those slow computers. But since it's really a non-issue, all the work being done on the TS I wouldn't waste any time.

Hope this new setup works out good for ya. It's always great when you come across something new that could potentially save a lot of money. Especially when it comes to licensing costs!
 
I guess I always assumed that Linux had something like this. I've never seen one installed, running though.

So basically you're just PXE booting and it connects right to the TS? No initial O/S to load or client viewer like XenDesktop or VMware View?


I'd be suspecting the hard drives of those slow computers. But since it's really a non-issue, all the work being done on the TS I wouldn't waste any time.

Hope this new setup works out good for ya. It's always great when you come across something new that could potentially save a lot of money. Especially when it comes to licensing costs!

Yup. It's all done via PXE boot. The only semi confusing thing was I did not want the Linux box to be a DHCP server, which threw me off on how I could possibly make this work. Fortunately, I RTFM, and I found an entry about integrating it with a Windows DHCP server. Three lines later and blam, I was in business.

I find it absolutely awesome that it works off of a standard PXE boot and that to test it out you literally spend nothing. This can be huge for us, and I'm very excited to see if it takes off. The lack of proprietary vendor lock in is very attractive. Another nice thing is the scalability of it. Any box that can PXE boot (or any dedicated thin client you deploy) can connect to it accordingly. On some thin client setups you need to re-route the cabling in the server room so they're all in 1 switch, etc. Not here. PXE boot, what's up LTSP box, I'd like the linux image. Oh thank you, image received. Hello there login screen. Time elapsed - seconds. Time to get to work. Love it. Secondly, the fact that *IF* this would tank on us for any reason, we simply reboot the systems, select the local hard drive, and blam - we're back with the local OS still there as if nothing ever happened. This is a great way to "bounce back" if this goes south for us. That way we can test this with confidence in knowing there's little/no downtime to switch it back quick until we can optimize it for our setup. Once we got it locked down, we can move forward with permanent setups.

The other key thing here is being able to find alternative software for these Linux systems that are currently used on the Windows counterparts. However since we already utilize a TON of open source software, and considering the potential at cost savings here, I'm hoping that the higher up's may work with us and bend a little bit to consider alternatives to the very few pieces of proprietary software we still have laying around.

About the HP box - I have no doubt at this point that it's the hard drives. I know 512 RAM isn't "a lot" but in no way shape or form should XP suffer the way it does on 512 RAM with those boxes. They perform like Vista would on 256 RAM, which is a grade A recipe for a bad joke.

I'll keep posted as new changes arrive. Currently I'm trying to customize the login screen so it has our mascot and name of the school there just to personalize it a bit. Since the actual install is done, all I have left is to streamline it a little bit and give it a little in-house eye candy. ;)
 
Heh, don't ya love it when everything just comes together so easy like that? All those hair pulling days with stupid stuff going on and BAM, rewarded with a highly possible deployment that took only the cost of your paycheck.

Finding alternatives for software can be a major pain in the you know what sometimes though.
Wouldn't WINE help with the software that might not be replaceable though? I've never used it myself, but have heard it can do wonders for Windows based software...
I know users will give me reeeeally bad dirty looks (and not the good kind :sad:) when I change their software. If they can keep the same stuff, then it's no doubt a bonus.
However, if you can save the money...fricken go for it. Opens the budget to more goodies :lol:

Haha and no doubt XP can perform on 512MB. I've done it myself many times. Heck, if you had 1gig back in those days you were laughin'....wow...can't beleive I just wrote "back in those days with XP"...

Keep posting about the process. I'd like to know how this goes though. I'm already looking for solutions for 2013 when I have my PC leases up. Either desktop virtualization or TS to save the hardware costs @ the user end.
 
Today I put a small server in place (dual core proc, 4gb ram) in the library to use with 10 very slow systems that I cannot get to run any faster (note the HP comment above). I showed the librarian how the interface is different and how to do various tasks. Within a matter of minutes, we were done and she had no questions. I fired up the systems and had them all running tasks to show her how the setup worked and whatnot. She seemed very anxious to put it to use as the systems were *that* slow that nobody was using them anymore. This is certainly a great way to breathe new life into older hardware.

Also today, my boss spoke to me about presenting this project to the super intendant and assistant super intendant next week. Along with that, we also caught word a neighboring school district (which is entirely Mac) is beginning to use this exact same process. This year may bring some huge changes...

On one hand, it's sad to see that it took a bad economy with struggling budgets for people to realize the power of this alternative software. On the upside, it's great to see people finally getting the hint. I don't want speak too soon before this goes larger scale, but I sense a Linux revolution is upon us. Suddenly work is fun again. ;)
 
I figured I would update this thread with some of my findings in case it could benefit any other users in a similar situation.

We sat down a few weeks ago and really analyzed what we are doing. We removed all personal opinions from the table and asked a simple question.

What do we need to get the job done?

This lead to a formal meeting with a series of higher ups in the district. Followed by a 2 hour presentation of Ubuntu and LTSP and now we're on the road to adopting it a little bit more, pending some more testing of course. An array of alternative software is being considered since that's an obvious factor, but so far it's proving to be far easier than I anticipated. After all, curriculum drives technology in our environment, so what they need we have to support. Of course, if they want Windows 7 and Office 2010 or Mac OSX and Office 2011, somebody needs to write some checks. Little FYI - nobody is writing any checks... That's where we come in with alternatives that are more affordable.

We currently have three series of thin client setups. Two Windows based, one Linux based. Of the two that are Windows based, they are NComputing and some HP product using a specific version of Server 2008 known as MultiPoint, or something of that nature. A pretty black and white picture in regard to works vs doesn't work has been painted based on our results over the last few weeks/months. That being said... we will begin to introduce more Ubuntu based thin clients. As of now, I have 2 servers in place, both of which haven't had to be touched since I implemented them. No reboots, random shut offs, connectivity issues, nothing. So far we've had minor issues that we were able to recover from.

One server is running a true thin client, which is where the clients use the server for processing. Comments have been high, with the only downside being unfamiliarity of the interface, however I have been told it is "very easy to adapt to on the fly."

The other server, due to its low amount of RAM (2gb) and the simple fact that 2gb won't push a lab of 30 clients, I had to use a fat client image on our Ubuntu server. Fat clients utilize local resources, which takes away from server load. The advantage to using fat clients is instead of editing 30 images for a lab, you edit one - the server. It's centralized management at its finest. I ran into some issues with fat clients, which is simply due to the way it's designed and the way it interacts with Windows domains using Likewise Open. When you're on a thin client, the thin image is built as part of the server, so it's in direct connection with the domain. When you're on a fat client, the fat image is built independently within the server. As a result, the fat clients boot in the same demeanor as a LiveCD of Ubuntu. So ask yourself this - does a LiveCD session know it belongs to a Windows domain? My issue exactly. That being said, our users have to authenticate to whichever server they want to go to once they are already logged in. Kind of a pain, but it's just the way fat clients work with Windows domains. The fix? More RAM in the server and we'll swap the image back to thin clients. 10 minute fix. The other fix would be to use a Linux domain, which authenticates in a different way to retain the connection (from what I read), but that's not really in the cards. So I'll be swapping the clients from fat to thin next week.

There's also another option with this setup. So there's fat which uses local resources, thin which uses server resources, and lastly localapps. Localapps are used with thin clients, but you can customize which applications use local resources. This means I can run a thin client lab but say Firefox runs off of local resources whereas everything else runs off of the server. That way you can balance server vs local processing a little bit more. I'm still testing this avenue, but so far it's interesting to play with.

For the time being, we will be banking on our available time remaining with XP while it is still supported. We'll have to see what happens in the future though, as we will have to do a lot more testing to ensure this idea is a solution for our environment.

So far though... I'm more than pleased. I had to learn a lot on the fly because LTSP, while relatively easy to use, is very extensive with its capability and how it works and how it interacts with a standard Ubuntu install. We'll ride out the end of the year with these two labs and see what happens.
 
Glad to see things are working out with this project.
Curious, what hardware/software are you using with the NComputing? I ran NComputing with their Expanion thin clients for about 3 years and had nothing but problems, mostly with the terminal not being able to connect to the vSpace software. NComputing's fix...lets put out another terminal and immediately end all support for current releases. Heck, they didn't even notify their help desk that they terminated those revisions.

And I hear ya about the people writin' the checks. Especially since how ugly the economy has been, it's nice to get something that works, works well, and only the only cost is the implementation time. And especially if you're trying to replace XP. Not that XP is bad, just well outdated now and to move to 7 would require a major hardware update. And when you have ~4 years left of security updates, it's time to start movin'.
 
Glad to see things are working out with this project.
Curious, what hardware/software are you using with the NComputing? I ran NComputing with their Expanion thin clients for about 3 years and had nothing but problems, mostly with the terminal not being able to connect to the vSpace software. NComputing's fix...lets put out another terminal and immediately end all support for current releases. Heck, they didn't even notify their help desk that they terminated those revisions.

And I hear ya about the people writin' the checks. Especially since how ugly the economy has been, it's nice to get something that works, works well, and only the only cost is the implementation time. And especially if you're trying to replace XP. Not that XP is bad, just well outdated now and to move to 7 would require a major hardware update. And when you have ~4 years left of security updates, it's time to start movin'.

We are using L300's with a completely massive server. They would often have issues connecting, but most of the time if we would just reboot the server they would come back. But they would take quite a while to come back online. One time I had rebooted it remotely from another building and I must have sat on the phone for 25 minutes with the other tech who was present watching the lab come online and tell me everything was okay. The lab was a general purpose lab, with Libre Office, Firefox, and uh... that's basically it. It's meant to be a general research lab. We also have been told that they constantly have a fix coming out, but it never seems to do the job. Once it's made aware nothing changed, don't worry, another fix is on the way. You can only buy that excuse so many times.

The other thin client solution we were toying with is also experiencing issues. Some issues were different, other issues were the exact same. I began to wonder if there was a common denominator between the two setups, but I really came up short. Different hardware, different locations, different everything except the OS. Before any more time was wasted, the decision was made. So now it's just time to begin the software hunt and have it in place before XP expires.

XP might be an old operating system, but every school district I know is clinging to it for dear life. Some districts vow to run it well beyond whenever Microsoft stops supporting it. Others are already making the bold decision to look at other platforms. It's just when you take 2,000 computers and factor in the awesome discount Microsoft gives educational institutions (I'm not joking, it's a pretty sweet discount), you still have a big fat check to pay just by putting W7 on the systems. Where does it stop? I find it hard to believe we won't be in the EXACT same scenario in 5 years, when W7 is the new age "XP" and Windows 9 is on the table with favorable reviews. It's nothing against the OS. It just works for the most part. But the business model is not optimal, especially now when everybody's budget is cut, yet everybody upstairs is barking that we need to keep this gear running.

I'm just thankful an alternative is on the table... one that will salvage our hardware, save us with new hardware purchases, and also eliminate nearly all software costs. I'm sure we'll invest in the open source model with commercial support for some of our software choices, but even considering commercial support, the weigh-in with savings we would benefit from would still be absolutely massive. I'm still testing things on a larger and larger scale, with baby steps each time to see how we can push the bar a bit more, but so far I've had nothing but favorable reviews. Most people are very welcoming to the new interface when they see the speed boost these systems offer, which is a relief to me since some of the subjects I expected to downright boycott the new software were more open to it when they logged in within seconds.

There will be quite a few hours of testing yet and close monitoring just to make sure things are moving the way they should. I'm relatively confident though, but still a little afraid of the whole thin client idea. I think that fear is only fueled by the conversations I have with other districts. One district I spoke to have invested over 200,000 dollars in 5 different thin client vendors, all of which failed miserably. I just find it a little strange that I was able to install Ubuntu and LTSP and network boot a few PCs and bingo - I'm running with no issues... I guess if it fails, I can always just use Ubuntu locally on the systems since that's the same interface and software... even though we would still use regular desktop hardware. There would still be software savings, and we would get more longevity out of our hardware. But at the same time, other districts in the world run ENTIRELY off of LTSP. If they can make it work, I can too. Right? ;)
 
Status
Not open for further replies.
Back
Top Bottom