Friday, June 30, 2006

Geforce 6800GT Super Cooling


CardMod01
Originally uploaded by ModestOne.
While cleaning the dustbunnies from my videocard's stock cooling setup, I decided to pop off the stock heatsink/fan assembly and have a look around. I found a very sloppy thermal paste application, and horrible pads for the ram.

I decided since this card will eventually be water cooled, and the stock stuff removed, I may as well come up with my own aircooled solution. Hopefully it will preform quite a bit better than the stock cooler.

Tonight is the testing night, hopefully it doesnt blow up.

Monday, June 19, 2006

My quest for making a "Gaming Pod"



I have wanted to have a semi enclosed "pod" specifically for computer usage/gaming for a long time now. The idea has been thrown around many many times, and has always seemed like something that would be easy to do, and have a great benefit. However its just one of those things I have not gotten around too. Now that I have a full sized arcade, I need a "cockpit" like enclosure as well!

I want to design it in such a way, that it is not only great for gaming, but also can make for a productive workstation as well. The main objectives? A semi enclosed environment, that can remove unwanted distractions, as well as act as a sound buffer. Full access to all types of control/computer input that would come to be expected in a workstation, and gaming environment.

Anyways, these are some photos from across the net of a particular design I like. While it is not enclosed, it can be made to pretty easily.

Wednesday, June 07, 2006

Dual monitors in Ubuntu

On my laptop I go from using dual monitors during the day, to just the laptop display at night. In Windows this usually happens pretty seamlessly. However Ubuntu does not seem to have a very evolved interface for setting up and using dual screens. I had to do a little digging to find the necessary information to get everything how I wanted it.

The requirements were;
  • Dual Screens (duh)
  • An "extended" desktop, not "cloned"
  • Automatic detection of monitors, and automatically set up X windows for single or multi
I got most of these things working within short order, but getting automatic detection / configuration took a little more looking. I am going to try to include all the information required here, but will also link off to the pages that I found the bits of info.

The "xorg.conf" file, located in "/etc/X11/" is the main config file for the display system of Xorg. This file contains the information for monitors, video cards, X screens, and the instructions on how to make them all work together. The first thing most people should do is plug in their second monitor and run "sudo xorgconfig". This should make X scan for hardware and update the Xorg file accordingly. This way all the info needed for your two monitors and video devices will be in there. It will probably be cloned, and it defiantly will not auto detected/adjust at boot.

unfortunately that step did not detect my additional monitor on my IBM Thinkpad, however it did work for my co-worker's laptop. Try it out and see what you get. I had to manually enter the information for my monitor.

I am not going to go into full detail on how to get all of this working, instead I am going to link to the pages that I used to get what was needed, and supply my own config files for use/examination. This way I don't have to re-hash what they already go through.

This page here has a very good xorg.conf file, and a guide to the file. Here you will learn how xorg.conf works, and what means what. This xorg.conf file is almost identical to mine.

This page here
has lots of information, a little ways down is information about dual monitors, and below that is a guide on getting what is needed for auto detection of the monitors. This page also contains the script and packages needed to get it all working.

The auto-detection relies on a couple small packages, and a simple script. It also requires that you make two extra xorg.conf files. One for single screen setup, one for multiple screens. Mine are named "xorg.conf.single" and "xorg.conf.multi". You then register the script as a startup script that. It runs before X windows, and checks through the "read-edid" package to see if there is a second monitor hooked up or not. It then moves over the appropriate xorg.conf file to replace the last used one.

The script will need to be edited to indicate your two xorg.conf files, and their location. All of this information is available through that last page.

While this may seem complicated it really isn't to bad. I do however, hope that the Ubuntu developers incorporate something like this in their next release!


Sunday, June 04, 2006

Ubuntu, Linux for all



So I have decided to jump back into Linux, with the intention of really only messing around a bit, and seeing how some things have come along in the many years since I used it exclusively. Ubuntu has changed all my "intentions".

I intended to partition my laptop out so that I could dual boot from there, play around, and be back into Windows when I inevitably get frustrated trying to make things work that just don't want to play nice in Linux. So I went ahead, reformatted, partitioned out a small section of my drive for Linux, and installed Windows on the rest. I then proceeded to decide what distribution of Linux to use, and ultimately decided on Ubuntu 6.06 (Dapper Drake).

After going through the VERY nice install procedure for Ubuntu, I prepared my self for the inevitable, at least a couple hours trying to get sound to work, or WiFi to work, or networking, or video, like I almost allays had to do in the past with Linux.

To my complete surprise, I heard the Ubuntu start up sound. "Well that's a good sign", I thought to myself. I then started thinking about how much of a pain it was going to be to get Linux drivers for my IBM Internal WiFi card. Only to look up by my clock and see that Ubuntu apparently has already jumped online through my WiFi and found 3 system updates ready for download. WHOA! Not only did it get my card configured, and working properly, without any input from me, but it also has an automatic update feature! Awesome.

So far so good with Ubuntu. Seems to be a very refined Linux desktop. It is clear that they are trying to bring Linux to the masses, and are doing a great job at it. There are still some things to learn with Linux if you are converting from Windows, with no Linux experience. But with Ubuntu, it takes care of all the pain in the ass things, and lets you learn the parts that are fun and interesting.

Expect to see more posts about Ubuntu in the near future!

Sunday, May 28, 2006

The Truth


*Comic by Ctrl-Alt-Del

Friday, May 19, 2006

"Computer license and registration please."




Some people may believe that being "computer illiterate" is a victimless crime, but I am here to dispel this belief, once and for all. I will say however, being computer illiterate, and also not having a computer, is much less dangerous.

The problem lies in the fact that most people that do not believe they have time to spend learning how to properly use a computer, also believe that the rest of the world that does know how to use a computer, is forever in debt to them, and therefore must act as their personal slave of technology. It almost seems as though the ideology is that they would rather spend their time talking to tech support, then actually exploring/learning/using the device they bought. Now you may say, "But James, don't people learn things when they call tech support?", and the simple answer is "no". People seem to click their brains off when they call for help. Suddenly the person is unable to make any decisions for themselves, let alone even process what they are seeing on screen. They must now have extremely detailed descriptions/instructions on how to close the active window!

So, we have now determined that computer illiterate people effect those in the IT field. However this "effect" is minor, and some would argue is "part of the job". So lets look a little closer at how this all pans out.

There are people that legitimately need tech support for the products that they own. However the bulk of Internet related, phone service calls, go from "Tech Support" to "Computers 101" in the blink of an eye. Instead of troubleshooting a supposed "internet problem", suddenly you find yourself teaching someone how to use bookmarks, or how to set up email rules in Outlook, or giving a lecture on spyware and spam, the list goes on and on. The main problem with this scenario, is that troubleshooting problems with an internet connection is a relatively basic procedure, with minimal variables, and the problem can be determined quite quickly in most cases. This all falls apart however, when the person needing tech support, suddenly also needs a personal computer tutor.

So what is the end result of all this? Besides frustrated phone tech's, and lots of time on the phone? I believe it has been one of many contributing factors to businesses outsourcing phone support over-seas. As more and more people that are unfamiliar with technology are getting involved with it, this problem intensifies. Suddenly service centers are needing to double and triple their staff, or offer less effective tech support, with more on hold time. In general, if all calls to a given tech support call center, actually had a legitimate problem, and both ends of the phone had people that have a basic understanding of their respective computers, then tech support call centers would be drastically smaller, and more effective.

Of course in the real world, it is a lot easer to pay a whole building of people from India to do your job, and for a lot less. The companies don't care, they are paying less for support then ever, and don't have to listen to you complain about it, because they don't even run the show.

Obviously there are many, many, many reasons for outsourcing jobs. I believe this is one reason for the large scale move over of tech support related jobs. It is a sad truth.

So what will come first, a required license (or training at least) to operate a computer on the internet, or a couple generations of people using computers from childhood?

Monday, May 15, 2006

The suffocation of MMORPG's and the Rise of the SuperCade

rant/

I have been into gaming, on and off, for most of my life. If you would have asked me 5 years ago where I thought my gaming future would go, I would have seen nothing but MMORPG's in store. However as of late, I have found that I don't get the same sense of "awe" that I once was infused with through the likes of Ultima Online. That feeling of adventure, that the world around you is alive with people, just waiting to create new experiances with eachother.

Is this a product of getting older, and not finding joy in the same things? Or is it the over decline in innovation and progress in MMORPG's. Since my long bout with Ultima Online addiction, I have played a variety of more current MMORPG's. I have played the WOW, Anarchy Online, Guild Wars, Redmoon, Legend of Mir, Project Entropia, and while I have not played EQ, I have been around it enough to know more than the average non player.

While I have gotten into all of these games in varrying degrees, there is one common thread. I get bored with the pre built worlds, pre built experiances, pre built friends and enemies. They come and go, with the same "static" feel, allways slightly different, but allways the same. This has led me down the same path each time. Have some fun, like some of the new things, then loose all motivation to repeat the same things I have done the last 8 years in past MMO's.

Ultima Online, I belive, is the one MMORPG that has a fully dynamic feel to it. Perhaps I feel that way because I have allways played on player ran shards, which give complete control (or lack of it) of all parts of the game, to the players. Because of this fact, I see UO as living on forever in the player ran MMORPG world.

However, as it is now, I do not play Ultima Online, or any other MMO for that matter. I have found my gaming bliss as of late, to come from the past. When games were valued for their sheer enjoyment factor, and not all the extra frills. When games HAD to be fun, at least enough to convince hundereds of teenagers to dump all the quarters they had in, just to see what was next, or get that high score. This my friends, is the true foundation for the current gamming industry.

There is something about that golden age in gaming that calls to me. I have had more fun trying to beat my friends score's in the likes of Galaga, Gradius, and Joust, (to name a very select few), then I can remember having in my entire time of killing Murlocks and Bandits. There is just something about gathering together around the dull glow of an arcade, and competing to see who's the best at staying alive. All the while with full social interaction and tactile feedback from the people around you.

These games didn't need dual 3 ghz processors, or the latest $700 video card, or a cell proecessor. They survived on the fact that they are great fun, and highly competitive. It is all to common these days for the main selling point of a game being sheer graphics. The game looks great, but plays like a bucket of rocks. To which I ask myself "Why not just watch a CG movie" as I get 100% in the third bonus round in Galaga.

Well, ranting and raving aside, having an Arcade at home, that has an unlimited amount of classic arcade games, is pure gaming nirvana. Of course there is some satisfaction knowing you built it yourself as well.


/rant