Discussion in 'Article Discussion' started by Lizard, 1 Jun 2010.
I remember getting a 40mb hard drive for my Amiga 1200 and was the envy of my friends.
It's the short sighted attitude of people like you that creates these barriers in the first place. So well done... this is all your fault.
Just because you don't doesn't mean no one else does either.
Can I ask though.... what the point of that is? In reality, you want faster load times for things like games, and for loading in large files into Photoshop, or large video into premiere.. then for huge amounts of scratch space for video editing etc. So it will still be mechanical disks that are doing all this work. All you've achieved by moving everything off your boot drive and having a small partition on a SSD is fast boot times. That's pretty much it.
Seriously... I still do not get the whole SSD thing. They're too small to be useful unless you pay more for the drive than you did for the whole rig.
Don't doubt it. I have a bookcase FULL of DVD wallets - all full. It's actually 4.37GB max on a DVD+R5 and 7.95GB on a DVD+R9. I have burned through so much media in the past 7-8 years that I have gone through at least 2 DVD burners per year. They just end up failing due to how much use they get. Thankfully a DVD burner is $25 or so now.
I admit, a restore would really suck. But these files are spread across multiple drives. I have a 1TB, 1.5TB and 2TB drive on my desk right now. Don't forget that a 2TB drive isn't actually 2048GB of storage capacity. It's 1853GB.
With 2TB drives dropping like a rock in price I will probably stop using DVD media. I will just copy data to a pair of external drives. One would be permanent back up in place of the DVD media. To be used until it is full and then put away.
IBVs develop their code for reference platforms (a particular chipset and CPU combination like the 'Thurley' platform which the x58 is part of).. Companies like Asus and MSI can then buy a license for the source code involved and work on it themselves. So stuff like Expressgate, OC Genie, or things like that are typically developed by the OEM. UEFI is a totally different animal from BIOS, it is built on a totally different set of principals than BIOS. Just for starters, it is written in C rather than ASM, so even where it is possible to use features developed for BIOS, it's really not very practical.
I believe a number of companies are doing extremely basic ports of reference code for their platform and shipping in the low end. This lets them get to work with UEFI without a lot of risk (from what I have seen, the transition from BIOS engineer to UEFI engineer is not an easy process).
Just as an example, my MSI Wind is running a UEFI BIOS (though I'm not sure which rev of the spec it is compliant with). I can't get at any of the 'fun' features of UEFI because it seems to be set up so I can only do a legacy boot. But it is a UEFI core. At some point when I am not being totally lazy, I'll see about writing a DOS tool to scan memory for the EFI system services table. That could prove interesting to see exactly what is a UEFI core and what is a legacy core
Who needs to backup porn when the intarwebs is the biggest distributed cloud storage datacenter there is? In fact, the only data i back up is that which i make myself
after a hdd failure leave utorrent running for a few days and you'll have it all back
I keep hearing about this issue, but it always seems to be in the theoretical sense. is anyone actually having these sorts of issues, or is it just something that may happen someday? Given the number of servers out there with utterly massive arrays, it would seem that if this were as big of an issue as some people say it is, then we would be seeing un-recoverable crashes every day.
Separate names with a comma.