Burning a 700MB+ PS Game

Need help with your PC or Modding Projects?
Post Reply
User avatar
bawitback
64-bit
Posts: 355
Joined: Wed Mar 01, 2006 7:29 pm

Burning a 700MB+ PS Game

Post by bawitback »

Ever came across a large PS/SAT file? I have a (CCD/IMG/SUB) PS1 game that is 712MB. I didn't notice the massive size, and when I tried burning it (700MB CD-R) of course, didn't burn! Is there a way to convert the file to another type (possible BIN/CUE) to make it smaller? or will that even help? btw; without burning it on a DVD-RW CD..

:?:
User avatar
Mozgus
Next-Gen
Posts: 6624
Joined: Sat May 13, 2006 10:31 pm
Contact:

Post by Mozgus »

They should always burn if you use an 80 min CD-R. I've never come across such a game. It shouldnt matter if the actual image is more then 700MB, due to the burning method used. The shits confusing, and honestly it was years ago when I read up on it. I forget exactly why 80min CD-Rs can technically store 800MB. Maybe someone else can explain.
User avatar
lordofduct
Next-Gen
Posts: 2907
Joined: Sat Apr 01, 2006 12:57 pm
Location: West Palm Beach

Post by lordofduct »

The image isn't the size of the actual game. Depending the format the software adds some extra informatin to the image but isn't actually placed on the disk. The actual data that ends up on the disk after burning the image is going to be the same size no matter what format you use (nrg, iso, BIN).

Some 'actual' disk size may fluctuate depending the "error correction" used on the disk. See a CD records the data tightly on the disk in normal binary... but each sector of data has an extra area of data unrecognizeable when you read the disk that is used for error correction. Inherently the actual implementation of a CD is analog... interferences can cause errors (scratches, scuffs, dust, speed distortion of the spindle). So this error correction area is just an algorithm that repeats what should be in the sector but in a short hand version... similar to this

imagine the sector is 8 bits long... for the sake of explanation I'm gonna use arbitrary values. SO:

10010011

is the data located in the abbridged sector. The computer knows the sector is supposed to only be 8 bits long so more digits can be placed at the end because it knows that any extra data is for data correction. The number of bits at the end are set by the standards created for the error correction used. So the sector we are referring may actually read:

10010011 1101

where the 1101 at the end is error correction code. This code is mainly used to quickly retrace the data before hand and match it to make sure the laser read the correct code. The algorithm of this hypothetical string may say something like:

bit 1: is the value of the leading bit
bit 2-4: is the number of 1's in the sector.

so... bit 1 of error correction says the leading digit is 1 and the following 3 bits say there are 5 (101 is binary for 5) 1's in the sector. The computer now can look at the sector loaded in its buffer and ask itself does this string start with a 1 and does it have 5 1's in it?

10010011... yep... it matches.

Now it can continue on instead of having to reajusting the laser to reread the same sector which is more time consuming. Of course the algorithms are more complex... and the sectors are much longer... but I kept it short for example purposes.

BUT, a CD has a limited amount of actual physical space on it. So to gain more data theory space the error correction will be shortened to jam more data in. The problem is that the disk is more prone to error... (i.e. the data structure of the GD-ROM is error prone for several reasons including this... shortened error correction).

Some games on the PS and Saturn utilized the standard ISO shortened error correction to jam more space on it. The TOC informs the CD-ROM this short hand code is being used...

Now why did I say all this. To inform, and to explain that this is why some games may seem over 700MBs. This actually is why the SegaCD is limited to 600MB disks, because it is a single speed CD-ROM it has to have longer error correction code. But why did I explain all of this... because I wanted to inform you of that actual structure used and because it can then be used to explain something about CD-Rs.

First off when you make an image the software recognizes the error correction structure and keeps it intact.

Pressed CDs can utilize tighter error correction because it is actually pressed mechanically. Unlike CD-Rs which uses a laser to turn a once clear liquid gel into an opaque liquid gel to emulate divits in the disk. Because of this CD-Rs were engineered to handle the common standards set. Certain burning tools will expand the abilities and limitations of CD-Rs by forcing different burning standards. Nero will call this "OVERBURN". It forces the error correction into a tighter space and also pushes the laser further down the CD then usual. A downside to this is that again the disk is more prone to error!

Now WHY offer this? Well as I said when they pressed a CD they may of packed the data in... making a larger image as long as the hardware supports it. Say they did this on a Saturn game and then its data length as longer then usual. Well when you make an image it is going to be much longer then the standard CD-R. Well this "OVERBURN" allows it to build this same structure utilized on the pressed CD and push all that data onto the CD-R. As NERO will tell you when burning the disk "This CD may not be readable in ALL CD-ROM drives". Which is true, if the computer or CD player you put it in doesn't support the algorithms used... then it ain't gonna be able to read it. It is like trying to make you read something written by those people who record court room converstaions (what are they called again?).

SO, get yourself software that supports "OVERBURN" or something similar. Like Nero, or Alcohol 120%.
www.lordofduct.com - check out my blog

Space Puppy Studios - games for gamers by gamers
User avatar
Mozgus
Next-Gen
Posts: 6624
Joined: Sat May 13, 2006 10:31 pm
Contact:

Post by Mozgus »

Hahahaha, oh wow. *clap clap*
User avatar
Pullmyfinger
Next-Gen
Posts: 1470
Joined: Sat Jan 28, 2006 12:49 pm
Location: Orange County
Contact:

Post by Pullmyfinger »

also, didn't the playstation use 650 mb discs??
User avatar
lordofduct
Next-Gen
Posts: 2907
Joined: Sat Apr 01, 2006 12:57 pm
Location: West Palm Beach

Post by lordofduct »

Pullmyfinger wrote:also, didn't the playstation use 650 mb discs??

It's standard was that... but you can push it further, the let down was that the PS was a single speed CD-ROM as well so the games that pushed it further were again more prone to error.
www.lordofduct.com - check out my blog

Space Puppy Studios - games for gamers by gamers
User avatar
Mozgus
Next-Gen
Posts: 6624
Joined: Sat May 13, 2006 10:31 pm
Contact:

Post by Mozgus »

lordofduct wrote:
Pullmyfinger wrote:also, didn't the playstation use 650 mb discs??

It's standard was that... but you can push it further, the let down was that the PS was a single speed CD-ROM as well so the games that pushed it further were again more prone to error.

Was FF9 one of them? Cause that game gave me all sorts of trouble, and looked fine in terms of condition.
Post Reply