twoodcc
Aug 8, 02:42 PM
I've seen several people saying that it's starting to be a car encyclopedia rather than an enjoyable racing game. I kinda agree with that. My last experience with GT is GT2 on PS1 I think but I'm looking forward on this game. Hopefully it will be what I expect, a good, solid driving game. I hope they have spent time on the actual driving too, not just with the cars and 3D stuff etc
i don't know, i still think the Gran Turismo series is the best as far as real driving simulation. by far. and the number of copies sold backs that up
me too!! i am So excited! i wont pre order or anything, might save for a steering wheel though. :)
yeah i still might pre-order the special edition one. i'm not sure yet
i don't know, i still think the Gran Turismo series is the best as far as real driving simulation. by far. and the number of copies sold backs that up
me too!! i am So excited! i wont pre order or anything, might save for a steering wheel though. :)
yeah i still might pre-order the special edition one. i'm not sure yet

AngryCorgi
Apr 6, 04:16 PM
Since you have no clue how the sandy bridge airs will perform, I'll take your statement as FUD.
I'll give you some insight into their potential. The desktop i7-2600k has been benchmarked to be roughly equivalent to a 9400m in performance (assuming similar CPU).
i7-2600k GPU clock = 850/1350 (normal/turbo)(MHz)
i5-2410m (13" Mac Pro base) GPU clock = 650/1200 (normal/turbo)(MHz)
i7-2620m (13" Mac Pro upg) GPU clock = 650/1300 (normal/turbo)(MHz)
i5-2537m (theorized 11/13 MBA) GPU clock = 350/900 (normal/turbo)(MHz)
i7-2649m (theorized 13 MBA upg) GPU clock = 500/1100 (normal/turbo)(MHz)
As you can see, none of the mobile GPUs run quite as fast as the desktop, but the 13" 2.7GHz upg cpu's comes fairly close. Now, the 2.13 GHz MBA + 320m combo matched or beat out the i7-2620m in 75% of the tests (and only narrowly was defeated in 25%). There is going to be some random inconcistancy regardless, due to driver variances in different apps. The issue here is (and this can be shown in core2 vs. i5/i7 testing on the alienware m11x) the core2 duo really very rarely gets beat by the i5/i7 in gaming/video playback. This is because not many games are single-threaded anymore, and if using 2+ threads, the i5/i7 ULV won't jump the clock speed any. Further, the 2.13GHz was keeping up with and beating a 2.7GHz (27% higher clock!) in that test, because graphics are the bottleneck, not the CPU. Take into account that NONE of the ULV core-i options match the MBP 13" 2.7GHz upg GPU speed and its pretty clear that for graphics-intensive apps, the older 320m would be the way to go. Now for most everything else, the i7-2649m would overtake the core2 2.13GHz. This includes a lot of non-accelerated video playback (high-CPU-overhead).
Something you guys need to be wary of is the 1333MHz memory topic. Likely, Apple will choose to run it down at 1066MHz to conserve battery life. Memory speed hikes = gratuitous battery drain.
I for one am happy Apple is growing with the modern tech, but I hold no illusions as to the benefits/drawbacks of either system.
I'll give you some insight into their potential. The desktop i7-2600k has been benchmarked to be roughly equivalent to a 9400m in performance (assuming similar CPU).
i7-2600k GPU clock = 850/1350 (normal/turbo)(MHz)
i5-2410m (13" Mac Pro base) GPU clock = 650/1200 (normal/turbo)(MHz)
i7-2620m (13" Mac Pro upg) GPU clock = 650/1300 (normal/turbo)(MHz)
i5-2537m (theorized 11/13 MBA) GPU clock = 350/900 (normal/turbo)(MHz)
i7-2649m (theorized 13 MBA upg) GPU clock = 500/1100 (normal/turbo)(MHz)
As you can see, none of the mobile GPUs run quite as fast as the desktop, but the 13" 2.7GHz upg cpu's comes fairly close. Now, the 2.13 GHz MBA + 320m combo matched or beat out the i7-2620m in 75% of the tests (and only narrowly was defeated in 25%). There is going to be some random inconcistancy regardless, due to driver variances in different apps. The issue here is (and this can be shown in core2 vs. i5/i7 testing on the alienware m11x) the core2 duo really very rarely gets beat by the i5/i7 in gaming/video playback. This is because not many games are single-threaded anymore, and if using 2+ threads, the i5/i7 ULV won't jump the clock speed any. Further, the 2.13GHz was keeping up with and beating a 2.7GHz (27% higher clock!) in that test, because graphics are the bottleneck, not the CPU. Take into account that NONE of the ULV core-i options match the MBP 13" 2.7GHz upg GPU speed and its pretty clear that for graphics-intensive apps, the older 320m would be the way to go. Now for most everything else, the i7-2649m would overtake the core2 2.13GHz. This includes a lot of non-accelerated video playback (high-CPU-overhead).
Something you guys need to be wary of is the 1333MHz memory topic. Likely, Apple will choose to run it down at 1066MHz to conserve battery life. Memory speed hikes = gratuitous battery drain.
I for one am happy Apple is growing with the modern tech, but I hold no illusions as to the benefits/drawbacks of either system.
boncellis
Jul 20, 12:05 PM
double post, my apologies.
iansilv
Apr 25, 04:48 PM
wow, this has officially been blown out of proportion!
Yup!
The GOVERNMENT must get a warrant- that attorney is an idiot. Things like the iPhone tracking people's location is not the same thing as a federal officer getting a warrant for tracking someone.
Hey attorney- thanks for making our profession look idiotic!
Yup!
The GOVERNMENT must get a warrant- that attorney is an idiot. Things like the iPhone tracking people's location is not the same thing as a federal officer getting a warrant for tracking someone.
Hey attorney- thanks for making our profession look idiotic!
macintel4me
Aug 7, 07:46 PM
thats a kinda harsh requirement, i would think it will allow you to choose local/external hard drive/network server.
Buts till, it will cost lot of space, no matter where the space is from.
From the Apple website...
Backup Disk: Change the drive or volume you�re backing up to. Or back up to a Mac OS X server computer.
Buts till, it will cost lot of space, no matter where the space is from.
From the Apple website...
Backup Disk: Change the drive or volume you�re backing up to. Or back up to a Mac OS X server computer.
QCassidy352
Jul 20, 03:53 PM
I hate to burst everyone's bubble, but Kentsfield will not be appearing in any of the Pro machines for some time.
Apple will be using them exclusively in the Xserves, at for the most part of 2007. This will finally give Apple another way to distinguish their server line from their pro line.
What? Apple*differentiates the XServes by having them 1U thick and rackmountable. One buys a rackmount server not because it's faster but because it's smaller and fits in a rack.
yeah, what he said. Apple does not have to distinguish powermacs from servers with processor speeds. People (businesses) who need servers are not going to buy powermacs to do the job even if they are a little bit faster or cheaper; they are going to buy real rack-mounted servers.
Apple will be using them exclusively in the Xserves, at for the most part of 2007. This will finally give Apple another way to distinguish their server line from their pro line.
What? Apple*differentiates the XServes by having them 1U thick and rackmountable. One buys a rackmount server not because it's faster but because it's smaller and fits in a rack.
yeah, what he said. Apple does not have to distinguish powermacs from servers with processor speeds. People (businesses) who need servers are not going to buy powermacs to do the job even if they are a little bit faster or cheaper; they are going to buy real rack-mounted servers.
AppleScruff1
Apr 9, 10:01 PM
I'd wait for Haswell or maybe even Rockwell which will be the 16nm shrink of Haswell.
merk850
Jul 28, 05:23 PM
I respectfully disagree. I say take it back and be ready for a much faster iMac Core 2 Duo. You want the latest, take it back. It won't be the latest for many more weeks. Core 2 Duo will be the latest for two more years.
I appreciate the thoughts on my quandry whether or not to return my 20 " iMac and purchase after the WWDC. Of course my decision is not any easier with one vote for and one vote against.
Thanks Grokgod and Multimedia for the thoughts...
I appreciate the thoughts on my quandry whether or not to return my 20 " iMac and purchase after the WWDC. Of course my decision is not any easier with one vote for and one vote against.
Thanks Grokgod and Multimedia for the thoughts...
Lollypop
Jul 20, 08:12 AM
seems the tragic days of the P4 are gone for intel, good for us! :p With all the high end stuff from intel thats apparently going into the mac im a bit worried about the price of the systems though!
Iconoclysm
Apr 19, 08:24 PM
WRONG! They weren't invented at Apple's Cupertino HQ, they were invented back in Palo Alto (Xerox PARC).
Secondly, your source is a pro-Apple website. Thats a problem right there.
I'll give you a proper source, the NYTimes (http://www.nytimes.com/1989/12/20/business/xerox-vs-apple-standard-dashboard-is-at-issue.html), which wrote an article on Xerox vs Apple back in 1989, untarnished, in its raw form. Your 'source' was cherry picking data.
Here is one excerpt.
Then Apple CEO John Sculley stated:
^^ thats a GLARING admission, by the CEO of Apple, don't you think? Nevertheless, Xerox ended up losing that lawsuit, with some saying that by the time they filed that lawsuit it was too late. The lawsuit wasn't thrown out because they didn't have a strong case against Apple, but because of how the lawsuit was presented as is at the time.
I'm not saying that Apple stole IP from Xerox, but what I am saying is that its quite disappointing to see Apple fanboys trying to distort the past into making it seem as though Apple created the first GUI, when that is CLEARLY not the case. The GUI had its roots in Xerox PARC. That, is a FACT.
http://upload.wikimedia.org/wikipedia/en/7/78/Rank_Xerox_8010%2B40_brochure_front.jpg
Actually, you're WRONG!!!! to say he's wrong. You're trying to say that every GUI element was created at Xerox? EVERY one of them? Sorry, but your argument here is akin to something Fox News would air.
Secondly, your source is a pro-Apple website. Thats a problem right there.
I'll give you a proper source, the NYTimes (http://www.nytimes.com/1989/12/20/business/xerox-vs-apple-standard-dashboard-is-at-issue.html), which wrote an article on Xerox vs Apple back in 1989, untarnished, in its raw form. Your 'source' was cherry picking data.
Here is one excerpt.
Then Apple CEO John Sculley stated:
^^ thats a GLARING admission, by the CEO of Apple, don't you think? Nevertheless, Xerox ended up losing that lawsuit, with some saying that by the time they filed that lawsuit it was too late. The lawsuit wasn't thrown out because they didn't have a strong case against Apple, but because of how the lawsuit was presented as is at the time.
I'm not saying that Apple stole IP from Xerox, but what I am saying is that its quite disappointing to see Apple fanboys trying to distort the past into making it seem as though Apple created the first GUI, when that is CLEARLY not the case. The GUI had its roots in Xerox PARC. That, is a FACT.
http://upload.wikimedia.org/wikipedia/en/7/78/Rank_Xerox_8010%2B40_brochure_front.jpg
Actually, you're WRONG!!!! to say he's wrong. You're trying to say that every GUI element was created at Xerox? EVERY one of them? Sorry, but your argument here is akin to something Fox News would air.
DoFoT9
Aug 28, 07:18 PM
i am looking forward to this game, no matter if it's got standard and premium cars.
yeh im over the bitching - just make the physics right and ill play it in 8 bit colour!
yeh im over the bitching - just make the physics right and ill play it in 8 bit colour!
Multimedia
Aug 18, 10:31 AM
If one were to buy a mac pro now, is the processor upgradeable to Clovertown in the future, or is that not really worth it even if it is, because you would need a faster FSB, meaning a new logic board, to take advantage of its power?I'm sure you know this. But just a reminder that you would be dealing with an extremely fragile and tricky upgrade process that could destroy your motherboard or fry the processor without the latest cooling system from Apple. Just my own caution against attempting this. Not worth the risk I think. There will be a better video card with the Dual Clovertown Mac Pro as well as other changes to the system fixing bugs discovered between now and then. Too many changes in the works for me to want to fool with such a complex system.
dethmaShine
Apr 6, 10:11 AM
For a programmer dealing with Terminal, Xcode, Netbeans, Eclipse, etc (not graphic intensive softwares), would this macbook air be a better deal than the 13/15" Macbook pro?
Anyone?
Anyone?
Eidorian
Jul 20, 05:57 PM
According to Daily Tech Merom is already shipping! Intel announced it during Intel's Q2'06 earnings report. Is an upgraded MBP going to make an appearance at the WWDC?
http://www.dailytech.com/article.aspx?newsid=3421��Qué?!
http://www.dailytech.com/article.aspx?newsid=3421��Qué?!
theBB
Mar 31, 07:13 PM
If you're going to licence your project as open source, then you do actually have to release the source. I know there's often a delay with commercial products. I suppose the tolerance of the open source community depends on the reason and the amount of time the code is held back.
Well, the rules for GPL say you need to release the source code along with the software and you actually have to offer them through the same channel, so that you cannot make it practically impossible for people to get to the source even if it is theoretically available. Of course, GPL is not the only "open source" license. This is Google's playground, so they get to define it any way they wish.
Well, the rules for GPL say you need to release the source code along with the software and you actually have to offer them through the same channel, so that you cannot make it practically impossible for people to get to the source even if it is theoretically available. Of course, GPL is not the only "open source" license. This is Google's playground, so they get to define it any way they wish.
840quadra
Nov 28, 06:51 PM
Adds universal to the list of Companies I do not buy from..
Wait..
They are already on that list!
GTH Universal! I bought my iPod, Every song on it, and will continue to do so. Stop Extorting the public, and possibly you may actually have some fans, or people that want to deal with your crappy company!
Wait..
They are already on that list!
GTH Universal! I bought my iPod, Every song on it, and will continue to do so. Stop Extorting the public, and possibly you may actually have some fans, or people that want to deal with your crappy company!
HyperZboy
Apr 7, 11:22 PM
Having managed at several retail giants right out of college, I can give an answer as to why a company might withhold some stock and it's a very simple one...
What if the supplier is abnormally constraining stock of a popular item?
Do you prefer to be out of that item for a week, possibly weeks after it sells out or do you conserve some stock to have some in the store every day and tell some customers you're expecting more the next day?
From what I've read, Apple's shipments of iPads has been constrained.
Clearly, from a retail manager's perspective and even from corporate managers, I could easily see why Best Buy might conserve some stock until Apple gets ramped up and can hit demand. Otherwise your regular customers will get the impression that you're not carrying the product at all and just go buy it somewhere ELSE! At least if you tell them you'll have some more in stock tomorrow, there's a better chance they'll come back the next day.
Trust me, I'm not a big fan of Best Buy, but this appears to be Apple's doing since they forced the issue by making sure their Apple Stores were well stocked and maybe not as much as the retail giants.
Clearly not many people here have managed in sales. If you've got a product you KNOW is going to sell out in a particular time period and you've hit your sales quota and you're not going to get any back in stock for 2-3 weeks, this is not a crazy idea to do.
In my opinion, Apple needs to get its supply chain act together and stop micromanaging other vendors' sales strategies instead.
What if the supplier is abnormally constraining stock of a popular item?
Do you prefer to be out of that item for a week, possibly weeks after it sells out or do you conserve some stock to have some in the store every day and tell some customers you're expecting more the next day?
From what I've read, Apple's shipments of iPads has been constrained.
Clearly, from a retail manager's perspective and even from corporate managers, I could easily see why Best Buy might conserve some stock until Apple gets ramped up and can hit demand. Otherwise your regular customers will get the impression that you're not carrying the product at all and just go buy it somewhere ELSE! At least if you tell them you'll have some more in stock tomorrow, there's a better chance they'll come back the next day.
Trust me, I'm not a big fan of Best Buy, but this appears to be Apple's doing since they forced the issue by making sure their Apple Stores were well stocked and maybe not as much as the retail giants.
Clearly not many people here have managed in sales. If you've got a product you KNOW is going to sell out in a particular time period and you've hit your sales quota and you're not going to get any back in stock for 2-3 weeks, this is not a crazy idea to do.
In my opinion, Apple needs to get its supply chain act together and stop micromanaging other vendors' sales strategies instead.
61132
Aug 7, 08:22 PM
gosh, the finder looks the same :( I dont want the brushed metal anywhere anymore!! Also, they should just integrate address book/ical/mail into one app!!!
blakbyrd
Aug 5, 04:07 PM
Reposting my prediction from another thread:
theBB
Aug 11, 07:28 PM
Confused.
Can somebody explain me the differences between the cellphone market between the US and Europe.
Will a 'iPhone' just be marketed to the US or worldwide (as the iPod does)?
Well, let's see, about 20 years ago, a lot of countries in Europe, Asia and elsewhere decided on a standard digital cell phone system and called it GSM. About 15 years ago GSM networks became quite widespread across these countries. In the meantime US kept on using analog cell phones. Motorola did not even believe that digital cell phone had much of a future, so it decided to stay away from this market, a decision which almost bankrupted the company.
US started rolling out digital service only about 10 years ago. As US government does not like to dictate private companies how to conduct their business, they sold the spectrum and put down some basic ground rules, but for the most part they let the service providers use any network they wished. For one reason or another, these providers decided go with about 4 different standards at first. Quite a few companies went with GSM, AT&T picked a similar, but incompatible TDMA (IS=136?) standard, Nextel went with a proprietary standard they called iDEN and Sprint and Verizon went with CDMA, a radically different standard (IS-95) designed by Qualcomm. At the time, other big companies were very skeptical, so Qualcomm had to not only develop the underlying communication standards, but manufacture cell phones and the electronics for the cell towers. However, once the system proved itself, everybody started moving in that direction. Even the upcoming 3G system for these GSM networks, called UMTS, use a variant of CDMA technology.
CDMA is a more complicated standard compared to GSM, but it allows the providers to cram more users into each cell, it is supposedly cheaper to maintain and more flexible in some respects. However, anybody in that boat has to pay hefty royalties to Qualcomm, dampening its popularity. While creating UMTS, GSM standards bodies did everything they could to avoid using Qualcomm patents to avoid these payments. However, I don't know how successful they got in these efforts.
Even though Europeans here on these forums like to gloat that US did not join the worldwide standard, that we did not play along, that ours is a hodge podge of incompatible systems; without the freedom to try out different standards, CDMA would not have the opportunity to prove its feasibility and performance. In the end, the rest of the world is also reaping the benefits through UMTS/WCDMA.
Of course, not using the same standards as everybody else has its own price. The components of CDMA cell phones cost more and the system itself is more complicated, so CDMA versions of cell phones hit the market six months to a year after their GSM counterparts, if at all. The infrastructure cost of a rare system is higher as well, so AT&T had to rip apart its network to replace it with GSM version about five years after rolling it out. Sprint is probably going to convert Nextel's system in the near future as well.
I hope this answers your question.
Can somebody explain me the differences between the cellphone market between the US and Europe.
Will a 'iPhone' just be marketed to the US or worldwide (as the iPod does)?
Well, let's see, about 20 years ago, a lot of countries in Europe, Asia and elsewhere decided on a standard digital cell phone system and called it GSM. About 15 years ago GSM networks became quite widespread across these countries. In the meantime US kept on using analog cell phones. Motorola did not even believe that digital cell phone had much of a future, so it decided to stay away from this market, a decision which almost bankrupted the company.
US started rolling out digital service only about 10 years ago. As US government does not like to dictate private companies how to conduct their business, they sold the spectrum and put down some basic ground rules, but for the most part they let the service providers use any network they wished. For one reason or another, these providers decided go with about 4 different standards at first. Quite a few companies went with GSM, AT&T picked a similar, but incompatible TDMA (IS=136?) standard, Nextel went with a proprietary standard they called iDEN and Sprint and Verizon went with CDMA, a radically different standard (IS-95) designed by Qualcomm. At the time, other big companies were very skeptical, so Qualcomm had to not only develop the underlying communication standards, but manufacture cell phones and the electronics for the cell towers. However, once the system proved itself, everybody started moving in that direction. Even the upcoming 3G system for these GSM networks, called UMTS, use a variant of CDMA technology.
CDMA is a more complicated standard compared to GSM, but it allows the providers to cram more users into each cell, it is supposedly cheaper to maintain and more flexible in some respects. However, anybody in that boat has to pay hefty royalties to Qualcomm, dampening its popularity. While creating UMTS, GSM standards bodies did everything they could to avoid using Qualcomm patents to avoid these payments. However, I don't know how successful they got in these efforts.
Even though Europeans here on these forums like to gloat that US did not join the worldwide standard, that we did not play along, that ours is a hodge podge of incompatible systems; without the freedom to try out different standards, CDMA would not have the opportunity to prove its feasibility and performance. In the end, the rest of the world is also reaping the benefits through UMTS/WCDMA.
Of course, not using the same standards as everybody else has its own price. The components of CDMA cell phones cost more and the system itself is more complicated, so CDMA versions of cell phones hit the market six months to a year after their GSM counterparts, if at all. The infrastructure cost of a rare system is higher as well, so AT&T had to rip apart its network to replace it with GSM version about five years after rolling it out. Sprint is probably going to convert Nextel's system in the near future as well.
I hope this answers your question.
Stridder44
Nov 28, 09:06 PM
No guys, this sounds like a great idea....*cough*.....
satty
Jul 20, 08:48 AM
At some point your going to have deminished returns. Sure multimedia apps can take advantage of a few more cores, but I dont see Mail running faster on 4 cores, nevermind 2! The nice thing about intel is that they seem to realise that, and have invested in improved IO as well, look at Pci express and SATA, you can have the fastest processor in the world, but if your running it with 512megs of memory your going to slow down fast!
guzhogi
Jul 15, 11:20 AM
Something I liked about the power supply in my beige G3 was that not only did it have a power in socket, but allso a power out one too to a monitor or something.
LethalWolfe
Apr 10, 11:16 PM
The guy who 'botched' iMovie is the same person that created Final Cut and continues to work on Final Cut. Randy Ubillos has been the head of Apple's video editing suites/applications for as long as I can remember.
He's also the guy that headed up Adobe Premiere. Sure, the iMovie revamp wasn't a high point but the guy laid the foundations for two of the three most popular NLE's so he can't be all bad. ;)
Lethal
He's also the guy that headed up Adobe Premiere. Sure, the iMovie revamp wasn't a high point but the guy laid the foundations for two of the three most popular NLE's so he can't be all bad. ;)
Lethal


No comments:
Post a Comment