
dukebound85
Dec 7, 05:07 PM
would those that have played this game reccomend getting it? or are there too many cons (standard cars, multiple versions of one car, bad AI in racing, bad physics in damage esp with standard, etc) that would lead to buyers remorse?
Keep in mind, I have played quite abit of Forza, but now have a PS3 and want agood racing sim but just keep hearing bad things about this game (largely being an incomplete game)
Keep in mind, I have played quite abit of Forza, but now have a PS3 and want agood racing sim but just keep hearing bad things about this game (largely being an incomplete game)

radesousa
Sep 13, 11:40 AM
So the question I have is can the latest iMac be CPU upgraded like the MacPro?
QCassidy352
Apr 6, 11:43 AM
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8G4 Safari/6533.18.5)
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8G4 Safari/6533.18.5)
I have a 13" ultimate of the current generation. The limiting factor for me is the graphics, not the processor. so going to sandy bridge with the intel 3000 would be a less appealing machine for my uses than the current model. It's really too bad the sandy bridge macs are tied to those garbage integrated graphics.
Since you have no clue how the sandy bridge airs will perform, I'll take your statement as FUD.
It's safe to say they won't outperform 13" mbp which has the same graphics and a faster processor. Which means the graphics performance will be a step back. And really, is the attitude necessary?
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8G4 Safari/6533.18.5)
I have a 13" ultimate of the current generation. The limiting factor for me is the graphics, not the processor. so going to sandy bridge with the intel 3000 would be a less appealing machine for my uses than the current model. It's really too bad the sandy bridge macs are tied to those garbage integrated graphics.
Since you have no clue how the sandy bridge airs will perform, I'll take your statement as FUD.
It's safe to say they won't outperform 13" mbp which has the same graphics and a faster processor. Which means the graphics performance will be a step back. And really, is the attitude necessary?
gregorypierce
Apr 11, 02:31 AM
Wow. You'd think a FCP Users group would be able to track down a halfway decent graphic artist to make their banner graphic...
It probably looks great when in motion on a TV screen.....
hey I tried :)
It probably looks great when in motion on a TV screen.....
hey I tried :)
geerlingguy
Aug 16, 11:24 PM
When rendering in FCP, it's all about the CPU.
Fast hard drives contribute to real-time effects, but do NOT contribute to rendering.
Ram helps a little bit.
However, depending on what kind of rendering you're doing, the hard drive can be a limiting factor.
Say you're just rendering ten minutes worth of a blur effect on video�the CPU says 'gimme all you got' and goes to town on the frames, blurring each one quickly. But the hard drive may have a hard time keeping up with the CPU, because 10 minutes of footage needs to be read, then re-written to the drive. For HD-resolution video, that can be a couple gigs of data. And that data also has to pass through the RAM (which acts like a high-speed buffer).
However, in the case of these benchmarks, one would think the testers would choose some more CPU-intense rendering, which would allow the hard drive to take it's time while the CPU is overloaded with work.
But, to anyone configuring a graphics or video workstation: Everything�CPU, Hard Drives, RAM, and even the GPU for some tasks�should be as fast and ample as possible. "A chain is only as good as it's weakest link." If you pair up a Quad 3.0 GHz Xeon with a 5400 rpm USB 2.0 drive, you will be disappointed.
Fast hard drives contribute to real-time effects, but do NOT contribute to rendering.
Ram helps a little bit.
However, depending on what kind of rendering you're doing, the hard drive can be a limiting factor.
Say you're just rendering ten minutes worth of a blur effect on video�the CPU says 'gimme all you got' and goes to town on the frames, blurring each one quickly. But the hard drive may have a hard time keeping up with the CPU, because 10 minutes of footage needs to be read, then re-written to the drive. For HD-resolution video, that can be a couple gigs of data. And that data also has to pass through the RAM (which acts like a high-speed buffer).
However, in the case of these benchmarks, one would think the testers would choose some more CPU-intense rendering, which would allow the hard drive to take it's time while the CPU is overloaded with work.
But, to anyone configuring a graphics or video workstation: Everything�CPU, Hard Drives, RAM, and even the GPU for some tasks�should be as fast and ample as possible. "A chain is only as good as it's weakest link." If you pair up a Quad 3.0 GHz Xeon with a 5400 rpm USB 2.0 drive, you will be disappointed.
Chupa Chupa
Apr 5, 07:00 PM
4GB download with in-app purchases for content would be my guess.
4GB? Do you realize how many DVDs FCS is? Unless Apple is going to severely cut up the package and de-studio it, no way is 4GB nearly enough space. Aperture is fine as a download b/c it's a relatively small program. FCS is a monster. It needs to be on media. I can't hog up my bandwidth to d/l a 16+GB suite.
4GB? Do you realize how many DVDs FCS is? Unless Apple is going to severely cut up the package and de-studio it, no way is 4GB nearly enough space. Aperture is fine as a download b/c it's a relatively small program. FCS is a monster. It needs to be on media. I can't hog up my bandwidth to d/l a 16+GB suite.
LagunaSol
Apr 6, 04:31 PM
Don't hate. I have money and I can spend it however. Maybe I'll buy an ipad and leave it in the bathroom for people to use as they're taking care of business.
Yeah, I also remember you always talking about how your mom had an iMac (or something like that) but how you thought Apple stuff was overpriced crap. Of course now you own a MBP and "love" Apple stuff, except for the iPad of course, cuz your XOOM has 2 native apps and runs widgets. (I also remember you bragging about how much money you made. So it is you.)
As for tablets-for-toilets, you have far more chance of seeing those pallets of XOOMs at Costco (as balamw mentioned) being firesold for high-tech bumwipes.
Yeah, I also remember you always talking about how your mom had an iMac (or something like that) but how you thought Apple stuff was overpriced crap. Of course now you own a MBP and "love" Apple stuff, except for the iPad of course, cuz your XOOM has 2 native apps and runs widgets. (I also remember you bragging about how much money you made. So it is you.)
As for tablets-for-toilets, you have far more chance of seeing those pallets of XOOMs at Costco (as balamw mentioned) being firesold for high-tech bumwipes.

inkswamp
Jul 27, 02:22 PM
but is still more productive because it handles more calculations per clock cycle
I'm no processor geek. I have a basic understanding of the terminology and how things work so correct me if I'm wrong, but wasn't this one of the advantages that the PPC had over Intel chips? Does this mean Intel is moving toward shorter pipes? Are we talking more instructions per clock cycle or what? What does "calculations" mean in this context?
I'm no processor geek. I have a basic understanding of the terminology and how things work so correct me if I'm wrong, but wasn't this one of the advantages that the PPC had over Intel chips? Does this mean Intel is moving toward shorter pipes? Are we talking more instructions per clock cycle or what? What does "calculations" mean in this context?

milo
Jul 27, 04:20 PM
You did say "successors" and "next generation" which I was pointing out they are not :D
It seems like you're just quibbling over semantics. Webster defines "successor" as "one that follows" which is exactly what the quad core chips will be doing (and "next gen" seems to imply the same thing). Kentsfield and cloverton follow conroe and woodcrest, and use the same sockets respectively. People will upgrade, and top of the line computers for sale will switch to the new chips.
You don't seem to be disagreeing as much as quibbling with my word choice. What would you suggest as an alternative to "succcessor" to describe these future chips?
It seems like you're just quibbling over semantics. Webster defines "successor" as "one that follows" which is exactly what the quad core chips will be doing (and "next gen" seems to imply the same thing). Kentsfield and cloverton follow conroe and woodcrest, and use the same sockets respectively. People will upgrade, and top of the line computers for sale will switch to the new chips.
You don't seem to be disagreeing as much as quibbling with my word choice. What would you suggest as an alternative to "succcessor" to describe these future chips?
craig jones
Sep 13, 01:10 PM
The OS takes advantage of the extra 4 cores already therefore its ahead of the technology curve, correct? Gee, no innovation here...please move along folks. :rolleyes:
As for using a Dell, sure they could've used that. Would Windows use the extra 4 cores? Highly doubtful. Microsoft has sketchy 64 bit support let alone dual core support; I'm not saying "impossible" but I haven't read jack squat about any version of Windows working well with quad cores. You think those fools (the same idiots who came up with Genuine Advantage) actually optimized their OS to run in an 8 core setup? Please pass along what you're smoking. :rolleyes:
How do you know these things? Is Windows' 64-bit support sketchier than OS X's? Of course not. OS X has little 64-bit support and none at all for Intel. Windows also supports far more than 2 or 4 cores (although there are license restrictions). Windows has run on far more than 8 cores for a long, long time. You realize they have an actual presence in the server market, don't you?
As for using a Dell, sure they could've used that. Would Windows use the extra 4 cores? Highly doubtful. Microsoft has sketchy 64 bit support let alone dual core support; I'm not saying "impossible" but I haven't read jack squat about any version of Windows working well with quad cores. You think those fools (the same idiots who came up with Genuine Advantage) actually optimized their OS to run in an 8 core setup? Please pass along what you're smoking. :rolleyes:
How do you know these things? Is Windows' 64-bit support sketchier than OS X's? Of course not. OS X has little 64-bit support and none at all for Intel. Windows also supports far more than 2 or 4 cores (although there are license restrictions). Windows has run on far more than 8 cores for a long, long time. You realize they have an actual presence in the server market, don't you?
ecwyatt
Aug 11, 03:34 PM
I'd wager that what ever they do come out with will be considered a let down, seeing as so much hype is building around it. Its kinda like those supposed summer block-buster movies all hype but doesn't really deliver.
Also I wouldn't be surprised if it only held as many songs as the Rokr or Slvr (if any at all) anything more would threaten to encroach to much on the iPod line, and I don't think apple is dumb enough to do that.
I'd be happier if it replaced my Palm you know a Blackberry killer, since they don't communicate natively only via third party. It would have to have flawless integration with mail and 0 config wi-fi capabilities to make me even consider looking at it.
Also I wouldn't be surprised if it only held as many songs as the Rokr or Slvr (if any at all) anything more would threaten to encroach to much on the iPod line, and I don't think apple is dumb enough to do that.
I'd be happier if it replaced my Palm you know a Blackberry killer, since they don't communicate natively only via third party. It would have to have flawless integration with mail and 0 config wi-fi capabilities to make me even consider looking at it.
mcrain
Mar 17, 01:29 PM
Ron Paul believes in term limits, but keeps running and running and running...
Oh, and Rand didn't fall far from the tree. From wiki for anyone who is curious:
Controversial claims made in Ron Paul's newsletters, written in the first person narrative, included statements such as "Boy, it sure burns me to have a national holiday for that pro-communist philanderer Martin Luther King. I voted against this outrage time and time again as a Congressman. What an infamy that Ronald Reagan approved it! We can thank him for our annual Hate Whitey Day." Along with "even in my little town of Lake Jackson, Texas, I've urged everyone in my family to know how to use a gun in self defense. For the animals are coming." Another notable statement that garnered controversy was "opinion polls consistently show only about 5% of blacks have sensible political opinions, if you have ever been robbed by a black teen-aged male, you know how unbelievably fleet-footed they can be". An issue from 1992 refers to carjacking as the "hip-hop thing to do among the urban youth who play unsuspecting whites like pianos." In an article titled "The Pink House" the newsletter wrote that "Homosexuals, not to speak of the rest of society, were far better off when social pressure forced them to hide their activities." These publications would later create political problems for Paul and he considered retiring his seat. Wiki (http://en.wikipedia.org/wiki/Ron_Paul)
He won't ever be president, and he should have resigned his seat years ago.
It's one thing to vote against pay raises; it's another to actually do something about them. It's one thing to vote against many things that you know are going to pass, and another to stand up to your party when it counts.
Oh, and Rand didn't fall far from the tree. From wiki for anyone who is curious:
Controversial claims made in Ron Paul's newsletters, written in the first person narrative, included statements such as "Boy, it sure burns me to have a national holiday for that pro-communist philanderer Martin Luther King. I voted against this outrage time and time again as a Congressman. What an infamy that Ronald Reagan approved it! We can thank him for our annual Hate Whitey Day." Along with "even in my little town of Lake Jackson, Texas, I've urged everyone in my family to know how to use a gun in self defense. For the animals are coming." Another notable statement that garnered controversy was "opinion polls consistently show only about 5% of blacks have sensible political opinions, if you have ever been robbed by a black teen-aged male, you know how unbelievably fleet-footed they can be". An issue from 1992 refers to carjacking as the "hip-hop thing to do among the urban youth who play unsuspecting whites like pianos." In an article titled "The Pink House" the newsletter wrote that "Homosexuals, not to speak of the rest of society, were far better off when social pressure forced them to hide their activities." These publications would later create political problems for Paul and he considered retiring his seat. Wiki (http://en.wikipedia.org/wiki/Ron_Paul)
He won't ever be president, and he should have resigned his seat years ago.
It's one thing to vote against pay raises; it's another to actually do something about them. It's one thing to vote against many things that you know are going to pass, and another to stand up to your party when it counts.
Rodimus Prime
Feb 27, 09:39 PM
assume what the guy says is true it looks like he has some pretty strong grounds for a wrongful termination law suit.
dongmin
Jul 14, 04:07 PM
A 2.66 Ghz Woodcrest will probably be faster than a 2.93Ghz Conroe. A 1.83Ghz Yonah is faster than a 3.2Ghz Pentium, right?;)I thought the two processors were identical (in a single processor config) except that the Woodcrests have a higher FSB (1066mhz vs. 1333mhz). According to the Anandtech review, the 1333mhz FSB gives you only about 3% boost in speed.
Core 2 Duo
2.13 ghz - $224 (2MB L2 cache)
2.40 ghz - $316
2.67 ghz - $530
Xeon 5100 series
2.00 ghz - $316
2.33 ghz - $455
2.66 ghz - $690
It makes more sense to go with a 2.4 ghz Conroe for a single-processor config, since it's cheaper than the 2.33 ghz Woodcrest. What I'd like to see:
GOOD
2.40 ghz Core 2 Duo - $1499
BETTER
2 x 2.00 ghz Xeon - $1999
BEST
2 x 2.67 ghz Xeon - $2799
Of course, if Apple were REALLY ambitious, they should release a mini tower using Conroes and release the Mac Pros in quad-only configs.
Core 2 Duo
2.13 ghz - $224 (2MB L2 cache)
2.40 ghz - $316
2.67 ghz - $530
Xeon 5100 series
2.00 ghz - $316
2.33 ghz - $455
2.66 ghz - $690
It makes more sense to go with a 2.4 ghz Conroe for a single-processor config, since it's cheaper than the 2.33 ghz Woodcrest. What I'd like to see:
GOOD
2.40 ghz Core 2 Duo - $1499
BETTER
2 x 2.00 ghz Xeon - $1999
BEST
2 x 2.67 ghz Xeon - $2799
Of course, if Apple were REALLY ambitious, they should release a mini tower using Conroes and release the Mac Pros in quad-only configs.
xxavier
Aug 5, 09:31 PM
With the iSight and IR sensor rumored to be integrated into the new line of Cinema Displays, i guess apple's gonna adopt HDMI as the IO interface, making Apple one of the first corps to do so. Plus with a HDMI enabled Mac Pro and Leopard fully support it. Why? HDMI is just like ADC, plus its an industry standard port. U need only one cable to have all the communications (FW+USB+Sound+...) going, without having to clutter yr desktop with multiple cables. I see it coming!

CmdrLaForge
Apr 10, 02:10 AM
I am really looking forward to see what Apple has in house for FCP. I will decide then if I stay with Apple or move to Adobe Production Studio. If they go too much in the direction of iMovie I will for sure not like it.
The take over of Supermeet is very nasty and it put the organizers in a very bad position because either way they can only loose. Other companys will think twice in the future if they want to sponsor it and if Apple doesn't have anything new they won't be present.
Apple can easily make there own event, just book that building in SF and invite some journalists or plan in advance!!
The take over of Supermeet is very nasty and it put the organizers in a very bad position because either way they can only loose. Other companys will think twice in the future if they want to sponsor it and if Apple doesn't have anything new they won't be present.
Apple can easily make there own event, just book that building in SF and invite some journalists or plan in advance!!
theBB
Aug 11, 07:28 PM
Confused.
Can somebody explain me the differences between the cellphone market between the US and Europe.
Will a 'iPhone' just be marketed to the US or worldwide (as the iPod does)?
Well, let's see, about 20 years ago, a lot of countries in Europe, Asia and elsewhere decided on a standard digital cell phone system and called it GSM. About 15 years ago GSM networks became quite widespread across these countries. In the meantime US kept on using analog cell phones. Motorola did not even believe that digital cell phone had much of a future, so it decided to stay away from this market, a decision which almost bankrupted the company.
US started rolling out digital service only about 10 years ago. As US government does not like to dictate private companies how to conduct their business, they sold the spectrum and put down some basic ground rules, but for the most part they let the service providers use any network they wished. For one reason or another, these providers decided go with about 4 different standards at first. Quite a few companies went with GSM, AT&T picked a similar, but incompatible TDMA (IS=136?) standard, Nextel went with a proprietary standard they called iDEN and Sprint and Verizon went with CDMA, a radically different standard (IS-95) designed by Qualcomm. At the time, other big companies were very skeptical, so Qualcomm had to not only develop the underlying communication standards, but manufacture cell phones and the electronics for the cell towers. However, once the system proved itself, everybody started moving in that direction. Even the upcoming 3G system for these GSM networks, called UMTS, use a variant of CDMA technology.
CDMA is a more complicated standard compared to GSM, but it allows the providers to cram more users into each cell, it is supposedly cheaper to maintain and more flexible in some respects. However, anybody in that boat has to pay hefty royalties to Qualcomm, dampening its popularity. While creating UMTS, GSM standards bodies did everything they could to avoid using Qualcomm patents to avoid these payments. However, I don't know how successful they got in these efforts.
Even though Europeans here on these forums like to gloat that US did not join the worldwide standard, that we did not play along, that ours is a hodge podge of incompatible systems; without the freedom to try out different standards, CDMA would not have the opportunity to prove its feasibility and performance. In the end, the rest of the world is also reaping the benefits through UMTS/WCDMA.
Of course, not using the same standards as everybody else has its own price. The components of CDMA cell phones cost more and the system itself is more complicated, so CDMA versions of cell phones hit the market six months to a year after their GSM counterparts, if at all. The infrastructure cost of a rare system is higher as well, so AT&T had to rip apart its network to replace it with GSM version about five years after rolling it out. Sprint is probably going to convert Nextel's system in the near future as well.
I hope this answers your question.
Can somebody explain me the differences between the cellphone market between the US and Europe.
Will a 'iPhone' just be marketed to the US or worldwide (as the iPod does)?
Well, let's see, about 20 years ago, a lot of countries in Europe, Asia and elsewhere decided on a standard digital cell phone system and called it GSM. About 15 years ago GSM networks became quite widespread across these countries. In the meantime US kept on using analog cell phones. Motorola did not even believe that digital cell phone had much of a future, so it decided to stay away from this market, a decision which almost bankrupted the company.
US started rolling out digital service only about 10 years ago. As US government does not like to dictate private companies how to conduct their business, they sold the spectrum and put down some basic ground rules, but for the most part they let the service providers use any network they wished. For one reason or another, these providers decided go with about 4 different standards at first. Quite a few companies went with GSM, AT&T picked a similar, but incompatible TDMA (IS=136?) standard, Nextel went with a proprietary standard they called iDEN and Sprint and Verizon went with CDMA, a radically different standard (IS-95) designed by Qualcomm. At the time, other big companies were very skeptical, so Qualcomm had to not only develop the underlying communication standards, but manufacture cell phones and the electronics for the cell towers. However, once the system proved itself, everybody started moving in that direction. Even the upcoming 3G system for these GSM networks, called UMTS, use a variant of CDMA technology.
CDMA is a more complicated standard compared to GSM, but it allows the providers to cram more users into each cell, it is supposedly cheaper to maintain and more flexible in some respects. However, anybody in that boat has to pay hefty royalties to Qualcomm, dampening its popularity. While creating UMTS, GSM standards bodies did everything they could to avoid using Qualcomm patents to avoid these payments. However, I don't know how successful they got in these efforts.
Even though Europeans here on these forums like to gloat that US did not join the worldwide standard, that we did not play along, that ours is a hodge podge of incompatible systems; without the freedom to try out different standards, CDMA would not have the opportunity to prove its feasibility and performance. In the end, the rest of the world is also reaping the benefits through UMTS/WCDMA.
Of course, not using the same standards as everybody else has its own price. The components of CDMA cell phones cost more and the system itself is more complicated, so CDMA versions of cell phones hit the market six months to a year after their GSM counterparts, if at all. The infrastructure cost of a rare system is higher as well, so AT&T had to rip apart its network to replace it with GSM version about five years after rolling it out. Sprint is probably going to convert Nextel's system in the near future as well.
I hope this answers your question.

Snowy_River
Jul 28, 05:37 PM
That looks stunningly beautiful. wish there were 3 or 4 card slots though.
Well, I was trying to hit the mid-point. The PM has four, and the Mini has none, so I put in two. If I had put in a third one, I would have had to make it taller.
(Of course, I realize that both the two and the four aren't quite accurate, as the PM has one slot taken up by the video card, so it's really three, as does my M++ so it's really only got one. But a strong argument can be made that people who need more than one expansion slot should really get a full sized system...)
Well, I was trying to hit the mid-point. The PM has four, and the Mini has none, so I put in two. If I had put in a third one, I would have had to make it taller.
(Of course, I realize that both the two and the four aren't quite accurate, as the PM has one slot taken up by the video card, so it's really three, as does my M++ so it's really only got one. But a strong argument can be made that people who need more than one expansion slot should really get a full sized system...)

nick123222
Mar 26, 12:23 PM
Looks like they are going for another Snow Leopard (aka disappointingly small) release.
Not sure about what everyone else wants out of the OS, but I certainly don't want ANY of the iOS style features they have announced. I can see launchpad becoming another unused feature (I'm looking at you dashboard!) that people forget about.
I guess we'll know just how committed Apple are to the Mac after this. We already know they couldn't give a damn about the hardware side of the business any more. The final stab in the back would be XCode for windows.
I really do fear that within 3-5 years Apple will have a tiny mac lineup with all focus on iOS. No more yearly OS updates, no more updates to iLife, etc. They make peanuts from it compared to the iOS income.
Do you use stacks for accessing applications? If yes, then why wouldn't you want to use launchpad? It is like the application stack but makes organising apps into folders so much easier and allows you to find apps easier. Yes you could just use spotlight to find apps quickly, but not everyone likes doing this.
Launchpad is one of the features that I am most looking forward to for easy app management and access.
Also, I use dashboard every day usually as I use to see the time on an analogue clock when I want to check the time (I find an analogue clock easier to visualise time with), currency conversion, stickies, translator, and iStat Pro.
Not sure about what everyone else wants out of the OS, but I certainly don't want ANY of the iOS style features they have announced. I can see launchpad becoming another unused feature (I'm looking at you dashboard!) that people forget about.
I guess we'll know just how committed Apple are to the Mac after this. We already know they couldn't give a damn about the hardware side of the business any more. The final stab in the back would be XCode for windows.
I really do fear that within 3-5 years Apple will have a tiny mac lineup with all focus on iOS. No more yearly OS updates, no more updates to iLife, etc. They make peanuts from it compared to the iOS income.
Do you use stacks for accessing applications? If yes, then why wouldn't you want to use launchpad? It is like the application stack but makes organising apps into folders so much easier and allows you to find apps easier. Yes you could just use spotlight to find apps quickly, but not everyone likes doing this.
Launchpad is one of the features that I am most looking forward to for easy app management and access.
Also, I use dashboard every day usually as I use to see the time on an analogue clock when I want to check the time (I find an analogue clock easier to visualise time with), currency conversion, stickies, translator, and iStat Pro.
faroZ06
Apr 27, 08:41 AM
I think ALL the gooses should be cooked. No one should get the free pass.. so I don't think it's wrong to call Apple out on this.
Sharing a photo is actively giving out a location. Just like foursquare, tweeting and updating facebook. This issue is about giving out data which is involuntary, non encrypted and not being able to turn it off.
And as for the latter half of your statement - it's a dangerous/slippery slope to start being apathetic about your right to privacy. Once it's all out there - it's that much harder to get it back.
And again - there's a difference between voluntarily and involuntarily releasing of private information.
The iPhone is voluntary. You enabled location services.
Sharing a photo is actively giving out a location. Just like foursquare, tweeting and updating facebook. This issue is about giving out data which is involuntary, non encrypted and not being able to turn it off.
And as for the latter half of your statement - it's a dangerous/slippery slope to start being apathetic about your right to privacy. Once it's all out there - it's that much harder to get it back.
And again - there's a difference between voluntarily and involuntarily releasing of private information.
The iPhone is voluntary. You enabled location services.
DotCom2
Apr 27, 09:21 AM
If I were a criminal or a terrorist I would be upset about this data collection. Since I am not, I would rather the data be kept on my phone if it will help my GPS work better/faster.:rolleyes:
Ktulu
Aug 25, 07:40 PM
My only dealings with Apple Support was a few years ago. On Christmas day the modem on my Pismo went out. I just for a lark called to see if anyone was in and not only was someone there I was taken care of quite nicely. The next day I had a box to send it off and three days later I had it back. Not bad for a notebook that was about two weeks short of the warranty expiring.
I'm not trying to be a wise a@@, but when did Apple make a Pismo. I do remember them, but not being made by Apple. I am sorry, I don't recall the manufactuer for them at this time.:confused:
I'm not trying to be a wise a@@, but when did Apple make a Pismo. I do remember them, but not being made by Apple. I am sorry, I don't recall the manufactuer for them at this time.:confused:
Roessnakhan
Mar 22, 12:53 PM
So what is next year the year of? Phones again let me guess
Yeah, probably.
Yeah, probably.
basesloaded190
Apr 6, 11:03 AM
I am shocked that anyone finds this as a positive.
So you all want a drop from 1.86/2.13 to 1.4GHz CPUs in your 13" MBA? That is a 30% drop.
You obviously don't know how powerful SB actually is compared to C2D
So you all want a drop from 1.86/2.13 to 1.4GHz CPUs in your 13" MBA? That is a 30% drop.
You obviously don't know how powerful SB actually is compared to C2D


No comments:
Post a Comment