AngryCorgi
Apr 6, 04:16 PM
Since you have no clue how the sandy bridge airs will perform, I'll take your statement as FUD.
I'll give you some insight into their potential. The desktop i7-2600k has been benchmarked to be roughly equivalent to a 9400m in performance (assuming similar CPU).
i7-2600k GPU clock = 850/1350 (normal/turbo)(MHz)
i5-2410m (13" Mac Pro base) GPU clock = 650/1200 (normal/turbo)(MHz)
i7-2620m (13" Mac Pro upg) GPU clock = 650/1300 (normal/turbo)(MHz)
i5-2537m (theorized 11/13 MBA) GPU clock = 350/900 (normal/turbo)(MHz)
i7-2649m (theorized 13 MBA upg) GPU clock = 500/1100 (normal/turbo)(MHz)
As you can see, none of the mobile GPUs run quite as fast as the desktop, but the 13" 2.7GHz upg cpu's comes fairly close. Now, the 2.13 GHz MBA + 320m combo matched or beat out the i7-2620m in 75% of the tests (and only narrowly was defeated in 25%). There is going to be some random inconcistancy regardless, due to driver variances in different apps. The issue here is (and this can be shown in core2 vs. i5/i7 testing on the alienware m11x) the core2 duo really very rarely gets beat by the i5/i7 in gaming/video playback. This is because not many games are single-threaded anymore, and if using 2+ threads, the i5/i7 ULV won't jump the clock speed any. Further, the 2.13GHz was keeping up with and beating a 2.7GHz (27% higher clock!) in that test, because graphics are the bottleneck, not the CPU. Take into account that NONE of the ULV core-i options match the MBP 13" 2.7GHz upg GPU speed and its pretty clear that for graphics-intensive apps, the older 320m would be the way to go. Now for most everything else, the i7-2649m would overtake the core2 2.13GHz. This includes a lot of non-accelerated video playback (high-CPU-overhead).
Something you guys need to be wary of is the 1333MHz memory topic. Likely, Apple will choose to run it down at 1066MHz to conserve battery life. Memory speed hikes = gratuitous battery drain.
I for one am happy Apple is growing with the modern tech, but I hold no illusions as to the benefits/drawbacks of either system.
I'll give you some insight into their potential. The desktop i7-2600k has been benchmarked to be roughly equivalent to a 9400m in performance (assuming similar CPU).
i7-2600k GPU clock = 850/1350 (normal/turbo)(MHz)
i5-2410m (13" Mac Pro base) GPU clock = 650/1200 (normal/turbo)(MHz)
i7-2620m (13" Mac Pro upg) GPU clock = 650/1300 (normal/turbo)(MHz)
i5-2537m (theorized 11/13 MBA) GPU clock = 350/900 (normal/turbo)(MHz)
i7-2649m (theorized 13 MBA upg) GPU clock = 500/1100 (normal/turbo)(MHz)
As you can see, none of the mobile GPUs run quite as fast as the desktop, but the 13" 2.7GHz upg cpu's comes fairly close. Now, the 2.13 GHz MBA + 320m combo matched or beat out the i7-2620m in 75% of the tests (and only narrowly was defeated in 25%). There is going to be some random inconcistancy regardless, due to driver variances in different apps. The issue here is (and this can be shown in core2 vs. i5/i7 testing on the alienware m11x) the core2 duo really very rarely gets beat by the i5/i7 in gaming/video playback. This is because not many games are single-threaded anymore, and if using 2+ threads, the i5/i7 ULV won't jump the clock speed any. Further, the 2.13GHz was keeping up with and beating a 2.7GHz (27% higher clock!) in that test, because graphics are the bottleneck, not the CPU. Take into account that NONE of the ULV core-i options match the MBP 13" 2.7GHz upg GPU speed and its pretty clear that for graphics-intensive apps, the older 320m would be the way to go. Now for most everything else, the i7-2649m would overtake the core2 2.13GHz. This includes a lot of non-accelerated video playback (high-CPU-overhead).
Something you guys need to be wary of is the 1333MHz memory topic. Likely, Apple will choose to run it down at 1066MHz to conserve battery life. Memory speed hikes = gratuitous battery drain.
I for one am happy Apple is growing with the modern tech, but I hold no illusions as to the benefits/drawbacks of either system.
dornoforpyros
Jul 14, 02:57 PM
eh I'm willing to bet they stick with the g5 type case, I mean the macbook is the only "new" case we've seen with the intel transition.
jmsait19
Aug 11, 03:37 PM
i just want a cell phone that works.
all these phones today(by all these phones i mean the motorolas i have had, so mayby motorola's jsut suck) have this ridiculous amount of latency when you are navigating the menus. cause they have to have all this fancy crap flyin around. its like phones are using the same technology from 5 years ago but they are just piling these features into them so they dog down. overall phones today seem to suck just a bit. my nokia 8260 was the best phone i ever had and it was monochrome with no camera or video or stupid crap like that...
plus it seems that my phones ability to get reception when inside a building has gotten worse over time too. i used to get good reception inside my work, but now i don't. and its the same building.
so all in all, just give me a phone that works and functions well and i'll be happy.
all these phones today(by all these phones i mean the motorolas i have had, so mayby motorola's jsut suck) have this ridiculous amount of latency when you are navigating the menus. cause they have to have all this fancy crap flyin around. its like phones are using the same technology from 5 years ago but they are just piling these features into them so they dog down. overall phones today seem to suck just a bit. my nokia 8260 was the best phone i ever had and it was monochrome with no camera or video or stupid crap like that...
plus it seems that my phones ability to get reception when inside a building has gotten worse over time too. i used to get good reception inside my work, but now i don't. and its the same building.
so all in all, just give me a phone that works and functions well and i'll be happy.
AppleScruff1
Apr 20, 11:55 AM
I think this was because Woolworth (Australian supermarket giant) applied for a blanket trademark that allows it to apply it's logo on anything - especially competing electronic goods, computers, music players, and branded phones. (I'm not saying it's right, just surfacing some more details)
P.
I think you are correct. Still ridiculous, IMHO. The Woolworth logo was a fancy W.
P.
I think you are correct. Still ridiculous, IMHO. The Woolworth logo was a fancy W.
igator210
Apr 27, 09:04 AM
The principle of any and every cell phone is that if can connect to a cellular network signal, it knows where you are. Based upon every unique cellular ID, the networks know how to route incoming calls and texts to you, If it didn't how that. how the h#!! do you think you'd get any calls? Right now, sitting at my desk, Verizon knows exactly where I am (based upon triangulation of the nearest cell towers. They have my unique cell ID and my account information. My dumb phone even has a gps 911 locator on it. I dial 911, they know where I am.
Side story: the credit card companies know exactly where I am better then the cell companies. Every time I swipe my credit or debit card, they know where I am. When I travel for vacation, I am very likely to get a call from my credit card company (on my cell) asking where, when and how long I will be traveling. They know every store and every purchase I've ever made on a credit card.
Side story: the credit card companies know exactly where I am better then the cell companies. Every time I swipe my credit or debit card, they know where I am. When I travel for vacation, I am very likely to get a call from my credit card company (on my cell) asking where, when and how long I will be traveling. They know every store and every purchase I've ever made on a credit card.
milo
Jul 14, 03:04 PM
Power Supply at the top is REALLY stupid.
Why?
Why?
Eraserhead
Mar 22, 01:47 PM
With regards to Libya without the no fly zone there would have been a massacre, and without bombing Gaddafi's troops there isn't much hope of anything other than a stalemate, which is also unideal.
With the rebels on the ground it seems highly unlikely that we'll be in Libya for years to come or anything like that.
The big difference between Libya and Iraq is that in Iraq there wasn't a large insurgence controlling a decent proportion of the country before the troops went in.
With the rebels on the ground it seems highly unlikely that we'll be in Libya for years to come or anything like that.
The big difference between Libya and Iraq is that in Iraq there wasn't a large insurgence controlling a decent proportion of the country before the troops went in.
Count Blah
Apr 6, 02:24 PM
Apple are kicking arse without the competition. Do they need it at this point?
Yes, now more than ever.
Yes, now more than ever.
andiwm2003
Apr 11, 11:36 AM
i've been eligible for an upgrade since November and my contract ended in March I think.
But what really matters is that my 3GS shows low battery life and I don't know if it holds up till next year.
Delaying the release date would suck because many users feel they "need" to upgrade after their contract is up and the they feel they are "cheated" if they have to stay on a contract for more than 2 years without upgrading.
From a marketing perspective this would be a bad move.
But what really matters is that my 3GS shows low battery life and I don't know if it holds up till next year.
Delaying the release date would suck because many users feel they "need" to upgrade after their contract is up and the they feel they are "cheated" if they have to stay on a contract for more than 2 years without upgrading.
From a marketing perspective this would be a bad move.
marksman
Apr 11, 03:17 PM
Personally, a bigger screen > Retina Display.
So a 50" SD tv is better than a 42" High Def tv?
So a 50" SD tv is better than a 42" High Def tv?
dethmaShine
Apr 20, 12:29 PM
I'd say even the icon grid claim is reaching. The pictures shown all show the Android application drawer. The actual home screen on Galaxy S devices, what shows up after unlocking, is not the icon grid with a dock. You have to dig into the phone to get to the grid of icons, which frankly again has been shown to be a pretty standard phone UI. Older Palm/Sony models had the "icon grid" UIs in their phones also. :
http://www.mobiledia.com/reviews/sonyericsson/t610/images/front.jpg
http://www.mobileincanada.com/images/unlock/att-palm-treo-600.jpg
Let's face it, the "icon grid" has been a UI for quite a while now :
http://www.computerhope.com/jargon/p/progman.jpg
http://i55.tinypic.com/jzzc53.png
http://www.guidebookgallery.org/pics/gui/system/managers/filemanager/cde15solaris9.png
And all of them had a dock too? And the page change notifier and similar styled icons?
People fail to understand that Apple isn't suing for grid layout. They are suing for the entire phone which looks just like an iPhone. Simple.
http://www.mobiledia.com/reviews/sonyericsson/t610/images/front.jpg
http://www.mobileincanada.com/images/unlock/att-palm-treo-600.jpg
Let's face it, the "icon grid" has been a UI for quite a while now :
http://www.computerhope.com/jargon/p/progman.jpg
http://i55.tinypic.com/jzzc53.png
http://www.guidebookgallery.org/pics/gui/system/managers/filemanager/cde15solaris9.png
And all of them had a dock too? And the page change notifier and similar styled icons?
People fail to understand that Apple isn't suing for grid layout. They are suing for the entire phone which looks just like an iPhone. Simple.
puggles
Jun 14, 07:42 PM
ok definitely not going to radio shack... they changed the time from 7AM to 1PM and are now giving out pins which will put your name on a list and they will call you as they are received to the store.... definitely not guaranteed! They also seemed really desperate for my business. Im guessing they also made the 1PM time so you will miss other pre orders and be stuck with them....unless you can pre order with apple and radio shack and cancel the apple one if radio shack does work out?
sjo
Aug 11, 03:34 PM
Well only about 1.25bil out of the +6 actually have cell service and I'd suspect only about 300mil in Eurpoe use cell phones (according to internetworldstats.com estimates 291mil in Europe use the internet... I'd assume cell usage is similiar).
And factor in that the US, Canada and many of the other countries with CDMA service are amongst the most wealthy in the world. Those +150mil customers are nothing to sneeze at.
Well now you ignorant yankie ;) Firstly the mobile phone penetration in Europe is about 99% or maybe slighly more. You should really travel a bit to get some perspective.
And secondly, GSM has user base of over 1 billion while CDMA as you said has some 60m users. Which one you think would be more interesting market to cover for a new mobile phone manufacturer? And there is really no question of "we'll see which one wins" because GSM won a long long time ago, hands down.
And factor in that the US, Canada and many of the other countries with CDMA service are amongst the most wealthy in the world. Those +150mil customers are nothing to sneeze at.
Well now you ignorant yankie ;) Firstly the mobile phone penetration in Europe is about 99% or maybe slighly more. You should really travel a bit to get some perspective.
And secondly, GSM has user base of over 1 billion while CDMA as you said has some 60m users. Which one you think would be more interesting market to cover for a new mobile phone manufacturer? And there is really no question of "we'll see which one wins" because GSM won a long long time ago, hands down.
840quadra
Apr 27, 09:49 AM
Incorrect - it's not tracking your direct location as you assert.
For instance, when you're visiting "Harry's Sex Shop and under the counter Heroin sales" it doesn't track that you're actually at that business.
It tracks that your phone contacted "AT&T Cellular Site 601-2L" which might be within line of sight of such a business or it might be in the surrounding neighborhood or somewhat nearby.
My own phone shows that I travel all over the Twin Cities of Minneapolis/St. Paul since I am an IT staffer who journeys between 25 different offices all of the time that are dispersed all over town - and I think you would be hard pressed to find out ANYTHING from looking at that picture, it's a giant mess of dots all over town and one satellite facility southeast of town:
Anyway. Yes, an enterprising thief with access to your phone could use it potentially. But as it is, collating that data would require some smarts and effort.
You stole my map!!!
For instance, when you're visiting "Harry's Sex Shop and under the counter Heroin sales" it doesn't track that you're actually at that business.
It tracks that your phone contacted "AT&T Cellular Site 601-2L" which might be within line of sight of such a business or it might be in the surrounding neighborhood or somewhat nearby.
My own phone shows that I travel all over the Twin Cities of Minneapolis/St. Paul since I am an IT staffer who journeys between 25 different offices all of the time that are dispersed all over town - and I think you would be hard pressed to find out ANYTHING from looking at that picture, it's a giant mess of dots all over town and one satellite facility southeast of town:
Anyway. Yes, an enterprising thief with access to your phone could use it potentially. But as it is, collating that data would require some smarts and effort.
You stole my map!!!
TallGuy1970
Mar 31, 04:20 PM
Maybe, just maybe, Steve jobs knows a bit about computing. You may not like his business model, but the man isn't stupid.
wmmk
Aug 20, 01:04 AM
Anyone ever check and see if Quicktime was Universal
if i'm not mistaken, it's been universal since osx for intel was released.
if i'm not mistaken, it's been universal since osx for intel was released.
xStep
Apr 10, 04:58 AM
I'm a little confused...why was Avid presenting at a Final Cut Pro User Group's meeting anyway? Do they just come in and are like "Hey, you've all made a mistake!" or something?
No, they come in and professionally present their product like they would do for any audience, as personally seen at an LAFCPUG (http://www.lafcpug.org/) meeting.
Michael Horton who runs LAFCPUG, and is one of the main organizers of the Supermeet, has the attitude that editors should be aware of all the tools available, including competition to FCP. Also remember that not all people are tied to one tool.
The speculation of how Apple got into the meeting is humorous. Hopefully Michael will eventually give up some information.
Apple can easily make there own event, just book that building in SF and invite some journalists or plan in advance!!
Giving an in depth presentation to this FCP centric audience will likely get Apple much more buzz in the editing community than a standard announcement in front of journalists.
No, they come in and professionally present their product like they would do for any audience, as personally seen at an LAFCPUG (http://www.lafcpug.org/) meeting.
Michael Horton who runs LAFCPUG, and is one of the main organizers of the Supermeet, has the attitude that editors should be aware of all the tools available, including competition to FCP. Also remember that not all people are tied to one tool.
The speculation of how Apple got into the meeting is humorous. Hopefully Michael will eventually give up some information.
Apple can easily make there own event, just book that building in SF and invite some journalists or plan in advance!!
Giving an in depth presentation to this FCP centric audience will likely get Apple much more buzz in the editing community than a standard announcement in front of journalists.
xPismo
Jul 14, 11:36 PM
s/apple/Intel/wh
s/mac community/all the Intel vendors/wh
You have been assimilated.
Apple == Dell == IBM == Gateway == Lenovo == ...
Apple existed pre RISC, and they will exist post IBM chips. Your fears are unfounded. Well engineered hardware with well engineered software, add a dash of the SJ RDF and things will stay groooooovy.
s/mac community/all the Intel vendors/wh
You have been assimilated.
Apple == Dell == IBM == Gateway == Lenovo == ...
Apple existed pre RISC, and they will exist post IBM chips. Your fears are unfounded. Well engineered hardware with well engineered software, add a dash of the SJ RDF and things will stay groooooovy.
fastlane1588
Jul 27, 12:19 PM
thats a pretty cool concept i must say
iawait
Apr 11, 10:01 PM
I just don't think I can wait and that is SO irritating I may have to jump ship!
Newton memories :mad:
Newton memories :mad:
sierra oscar
Sep 19, 09:54 AM
The tone has not been warm to this point. Read the first few pages of the posts. There was a lot of Apple-blasting on pretty silly grounds. It's not like it's months and months later (a pattern we used to have with Apple all the time). It's a matter of a couple weeks -- MAX. Like I said, you and others can wait if you want. Heck, I have a MB and a MBP and am probably going to sell the MBP soon and wait for a revision myself. But the implication that many posts had, such as that the world was coming to an end, was pretty darn ridiculous.
I don't really understand... are you saying that antisocial behavioural traits be encouraged?
I don't really understand... are you saying that antisocial behavioural traits be encouraged?
barkomatic
Mar 31, 03:58 PM
At a glance your statement sounds fine. But that logic can be used for following logics:
1. I don't care what US does to rest of world as long as I as an american can live nice, prosperous life.
but i digress...
You're comparing a phone or a tablet to U.S. foreign policy? I'm sorry, I don't think gadgets are as important as that but apparently you do. I think you need a check on your perspective.
1. I don't care what US does to rest of world as long as I as an american can live nice, prosperous life.
but i digress...
You're comparing a phone or a tablet to U.S. foreign policy? I'm sorry, I don't think gadgets are as important as that but apparently you do. I think you need a check on your perspective.
JimEJr
Apr 27, 10:30 AM
Its not about being a criminal or paranoid. This data is for the sole purpose of marketers to sell us crap.
Well, I'm tired of seeing ads everywhere I turn. You can't go to the bathroom now without seeing a ad shoved in your face and its becoming tiresome.
Well, Fry could have added our iPads and our phones too. Its disgusting already how much advertising has infiltrated our lives. You can't even read a news story on the internet without an ad being being intrusively shoved in your face.
Well then shut your eyes and plug your ears...or kiss your content (aka what you DO want) good bye as those ads are what is paying for you to enjoy that news story you refer to and most anything else that is free or a lower cost than it would be without ads. You can't have it both ways. Want all bloggers, media, etc. to do everything without ads AND without a charge? You try running a biz that way...see how long you'll be able to pay your bills.
In reality, the more data advertisers have about you, the better they will be able to put forth ads that are much more relevant to you. If we're going to have ads, might as well have them be for something of genuine interest to each one of us.
Well, I'm tired of seeing ads everywhere I turn. You can't go to the bathroom now without seeing a ad shoved in your face and its becoming tiresome.
Well, Fry could have added our iPads and our phones too. Its disgusting already how much advertising has infiltrated our lives. You can't even read a news story on the internet without an ad being being intrusively shoved in your face.
Well then shut your eyes and plug your ears...or kiss your content (aka what you DO want) good bye as those ads are what is paying for you to enjoy that news story you refer to and most anything else that is free or a lower cost than it would be without ads. You can't have it both ways. Want all bloggers, media, etc. to do everything without ads AND without a charge? You try running a biz that way...see how long you'll be able to pay your bills.
In reality, the more data advertisers have about you, the better they will be able to put forth ads that are much more relevant to you. If we're going to have ads, might as well have them be for something of genuine interest to each one of us.
shamino
Jul 14, 05:26 PM
Kind of odd/funny how we seem to be going backwards in processor speeds. Instead of 3.6 GHz Pentiums, we are looking at 2.x GHz Intel Cores. It would be interesting to see how well a single Core processor matches up to PowerPC, or a Pentium, or AMD.
It just means that Intel has finally publicly recognized the validity of the MHz Myth.
Raw clock speed is meaningless. You can get better performance at a slower clock speed if you can increase parallelism. This includes features like superscalar architecture (where multiple instructions are executed per clock), deep pipelining, hyperthreading, SIMD instructions, and multi-core chips.
However, I am finding one of my predicitions finally happen...it appears that a ceiling has been currently met on how fast the current line of processors can go, and now we are relying on multiple cores/processors to distribute work, instead of relying on just one fast chip.
That's a part of the equation, but not all of it.
Higher clock speeds are possible, but it's not worth the effort. Pumping up the clock speed creates serious problems in terms of power consumption and heat dissipation. Leaving the clock speed lower, but increasing parallelism will also boost performance, and keeps the power curve down at manageable levels.
It's worth noting that Intel has shipped P4-series chips at 3.4GHz. But the new chips (Woodcrest and Conroe) aren't being sold at speeds above 3GHz.
So when will we start seeing 8 chips in a computer? Perhaps this will become the new measurement...not processor speeds, but the number of processors (or cores).
Pay attention. The answer is "sooner than you think".
There have already been technology briefings from Intel that talk about 4-core chips in early and 32-core chips by 2010. Similar offerings are expected from AMD.
And the Xeon-MP series processors (which will, of course, eventually get all this tech) are designed with 8-way SMP in mind. A theoretical Xeon-MP based on this 32-core tech would produce a system with 256 cores. Of course, it is doubtful that anything other than a large server would be able to take proper advantage of this, so I wouldn't ever expect to find one on a desktop.
(FWIW, Intel is looking to Sun as a rival here. Sun's latest chip - the UltraSPARC T1 (http://www.sun.com/processors/UltraSPARC-T1/) - currently ships in an 8-core configuration, with each core capable of running four threads at a time, and only consuming 72W of power. Even at 1.2GHz - the top speed they're currently shipping at - this makes for a very nice server.)
It just means that Intel has finally publicly recognized the validity of the MHz Myth.
Raw clock speed is meaningless. You can get better performance at a slower clock speed if you can increase parallelism. This includes features like superscalar architecture (where multiple instructions are executed per clock), deep pipelining, hyperthreading, SIMD instructions, and multi-core chips.
However, I am finding one of my predicitions finally happen...it appears that a ceiling has been currently met on how fast the current line of processors can go, and now we are relying on multiple cores/processors to distribute work, instead of relying on just one fast chip.
That's a part of the equation, but not all of it.
Higher clock speeds are possible, but it's not worth the effort. Pumping up the clock speed creates serious problems in terms of power consumption and heat dissipation. Leaving the clock speed lower, but increasing parallelism will also boost performance, and keeps the power curve down at manageable levels.
It's worth noting that Intel has shipped P4-series chips at 3.4GHz. But the new chips (Woodcrest and Conroe) aren't being sold at speeds above 3GHz.
So when will we start seeing 8 chips in a computer? Perhaps this will become the new measurement...not processor speeds, but the number of processors (or cores).
Pay attention. The answer is "sooner than you think".
There have already been technology briefings from Intel that talk about 4-core chips in early and 32-core chips by 2010. Similar offerings are expected from AMD.
And the Xeon-MP series processors (which will, of course, eventually get all this tech) are designed with 8-way SMP in mind. A theoretical Xeon-MP based on this 32-core tech would produce a system with 256 cores. Of course, it is doubtful that anything other than a large server would be able to take proper advantage of this, so I wouldn't ever expect to find one on a desktop.
(FWIW, Intel is looking to Sun as a rival here. Sun's latest chip - the UltraSPARC T1 (http://www.sun.com/processors/UltraSPARC-T1/) - currently ships in an 8-core configuration, with each core capable of running four threads at a time, and only consuming 72W of power. Even at 1.2GHz - the top speed they're currently shipping at - this makes for a very nice server.)