
pkson
Apr 5, 11:43 PM
I wonder if they're gonna add (slightly useless) stuff from iMovie like face recognition (It's a great idea, but it takes too long to go through all the clips..)
I hope FCP is just awesome beyond comprehension.
I hope FCP is just awesome beyond comprehension.

wovel
Mar 31, 03:20 PM
This is a smart move. It had to happen sooner or later.
John Gruber would eat Steve Job's ***** if he could. His opinion is extremely biased.
Yet what he said is 100% accurate..Weird how that can happen sometimes.
Except... he's right. This was a bait-and-switch from Google. I don't think it was a bad move for the future of the platform, but it does render a lot of their PR commentary through history as bogus. As for Gruber, you clearly don't like him, but while he is certainly a fan of Apple he is usually correct.
Despite what the fandroids think, the Android Ecosystem is in a world of hurt. Fragmentation is a much bigger problem then even Jobs said and they have almost no market at all for paid applications today. They will continue to dominate the worthless bottom of the market and nothing else if they do not do something to reign in these manufacturers.
John Gruber would eat Steve Job's ***** if he could. His opinion is extremely biased.
Yet what he said is 100% accurate..Weird how that can happen sometimes.
Except... he's right. This was a bait-and-switch from Google. I don't think it was a bad move for the future of the platform, but it does render a lot of their PR commentary through history as bogus. As for Gruber, you clearly don't like him, but while he is certainly a fan of Apple he is usually correct.
Despite what the fandroids think, the Android Ecosystem is in a world of hurt. Fragmentation is a much bigger problem then even Jobs said and they have almost no market at all for paid applications today. They will continue to dominate the worthless bottom of the market and nothing else if they do not do something to reign in these manufacturers.

dissdnt
Jun 9, 02:59 PM
Went down to Radio Shack today. They are doing a trade in but you're never gonna get the max price they offer unless it's out of the box new. My 3gs has normal wear over the year so they will only give 230 for it.
And I have a feeling trading in prices will drop when the iPhone 4 drops.
And I have a feeling trading in prices will drop when the iPhone 4 drops.

Machead III
Sep 19, 08:05 AM
Engadget have the situation nailed. (http://www.engadget.com/2006/09/19/so-where-the-hell-are-our-core-2-duo-macbooks/)

SiliconAddict
Jul 27, 03:29 PM
this blog was also written by jason o'grady, aka the PowerPage rumor site. his writing means nothing to me.
++
99.998% of what is posted on POwerPage is garbage. I love the crap about how he's against a buttonless iPOd because touching the screen would scratch it. o.O No Jason....touching the screen would smuge it. Unless you hands are as hard as sandpaper....anyways. PP is pretty much crap.
Nice news from intel, good for WWDC ...
... Apple will probably announce right before, since SJ said long ago no hard announcements at WWDC.
Of course he reverses A LOT :eek:
ARE PREPARED for the NASTY NEWS THAT COULD COME AT ANY TIME :eek: :eek:
MS will announce that they are dropping Mac development :eek: :eek: :eek: :mad:
DON'T YOU GET IT ...
... this is the plan Bill G., the NICE GUY, leaves MS with his CLOD BULLDOG in charge AND YOU GET WHAT YOU GET, Balmer cuts the Mac devision and probably a TON of other jobs too and then they announce their MP3 player and all sorts of services !!!
This way their player looks more credible then Apple, for a while anyway !!
If Apple was smart after all they would have hired me to test their spreadsheet app a long time ago, I am after all, the SPREADSHEET GOD :cool:
Dude. One word.....Decaf. :rolleyes:
++
99.998% of what is posted on POwerPage is garbage. I love the crap about how he's against a buttonless iPOd because touching the screen would scratch it. o.O No Jason....touching the screen would smuge it. Unless you hands are as hard as sandpaper....anyways. PP is pretty much crap.
Nice news from intel, good for WWDC ...
... Apple will probably announce right before, since SJ said long ago no hard announcements at WWDC.
Of course he reverses A LOT :eek:
ARE PREPARED for the NASTY NEWS THAT COULD COME AT ANY TIME :eek: :eek:
MS will announce that they are dropping Mac development :eek: :eek: :eek: :mad:
DON'T YOU GET IT ...
... this is the plan Bill G., the NICE GUY, leaves MS with his CLOD BULLDOG in charge AND YOU GET WHAT YOU GET, Balmer cuts the Mac devision and probably a TON of other jobs too and then they announce their MP3 player and all sorts of services !!!
This way their player looks more credible then Apple, for a while anyway !!
If Apple was smart after all they would have hired me to test their spreadsheet app a long time ago, I am after all, the SPREADSHEET GOD :cool:
Dude. One word.....Decaf. :rolleyes:

2IS
Apr 7, 12:26 AM
I'm getting tired of Apple Mac's being INTEL's BIATCH!
Integrated graphics on a laptop costing THAT MUCH? PLEASE!
Steve Jobs should threaten to switch to AMD/ATI solutions even if just for leverage with Intel to get discreet graphics chips in these machines.
If this is true, this is a pathetic technology compromise in my opinion.
lol... You really think Intel is the reason Apple laptops cost what they do? Really?
Integrated graphics on a laptop costing THAT MUCH? PLEASE!
Steve Jobs should threaten to switch to AMD/ATI solutions even if just for leverage with Intel to get discreet graphics chips in these machines.
If this is true, this is a pathetic technology compromise in my opinion.
lol... You really think Intel is the reason Apple laptops cost what they do? Really?

mikemac11
Mar 26, 01:29 AM
This post made me laugh. As a developer who is actively testing and reporting bugs I can tell you that without a doubt this is 100% false. My dozen of bug reports combined with a lot of different discussions happening in the developer forums is a pretty clear indicator they have a while to go.
Side note: Really? Techcrunch?
Side note: Really? Techcrunch?

fivepoint
Apr 27, 03:25 PM
I'd be fascinated to know exactly what you did to "discover" those layers, 5P. I have Photoshop and Illustrator too. Guess what? One layer. Nothing selectable. At least one of us is talking complete bollocks.
Open the file in illustrator, use the white arrow (not black) tool, and the individual layers or objects will be individually selectable. If you look at the word 'none' in the center of the document for example, you'll see that part of the word is darker than the other, one part is on one layer, the other is separate. I just don't understand how this would normally happen on a simple scanned PDF.
Like I said... Computer / operator fail @ OCR usage.
:rolleyes:
EDIT: although I do have to issue another "rollseyes" face at the people who dismiss 5P because "they tried but saw no layers".
I tried, I discovered layers.
Fact: There are "layers" if you can even call them that.
Another Fact: They mean nothing.
You're probably right... it's probably some type of OCR epic fail.
Also, it's not a fact. I'm a liar, you're a liar, if you don't think there's only one layer, you're a liar.
Since the messenger (you) has expressed huge distaste for Obama on almost a daily basis, I'd say my assumptions are fair.
Yes, I think Obama is a horrible president. That doesn't mean he was born in Kenya. Enough with the overly dramatic defense mechanisms. Just because you love the guy doesn't mean you get to live in a fairlytale world where he has no flaws, or he can't be questioned or criticized in the least. Why not focus on figuring out why the document is weird so we can all move on!?!? Do you just have fun laying down baseless attacks for no reason instead? It's a simple question - aimed at graphic artists who know what they're talking about (not you) - so why even discuss it other than to disrupt this issue, misdirect the conversation, and accuse me lying?
He didn't discover anything, he just bought in to the reactionary right wing propaganda spreading like wildfire on the internet.
If I had 'bought into it' I would have been on here saying, "look, look, it's a fake! He's not a citizen! Here's proof!". To the contrary, I said from the very beginning that there was likely a simple explanation and that I wanted to hear such an explanation which I think MattSepta (unlike the rest of you) has begun to offer. Are there any other expert opinions out there on this issue? I had hoped this issue would be laid to rest at this point, I almost think it's going to get worse based on what I'm seeing out there. :(
Open the file in illustrator, use the white arrow (not black) tool, and the individual layers or objects will be individually selectable. If you look at the word 'none' in the center of the document for example, you'll see that part of the word is darker than the other, one part is on one layer, the other is separate. I just don't understand how this would normally happen on a simple scanned PDF.
Like I said... Computer / operator fail @ OCR usage.
:rolleyes:
EDIT: although I do have to issue another "rollseyes" face at the people who dismiss 5P because "they tried but saw no layers".
I tried, I discovered layers.
Fact: There are "layers" if you can even call them that.
Another Fact: They mean nothing.
You're probably right... it's probably some type of OCR epic fail.
Also, it's not a fact. I'm a liar, you're a liar, if you don't think there's only one layer, you're a liar.
Since the messenger (you) has expressed huge distaste for Obama on almost a daily basis, I'd say my assumptions are fair.
Yes, I think Obama is a horrible president. That doesn't mean he was born in Kenya. Enough with the overly dramatic defense mechanisms. Just because you love the guy doesn't mean you get to live in a fairlytale world where he has no flaws, or he can't be questioned or criticized in the least. Why not focus on figuring out why the document is weird so we can all move on!?!? Do you just have fun laying down baseless attacks for no reason instead? It's a simple question - aimed at graphic artists who know what they're talking about (not you) - so why even discuss it other than to disrupt this issue, misdirect the conversation, and accuse me lying?
He didn't discover anything, he just bought in to the reactionary right wing propaganda spreading like wildfire on the internet.
If I had 'bought into it' I would have been on here saying, "look, look, it's a fake! He's not a citizen! Here's proof!". To the contrary, I said from the very beginning that there was likely a simple explanation and that I wanted to hear such an explanation which I think MattSepta (unlike the rest of you) has begun to offer. Are there any other expert opinions out there on this issue? I had hoped this issue would be laid to rest at this point, I almost think it's going to get worse based on what I'm seeing out there. :(

janstett
Sep 15, 08:26 AM
And of course, NT started as a reimplementation of VMS for a failed Intel RISC CPU...
More pedantic details for those who are interested... :)
NT actually started as OS/2 3.0. Its lead architect was OS guru Dave Cutler, who is famous for architecting VMS for DEC, and naturally its design influenced NT. And the N-10 (Where "NT" comes from, "N" "T"en) Intel RISC processor was never intended to be a mainstream product; Dave Cutler insisted on the development team NOT using an X86 processor to make sure they would have no excuse to fall back on legacy code or thought. In fact, the N-10 build that was the default work environment for the team was never intended to leave the Microsoft campus. NT over its life has run on X86, DEC Alpha, MIPS, PowerPC, Itanium, and x64.
IBM and Microsoft worked together on OS/2 1.0 from 1985-1989. Much maligned, it did suck because it was targeted for the 286 not the 386, but it did break new ground -- preemptive multitasking and an advanced GUI (Presentation Manager). By 1989 they wanted to move on to something that would take advantage of the 386's 32-bit architecture, flat memory model, and virtual machine support. Simultaneously they started OS/2 2.0 (extend the current 16-bit code to a 16-32-bit hybrid) and OS/2 3.0 (a ground up, platform independent version). When Windows 3.0 took off in 1990, Microsoft had second thoughts and eventually broke with IBM. OS/2 3.0 became Windows NT -- in the first days of the split, NT still had OS/2 Presentation Manager APIs for it's GUI. They ripped it out and created Win32 APIs. That's also why to this day NT/2K/XP supported OS/2 command line applications, and there was also a little known GUI pack that would support OS/2 1.x GUI applications.
More pedantic details for those who are interested... :)
NT actually started as OS/2 3.0. Its lead architect was OS guru Dave Cutler, who is famous for architecting VMS for DEC, and naturally its design influenced NT. And the N-10 (Where "NT" comes from, "N" "T"en) Intel RISC processor was never intended to be a mainstream product; Dave Cutler insisted on the development team NOT using an X86 processor to make sure they would have no excuse to fall back on legacy code or thought. In fact, the N-10 build that was the default work environment for the team was never intended to leave the Microsoft campus. NT over its life has run on X86, DEC Alpha, MIPS, PowerPC, Itanium, and x64.
IBM and Microsoft worked together on OS/2 1.0 from 1985-1989. Much maligned, it did suck because it was targeted for the 286 not the 386, but it did break new ground -- preemptive multitasking and an advanced GUI (Presentation Manager). By 1989 they wanted to move on to something that would take advantage of the 386's 32-bit architecture, flat memory model, and virtual machine support. Simultaneously they started OS/2 2.0 (extend the current 16-bit code to a 16-32-bit hybrid) and OS/2 3.0 (a ground up, platform independent version). When Windows 3.0 took off in 1990, Microsoft had second thoughts and eventually broke with IBM. OS/2 3.0 became Windows NT -- in the first days of the split, NT still had OS/2 Presentation Manager APIs for it's GUI. They ripped it out and created Win32 APIs. That's also why to this day NT/2K/XP supported OS/2 command line applications, and there was also a little known GUI pack that would support OS/2 1.x GUI applications.

dgree03
Mar 31, 02:39 PM
I've been wanting to say this for a very long time. Google's OS has no advantage over iOS. You could even say it has a disadvantage. Having to create a vanilla code base that needs to function on multiple pieces of hardware is complex, more complexity creates weaker system.
But here's my point. The ONLY ONLY reason why Android market share is anywhere near what it is today is because of the Buy One Get One options at most phone retailers. iOS has NEVER done that and hopefully never will. If you didn't care about the phone or service but needed two "Newer Smart Phones" one for you and one for your wife, why not go with the "Blah Blah" model from Verizon where if I buy one today I get the second for free (two year agreement and activation fees required).
Market share means nothing. This platform is doomed unless Google reins it in and get control over it. If they do, providers will be less willing to work with them, if they don't, by by Android.
My Two Cents.
-LanPhantom
Iphone are sold BOGO and even just free on contract over in other countries.
Android has quite a few advantages over iOS. And as it stands right now, android is still as "open" as it was 1 minutes before this article got posted
Nice try.
But here's my point. The ONLY ONLY reason why Android market share is anywhere near what it is today is because of the Buy One Get One options at most phone retailers. iOS has NEVER done that and hopefully never will. If you didn't care about the phone or service but needed two "Newer Smart Phones" one for you and one for your wife, why not go with the "Blah Blah" model from Verizon where if I buy one today I get the second for free (two year agreement and activation fees required).
Market share means nothing. This platform is doomed unless Google reins it in and get control over it. If they do, providers will be less willing to work with them, if they don't, by by Android.
My Two Cents.
-LanPhantom
Iphone are sold BOGO and even just free on contract over in other countries.
Android has quite a few advantages over iOS. And as it stands right now, android is still as "open" as it was 1 minutes before this article got posted
Nice try.

fehhkk
Apr 27, 09:47 AM
If they're not tracking location, why would the new update purge the location database when it's turned off... :p

Sirmausalot
Apr 10, 11:42 AM
I think the studio concept, as we know, it will be gone. It will all be one truly integrated application. Most importantly, full audio editing will be integrated obviating the need for OMFs and conforms for the person who does all of their own work.
This will include a powerful titling tool, Motion graphics, compression, sound. There shouldn't be a need to launch an external application. Integrated Internet delivery will be comprehensive to social media, iDevices, and anything in the cloud.
DVD Studio Pro will get a full overhaul and fully support The Bag of Hurt Blu-ray -- on an external burner for the new iMacs which will also be announced. Again, physical media gets an external treatment and the application will be the sperate step child of the newly integrated Final Studio.
This will include a powerful titling tool, Motion graphics, compression, sound. There shouldn't be a need to launch an external application. Integrated Internet delivery will be comprehensive to social media, iDevices, and anything in the cloud.
DVD Studio Pro will get a full overhaul and fully support The Bag of Hurt Blu-ray -- on an external burner for the new iMacs which will also be announced. Again, physical media gets an external treatment and the application will be the sperate step child of the newly integrated Final Studio.

chasemac
Aug 7, 05:46 PM
can't believe only 8 people voted for 64bit, its the most profound change here.... all others you can achieve with some 3rd party softwares.
Same here. To me it is one of the most significant upgrades of all of them.
Same here. To me it is one of the most significant upgrades of all of them.

Soba
Jul 28, 01:02 PM
you can't make a statement like that. that's like saying "i hate general electric air conditioners." what the heck? all CPU's (and air conditioners) do the same thing.
I'm not sure if this was intended as some kind of throwaway comment or not, but this is not even remotely true.
The original poster said he hated the P4, and honestly, the P4 was a lousy chip design from day 1. The original Pentium 4 chips released about 5 1/2 years ago were outperformed in some instances by an original Pentium chip running at 166MHz. The Pentium 4 was an awful architecture in many respects that simply could not be cleaned up enough to be viable; that would be why Intel abandoned it and based its current designs on the Pentium Pro's core (which was really a very decent server chip in the nineties).
When Apple announced last year they were going with Intel, a lot of people agreed it was a good choice based on the current state of the PowerPC architecture and based on Intel's planned chip designs. Personally, I was a bit unsure at the time, but was optimistic about the switch and figured we could scarcely do much worse than sticking with the G5, which was languishing. Turning back the clock a bit, if instead of releasing the G5, Apple had announced a switch to Intel in I would have thought they were crazy. Intel's chips were awful at that time and there wasn't much of a light at the end of the tunnel, either.
CPUs can be very, very different even if the overall system architecture is similar. And I side with the original poster. The P4 was a dog, and thankfully it is about to be buried forever.
I'm not sure if this was intended as some kind of throwaway comment or not, but this is not even remotely true.
The original poster said he hated the P4, and honestly, the P4 was a lousy chip design from day 1. The original Pentium 4 chips released about 5 1/2 years ago were outperformed in some instances by an original Pentium chip running at 166MHz. The Pentium 4 was an awful architecture in many respects that simply could not be cleaned up enough to be viable; that would be why Intel abandoned it and based its current designs on the Pentium Pro's core (which was really a very decent server chip in the nineties).
When Apple announced last year they were going with Intel, a lot of people agreed it was a good choice based on the current state of the PowerPC architecture and based on Intel's planned chip designs. Personally, I was a bit unsure at the time, but was optimistic about the switch and figured we could scarcely do much worse than sticking with the G5, which was languishing. Turning back the clock a bit, if instead of releasing the G5, Apple had announced a switch to Intel in I would have thought they were crazy. Intel's chips were awful at that time and there wasn't much of a light at the end of the tunnel, either.
CPUs can be very, very different even if the overall system architecture is similar. And I side with the original poster. The P4 was a dog, and thankfully it is about to be buried forever.

KnightWRX
Apr 12, 07:02 PM
The coverage and cost obviously.
Because if Apple release an iPhone 5 with LTE, it will cost more and won't be backwards compatible... right... :rolleyes:
Obviously not a factor.
Because if Apple release an iPhone 5 with LTE, it will cost more and won't be backwards compatible... right... :rolleyes:
Obviously not a factor.

alexpaul
Mar 23, 05:19 AM
The features looks pretty cool for this price tag, but what about the apps? If it support only BB app world then for sure they won't win!

samcraig
Apr 27, 08:51 AM
Ok then show me where it says that turning location services off will not stop the tracking. I've scanned the articles and did not find anything that said that. If it does still track when you turn it off, I'd like to know.
http://online.wsj.com/article/SB10001424052748704123204576283580249161342.html
http://online.wsj.com/article/SB10001424052748704123204576283580249161342.html

Zadillo
Aug 7, 03:35 PM
anyone else a little underwhelmed with today's WWDC? There isn't anything that really jumped out at me besides the Mac Pro.
I don't know what there is to be underwhelmed about; the rumor has basically been that the main things being covered here would be the Mac Pro (which exceeded my expectations) and the first real glimpse at Leopard (which looks very cool from what I've seen). I didn't find either the Mac Pro or Leopard to be underwhelming, so I don't see anything that would make me feel underwhelmed.
I guess I would be underwhelmed if I had mistaken WWDC for Macworld or something, and expected a bunch of major new product announcements.
I don't know what there is to be underwhelmed about; the rumor has basically been that the main things being covered here would be the Mac Pro (which exceeded my expectations) and the first real glimpse at Leopard (which looks very cool from what I've seen). I didn't find either the Mac Pro or Leopard to be underwhelming, so I don't see anything that would make me feel underwhelmed.
I guess I would be underwhelmed if I had mistaken WWDC for Macworld or something, and expected a bunch of major new product announcements.

BlizzardBomb
Jul 27, 10:22 AM
Well there's always going to be some die-hard PPC and Core Duo users who will vote negative on this story :p
Well Apple, get those Core 2 Duos in the iMacs and MacBook Pros, and a Woodcrest... No... 2 Woodcrests in the Mac Pros.
Well Apple, get those Core 2 Duos in the iMacs and MacBook Pros, and a Woodcrest... No... 2 Woodcrests in the Mac Pros.
PhantomPumpkin
Apr 27, 10:23 AM
Maybe that's what you heard.
I heard that the database couldn't be user purged (easily)
The the database kept data from Day one
and that Location services being turned off didn't change the recording of the data.
Apple fans were "more correct". Wow. Ok - if you say so.... and if it helps you sleep at night
I'm still confused how you think the "hype" was correct then.
Your points don't even support it.
As was said before, this was way overblown.
I heard that the database couldn't be user purged (easily)
The the database kept data from Day one
and that Location services being turned off didn't change the recording of the data.
Apple fans were "more correct". Wow. Ok - if you say so.... and if it helps you sleep at night
I'm still confused how you think the "hype" was correct then.
Your points don't even support it.
As was said before, this was way overblown.
oregonmac
Nov 29, 01:11 PM
see http://www.tunecore.com/
Universal is simply increasing the rate of their own demise. And why do they think artists find them necessary?
Universal is simply increasing the rate of their own demise. And why do they think artists find them necessary?
moebius
Mar 22, 08:36 PM
Probably someone mentioned before, but "a tablet for professionals" named PLAYbook?
I smell an identity crisis.
I smell an identity crisis.
28monkeys
Apr 7, 10:32 PM
Obviously you know little about retail and accounting.
Obviously you know nothing about retail.
Obviously you know nothing about retail.
MacsRgr8
Jul 20, 02:28 PM
Have you ever owned a machine that hasn't been CPU bound? I know I haven't.
Probably Single CPU bound....
It will be gr8 being able to get 8 cores in a Mac, but if the software dosn't use it....
Someone already mentioned that it also gives you the possibility to use those cores by using many apps at once. This is true, but I wonder how many often you will actually use all those cores at once.
Let's hope the "opposite of Hyperthreading" will come along (Leopard feature???).. So, instead of a "emulating" a Dual Core / CPU config (like on later Pentium 4's), emulate a Single CPU on multiple cores. :cool:
Then, you get 8 * 3 GHz = 1 * 24 GHz...!!!
Probably Single CPU bound....
It will be gr8 being able to get 8 cores in a Mac, but if the software dosn't use it....
Someone already mentioned that it also gives you the possibility to use those cores by using many apps at once. This is true, but I wonder how many often you will actually use all those cores at once.
Let's hope the "opposite of Hyperthreading" will come along (Leopard feature???).. So, instead of a "emulating" a Dual Core / CPU config (like on later Pentium 4's), emulate a Single CPU on multiple cores. :cool:
Then, you get 8 * 3 GHz = 1 * 24 GHz...!!!





No comments:
Post a Comment