Contribute
Register

Apple Event announced for October 30th: 'Scary Fast'

I liked this video. It has no benchmarks or thermal testing. It's by a professional photographer that has used his MBP M1 Pro every day for two years. He switched from Windows not too long ago and this was his first Mac ever. Now he just bought an M3 Max MBP and "spoiler alert" he plans on keeping his new M3 despite what the thumbnail says.

 
Last edited:
Can you say a bit more about the perfection of the argument?
I have better frame rates on the one game I play on AS then I did on intel hack or even on windows. I have better frame rates in Twinmotion on AS then I did on the Hack or in windows. And while AS unreal is a beat down version of Unreal it has all the tools that I need for what I use it for. With the game mode added the software got even better performance. Keep in mind that Twinmotion/Unreal were Intel Mac apps running in rosetta. Epic released a AS version of unreal and hopefully there will be a twinmotion version soon. With the addition to Ray Tracing in the M3 the performance should only improve. All that said my windows laptop that is albeit a few years old now would not be able to do this work very long without it being plugged in. The AS machine as long as I get the brightness of the screen turned down and do not run it on ultra settings will allow me about 3/hrs of work.

Could it just be a placebo effect sure but the 15-20 FPS in twin-motion and the 40 or so FPS in world of warcraft are a pretty good placebo if you ask me.
 
Re Fstoppers, my reading of that review was that someone buys a device, then next year buys another one, then argues with himself in front of an online audience about whether he's got his moneys worth on either of them in exchange for ad share... He can't lose with circular reasoning regarding "value".

Per year over year benchmark increases and marketing increments like M1, M2, M3 / 12th, 13th, 14th...this is a hype cycle built around a pipeline of progress.

I prefer to keep in mind the long-term trends. To repeat myself from a previous post, a 2023 iPhone can do what 2013 top-of-the-line Mac Pro was designed to do. At it does it at 1/5th the price while fitting in your pocket, running on a battery. And BTW it's got a 4K camera with a range of optics built in, just in case you care. And you can pipe your vid around the world from wherever you are when you get done. Oh and it's a phone.

Correspondingly, if there's an app that doesn't realize obvious generational perf gains but does respond to pure compute scaling, such as what seems to be portrayed for Logic Pro, then questions emerge about the details of that app: the design, the toolchain, and the underlying machine. This makes that app interesting. I mean, isn't it good news that Logic Pro users continue to get leading performance even on older designs? If something is done as well as it can be, is that anything to complain about? Or should the Logic Pro customer feel dismayed that the developers are incapable of realizing performance gains on evolving HW? There's no obvious conclusion about the devices that run the app: it has to be studied.

I read a recent post at Macrumours by a developer who lamented that his mainline work is with Docker containers and everything seems limited to E-cores so maybe he should get a Mac with more E-cores. But his lament included no words for the key trait of running containers, which is to limit scope of allocation of underlying HW to the task. Maybe it's never occurred to him to read the documentation on controls for HW allocations. Idk. But I do know that the whole point of virtualization is to abstract away the underlying details, so why isn't he overjoyed?!

These strange loops of thinking are everywhere. And everywhere appearances are becoming proxies for reality.

There's a recent trend in cinematography to shoot at super high resolution, and relax details of composition on set, then frame later in post. Why such displacement of work valuable is open to consideration, but the natural evolution of this will be to render the entire world inside a machine then let an AI create the movie, maybe dump some actors in there to cross the uncanny valley. Big steps have already been taken in this approach:

 
Re Fstoppers, my reading of that review was that someone buys a device, then next year buys another one, then argues with himself in front of an online audience about whether he's got his moneys worth on either of them in exchange for ad share... He can't lose with circular reasoning regarding "value".

Per year over year benchmark increases and marketing increments like M1, M2, M3 / 12th, 13th, 14th...this is a hype cycle built around a pipeline of progress.

I prefer to keep in mind the long-term trends. To repeat myself from a previous post, a 2023 iPhone can do what 2013 top-of-the-line Mac Pro was designed to do. At it does it at 1/5th the price while fitting in your pocket, running on a battery. And BTW it's got a 4K camera with a range of optics built in, just in case you care. And you can pipe your vid around the world from wherever you are when you get done. Oh and it's a phone.

Correspondingly, if there's an app that doesn't realize obvious generational perf gains but does respond to pure compute scaling, such as what seems to be portrayed for Logic Pro, then questions emerge about the details of that app: the design, the toolchain, and the underlying machine. This makes that app interesting. I mean, isn't it good news that Logic Pro users continue to get leading performance even on older designs? If something is done as well as it can be, is that anything to complain about? Or should the Logic Pro customer feel dismayed that the developers are incapable of realizing performance gains on evolving HW? There's no obvious conclusion about the devices that run the app: it has to be studied.

I read a recent post at Macrumours by a developer who lamented that his mainline work is with Docker containers and everything seems limited to E-cores so maybe he should get a Mac with more E-cores. But his lament included no words for the key trait of running containers, which is to limit scope of allocation of underlying HW to the task. Maybe it's never occurred to him to read the documentation on controls for HW allocations. Idk. But I do know that the whole point of virtualization is to abstract away the underlying details, so why isn't he overjoyed?!

These strange loops of thinking are everywhere. And everywhere appearances are becoming proxies for reality.

There's a recent trend in cinematography to shoot at super high resolution, and relax details of composition on set, then frame later in post. Why such displacement of work valuable is open to consideration, but the natural evolution of this will be to render the entire world inside a machine then let an AI create the movie, maybe dump some actors in there to cross the uncanny valley. Big steps have already been taken in this approach:



Btw, I use Docker in macOS, but, honestly, Docker runs a lot better in Linux.
 
"Shot on iPhone" plus $10 million of other stuff.

Apple is so far off the mark on this PR...

They build a 4K broadcast studio into a pack of cigarettes, then show it off caged it in a completely artificial world that could just as well been shot in tri-camera CinemaRama:

IMG_5458.jpeg


IMG_5455.png
 
"Shot on iPhone" plus $10 million of other stuff.

Yes, but even with the "$10M of other stuff", impossible with an iPhone from just a few years ago.
 
These strange loops of thinking are everywhere. And everywhere appearances are becoming proxies for reality.

There's a recent trend in cinematography to shoot at super high resolution, and relax details of composition on set, then frame later in post. Why such displacement of work valuable is open to consideration, but the natural evolution of this will be to render the entire world inside a machine then let an AI create the movie, maybe dump some actors in there to cross the uncanny valley. Big steps have already been taken in this approach:

I am going to cherry pick a bit here and just say this sediment is exactly how they created The Mandalorian on Disney+ and I could not imagine they did it much different for the other Star Wars series. Similarly to the spear in Los Vegas that is designed to immerse you in the experience.
 
Back
Top