Is possible use Catalyst with snow leopard?? (osx 10.6)
Printable View
Is possible use Catalyst with snow leopard?? (osx 10.6)
It's a bit early for that question!!
As with all new OS releases, it is best to wait for the new software to settle down and for Apple to mend the things they have broken!
Dont even go there....
Apple has only released the latest build (10A314) of Mac OS X 10.6 (Snow Leopard) to developers on the 1st April (ominous!!)
An exact release date for Snow Leopard has not been established , though it could be as late as September.
Stick with the OS you are currently using!
If you still use Tiger on 2007 Macs - stick with it - theses no significat reason to upgrade unless you 'really' want some Leopard features. Everyting still seems to work fine with latest Tiger & Quicktime software updates.
If you are using Leopard on 2008/9 Macs - have not seen any problems with all the latest OS & Quicktime updates.
Would recommend disabling spotlight, by placing hard drives in 'privacy' area
Users may also want to consider disabling dashboard and keyboard shortcuts, expose, etc.
Simon
well I´m testing it now and seems good, no problem yet, but I only use it at home, not for shows.
A critical issue which might arise with Snow Leopard are the characteristics of OpenCL removing flops from the GPU to assist in CPU processing. I have been using the beta and have already discovered some concerns. According to the tech notes Apple has up, these issues will be resolved when OpenCL becomes better integrated.
Something else to think about is that the rumor of Snow Leopard being entirely Cocoa. Right now, there are APIs and frameworks in the beta that are still using Carbon. Basically what this means is that there might be a need to entirely reconfigure applications running on the new system. Catalyst would most likely need some rewrite depending on how radically different the interface.
I can't speak for Richard here but from my own development observations, are that while Snow Leopard will "look the same" to end users, it will be radically different inside and Catalyst won't work quite as well (yet).
Hi All
I have been playing with Snow Leopard for a few days now. My initial testing of Catalyst on a base machine, ie no input cards or other third party add ons seems very positive. Tested using OCZ SSD's and 4870 & 285 graphics cards.
As mentioned before, everyone needs to do alot of testing on their systems to check compatibility with input cards and other interfaces before this should go anywhere near a show!
Nev.
No- catalyst should work without any changes at all in my code.
and so should all previous versions of catalyst.
if it doesnt work apple has broken something that needs to be fixed.
a carbon/cocoa API divide is misunderstood by many people- and its not relevant.
Better watch out for this one....
They changed the default screen gamma... ho ho ho...
http://support.apple.com/kb/HT3712
Also - the standard Quicktime player in Snow Leopard has the edit/save features of Quicktime PRO built in - so no need to buy Quicktime PRO! As part of the Snow Leopard upgrade process, the installer moves QuickTime Player Pro 7 to the Utilities folder, so it’s still there if you want it.
Nev.
Hi All
for those who have asked - some figures from my test system consisting of the following:
8 Core 2.26 Intel Xeon 5500 series processors
6gb RAM
250gb OCZ Vertex SSD
Tested with ATI Radeon HD 4870 & EVGA GTX 285
2 x 1280 x 1024 Monitors
Test movies created in AIC at 720x576 25fps
OS 10.5: 9-10 layers
OS 10.6: 19-20 layers
Nev.
Have you tested HD content playback?
were these results the same using both graphics card mentioned?
I am running 10.6 on my hot backup machines. They are all outperforming the 10.5.8 machines.
I have been able to run 3 layers 1920x1080x29.97 having Audio with minimal frame drop.
8 core 2.66 processors
Mtron 7500 SSDs
8800 Graphics
Phoenix 1 lane 2 HD input
I can run 2 layers of the same and 2 layers of 720x480@29.97 at same time with no frame loss.
I'm on tour and testing during down time. My last leg I didn't get as much testing time as I wanted.
I also have a slow SSD which I need to zero and defrag content on. With 100Gb of show content, that is going to take a while. Once I do that I can run tests.
I'll report more on results after the weekend.
Oh and Apple's brilliant :confused: idea of changing the Gamma was a bit annoying to correct.
Do not Defrag SSD. It will only wear down the flash memory!
http://www.tomshardware.co.uk/forum/...6283_14_0.html
It's not worth it...
i would agree with this -
but maybe todd has heard something else?
it says here :
http://www.tomshardware.co.uk/OCZ-AP...ews-30096.htmlQuote:
OCZ also warns on their info page that “Solid State Drives DO NOT require defragmentation. It may decrease the lifespan of the drive.”
This is nothing to actually be overly concerned about as the theoretical re-write limits for each sector in a Solid State Drive are going to outlive the use of the drive. It is just that defragmenting (although not necessary) creates an excessive amount of write cycles on any drive. Solid State Drives are designed so that data is written evenly to all sectors – this is what the industry refers to as “Wear Leveling.” So feel free to fill your drive full of random data just so you can see how fast it defrags for kicks; you will not harm anything, but do not do it on a regular basis unless you want to lower the MTBF of the drive down to mechanical HDD standards.
In brief I will agree. Defragmenting a content SSD drive is not a good idea. However, here is a better explanation:
I run the Disk Utility to Zero the Disk.
I Defrag the content on a separate HD and copy back to the SSD
Let me clarify how I will go about this. First, I have all the content copied and verified to backup disk. On the backup disk I run a defragmenter and series of tools so that the content is consolidated and orderly on my backup disk.
One the content is backed up, I zero the data on the SSD using one pass. Then I copy the content back to the SSD.
In this method, I only conduct two write cycles to the SSD.
The reason defragging an SSD is a bad ideas is because the defragging cycle can literally write hundreds of times to a specific zone or specific sectors on a disk during the cleaning cycle.
Another paradigm to consider is the use of the SSD in Catalyst. Ideally, we are not constantly read/writing like other industries. We (hopefully) write once or very few times and then simply read back repeatedly.
Now for a moment consider my specific application. I have a show where the total combined content creation added up to about 800GB of show files. Of course we threw out about 80% of that but we didn't know what was staying or going until the show was on the road.
Now if I had been able to, I would have justified an xServe RAID during preproduction which would have run all the servers. Since I didn't get the budget I needed for content, I had to settle for writing and deleting to the SSDs repeatedly.
Now I mention xServeRAID here for a reason. The show uses multiple servers and a single fast reliable repository for content means copying one time. As it turns out, the workflow I was subject to required tremendous amounts of wasted time waiting for files to copy.
Anyone who's copied a large volume of data to an SSD knows exactly what I'm talking about.
Had I used an xServeRAID, all the content creators on site could have been connected to a single repository through fibre and that repository being the same one I use to run the rehearsals. Instead, using the Gigabit network on copper meant that each time a file was updated it had to be copied successively from machine to machine with my content machine being an intermediary.
Of course some of this is beside the point but it paints a picture of how the content can get so messy and require cleanup from disk utilities.
I'm using 128Gb drives in all my machines, I really only have about 100Gb of safe limit before the disks get too full. (Following the 20% open space rule.)
Having to change content, revise files, and add new ones requires a lot of copying and a lot of deleting. This means dozens of rewrites to and from each SSD.
I have theoretically moved more than 2Tb of data across this disk for this show alone so the machines are do for a cleanup.
By zeroing the disk and rewriting the content to it, I might shorten the life yes, but not really in comparison to using the disk as a constant read/write/rewrite device. This show has two years in front of it so it's not like it's something I'll be doing again soon. So zeroing one time so that all the disks do are reads for the next 120 shows is not a bad trade off.
There are several other articles which talk about SSDs. Specifically that using MLS disks will cause adjacent memory locations to favor one another which causes the reader to misinterpret on first pass if the sectors have been rewritten repeatedly. By zeroing a disk, new data which writes will not be as easily compromised by adjacent cell degradation.
For more info search SSD pros and cons keeping in mind that SLCs have a greater lifespan than MLCs.
Well to get back on topic my first test results on snow leapord:
late 2008 8 core mac pro
8800 gt
ocz ssd
cat 167
using 720p files.. all around 150mb in size snow leopard allows for 1-2 more layers to be run... total of 7
I would say that so far it is a verifiable improvement... will test LFG cards next week if nobody beats me to it.
Thanks Anthony
I'll let ya know when I get a chance
I have a few things on my touring systems which have limited playback (locking to interlaced and exact frame rates). However, I am successfully running my backup machines on Snow Leopard.
I tried to crash things today. I caused six crashes by running FX generator on multiple servers from a lighting desk. Obviously something I would never do during a real show. Each of the crashes were on the Primary Leopard systems and never a single crash on the Snow Leopard backup machines.
Although I don't have many regular SD test files, I loaded some of the original HES stock and was able to run 13 layers without frame drop at locked 29.97fps. I haven't tested yet at 25fps and won't with this configuration.
I am back in my office on Monday and will use some of my test servers to produce better reports. I'll be testing on one each of 2007 Pro (Mtron &x1900), 2008 Pro (OCZ & 8800), and 2009 Pro (OCZ 4870). I might try a Quatro FX 5600 also to see if there's a difference. However this is trivial for most of our applications.
It would appear,through limited testing, that the lfg cards work fine in snow leopard. Haven't done anything crazy but all 8 of my inputs seem to play nice.. I did notice that some of the input fx had issues like leaving trails etc but this resolved itself after a reboot... not sure what to think. On a side note, turning on input fx did crash my leopard os on same machine... seems like I remember reading something about that.. whats the deal?
I have spoken to Active Silicon who seem quite happy with their cards running on Snow Leopard - good to have some more reports
Nev.
anyone had any issues with visual effects crashing in snow leopard????...especially ones that have the "I" menu ??
the little extra menu that u can click on in the visual fx library page on certain fx.
i cant get 181 - 228 to edit when i choose them
at least when i go to the libraries, choose visual fx, then scrolling down and hitting the "I" button to bring up the extra edit controls
there def are more options.. sometimes it works sometimes it doesnt.. on 2 seperate machines... havent had time to compare to leopard... but for instance the pixelize round and square often don't work.. but sometimes do..as do the extra menu options, and sometimes dinkin around in the visual fx crashes the entire program.
i thought the "I" button was for making user presets and other menu options.....and manipulating these effects in .167 is causing crashes
ok, so it look like the "I" menu works for making user presets for color fx....i assumed it would for visual fx as well......anyone else having considerable crashes in visual fx in snow leopard?