Primate Labs launches Geekbench AI benchmarking software

Primate Labs has formally launched Geekbench AI, a benchmarking software designed particularly for machine studying and AI-centric workloads.

The discharge of Geekbench AI 1.0 marks the end result of years of growth and collaboration with clients, companions, and the AI engineering neighborhood. The benchmark, beforehand often known as Geekbench ML throughout its preview section, has been rebranded to align with {industry} terminology and guarantee readability about its objective.

Geekbench AI is now accessible for Home windows, macOS, and Linux by the Primate Labs web site, in addition to on the Google Play Retailer and Apple App Retailer for cellular units.

Primate Labs’ newest benchmarking software goals to supply a standardised technique for measuring and evaluating AI capabilities throughout completely different platforms and architectures. The benchmark gives a singular strategy by offering three total scores, reflecting the complexity and heterogeneity of AI workloads.

“Measuring efficiency is, put merely, actually laborious,” defined Primate Labs. “That’s not as a result of it’s laborious to run an arbitrary take a look at, however as a result of it’s laborious to find out which exams are an important for the efficiency you need to measure – particularly throughout completely different platforms, and significantly when everyone seems to be doing issues in subtly alternative ways.”

The three-score system accounts for the numerous precision ranges and {hardware} optimisations present in fashionable AI implementations. This multi-dimensional strategy permits builders, {hardware} distributors, and fanatics to achieve deeper insights into a tool’s AI efficiency throughout completely different eventualities.

A notable addition to Geekbench AI is the inclusion of accuracy measurements for every take a look at. This characteristic acknowledges that AI efficiency isn’t solely about pace but in addition concerning the high quality of outcomes. By combining pace and accuracy metrics, Geekbench AI offers a extra holistic view of AI capabilities, serving to customers perceive the trade-offs between efficiency and precision.

Geekbench AI 1.0 introduces help for a variety of AI frameworks, together with OpenVINO on Linux and Home windows, and vendor-specific TensorFlow Lite delegates like Samsung ENN, ArmNN, and Qualcomm QNN on Android. This broad framework help ensures that the benchmark displays the newest instruments and methodologies utilized by AI builders.

The benchmark additionally utilises extra intensive and various datasets, which not solely improve the accuracy evaluations but in addition higher symbolize real-world AI use circumstances. All workloads in Geekbench AI 1.0 run for no less than one second, permitting units to achieve their most efficiency ranges throughout testing whereas nonetheless reflecting the bursty nature of real-world functions.

Primate Labs has revealed detailed technical descriptions of the workloads and fashions utilized in Geekbench AI 1.0, emphasising their dedication to transparency and industry-standard testing methodologies. The benchmark is built-in with the Geekbench Browser, facilitating simple cross-platform comparisons and consequence sharing.

The corporate anticipates common updates to Geekbench AI to maintain tempo with market adjustments and rising AI options. Nonetheless, Primate Labs believes that Geekbench AI has already reached a stage of reliability that makes it appropriate for integration into skilled workflows, with main tech firms like Samsung and Nvidia already utilising the benchmark.

(Picture Credit score: Primate Labs)

See additionally: xAI unveils Grok-2 to problem the AI hierarchy

Need to be taught extra about AI and massive knowledge from {industry} leaders? Try AI & Huge Information Expo happening in Amsterdam, California, and London. The great occasion is co-located with different main occasions together with Clever Automation Convention, BlockX, Digital Transformation Week, and Cyber Safety & Cloud Expo.

Discover different upcoming enterprise expertise occasions and webinars powered by TechForge right here.

Tags: ai, synthetic intelligence, benchmark, geekbench, geekbench ai, machine studying, primate labs, instruments