Primate Labs launches Geekbench AI benchmarking software

Primate Labs has formally launched Geekbench AI, a benchmarking software designed particularly for machine studying and AI-centric workloads.

The discharge of Geekbench AI 1.0 marks the fruits of years of growth and collaboration with clients, companions, and the AI engineering neighborhood. The benchmark, beforehand often called Geekbench ML throughout its preview part, has been rebranded to align with {industry} terminology and guarantee readability about its function.

Geekbench AI is now accessible for Home windows, macOS, and Linux by means of the Primate Labs web site, in addition to on the Google Play Retailer and Apple App Retailer for cell units.

Primate Labs’ newest benchmarking software goals to offer a standardised methodology for measuring and evaluating AI capabilities throughout completely different platforms and architectures. The benchmark gives a novel method by offering three general scores, reflecting the complexity and heterogeneity of AI workloads.

“Measuring efficiency is, put merely, actually laborious,” defined Primate Labs. “That’s not as a result of it’s laborious to run an arbitrary check, however as a result of it’s laborious to find out which exams are an important for the efficiency you need to measure – particularly throughout completely different platforms, and notably when everyone seems to be doing issues in subtly alternative ways.”

The three-score system accounts for the numerous precision ranges and {hardware} optimisations present in fashionable AI implementations. This multi-dimensional method permits builders, {hardware} distributors, and lovers to achieve deeper insights into a tool’s AI efficiency throughout completely different situations.

A notable addition to Geekbench AI is the inclusion of accuracy measurements for every check. This function acknowledges that AI efficiency isn’t solely about pace but additionally in regards to the high quality of outcomes. By combining pace and accuracy metrics, Geekbench AI supplies a extra holistic view of AI capabilities, serving to customers perceive the trade-offs between efficiency and precision.

Geekbench AI 1.0 introduces assist for a variety of AI frameworks, together with OpenVINO on Linux and Home windows, and vendor-specific TensorFlow Lite delegates like Samsung ENN, ArmNN, and Qualcomm QNN on Android. This broad framework assist ensures that the benchmark displays the newest instruments and methodologies utilized by AI builders.

The benchmark additionally utilises extra intensive and various datasets, which not solely improve the accuracy evaluations but additionally higher characterize real-world AI use instances. All workloads in Geekbench AI 1.0 run for no less than one second, permitting units to succeed in their most efficiency ranges throughout testing whereas nonetheless reflecting the bursty nature of real-world functions.

Primate Labs has printed detailed technical descriptions of the workloads and fashions utilized in Geekbench AI 1.0, emphasising their dedication to transparency and industry-standard testing methodologies. The benchmark is built-in with the Geekbench Browser, facilitating straightforward cross-platform comparisons and end result sharing.

The corporate anticipates common updates to Geekbench AI to maintain tempo with market adjustments and rising AI options. Nevertheless, Primate Labs believes that Geekbench AI has already reached a stage of reliability that makes it appropriate for integration into skilled workflows, with main tech firms like Samsung and Nvidia already utilising the benchmark.

(Picture Credit score: Primate Labs)

See additionally: xAI unveils Grok-2 to problem the AI hierarchy

Wish to study extra about AI and large information from {industry} leaders? Take a look at AI & Massive Information Expo happening in Amsterdam, California, and London. The excellent occasion is co-located with different main occasions together with Clever Automation Convention, BlockX, Digital Transformation Week, and Cyber Safety & Cloud Expo.

Discover different upcoming enterprise expertise occasions and webinars powered by TechForge right here.

Tags: ai, synthetic intelligence, benchmark, geekbench, geekbench ai, machine studying, primate labs, instruments