Bandsaw Automated Portioning Saw System with Visual Feedback and Method of Use

Patent Number: US 12356997 Grant Date: 2025-07-15 Filing Date: 2023-01-17

Overview

This patent recounts a sophisticated automated bandsaw system that couples machine vision with motion control to classify end cuts and compute precise cutting depths for portioning whole loins. Filed on January 17, 2023 and granted on July 15, 2025, the claims cover a controller-driven device application that ingests images from multiple visual sensors to locate bone configurations, characterize meat portion geometry, calculate uncut length and volume, and select cutting planes and depths according to operator preferences. Inventors based in Amarillo, TX worked with Midwest Machine, LLC to secure protection for an integrated arrangement of tray and pusher positioning, conveyor handling, and bandsaw actuation tied to on-device and networked computing. The drafting emphasizes modular control architecture, image-based classification of T‑bone versus porterhouse sections, and preference-driven thickness selection—elements that reduce manual sorting, raise throughput, and improve yield consistency. The specification also contemplates IR/thermal sensing, scale-based weight inputs, and cloud-connected data storage for recipe and analytics support.

Key Features

  • Dual visual sensors (overview + end-slice) for classification and measurement
  • Device application that computes cutting depth from image, weight, and geometry
  • Meat positioning assembly with tray and pusher for precise bandsaw engagement
  • Operator preference inputs and networked data/storage options

This patent underscores the firm’s technical depth in drafting claims that bridge mechanical design and computer-vision control. The invention advances automation in meat processing, improving repeatability and operational efficiency across commercial cutting operations and related food‑processing systems.

Invention Details

Abstract: An automated saw wherein: the automated saw comprises one or more visual sensors, a meat positioning assembly. The automated saw is configured to analyze an uncut meat and calculate one or more cutting depths for one or more cut portions from the uncut meat. The uncut meat comprises a first end and a second end. The first end of the uncut meat can be analyzed by the automated saw by: capturing a first slice image of the first end, locating a first bone configuration and a configuration of one or more meat portions in relation to the first bone configuration, and measuring portions of the one or more meat portions to categorize which among one or more cuts of meat is presented at the first end of the uncut meat. The automated saw calculates a first cutting depth for a first meat portion.

Summary of the Invention: An automated saw configured to analyze an uncut meat and calculate one or more cutting depths for one or more cut portions from said uncut meat based on a current cut of meat wherein: said automated saw comprises a controller and one or more visual sensors. Said controller comprises an address space, a processor and a memory. Said controller further comprises a device application stored in said memory and configured to be executed in said memory. Said uncut meat comprises a first end and a second end. Said device application is configured to analyze said uncut meat. Said first end of said uncut meat can be analyzed by said device application by: capturing a first slice image of said first end of said uncut meat using said one or more visual sensors, locating in said first slice image, one or more bone configurations and one or more meat portion configurations in relation to one another, and categorizing a current cut of meat categorization among one or more cuts of meat categorizations according to locations of said one or more bone configurations and said one or more meat portion configurations in said first slice image. Said one or more visual sensors comprises a first visual sensor and a second visual sensor. Said first visual sensor has an overview perspective of said uncut meat. Said device application is configured to calculate an uncut meat length of said uncut meat using images from said first visual sensor. Said second visual sensor comprises a view of said first end of said uncut meat length. Said automated saw captures an image of one or more slice images of said uncut meat length using said second visual sensor. Said device application is configured to calculate said one or more cutting depths for said one or more cuts of meat categorizations corresponding to a preference for each category of said one or more cuts of meat categorizations.

Patent Document

View Full Patent (PDF)


Related: Computer Science & Software | Mechanical Engineering | Robotics & Automation | Food Processing | Robotics | Software And Electronics | Amarillo, Texas | Shannon Warren, Patent Attorney