The capability to discriminate low‐magnitude earthquakes from low‐yield anthropogenic sources, both detectable only at local distances, is of increasing interest to the event monitoring community. We used a dataset of seismic events in Utah recorded during a 14‐day period (1–14 January 2011) by the University of Utah Seismic Stations network to perform a comparative study of event classification at local scale using amplitude ratio (AR) methods and a machine learning (ML) approach. The event catalog consists of 7377 events with magnitudes ranging from and lower up to 5.8. Events were subdivided into six populations based on location and source type: tectonic earthquakes (TEs), mining‐induced events (MIEs), and mining blasts from four known mines (WMB, SMB, LMB, and CQB). The AR approach jointly exploits Pg‐to‐Sg phase ARs and Rg‐to‐Sg spectral ARs in multivariate quadratic discriminant functions and was able to classify 370 events with high signal quality from the three groups with sufficient size (TE, MIE, and SMB). For that subset of the events, the method achieved success rates between about 80% and 90%. The ML approach used trained convolutional neural network (CNN) models to classify the populations. The CNN approach was able to classify the subset of events with accuracies between about 91% and 98%. Because the neural network approach does not have a minimum signal quality requirement, we applied it to the entire event catalog, including the abundant extremely low-magnitude events, and achieved accuracies of about 94%–100%. We compare the AR and ML methodologies using a broad set of criteria and conclude that a major advantage to ML methods is their robustness to low signal‐to‐noise ratio data, allowing them to classify significantly smaller events.