Existing counter-sniper systems use acoustic, visual, infra-red, or electromagnetic signals related
to gunfire to determine the bearing or the exact location of the shooter. Interestingly, the majority
of the successful systems are based on acoustic measurements. The distinctive observable sound is
either originated from the muzzle blast or the acoustic shock wave, the sonic boom produced by a
supersonic projectile The main limiting factor in these systems is the requirement for line of sight,
which is a major impediment in urban environment. In fact, the performance of most of the current
acoustic systems significantly degrades when used in the concrete jungle, since some of the few
available sensor readings are typically corrupted by multipath effects.
PinPtr, our acoustic system, takes advantage of sensor network technology to eliminate this problem.
Instead of using a few expensive acoustic sensors, a low-cost ad-hoc acoustic sensor network measures
both the muzzle blast and shock wave to accurately determine the location of the shooter and the
trajectory of the bullet. The basic idea is simple: using the arrival times of the acoustic events
at different sensor locations, the shooter position can be accurately calculated using the speed
of sound and the location of the sensors.
The PinPtr prototype uses up to a hundred sensors that can be deployed either manually or by other means.
After deployment the sensors automatically establish an ad-hoc communication network, perform self-localization,
establish a common time-base, and the system is ready to use. Whenever an event is detected, the time of arrival
is measured and the information is propagated to the base station through the network using a specially
tailored data aggregation and routing service.
PintPtr was built on top of the UC Berkeley MICA2 mote platform running TinyOS. While it is an excellent hardware
and software platform for sensor network applications, the severe resource constraints prohibit the implementation
of muzzle blast and shockwave detection on the mote itself. Therefore, a custom acoustic sensor daughtercard,
was developed where the necessary signal processing algorithms for signal detection and time-stamping are
executed on the on-board FPGA. The mote itself runs the operating system, different middleware services, such
as time synchronization, data aggregation and message routing, and different application specific routines.
The PinPtr system was demonstrated and evaluated in a US Army test facility providing a realistic urban
environment. Using 60 sensors in a 50x100 meter area, the accuracy of the system was better than 1 meter on
average in three dimensions, that is PinPtr was able to determine the exact window a particular shot was
taken from. The latency of the system was better than 2 seconds.
The prototype user interface of PinPtr showing an overhead shot of the test area. The large red circle shows
the estimated shooter position with the line indicating the direction of the shot. The coordinates including
the elevation (7.13m) are displayed in the top right corner. Green dots show the sensor locations. Small ones
indicate no or unused measurements, while medium size circles denote the sensors whose data were used in the
current position estimation.
The prototype 3D user interface showing the same shot. Blueish spheres indicate sensor positions.