Difference Between Time Utilization & Dollar Utilization
Businesses that rent out equipment to customers typically measure the rental's performance with two metrics: time utilization and dollar utilization. The first tells you how often equipment is being rented. The second tells you how much return the company is getting on its investment in that equipment. Both metrics can be applied to individual pieces of equipment or to a company's entire inventory.
Time utilization tells you the percentage of "rentable" time that your equipment is actually being rented out. For example, if you rent cars by the day and have a fleet of 100 cars, then you have a total of 36,500 rentable days each year. If you have rentals for a total of 25,000 of those days, then your time utilization is 25,000 divided by 36,500, or about 68.5 percent.
To measure dollar utilization, divide your annual rental revenue by the cost of the equipment being rented. If the equipment in your rental inventory cost a total of, say, $300,000 and you had $165,000 in rental revenue, then your dollar utilization is 55 percent. Average figures vary by industry. For example, according to the trade publication "Rental," a 65 percent rate is considered acceptable for national equipment-rental chains, while stores that rent party equipment commonly see rates of 150 percent.
The goal of a rental company is to maximize dollar utilization, not time utilization. However, the two are inextricably linked. Dollar utilization depends on rental revenue, which is determined by time utilization, which itself is influenced by rental rates. High time utilization is not a good thing, for several reasons. First, it means the company might have to turn away customers if the equipment is out on rent all the time. It also suggests that the rates it charges are too low, and it produces heavy wear and tear on the equipment. "Rental" suggests shooting for a "sweet spot" of 60 percent to 70 percent time utilization.