Answer :
Final answer:
Heat detectors are the fire detection systems that use heat in their normal operation to detect fires, responding when the ambient temperature reaches a specific threshold. They can be tested by applying controlled heat to ensure they trigger at the correct temperature.
Explanation:
Among the fire detection systems listed, heat detectors are designed to use heat in their normal operation for detecting fires. Heat detectors are specifically engineered to respond when the ambient temperature reaches a predetermined threshold. Unlike smoke, flame, or gas detectors, which detect specific substances or types of radiation indicative of a fire, heat detectors rely on the direct effect of heat to trigger an alarm.
This makes them particularly useful in areas where other types of detectors could be triggered by non-fire related phenomena. For example, smoke detectors might alarm due to dust or steam, flame detectors could be falsely triggered by sunlight, and gas detectors may alarm due to non-hazardous chemicals in the air.
In the context of testing or maintenance, heat detectors can be tested by applying a controlled amount of heat to the device to ensure it triggers correctly at its set temperature point. This essential feature helps maintain the reliability and effectiveness of the fire detection system.