The code testing if a temperature should be emulated or not is not obvious. Add a comment explaining why this test is done.
Signed-off-by: Sascha Hauer <s.ha...@pengutronix.de> Reviewed-by: Mikko Perttunen <mperttu...@nvidia.com> --- drivers/thermal/thermal_core.c | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/drivers/thermal/thermal_core.c b/drivers/thermal/thermal_core.c index 3e0fe55..e204deb 100644 --- a/drivers/thermal/thermal_core.c +++ b/drivers/thermal/thermal_core.c @@ -435,6 +435,11 @@ int thermal_zone_get_temp(struct thermal_zone_device *tz, int *temp) } } + /* + * Only allow emulating a temperature when the real temperature + * is below the critical temperature so that the emulation code + * cannot hide critical conditions. + */ if (!ret && *temp < crit_temp) *temp = tz->emul_temperature; } -- 2.1.4 -- To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majord...@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html Please read the FAQ at http://www.tux.org/lkml/