In 1995, an agreement was made at the Food and Agricultural Organization of the United Nation/World Health Organization (FAO/WHO) Expert Consultation that risk assessment should be carried out for biological and chemical agents for food standards issues.1 Since then, risk assessment has been considerably evolved in the last two decades as a science-based activity to inform the regulatory public health policy decision making in agri-food field.2,3 Food safety related risk assessment provides a systematic framework to evaluate the probability of occurrence of adverse health outcomes due to an excess exposure to a hazardous agent or agents from food consumption.2 Currently, there has been important improvement in methodologies of data collection, evidence synthesis, and computational techniques for risk assessments that focus on the microbial and chemical hazards along food supply chains. However, risk assessment applications in nutrition (called nutritional risk assessment here) is beyond the areas that have originally been considered as the public health risk assessment realm.
Although, not advanced as microbial and chemical risk assessment in international and/or national/regional levels, nutritional risk assessment has never ceased and becomes more and more critical to public health protection. The increasing use of functional foods, fortified foods, dietary supplements, formulated foods in recent years has increased the intake of nutrient substances among populations around the world. This results in a growing interest in determining the dietary standard of nutrients to prevent chronic adverse disease outcomes due to their excess intake. For example, nutritional risk assessment has been widely accepted as a standard tool to identify the evidence- and science-based Tolerable Upper Intake Levels (ULs) of nutrient substances.4
In general, nutritional risk assessment shares the same principles of risk assessment which is applied to chemicals in foods other than nutrients, which is divided into four basic components, hazard identification, hazard characterization, exposure assessment, and risk characterization.2 As the first step of ‘hazard identification’, the evidence is collected, evaluated, and synthesized to identify the known and/or potential adverse effects related to the nutrient substances of interest. During ‘hazard characterization’, the association between the nature and extent of the adverse effects and the specific nutrient exposure is described. Especially, on a quantitative basis, the dose-response relationship is usually established that can translate the level of nutrient intake through food consumption to the probability of having different adverse effects. ‘Exposure assessment’ characterizes the intake distribution of the specific nutrient among members of the general population or subpopulations of interest. Finally, outputs from the previous three steps are integrated in ‘risk characterization’ to generate risk estimates.
The major application of nutritional risk assessment is to inform the guidance about diet and supplement intake, such as setting up ULs to avoid the excess intake of nutrients and therefore prevent chronic diseases due to the toxic effect of over-exposed nutrients. In the United States, the Food and Nutrition Board of the Institute of Medicine is the major group working in this field to establish the Dietary Reference Intakes (DRI).4 In addition, a very important unique aspect that distinguishes nutrient and other non-nutrient hazards is that both high and low nutrient intakes are associated with risks.5 Therefore, a risk-risk tradeoff analysis or risk-benefit analysis is critical to describe the whole picture by taking into consideration both the beneficial effects of preventing the adverse effects due to deficient intakes and hazardous effects of promoting the adverse effects due to excess intakes. A risk-benefit analysis mirrors the risk assessment approach, but is a unique method to quantify the balance between health benefits and risks imposed by specific food strategies on population of interest. Hoekstra and colleagues6 illustrated the risk-benefit analysis method with mandatory fortification of folic acid in bread in Netherland as an example.
The current approaches of nutritional risk assessment has their limitations. Since developed from traditional chemical risk assessment, most existing nutritional risk assessments address the effect of a single or relative specific nutrient substances,7 which leaves out the possibility to deal with very complex questions. However, the nature of nutritional risks is complex due to many endpoints, impact factors, and interactions. In addition, risk assessment cannot compensate for a lake of data and knowledge. However, ad hoc data are often utilized for the parameterization of nutritional risk assessment, which indicates the lack of data that were purposely designed for risk assessment practices. This could result in, for example, developing dose-response model based on findings from observational studies due to the lack of randomized controlled trials, which can subsequently lead to misleading interpretations of estimated risks. Therefore, it is crucial for the risk analysts to carefully evaluate uncertainty introduced by lack of high quality data, lack of solid method for dose-response relationship establishment.
In conclusion, although the application of risk assessment in finding answers to the nutrient-related public health questions is still in the early stage, the work of Dietary Reference Intakes (DRI) establishment and risk-benefit analysis studies has shown the benefits to use risk assessment process for nutrition related food standard and food safety issues, and indicated the potential to make further improvement in terms of data collection and methodological research of nutritional risk assessment.