-
1.
Energy Balance of Canadian Armed Forces Personnel during an Arctic-Like Field Training Exercise.
Ahmed, M, Mandic, I, Desilets, E, Smith, I, Sullivan-Kwantes, W, Jones, PJ, Goodman, L, Jacobs, I, L'Abbé, M
Nutrients. 2020;(6)
Abstract
Operating in temperature extremes frequently leads to a discrepancy in energy balance. Investigating the effects of operating in extreme cold temperatures on metabolic requirements has not been well described in Canadian Armed Forces (CAF) personnel. The objective was to accurately assess energy deficits using the "gold standard" methodology for measuring energy intake (EI) and energy expenditure (EE). Nutritional intake of a convenience sample of 10 CAF Class A Reservists, completing a basic military qualification (land) course under winter weather conditions, was assessed using the daily measured food intake/food waste collections. EE was measured by the doubly-labelled water method. Average EI was 2377 ± 1144 kcal/day, which was below the EE (4917 ± 693 kcal/day), despite having ~5685 kcal available in the field rations. A significant body weight loss of 2.7% was associated with the average daily energy deficit of 2539 ± 1396 kcal. As a result, participants demonstrated voluntary anorexia. Such results may have important implications for the impairment of performance and health under longer duration operations.
-
2.
Monitoring training and recovery responses with heart rate measures during standardized warm-up in elite badminton players.
Schneider, C, Wiewelhove, T, McLaren, SJ, Röleke, L, Käsbauer, H, Hecksteden, A, Kellmann, M, Pfeiffer, M, Ferrauti, A
PloS one. 2020;(12):e0244412
Abstract
PURPOSE To investigate short-term training and recovery-related effects on heart rate during a standardized submaximal running test. METHODS Ten elite badminton players (7 females and 3 males) were monitored during a 12-week training period in preparation for the World Championships. Exercise heart rate (HRex) and perceived exertion were measured in response to a 5-min submaximal shuttle-run test during the morning session warm-up. This test was repeatedly performed on Mondays after 1-2 days of pronounced recovery ('recovered' state; reference condition) and on Fridays following 4 consecutive days of training ('strained' state). In addition, the serum concentration of creatine kinase and urea, perceived recovery-stress states, and jump performance were assessed before warm-up. RESULTS Creatine kinase increased in the strained compared to the recovered state and the perceived recovery-stress ratings decreased and increased, respectively (range of average effects sizes: |d| = 0.93-2.90). The overall HRex was 173 bpm and the observed within-player variability (i.e., standard deviation as a coefficient of variation [CV]) was 1.3% (90% confidence interval: 1.2% to 1.5%). A linear reduction of -1.4% (-3.0% to 0.3%) was observed in HRex over the 12-week observational period. HRex was -1.5% lower (-2.2% to -0.9%) in the strained compared to the recovered state, and the standard deviation (as a CV) representing interindividual variability in this response was 0.7% (-0.6% to 1.2%). CONCLUSIONS Our findings suggest that HRex measured during a standardized warm-up can be sensitive to short-term accumulation of training load, with HRex decreasing on average in response to consecutive days of training within repeated preparatory weekly microcycles. From a practical perspective, it seems advisable to determine intra-individual recovery-strain responses by repeated testing, as HRex responses may vary substantially between and within players.
-
3.
The effectiveness of military physical exercise on irisin concentrations and oxidative stress among male healthy volunteers.
Jawzal, KH, Alkass, SY, Hassan, AB, Abdulah, DM
Hormone molecular biology and clinical investigation. 2020;(3)
Abstract
Background Irisin, a newly discovered hormone, is secreted into the circulation from skeletal muscles in response to physical exercise. The biochemical parameters related to irisin secretion have not been sufficiently investigated yet. The aim of this study was to examine the effectiveness of exercise on the level of irisin and its correlation with biochemical and oxidative stress parameters. Materials and methods In this pre- and post-test observational study, 39 healthy male volunteers from a military training setting were followed up on between September and November 2015. The individuals who were included in this study were between 22 and 27 years old with an average age of 24. Those with inflammatory disorders or chronic diseases such as diabetes mellitus were excluded from the study. The parameters were measured at the baseline, at 4 weeks, and at 8 weeks of intervention. Results The study found that the systolic and diastolic blood pressures substantially decreased after 8 weeks of intervention. The cholesterol-to-HDL ratio and glucose levels were significantly higher at the baseline compared to 8 weeks. Total protein and albumin were significantly higher following 4 weeks (0.25 and 0.21 g/dL) and 8 weeks (0.32 and 0.16 g/dL), respectively. Meanwhile, total globulin and irisin increased following 8 weeks of the intervention by only 0.16 g/dL and 0.41 μg/mL, respectively. The high sensitivity C-reactive protein (hs-CRP) decreased following 8 weeks (-0.81 μg/mL). The protein carbonyl (PC) decreased following 4 weeks by only 0.34 nmol/L. Conclusions This study demonstrated that military training enhanced irisin hormone secretion following 8 weeks of military exercise.
-
4.
Training-induced changes in daily energy expenditure: Methodological evaluation using wrist-worn accelerometer, heart rate monitor, and doubly labeled water technique.
Kinnunen, H, Häkkinen, K, Schumann, M, Karavirta, L, Westerterp, KR, Kyröläinen, H
PloS one. 2019;(7):e0219563
Abstract
INTRODUCTION Wrist-mounted motion sensors can quantify the volume and intensity of physical activities, but little is known about their long-term validity. Our aim was to validate a wrist motion sensor in estimating daily energy expenditure, including any change induced by long-term participation in endurance and strength training. Supplemental heart rate monitoring during weekly exercise was also investigated. METHODS A 13-day doubly labeled water (DLW) measurement of total energy expenditure (TEE) was performed twice in healthy male subjects: during two last weeks of a 12-week Control period (n = 15) and during two last weeks of a 12-week combined strength and aerobic Training period (n = 13). Resting energy expenditure was estimated using two equations: one with body weight and age, and another one with fat-free mass. TEE and activity induced energy expenditure (AEE) were determined from motion sensor alone, and from motions sensor combined with heart rate monitor, the latter being worn during exercise only. RESULTS When body weight and age were used in the calculation of resting energy expenditure, the motion sensor data alone explained 78% and 62% of the variation in TEE assessed by DLW at the end of Control and Training periods, respectively, with a bias of +1.75 (p <.001) and +1.19 MJ/day (p = .002). When exercise heart rate data was added to the model, the combined wearable device approach explained 85% and 70% of the variation in TEE assessed by DLW with a bias of +1.89 and +1.75 MJ/day (p <.001 for both). While significant increases in TEE and AEE were detected by all methods as a result of participation in regular training, motion sensor approach underestimated the change measured by DLW: +1.13±0.66 by DLW, +0.59±0.69 (p = .004) by motion sensor, and +0.98±0.70 MJ/day by combination of motion sensor and heart rate. Use of fat-free mass in the estimation of resting energy expenditure removed the biases between the wearable device estimations and the golden standard reference method of TEE and demonstrated a training-induced increase in resting energy expenditure by +0.18±0.13 MJ/day (p <.001). CONCLUSIONS Wrist motion sensor combined with a heart rate monitor during exercise sessions, showed high agreement with the golden standard measurement of daily TEE and its change induced by participation in a long-term training protocol. The positive findings concerning the validity, especially the ability to follow-up the change associated with a lifestyle modification, can be considered significant because they partially determine the feasibility of wearable devices as quantifiers of health-related behavior.
-
5.
Improvement in body composition following a supervised exercise-training program of adult patients with cystic fibrosis.
Prévotat, A, Godin, J, Bernard, H, Perez, T, Le Rouzic, O, Wallaert, B
Respiratory medicine and research. 2019;:5-9
Abstract
OBJECTIVES Maintenance of optimal nutritional status is a crucial issue for cystic fibrosis (CF) patients. Here, we evaluate the effects of an 8-week exercise training (ET) program on body composition in CF patients. METHODS This prospective pilot observational study was conducted in adult CF subjects in stable condition following their annual check-up. The ET program consisted of three sessions per week and included aerobic training (≥30min), muscle strengthening, circuit training, and relaxation. Exercise tolerance (6-minute walk test, 6MWT), pulmonary function, quadriceps isometric strength, and body composition (bioelectrical impedance analysis of fat-free mass [FFM], fat mass, and body cell mass) were analyzed before and immediately after the ET program. A control group of CF patients who preferred not to participate in the ET program received the same evaluations. RESULTS A total of 43 CF patients were enrolled and offered the ET program; 28 accepted (aged 28±5 years, forced expiratory volume in 1s [FEV1] 48.8±19% predicted) and 15 declined the ET program but agreed to be part of the control group (matched for age and CF severity: 30.8±9 years, FEV1 51.8±16.5%). Pulmonary function was unchanged at the end of the ET program, but significant improvements were observed in 6MWT distance (from 520±96m to 562±105m, P<0.001) and muscle strength (331±141N to 379±168N, P<0.001). Although mean body mass index did not change, the ET group showed significantly increased FFM (43.85±8kg to 44.5±9.2kg, P=0.03) and a trend towards increased body cell mass (21.4±6 to 22.1±6.6kg, P=0.06). All other parameters were unchanged by ET. There were no significant correlations between the increase in FFM and the improvements in either 6MWT distance or muscle strength. The CF control group exhibited no significant changes in any parameters between evaluations. CONCLUSIONS ET significantly improved FFM, but not body mass index, in CF patients. The results illustrate the superiority of bioimpedancemetry for assessing changes in body composition and reveal the importance of ET for improving not only exercise tolerance but also nutritional status in these patients.
-
6.
Sex differences in dietary intake in British Army recruits undergoing phase one training.
Chapman, S, Roberts, J, Smith, L, Rawcliffe, A, Izard, R
Journal of the International Society of Sports Nutrition. 2019;(1):59
Abstract
BACKGROUND British Army Phase One training exposes men and women to challenging distances of 13.5 km·d- 1 vs. 11.8 km·d- 1 and energy expenditures of ~ 4000 kcal·d- 1 and ~ 3000 kcal·d- 1, respectively. As such, it is essential that adequate nutrition is provided to support training demands. However, to date, there is a paucity of data on habitual dietary intake of British Army recruits. The aims of this study were to: (i) compare habitual dietary intake in British Army recruits undergoing Phase One training to Military Dietary Reference Values (MDRVs), and (ii) establish if there was a relative sex difference in dietary intake between men and women. METHOD Researcher led weighed food records and food diaries were used to assess dietary intake in twenty-eight women (age 21.4 ± 3.0 yrs., height: 163.7 ± 5.0 cm, body mass 65.0 ± 6.7 kg), and seventeen men (age 20.4 ± 2.3 yrs., height: 178.0 ± 7.9 cm, body mass 74.6 ± 8.1 kg) at the Army Training Centre, Pirbright for 8-days in week ten of training. Macro and micronutrient content were estimated using dietary analysis software (Nutritics, Dublin) and assessed via an independent sample t-test to establish if there was a sex difference in daily energy, macro or micronutrient intakes. RESULTS Estimated daily energy intake was less than the MDRV for both men and women, with men consuming a greater amount of energy compared with women (2846 ± 573 vs. 2207 ± 585 kcal·day- 1, p < 0.001). Both sexes under consumed carbohydrate (CHO) when data was expressed relative to body mass with men consuming a greater amount than women (4.8 ± 1.3 vs. 3.8 ± 1.4 g·kg- 1·day- 1, p = 0.025, ES = 0.74). Both sexes also failed to meet MDRVs for protein intake with men consuming more than women (1.5 ± 0.3 vs. 1.3 ± 0.3 g·kg- 1·day- 1, p > 0.030, ES = 0.67). There were no differences in dietary fat intake between men and women (1.5 ± 0.2 vs. 1.5 ± 0.5 g·kg- 1·day- 1, p = 0.483, ES = 0.00). CONCLUSIONS Daily EI in men and women in Phase One training does not meet MDRVs. Interventions to increase macronutrient intakes should be considered along with research investigating the potential benefits for increasing different macronutrient intakes on training adaptations.
-
7.
Monitoring Blood Biomarkers and Training Load Throughout a Collegiate Soccer Season.
Huggins, RA, Fortunati, AR, Curtis, RM, Looney, DP, West, CA, Lee, EC, Fragala, MS, Hall, ML, Casa, DJ
Journal of strength and conditioning research. 2019;(11):3065-3077
Abstract
Huggins, RA, Fortunati, AR, Curtis, RM, Looney, DP, West, CA, Lee, EC, Fragala, MS, Hall, ML, and Casa, DJ. Monitoring blood biomarkers and training load throughout a collegiate soccer season. J Strength Cond Res 33(11): 3065-3077, 2019-This observational study aimed to characterize the responses of a comprehensive panel of biomarkers, observed ranges, training load (TL) metrics, and performance throughout the collegiate soccer season (August-November). Biomarkers (n = 92) were collected before the start of pre-season (PS), in-season weeks (W)1, W4, W8, and W12 in NCAA Division I male soccer players (n = 20, mean ± SD; age = 21 ± 1 years, height = 180 ± 6 cm, body mass = 78.19 ± 6.3 kg, body fat = 12.0 ± 2.6%, VO2max 51.5 ± 5.1 ml·kg·min). Fitness tests were measured at PS, and W12 and TL was monitored daily. Changes in biomarkers and performance were calculated via separate repeated-measures analysis of variance. Despite similar fitness (p > 0.05), endocrine, muscle, inflammatory, and immune markers changed over time (p < 0.05). Total and free testosterone was lower in W1 vs. PS, whereas free cortisol remained unchanged at PS, W1, and W4 (>0.94 mg·dL). Oxygen transport and iron metabolism markers remained unchanged except for HCT (W1 vs. PS) and total iron binding capacity (W8-W12 vs. W1). Hepatic markers albumin, globulin, albumin:globulin, and total protein levels were elevated (p < 0.05) at W12 vs. W1, whereas aspartate aminotransferase and alanine aminotransferase levels were elevated at W1-W12 and W8-W12 vs. PS, respectively. Vitamin E, zinc, selenium, and calcium levels were elevated (p < 0.05) at W12 vs. W1, whereas Vitamin D was decreased (p < 0.05). Fatty acids and cardiovascular markers (omega-3 index, cholesterol:high-density lipoprotein [HDL], docosahexenoic acid, low-density lipoprotein [LDL], direct LDL, non-HDL, ApoB) were reduced at W1 vs. PS (p ≤ 0.05). Immune, lipid, and muscle damage biomarkers were frequently outside clinical reference ranges. Routine biomarker monitoring revealed subclinical and clinical changes, suggesting soccer-specific reference ranges. Biomarker monitoring may augment positive adaptation and reduce injuries from stressors incurred during soccer.
-
8.
Energy Balance and Diet Quality During the US Marine Corps Forces Special Operations Command Individual Training Course.
Sepowitz, JJ, Armstrong, NJ, Pasiakos, SM
Journal of special operations medicine : a peer reviewed journal for SOF medical professionals. 2017;(4):109-113
Abstract
METHODS This study characterized the total daily energy expenditure (TDEE), energy intake (EI), body weight, and diet quality (using the Healthy Eating Index-2010 [HEI]) of 20 male US Marines participating in the 9-month US Marine Corps Forces Special Operations Command Individual Training Course (ITC). RESULTS TDEE was highest (ρ < .05) during Raider Spirit (RS; 6,376 ± 712kcal/d) compared with Survival, Evasion, Resistance, and Escape (SERE; 4,011 ± 475kcal/d) School, Close-Quarters Battle (CQB; 4,189 ± 476kcal/d), and Derna Bridge (DB; 3,754 ± 314kcal/d). Body mass was lost (ρ < .05) during SERE, RS, and DB because EI was less than TDEE (SERE, -3,665kcal/d ± 475kcal/d; RS, -3,966 ± 776kcal/d; and DB, -1,027 ± 740kcal/d; p < .05). However, body mass was restored before the start of each subsequent phase and was not different between the start (86.4 ± 9.8kg) and end of ITC (86.7 ± 9.0kg). HEI score declined during ITC (before, 65.6 ± 11.2 versus after, 60.9 ± 9.7; p < .05) because less greens or beans and more empty calories were consumed (ρ < .05). Dietary protein intake was lowest during RS (0.9 ± 0.4g/kg) compared with all other phases, and carbohydrate intake during RS (3.6 ± 1g/kg), CQB (3.6 ± 1.0g/kg), and DB (3.7 ± 1.0g/kg) was lower than during the academic phase of SERE (5.1 ± 1.0g/kg; p < .05). CONCLUSION These data suggest that ITC students, on average, adequately restore body mass between intermittent periods of negative energy balance. Education regarding the importance of maintaining healthy eating patterns while in garrison, consuming more carbohydrate and protein, and better matching EI with TDEE during strenuous training exercises may be warranted.
-
9.
Strategies for injury prevention in Brazilian football: Perceptions of physiotherapists and practices of premier league teams.
Meurer, MC, Silva, MF, Baroni, BM
Physical therapy in sport : official journal of the Association of Chartered Physiotherapists in Sports Medicine. 2017;:1-8
Abstract
OBJECTIVES To describe the physiotherapists perceptions and the current practices for injury prevention in elite football (soccer) clubs in Brazil. DESIGN Cross-sectional study. SETTING Group of Science in Sports & Exercise, Federal University of Healthy Sciences of Porto Alegre (Brazil). PARTICIPANTS 16 of the 20 football clubs involved in the Brazilian premier league 2015. MAIN OUTCOME MEASURES Physiotherapists answered a structured questionnaire. RESULTS Most physiotherapists (∼88%) were active in design, testing and application of prevention programs. Previous injury, muscle imbalance, fatigue, hydration, fitness, diet, sleep/rest and age were considered "very important" or "important" injury risk factors by all respondents. The methods most commonly used to detect athletes' injury risk were: monitoring of biochemical markers (100% of teams), isokinetic dynamometry (81%), questionnaires (75%), functional movement screen (56%), fleximetry (56%) and horizontal jump tests (50%). All clubs used strength training, functional training, core exercises and balance/proprioception exercises in their injury prevention program; and Nordic hamstring exercise and other eccentric exercises were used by 94% of clubs. "FIFA 11+" prevention program was adapted by 88% of clubs. CONCLUSION Physiotherapists perceptions and current practices of injury prevention within Brazilian elite football clubs were similar to those employed in developed countries. There remains a gap between clinical practice and scientific evidence in high performance football.