Diet culture has always been viral in American culture, focusing on losing weight and toning fat. While it has taken America by storm, one of the primary vital components is the concept that all carbs are bad for your body because they make you gain weight.