There is a famous english proverb: “Health is Wealth”. However, I now believe that “Health is more important than wealth”.
Its easier to regain “wealth” after losing it. However, it may not be so easy to regain back health.
How much importance do we give to health in our lives?
Do we tell little children how important health is right from the beginning? Do we make them understand? Are there any special courses designed to explain the importance of health?
As an adult, I think I still don’t understand the importance of health at times, or fail to realize or “accept” it. I get tempted to eat chocolates, brownies, junk foods, although I know they may not be the healthiest foods.
It is important to consciously tell yourself what is healthy and what is not.Its not easy but its essential. I wish we all focused on health(physical and mental) and not wealth.
Imagine if parents told their kids that…
“Until the age of 25, you have to earn as much good health as you can.
From age 25-40 try to maintain good health levels…
From age 40-55 ensure that you take extra care…
Age 55 and after, try to remain as happy as you can…”
“Take out time to exercise throughout your life.”
Can “maintaining good health” become a culture? Would people start appreciating other people based on how healthy they are, and not how wealthy they are?
It is important for every individual to bring the health discipline in his/her life.
Not next week, not tomorrow, but TODAY -> RIGHT NOW.