For my University enrolment, I faced the dilemma of whether to take academic or practical studies. I believed that a university should not be a vocational training institution when I was a teenager. To suppose that I should learn to absorb education rather than prepare for a job was a bit idealistic. As a result, I chose not to major in practical areas like accounting.
Meanwhile, despite my interest in subjects such as philosophy and history, I reasoned that I could self-study them in my spare time rather than pay a high tuition cost for a degree. As a result, chemistry was my first choice because the ability to experiment with costly requirements would be worth the money I would spend on the institution.
Nowadays, I am more mature and practical; looking back, I don't believe I was mistaken, but I had to accept the reality that changing my career, which is unrelated to my major, would not be a simple transition. After graduating with a bachelor's degree in science, I worked in a kitchen utensil testing laboratory and checked for dangerous chemical leakage from plastic container samples. Although the employment was related to my studies, I realized it was not a career path I wanted to pursue. It was a monotonous job with few chances for advancement.
I took an interest in programming, and I eventually moved my career to information technology. I was studying programming skills day and night to survive in this difficult, grown-up environment. As a junior software developer, I learned a lot, from basic data structures to complicated cloud architecture. The more I learn, the more I realize how much I didn't know before.
Imposter syndrome is a regular occurrence in the information technology business. I'm no exception, especially when working as a consultant, which requires me to learn something new quickly and act as an expert in front of a client. I went through a phase where I pretended to be someone I wasn't until I became someone I wasn't. The I.T. industry is evolving so quickly that I have to keep up with new information.
I would have learned how to prioritize if I had done things differently in university. The school didn't educate me on how to learn; instead, they focused on knowledge transmission. Academics are the focus of the school, not skill-based development. Instead, I'd like to learn more about knowledge metadata.
My academic life would be much easier if I learned the knowledge since I would have studied more quickly and had more time to devote to other important things, such as extracurricular activities. Because education does not end with graduation, it would also equip me for lifelong study. It was only the beginning of my need to self-study a variety of life skills, such as public speaking, writing, and networking, to advance in my job. I could have made better decisions if I had read more books and gained a greater understanding of the world.
After graduation, I didn't have the opportunity to employ the mathematical second order differential equation in my regular job. Instead, I believe that in an increasingly competitive world, the quality of my thinking will provide me with an advantage, whether it's an idea that opens new doors, a technique that solves issues, or an insight that makes sense of everything. I can go brighter and quicker the more I know. That is why I should diverge my learning from academic to skill-based, like the most brilliant minds with cutting-edge thinking and best learning approaches.
One of the most important things I can do for my brain's health is to learn. Even as I become older, I can continue to expand my brain's capabilities. I continue to generate new pathways in my brain as long as I continue to learn. It maintains my brain flexibility and fluidity, allowing me to process further information in useful ways. This approach of learning is especially true if I set actual difficulties for myself on my educational path.
I should have exercised more because it alters the brain's memory and reasoning abilities. According to researchers, regular aerobic exercise increases the brain area involved in language memory and learning.
To capitalize on the good feature of learning new things, we should cultivate a learning culture that allows people to fail. We should be encouraged to learn from our failures and see them as opportunities to improve by adapting to changing circumstances and making the most of experiences. By being curious, we should not be exposed and embarrassed.
We, humans, are, at our core, social beings who learn best from and in the company of others. The value of being a good role model is underscored by the power of tales and the flow of ideas between friends.
Aside from that, I'd have to learn how to learn and practice to improve. Practice leads to perfection, and perfect practice leads to ideal learning. Constant learning improves my overall ability to learn, much like practising a specific skill in that activity. Knowledge alters the structure of the brain, enhancing my ability to learn even more. In other words, knowing how to learn is a skill in and of itself, and it may help you learn more effectively in a range of areas.
Staying in my comfort zone is a great way to prepare for the present, but it's a terrible way to prepare for the future. I need to acquire learning agility and rapid, continuous learning from events to maintain my achievements. Agile learners can connect experiences and let go of ideas or techniques that are no longer relevant. In other words, when new solutions are necessary, they can unlearn. People with this perspective are more likely to pursue learning objectives and to be open to new experiences. I should go through the motions of experiencing, seeding feedback, and reflecting systematically.
Developing a learning culture necessitates acclimating to change. Understanding and accepting that change is inevitable, necessary, and beneficial is essential for success in today's workplace. We must view change as a continuing opportunity rather than a danger.