What do you think they should teach at school these days? I think they need more lessons to do with things that are going to be thrown at you in life. Be it to do with careers, money, relationships, dealing with peer pressure or anything. People these days are too focused on things that will get them ahead in the workplace to remember that you need to focus on life outside of your job in order to really live a good life. People lose jobs because of outside influences often these days and I think humanity in general would benefit from this sort of thing. I know I probably would have - luckily I'm doing okay regardless but yeah.