One of the most annoying arguments (besides the whole socialization issue) I hear is that parents need an education to teach. Teachers have to be certified so why aren’t parents. I don’t know about you, but I wasn’t given a degree to teach my kid how to walk, talk, ride a bike, tie his shoes, learn his ABC’s eat with a spoon or any other of the myriad things we teach our young children. So why is it, that at some point, magically, we shouldn’t be allowed to teach our kids anymore? Why do we need a degree all of a sudden? Why do we need to be taught to teach our kids?
Teachers get a degree so they can learn how to teach a classroom full of different kids, with different needs. They are taught classroom organization, policy and procedure and all those other things they need to survive in the school environment. Parents don’t need to be taught how to organize their kids. We do this on a daily basis. We have time to sit down one-on-one with them and find out how they LEARN!
This argument bugs me on so many levels. When did we, as parents, decide to give up our right to teach our kids?