And if you get the reference in the above quote, you are my new favorite person, cause that movie was HORRIBLE.
I’m such a perfectionist, y’all, that I hate being beholden to anyone for a grade. I’ve never liked group work. In fact, when discussing with my department head what types of assessments we wanted to set up for our brand-spanky-new Honors course in the fall, I adamantly opposed any kind of group work, even though group work is part of the honors program. “If these kids are anything like me,” I told her, “then they won’t like their grade being in someone else’s hands.” So no group work for those kids. And a lot of times, the reason for that is because there’s no differentiation between the roles–there’s nobody who specializes in one thing over another, which is typical in the real world group project scenario. One person specializes in this section, another specializes in that section. Each person in a group has a valuable job–in the real world. In the classroom? Everyone’s on the same page.
Which made the group project for 520 a bit unique. Each person in the group had a unique function (though, in our case, we all did everything because one person was on vacation for a week, and then I ran a Tough Mudder another weekend, and another member was away for a chunk of time–so our group isn’t really…a good indicator of group work. That and we’re all perfectionists, but I digress), so each person contributed something different to the final product.
THAT is what good group work is all about–splitting the work evenly between people who are specifically suited to the job. If we could turn classroom group work into a more realistic scenario that translates into actual real world experience, then maybe I wouldn’t hate it so much.
….no. I really, REALLY hate having to depend on anyone else for my grades.
Anyway. If y’all are interested, you can find my group’s final Wiki page project here:
It’s that time again y’all! So I spend a lot of time in this blog talking about what I’ve learned in my class and connecting it to what I do as a historian and as a teacher. This week was an awful lot of video presentations, and how to make things pretty and engaging with Powerpoint. First thing–I finally learned how to record my slides in Powerpoint and turn it into a video! You’ll see that below in a minute. What I really wanted to do for my video blog this week was film myself talking to y’all. But…well…I have anxiety about screwing it up, so I didn’t. Instead, you get a Powerpoint recorded by moi talking about digital history–namely, how we utilize various multimedia in a historical setting.
I realized as I sat waiting for my Powerpoint to turn into a movie that it’s possible that one of the pieces of media I included in a slide wouldn’t work within the movie itself so that video is below.
Here’s the list of the examples I was telling you guys about. I highly recommend Liz Covart’s podcast. Also? She’s pretty prominent on Twitter–@lizcovart. Y’all should check her out.
National Archives: http://www.archives.gov/index.html
Library of Congress American Memory Project: http://memory.loc.gov/ammem/index.html
Ben Franklin’s World podcast: http://www.benfranklinsworld.com/
-Doing History: A Podcast Series about How Historians Work: http://blog.oieahc.wm.edu/doing-history/
If you’re actually wondering what the answer to the question is within my blog’s title for this week, then get out. No, really, #byefelicia. Privacy as a whole is such a HUGE issue across the board–not just digitally, though that’s mostly what I’m writing about this week (don’t even get me started on the political necessities of privacy–we’d be here for DAYS).
For those of you that are still here, you obviously know (or should, anyway) that we ALL need privacy, to a certain extent. We all have thoughts and feelings that, likely, are not fit for public consumption. “But Erica,” you might be saying right now, “of course the people want to know everything I think and say and have for breakfast. After all, that’s what social media is for!”
No, it is not. We, as a society, share FAAAARR too much on social media. Nobody can hold any secrets anymore. Nobody can keep their own thoughts and opinions to themselves. We have become a culture of oversharing, and I am just as guilty of doing so as everybody else is.
My mother always taught me not to discuss three things in mixed company–religion, money, and politics. Funny how Facebook, for example, is the very definition of “mixed company” and yet we have no problems spouting our opinions on Clinton versus Sanders, or Trump, or how God has touched our lives (or not). We have no problem whining about how we don’t make enough money, even when we friend our bosses on Facebook and they can see the worst sides of us.
How the hell did we get here? I took an online quiz put on by the ACLU (the link is in the caption of the screenshot)–do you know what it told me? Here’s a screenshot of my result:
We share SO MUCH INFORMATION. And we have no control over any of it. We have this illusion of privacy, as though we think what we say “in private” on Facebook won’t get around. Ha. The best explanation of why this is wrong comes from Victor Dorff(2016) in an article for the Huffington Post: “Like a teenager who is horrified to learn that the little lock on her diary was insufficient to protect her secrets from a prying sibling, Americans have been surprised over and over again to learn that nothing they say or do is necessarily a sacred secret” (para. 7).
So why do we do it? I can’t answer that question. It only seems to bring strife, for the most part. Story time: my mother and I got into it last week over–what else–disparate political views. We can’t seem to agree to disagree there, and when we were talking (read: arguing), she mentioned that she wasn’t allowed to post what she really thought about certain things because she’d get in trouble or judged. I argued back that she was allowed to, certainly; she just chose not to. And therein lies the fundamental difference between my mother and me–I choose to enter the fray. I choose to vocalize my opinions and beliefs. I choose to share the fun parts of my day on Facebook, just as I choose to share when I’m irritated about something. I am an open book, as I’ve often said, and sometimes that book comes with certain strong political views because something something FIRST AMENDMENT ‘MURICA YEAH!!!
But in all seriousness, digital privacy is starting to become less of a personal issue and more of a political one. I’m sure you all know about the iPhone belonging to the San Bernardino shooter that the FBI couldn’t get into. They wanted Apple to build a code for a back door–one that could then ostensibly be used any time the FBI needed information that they couldn’t get otherwise. There are significant privacy concerns there, according to Apple, and I can’t say I blame them. If they wrote a code to get into THIS iPhone, what’s to stop the FBI from getting into any iPhone they chose, regardless of whether someone committed a crime or not? And what’s worse is that less than 37% of Republicans at the time sided with Apple (Waddell, 2016). What possible reasons could the Republicans have had for not supporting Apple and the right to privacy? I don’t know, personally, but at a guess, I think it keeps us scared. So long as we’re scared, we’re malleable. When we’re scared, we give away rights left, right, and center. While I will own that the following quote is taken somewhat out of context, as Benjamin Franklin was discussing a tax dispute between the Penn family and the Pennsylvania General Assembly in the 1755, this one always comes to mind:
Not for nothing, but if a Founding Father saw that giving up rights for safety was stupid, why can’t we?
So yes, I might be presenting myself (at least on Facebook) in a manner unbecoming of an academic, but my answer to that criticism is as follows: First, my Facebook is my own personal soapbox. I am allowed to say and behave in a certain manner because, at the end of the day, my Facebook is where I am most comfortable. It’s where I can have debates with family and friends over various topics, and where we can share information and knowledge. If anybody doesn’t like that I utilize my Facebook in this way, then my only response is to not let the door hit you on the way out. Second, I have certain social media platforms that I use in specific ways. Twitter, for example, is my academic side. I very rarely post anything personal on there (anymore–I will own that; I’m transforming my Twitter), and I’m utilizing it as more of a place where I go as a historian and as an educator to share ideas and to pass on the passion for history that I have. I use Twitter as a teaching platform. I use Facebook as a personal one. I see no reason why this should change, regardless of anyone’s opinions of my opinions. They’re mine, and I have the right to speak them, just as someone else has a right to speak theirs. In the same vein, people have the right to be offended or angry at my opinions. Therein lies the strength (or weakness, depending on who you talk to) of the First Amendment. So my personal Digital Citizenship statement? I have a First Amendment right to say what I think, but others have a right to not like it, which means I need to be careful of what I say, where I say it, and how I say it.
Hell, maybe my mother has it right. Maybe it’s best to just keep my mouth shut. As Aaron Burr said in that great musical of our time (Hamilton, in case you missed the last blog post), “ev’ry proclamation guarantees free ammunition for your enemies.” Maybe it is best to “talk less, smile more.”
Barrett, B. (2016). Update your Pokemon Go app now to fix that privacy mess. Wired. Retrieved from: https://www.wired.com/2016/07/update-pokemon-go-app-now-fix-privacy-mess/
Dorff, V. (2016). Privacy: a failed experiment. The Huffington Post. Retrieved from: http://www.huffingtonpost.com/victor-dorff/privacy-a-failed-experiment_b_9730878.html
Franklin, B. (1755). Pennsylvania Assembly: Reply to the Governor. Votes and Proceedings of the House of Representatives, 1755-1756 (Philadelphia, 1756), pp. 19-21. Retrieved from: http://franklinpapers.org/franklin/framedVolumes.jsp?vol=6&page=238a
Waddell, K. (2016). Is digital privacy becoming a partisan issue? The Atlantic. Retrieved from: http://www.theatlantic.com/technology/archive/2016/03/is-digital-privacy-becoming-a-partisan-issue/472449/
After a short break for a class that didn’t require blogging (which, by the way, statistics and I are decidedly NOT friends), here’s another round of “Erica blogging about school!”
My course this module is “Digitally Mediated Teaching and Learning.” “But Erica,” you’re probably asking, “what does that mean?” The answer? I….haven’t the foggiest idea as yet. Mostly so far we’ve been talking about our online identity and setting up our PLE. That makes it fun for me because half of my work was already done! Because I started here with a graduate certificate in online teaching, I had to set up my PLE last year and use it for the classes I’ve taken thus far. Apparently, they’ve changed the recommended order in which M.Ed. students take their classes. I took mine in chronological order, but now they’ve adjusted that so that 520 is the second class M.Ed. students take.
This is my eighth. So that’s making life interesting. You know what else is making my life interesting? That my poor professor has to deal with me texting her at 8 PM on a Saturday night freaking out about an assignment.
Anyway. It never occurred to me to be concerned about my digital citizenship–which is just a fancy way of saying “follow Wheaton’s Law.”
Digital citizenship can be defined as the “norms of responsible, appropriate internet use” (Ribble, 2016, para. 1). There are nine elements to this, including digital access (can you get to it?), digital commerce (do you know how to keep your information safe when shopping online?), digital etiquette (see Wheaton’s Law above), and digital literacy (do you know how to use it?), to name a few (Ribble, 2016). There are so many resources out there regarding how to teach digital citizenship to elementary students–which we absolutely should do, because then maybe we’ll see fewer stories about cyberbullying and just general Internet meanness. But I digress.
Part of digital citizenship is how we’re perceived online, or our online identity. Part of my work for the past couple of weeks was to, essentially, research two prominent people in my field and study their online identity. How do the put themselves out there? I could’ve used prominent educators–in fact, my professor gave our class a bunch of names of people that we should know about anyway. But, at the end of the day, I’m a historian. Yes, I’m an educator, but I’m a historian at my core. So I went to find a couple of awesome historians.
By the way, if you’re unfamiliar with the musical Hamilton, be prepared to be educated, son.
One of the historians I chose for my project was Joanne Freeman. Some of you might have seen her in random documentaries on the History Channel or PBS. Maybe you’ve checked out Yale’s Open Courses, in which she teaches an entire course on the American Revolution. She wrote a book on duels a bunch of years back that really got into the nitty-gritty of the duel between Alexander Hamilton and Aaron Burr, and she’s worked with the National Park Service on the reconstruction of the Hamilton Grange National Memorial. She also was privileged enough to record a plenary session with Hamilton creator and actor Lin-Manuel Miranda. I’m excited for this because that plenary session was for the Society for Historians of the Early American Republic, or SHEAR, conference next weekend–which I’m attending. But I digress.
For those of you who don’t know, I’m an early American historian. I could sit and talk about the Founding Fathers for DAYS. And I love me some musical theater, so putting together a Founding Father and a musical? I’m hooked.
…my husband just called me a hipster, because I was into the American Revolution before it was cool. Huh.
Anyway, Dr. Freeman is so prolific on Twitter it’s ridiculous. AND Lin-Manuel Miranda follows her on Twitter…probably because, aside from Ron Chernow (who wrote the book the musical is based on), she’s THE Hamiltonian historian. Being able to study her work, and knowing just the kind of role model she is, is really inspiring as a woman historian. American revolutionary history is coming into a sort of renaissance because of Hamilton, which is interesting, considering that when I went to another annual meeting of historians, I was informed that the scholarship is moving away from the political aspect of the Revolution and more into the social aspect of it…and yet, you can’t talk about Hamilton without talking about his politics.
“But Erica,” I’m sure you’re asking at this point, “what does this have to do with online identity?” Maybe it doesn’t, in a direct way, but indirectly, Dr. Freeman shows us how to take history and make it relevant to the next generation of students. And it’s not just Hamilton–it’s history in general. She does such an amazing job of taking random historical…stuff, tweeting about it and starting conversations that may not have happened otherwise. There was a tweet…somewhere…on her page (forgive me, as I can’t remember when it was, and she’s so prolific that I’d have to go searching)–she brought an original revolutionary pamphlet to Lin-Manuel to read, and he tweeted a picture of it. Someone responded back to that picture with a comment about how she’s inspiring a new generation of historians. Given our current policial climate, that’s absolutely necessary.
Look, I don’t talk politics as a general rule, for three reasons. One, my mother always taught me that there were three things you don’t discuss in mixed company–politics, money, and religion–and the Internet is, most definitely, mixed company. Two, my mother and I have fundamentally different opinions on the political landscape of America, and I’m not stupid enough to engage in a political argument with my mother. Three, generally speaking, most people who talk about politics have no comprehension of how this country was set up, how it was founded, or why it was founded the way it was. I can count on one hand the number of friends I have who can tell me what the three branches of government are, much less what’s in the Constitution (and, more importantly, what ISN’T in the Constitution). History is, at its core, a study of what not to do, and when we don’t study it or pay attention to it, we do so at our peril. We need to educate the next generation in the history of our country, in the founding of it. We can’t understand where we’re going, or why this year’s election is so scary, if we don’t know where we come from.
But we can’t educate the next generation of students if we aren’t setting ourselves up as solid role models for them, if we aren’t putting forth a solid online identity for them to listen to and follow and learn from. And I firmly believe that identity needs to be digital. The next generation of students is going to be solidly online (she says as her daughter moves from one digital device to the other constantly), which means they might not be listening to us anymore if we’re not online–if we’re not relevant. And that’s where Hamilton comes into play.
In bringing Hamilton to the stage, Lin-Manuel Miranda brought this Founding Father to life. He has caught the attention of the students who will make up the next generation of historians. He’s made the American Revolution and the building of the Constitution relevant again. And to be able to connect to those students–to that generation–is something that all historians should aspire to do. Because these students are going to be the ones writing our history in the future. They should at least know where we came from.
Miranda, L. [lin_manuel]. (2016, July 9). An FB status update from 5 years ago today. [Tweet]. Retrieved from: https://twitter.com/lin_manuel?lang=en
Ribble, M. (2016). Digital Citizenship. Retrieved from: http://www.digitalcitizenship.net/
Ribble, M. (2016). Nine Elements of Digital Citizenship. Digital Citizenship. Retrieved from: http://www.digitalcitizenship.net/
To answer my own question, nothing. 🙂 That’s not true.
Because this class was so challenging for me, it meant that I wasn’t really focusing on all of the things I was apparently supposed to learn; rather, I focused on the things that my brain (yay cognition!) apparently decided I needed to know to further both my own educational goals as well as my goals within the course that I teach. How do you decide what’s important to learn? Short answer–your brain does that for you. And thus, mine did.
One of the biggest things I took out of this course was the lessons I learned from the Perkins book. Learning is not about me, really, but about how I can better help my students take the knowledge I’m trying to give them and apply it. I teach history, so maybe they won’t be able to apply it to their own lives as a general rule, but if they’re better able to argue for or against the Second Amendment because they took my class, awesome. If they’re able to better argue why one presidential candidate is scarier than another because of expansion of presidential powers; if they’re able to discuss competently why the current issue of blocking a Supreme Court nominee is detrimental to society as a whole; if they’re able to argue why Apple absolutely should not build a backdoor code to the iPhone–I’ve done my job.
But for all that I’m relatively good at my job (though it’s still a work in progress–don’t judge; I’m still new at this), there’s always something new that I can learn to help them learn. Perkins’ book was probably the most useful thing, next to a book about online learning pedagogy from another class I took, that I’ve ever found, because it really taught me how to view things differently. I waxed poetic about the Perkins book in my last blog post–you should take a look, and then go find the book and read it because educational reasons.
Cognition is, by definition, all about the science of learning. Cognition is how you learn. I’m always telling my students to be cognizant of the posting requirements, or of the questions being asked, or of their spelling/grammatical errors (because none of them proofread, apparently. Ever.). But what does that mean to them? I’m realizing that I’m phrasing things wrong. I’m not teaching smarter–I’m teaching harder. I’m trying to get them to be aware of their mistakes so they’ll fix them. This is Perkins’ hearts-and-minds theory that fails so spectacularly across the board. My students can be aware of their mistakes all day every day, but unless I’m teaching them how to fix them, unless I’m giving them a better understanding of why what they’re doing is inaccurate, nothing is going to change. That mistake is on me.
If you were to tell me seven weeks ago that I would have learned a different way of teaching my students because I understand how their brains work a little better, I’d’ve called you a liar. Learning styles and all that are great, and yeah, we know all about those, so what else would I have needed to know? Yeah, about that…
I needed to know that there are better ways of teaching the same information in a way that students can build the connections they need to build in order to learn what I’m teaching them, but also learn how to apply those skills to other things. Do I want them to be able to argue historical points? Absolutely. But more to the point, I want them to be able to take a source–any source, be it a book or a newspaper article or something else–and work through it to analyze it for what it doesn’t say. I want them to be able to support any arguments they make with well thought out evidence, because it means they’ve learned how to do research, and not blindly believe what other people tell them is the truth. I want them to question authority, because authority is not perfect. And if they don’t know how to do that, then I failed as a teacher. My job isn’t just to drill information into their heads–they may or may not learn it that way because the brain is screwy and cognition can screw with us (I think I included that CrashCourse video in another blog post, but since I love those guys so much, here it is again)
We as instructors live with an illusion when it comes to teaching our students. We believe the illusion that we’re better than this, that our students just don’t get it, and that it’s not our fault that they don’t get it. That’s our perception–and I believe it’s faulty. Because so many people equate cognition to, essentially, human computing, it’s easy to assume that learning is also static. However, behavior and performance are not static (Booth, 2012). It’s far too easy to put people into little boxes because you don’t understand how they work, but therein lies the beauty of the brain and the various cognitive processes—it’s for this very reason that AI is so far behind the human brain. The illusion that learning is static makes it very difficult for us to think outside that box in regards to instruction. However, if we realize that this idea of a static mental representation is simply an illusion—a subjective perception—it’s easier to see past it to the reality beyond. And that reality is that maybe we’re doing something wrong. Maybe something we’re doing isn’t working. So rather than just blaming the students, or the resources, or even ourselves, why don’t we fix it? Think outside the box and find a way to look past the illusion to the truth beyond. What harm could it do?
Have you ever been in the middle of doing something, and something happens and breaks your concentration? Like, I’ll be in the middle of doing homework, and my husband will do something really cool in his video game, and I’ll lose my train of thought, because video games are easier to pay attention to than my homework (especially for this class, and that’s literally happening right this second). Or I’ll be reading something important, and my kid will call me, and I will completely forget the thing I just read, so when I go back to reading it, I’ll have to read it all over again and hope like hell this time it clicks. Sometimes–and this is the really fun part–I’ll have the ability to sit down and just hammer out my work, and I’ll open my computer, read what I have to do for the week, try it a bunch of times and break down in tears because I just. Don’t. Get. It. So trying to pick that topic back up is ridiculously hard.
Apparently, all of these things are researched aspects of cognitive science. The idea of emotions screwing up your work? There was an entire study done regarding the relationship of emotions in psychopathology–essentially, that it doesn’t matter how motivated you are, and it doesn’t matter whether that motivation is internal or external, sometimes emotions just screw you up. Motivation is important to getting things done. Let me open up to you here. But first, here’s a video about motivation (I love these guys!).
I suffer from clinical depression and general anxiety disorder–things I was diagnosed with very early on in my college experience (I think I was about 17?). I have always been an extremely motivated individual–I’ve never needed my parents to bribe me or otherwise motivate me to do well; I do that all on my own. And generally speaking, I’m pretty good when things are challenging, because it means I’m learning something. This class, on the other hand, is ridiculously difficult for me. Like, tears inducing, contemplating quitting the program because I’m obviously not smart enough for this difficult for me. I don’t understand…probably half of what we learn about the first time we read it, and the other half not at all. It doesn’t matter how motivated I am, this class causes a significant amount of anxiety and defeated feelings, which in turn reduces my motivation by an order of magnitude (i.e. I start crying and telling my husband I can’t do this). Crocker et al. (2013) posited that depression and anxiety causes dysfunction in cognitive processing, which then leads to the breakdown of motivation in regards to achieving an end goal; specifically, that “deficits in specific EFs [executive functions] are at least partly responsible for key cognitive, emotion, and motivation features…including cognitive biases [and] motivation-related dysfunction (Crocker et al, 2013, p5). Or, in plain terms, the higher my anxiety gets, the less my brain works, the stupider I feel, the less motivated I am.
This whole idea of internal versus external motivation comes into play a lot in online learning. I see it inside myself, and I see it with my own students–many of whom are in school for the very first time, trying to balance school with work and the full time obligations of taking care of a family. Some of them are working two jobs–many of them are in the military, with all the stress that goes along with that. It doesn’t matter how motivated you are; when life gets in the way, it’s hard to pay attention to the schoolwork you should be doing at that moment. Attention is, essentially, defined as the ability to selectively process information in an environment (Fougnie, 2008). It’s expected that students will be able to pay attention to their work when it’s an online environment.
Realistically, though, as online educators, we need to be aware that life happens, and that will, by default, cause students to pay attention to something else entirely. The goal at that point is to hope that the students’ working memory is strong enough so that when they come back to the work, they remember what they’d learned beforehand. What is working memory? Essentially, working memory is the ability to take what you’ve just read about/done and keep it as something that you’ve learned, something you can refer back to. You can also call this short-term memory, I guess, but it’s not…quite. But, whatever works. There was going to be a picture here, but this video is so much better. (Seriously, the CrashCourse guys are AMAZING).
It’s at this point that you need to consider that students’ working memory is…less than stellar. As a general rule, people only tend to remember things for no more than 20 seconds unless they apply the information in some way (Doolitle, 2013). So, if I’m reading something about attention and memory, and my child calls me away to check her own homework, I have about 20 seconds to apply what I’ve just read so I can recall that information later. Simpler things are, obviously, easier to keep. Harder concepts (like pretty much everything we learn about in this class) are significantly more difficult to apply or keep in mind when jumping from one thing that needs attention back to schoolwork.
The connection between attention and working memory is obvious to me–mostly because my working memory is atrocious. Which might explain why every module I think that my students will absolutely get it this time, and then I’m surprised and somewhat disappointed when they don’t get it. There’s this idea in Making Learning Whole that really clicks for me. Perkins (2009) talks about the ideas of “near transfer” and “far transfer” in regards to how students make connections between information they already know and something I want them to learn. The concept of “bridging,” in which students make deliberate thoughtful connections between two separate concepts, really spoke to me, and caused me to think about how I can connect things that my students already know, or have already learned about, or have read about in current events, with past events (Perkins, 2009). There are also a few insights into how to combat the the concept of presentism, in which people look at historical events through the lens of today’s attitudes and knowledge that will absolutely help me teach smarter, not harder, and not blame the online program or the curriculum for why my students just aren’t getting it.
As a historian, I understand the need to take myself out of 21st century mindsets and attitudes and think about how the events played out for 16th and 17th century people. My students have a very hard time with this, and I’ve been trying to teach them harder. I will own that I sometimes blame the program, as there are places I think it can be improved to help me out with this, but honestly? It’s on me to ask better questions of my students to get them to think outside of a very narrow box that they’ve been living in for quite some time.
I wish I could show you the image in the Perkins book that connects how to teach the trouble spots, and differentiating between blame, focus, and explain (essentially teaching the same way, teaching harder, or teaching smarter). If you’re an educator reading this, and you haven’t read Making Learning Whole by David Perkins, then you’re missing out. It helps give you ideas on how to get your students to perform better in a very personable and easy to read way. I highly, highly recommend it no matter who you are. Maybe you’ll get some ideas too.
Crocker, L. D., Heller, W., Warren, S. L., O’Hare, A. J., Infantolino, Z. P., & Miller, G. A. (2013). Relationships among cognition, emotion, and motivation: implications for intervention and neuroplasticity in psychopathology. Frontiers in Human Neuroscienc, 1-19. doi:10.3389/fnhum.2013.00261
The past few weeks of my cognitive science course have been…challenging, to say the least. I’m not a sciency-type person, and while I’m not learning hard core neurobiology or anything like that, my brain does not grasp some of the concepts sometimes, so it’s taken a bit to really figure out what’s going on. That being said, there are a few things I’ve learned so far.
Thing number one: Cognition and cognitive science is really, really hard. So in keeping with the trend I utilize in my own course of finding easy to understand videos, here’s a video by Crash Course explaining cognition, and why we can be really, really stupid sometimes.
Thing number two: People really, really disagree on whether or not there’s such a thing as a learning style or a multiple intelligence. Dan Willingham put out a video disproving the whole idea of multiple intelligences and learning styles. Titled “Learning Styles Don’t Exist” (I know, it’s *really* vague on what the topic is, huh?), Willingham (2008) goes into a discussion about how people only believe learning styles exist because part of the theories is true. Well, if part of the theories is true, wouldn’t it stand to reason that it’s possible that the rest of it is also true, and you’re just being obstinate or testing it wrong? I mean, maybe I’m wrong there, but I know for certain that I am much better at some things than others. I’ve mentioned in previous blog posts about Gardner’s theory of multiple intelligences. One of those is mathematical/logistical. I cannot math. I’m remarkably bad at it. I can add, subtract, multiply, and, in small numbers, divide. Long division? Nope—and that’s something my 11-year-old can do. Complex math? Troublesome. Imaginary numbers like in calculus? NOPE.
That’s part of what makes teaching and learning so difficult—you may not learn the way I teach, and I teach the way I learn. That’s part of what makes teaching difficult. Which leads to thing number two: Children learn and problem solve in ways that are drastically different from adults, even when allowing for multiple intelligences and learning styles. Adults don’t really think outside the box; they’re more rigid and, in general, are inflexible in regards to taking a solution (even if it’s the wrong one), and running with it. An example: I teach U.S. History I to adults. These adults, in general, have very set thoughts and ideas and beliefs about why the American Revolution happened. In general, many of these ideas are wrong, but it’s not their fault—it’s just how we teach history to children. What that means for me as an instructor is that my students are very “inside the box” when it comes to thinking critically about the Revolution as a whole and what the catalysts for independence were. And, for the most part, they’re all incorrect, but rather than think about what’s being asked, they insist on making the evidence fit their belief. It’s called historical bias (or bad history, if you’re my thesis advisor), and almost everybody who isn’t a historian does it.
But if I try to teach that same concept to my 11-year-old, that lightbulb immediately goes off, and she gets it. Her cognitive processes are more flexible because, at 11, she hasn’t gone through the same experiences that build those preconceived notions that adults are hindered by. She may not understand the politics and the nuances of the Revolution (again, mature, but still 11), but she’ll talk about the basics of the concept, and understand it. Those rules, that logic that we’re so attached to as adults—that’s what kills adult learners. More than learning style, more than anything else—trying to rework the neural pathways of adults is like trying to tear down the Golden Gate Bridge and rebuild it in a day—it just isn’t going to happen. Adults, as a general rule, follow logic and rules more than anything else. “If I read this paper, then I’ll learn that topic.” “If I learn this topic, then I’ll pass the class.” It’s very linear, very structured. Children, on the other hand, are so much more abstract in their thinking, which means they solve problems in much different ways than adults do.
And this leads to thing number three (which might seem like it comes out of nowhere): We are absolutely screwed if artificial intelligence ever kicks off. Skynet (or the Matrix) will happen, and we will no longer be necessary. As it stands right now, AI is limited by those same rules and that same logic that adults are, and since it will be adults programming them, the AI won’t learn that there are exceptions to every rule. And then the Terminator will happen and it’ll be all over.
A lot has been made of education lately, and what is learning, exactly? Trying to put together a mind map of what learning looks like is…difficult, to say the least, especially when you work in higher education. But, I managed to do it. This is what learning is:
I do a lot at Post University. I’ve hit the trifecta, if you will. In my time there, I’ve seen a lot that I think can change, and I have some ideas on how to do that. Below is my Future Vision of Education for Post University.
We spent a lot of time these past couple of weeks talking about economic and demographic trends in education and, amusingly enough, it was something my mother talked about (albeit in a roundabout way) shortly thereafter. We were talking politics (something I try REALLY HARD not to discuss with my mother, as we’re on two opposite ends of the political spectrum), and the topic of education and “pulling yourself up by your bootstraps” (a phrase I LOATHE) came up. Like always happens in an Italian household on major holidays, this caused an argument. There are things I won’t bring up with my mother, both out of respect for her and out of a severe dislike for confrontation. However, since it connected to our work the past few weeks, I don’t mind talking about it here.
My mother went back to school when I was four to be a nurse. She graduated when I was 8. She was lucky–she was, technically, a single mom that entire time, working part time trying to make ends meet–but my father and my grandparents happily took me whenever she needed to study, go to work or class, or later, when she was doing her clinicals. That last year? She had me two days a week, with my grandparents and my father splitting the other five among themselves. And my mother managed to make it work. She graduated with honors from Quinnipiac with her associate’s in nursing, and in more recent years has gone back for her RN to BSN and later her master’s in nursing as well.
Why does this matter, I’m sure you’re asking right about now. It matters, because my mother was able to find reliable, safe, flexible child care whenever she needed without worrying about money. My mother was able to be a single parent and still work and go to school. Were we poor? Apparently, though between all of them I never knew it until I was an adult. Was it difficult for her? Probably, but not nearly as difficult as it is for the millions of single, low-income parents out there who want to better themselves but can’t because they don’t have reliable, safe, cheap child care. They don’t have the ability to get to class consistently. They might not have the time to study, or get homework done in silence, because they’re too busy going to work, coming home, helping their kids with their homework, making dinner, cleaning up, and putting their kids to bed. That’s a lot of work–work that (while I have all faith that my mother would’ve figured it out, because she’s nothing if not tenacious) I’m not entirely certain my mother would’ve been able to do on her own without all the help she had.
Which is why our conversation bothered me. It’s difficult to realize that your parents are so diametrically opposed to your views, even when you feel like they should be on the same side, given their struggles. My mother was that low-income single parent–the difference (and what has apparently made all the difference) is that my mother is white. At Post, the impression I get as an admissions counselor is that most of the students are low-income students. It’s a fact that well over half of enrolled students at Post are minorities, and it’s a fact that less than 50% of minority students finish a bachelor’s degree in six years, versus almost 70% for whites (Carey, 2008). That’s….a ridiculous number, and it speaks to the immense racial disparity that exists in the United States right now.
I know that I’m generally more humorous in my postings, but this is too big of an issue to joke about. I know that this is a point I’ve harped on in my class, but it’s a point that bears repeating, especially as my mother’s generation gets further out of touch with their pasts. Lower-income students need more support from their chosen universities. They need more support from us–period.
Maybe that support is through support services within the university. Maybe that support is from the government by way of social services. Maybe that support is some combination of both.But the fact remains that our low-income students need us–and we’re failing them. And THAT is unacceptable.
^THIS^ is how it’s done.
For a list of scholarships for single mothers, here is a great list to help you narrow down the search. You CAN do it–I believe in you.