I’ve been silent this past week, in part because I got sick, fell behind, prepared the house for weekend guests, planned my soon-to-be four-year old’s birthday party, partly because while I had a whole list of planned posts, I couldn’t concentrate on writing them. No, I was distracted by trying to come up with a way to write the following posts without impacting my own ethos as a writer and a teacher in higher education.
My post on the standardization of higher education from earlier this week was a hit, so to speak, driving traffic and stimulating some interesting discussions on Twitter. I’ve decide to address some of these concerns and continue venting on what I think is going to be the undoing of higher education in this country.
I received two tweets (one from @qui_oui and another from @rwpickard) about how a certain degree of standardization is necessary for transfer and the like. Look, I’m all for standards. We should all have a clear idea of what a 100, 200, 300, or 400 level class should contain within a discipline (how much to read, write, and the level of ideas/concepts expressed). I also understand that in other disciplines, you need to know a certain set of skills or concepts before moving on to the next level; I completely understand that Cal I has to come before Cal II, and that there has to be some standards in order for a student to make progress in their education. But, these standards would seem to grow organically from disciplinary requirements. Sometimes they are imposed by professional organizations, but often in the name of safety; I’m glad that my nurse has a standard set of skills that are required of her before being accredited.
It’s when we get into the “softer” disciplines, like English, where I live, that things get dicey. I have written already about my experience teaching an upper-division Modern Literature course. I appreciated the fact that, within a set of clear guidelines (400-level class on English literature written during what is known as the Modernist period), I had the freedom to teach the texts that I wanted to using the approaches that I thought would work best. I was able to “create” arches, comparisons, contrasts, and evolutions with the works we studied. Modern literature is a huge field (much like any field in English) and each professor will teach the course differently, according to their biases and expertise, but also based on the make-up of the student body and institutional culture. What works in a Modern Literature course at Yale won’t necessarily work in a Modern Literature course at Regional State U. But we can safely assume that given the guidelines and descriptions, a student coming out of an upper-division Modern Literature course should be able to do a certain set of things, from identify the major authors and features of the movement, as well as write a lengthy, in-depth research essay on a work from that period. How we get there will vary wildly.
And it should. Some may point to my characterization of the class as a disaster as a reason why we need more, not less, standardization. The argument goes that I was not to be trusted with coming up with the class, and instead I should have been given the syllabus and reading list to teach in a prescribed way (hey, just give me the script while you’re at it). I say that my failure is an indication that the institution needs to invest in professors, not temp workers, to teach class. If the administration continues to undermine and devalue what goes on in the classroom, no amount of standardization and accountability measures are going to improve student learning. Saying that we should teach all students the same things in the same way, all in the name of accessibility, is not the answer.
Which brings me to the next point of contention. Faculty, then, should then take it upon themselves to develop the accountability measures. We do already; it’s called the syllabus and grading. Apparently, that’s not good enough anymore. But is that the faculty’s fault or the fault of an administration that continually undermines the classroom experience (and professor’s authority) in the classroom? I just came across this essay about how we, the faculty, are increasingly pressured to let learning slide in the name of “customer service”:
This is what I am talking about when I say that the administration often don’t support what professors and instructors are trying to do in the classroom, but then blame us when learning doesn’t happen. Students are seen as tuition machines, and we are told they are to be retained, at all costs. When a student isn’t happy, we hear about it and need to adapt to keep the customer satisfied.
I say, get our backs, get out of our way, and let’s see what happens.
Money is being invested everywhere on campus except in front of the classroom, illustrated by increasing class sizes, the increase in online education, and the over-use of adjunct faculty. Students get the message; the professors (and learning) are the least important component on campus.
And even when faculty are involved in developing the accountability measures, it is usually because they are being required to do so and have to follow narrow guidelines with the demand for very prescriptive (and arbitrary) outcomes, in order to feed the data machine. Yes, cosmetically, faculty came up with the measures, but our hands are tied, impacting the results. Rather than having measures and standards that are organic to a given discipline, we have data driven measures that give us stats, but little else.
Time and resources are also a factor. Often, it is an already over-worked tenure-track faculty member (or committee of tenure-track faculty members) who is tasked with coming up with the measures. Those measures are then imposed on even more precariously positioned instructors and adjuncts, who are already burdened with the demands of teaching intensive introductory courses to larger and larger numbers of students. But none of that comes into the minds of the administrators requiring the extra work from their instructional staff (tenure-track and contingent). There’s no course release, no reduction in class sizes, nothing. Something has to give, and it is either dropping other elements from the syllabus or devising the “easiest” measures to implement.
There’s a win-win situation for student learning outcomes.
I was at an institutionally-mandated get-together for those instructors who taught the various developmental classes (math, reading, writing) at our institution a few weeks ago. We were hearing about the educational technology the math department was using to get students up to college readiness when the instructor presenting told us a disturbing little anecdote about how she caught a cheater last semester. “It was just like Big Brother!” she exclaimed excitedly. Ugh.
Now, I’ve already voiced my thoughts about our over-reliance on ed tech as the savior of education, but this statement made me think about one of the unintended (or intended) consequences of the move to standardize higher education, heavily facilitated by educational technology: the constant monitoring of all activity of both instructor and student. If we can standardize and record every instance of learning in a student’s academic career, then we can certainly pinpoint where learning failed, exactly which teacher or advisor is responsible for derailing a student’s career.
The more we standardize, the more we continue to infantilize our students and undermine our faculty. We are basically telling students that they aren’t responsible enough to learn and professors can’t be trusted to teach. Think about that for a second. Students can’t learn, and we can’t teach, so you need to be constantly monitored to make sure that these things happen.
How does this move towards standardization and assessment actually help students? What happens when institutions and accrediting boards rigidly dictate when and where learning happens in higher education? When instead of facilitating “informal” moments of learning, the university is required/requiring rigid reporting/return on investment data on campus talks, meeting spaces, and optional (but really mandatory) activities? Or that students (and eventually instructors/professors) measure success exclusively through test scores?
How do we teach and learn through experience, experiment, trial and error, and failures when Big Brother is always watching us? Does $44 billion really buy the Federal government the right to dictate to us how and what we teach, or how and when students can learn? As I put in the comments of Mary Churchill’s post “Can We Afford to Play,”
As we discover with young kids, we can spend all the money we want, but at the end of the day, all they want to play with is the empty cardboard box. I think the same thing goes for higher education, especially on the side of the professors. If professors didn’t have to worry as much about constant accountability measures, measurable outcomes, and reporting, we might be more likely to relax along with the students. If more people in front of the classroom had job security and more time, they may be more invested in the students outside of the classroom. If it didn’t feel like Big Brother was constantly monitoring all of us, we might relax, let loose, and really, really, learn.
At a certain point, the institution needs to get out of the way and just let learning happen. I have been critical of the type of “leisure” that takes place on (or rather off) campus, but is this behavior a result of the high states, high pressure environment we’ve created on campus? Most faculty and students can’t wait to get off campus at the end of the day; why is that? Universities have invested billions in creating “spaces” for students, faculty, and sometimes even community. Some have been very successful, but I wonder how many of them developed organically, and how many of them were responses to accreditation board requirements (having gone through two at two different universities, this is an important component for any re-accreditation)?
We may end up passing whatever tests they put in front of us, delivering more mandated content in increasingly rigid ways, but at the end of the day, we have failed.
In Fahrenheit 451, one of the characters describes what school is like in the near future:
But I don’t think it’s social to get a bunch of people together and then not let them talk, do you? An hour of TV class, an hour of basketball or baseball or running, another hour of transcription or painting pictures, and more sports, but do you know, we never ask questions, or at least most don’t; they just run the answers at you, bing, bing, bing, and us sitting there for four more hours of film teacher. That’s not social to me at all. It’s a lot of funnels and a lot of water poured down the spout and out the bottom, and then telling us it’s wine when it’s not.
Now, read a Tweet from a teacher in LA:
f2f is going 2 end up being security aka paras 2 make sure kids dont get on facebook in jr college f2f will disappear.
If Sir Ken Robinson (and many others) are right that the way schools are set up now was to prepare workers for factories, what are we preparing our kids for now, increasingly relying on computers to teach them? How to follow orders from a machine?
This is, of course, a dystopic view of the future, fueled in part by the fact that I am currently teaching Fahrenheit 451. But, I can’t help but wonder, are we really helping our most vulnerable students when we increasingly rely on technology rather than more traditional face-to-face instruction. Where are the mentorships, the relationships, the systems of support, of learning how to “think with others“? Certainly, we need to prepare students for a world that is increasingly interconnected through technology, but when do we say, enough, and start valuing, really valuing, personal interactions, rather than seeing it as an unnecessary cost, a budget line that is easy to eliminate.
Apparently, technology and online education is the real disruptive influence in education, allowing us to offer degrees for less than $10k. Having written about this very issue for the University of Venus recently, I remain skeptical. In the comments, the author of the post on creating a degree that costs less than $10k addresses my concern about teachers needing to eat with a response of only wanting teachers who are truly passionate about teaching. Great. More about how teachers are supposed to sacrifice everything for the greater good of “education. ” I am all for a more entrepreneurial approach to education, but I think we are trying to think bigger, rather than the true disruption coming from going smaller. If anything, money is being spent in the wrong place, in infrastructure instead of people.
I’m starting to see the movement in education as analogous to industrial farming; we all embraced farming technologies because food got cheaper, safer, more plentiful, and easier to grow (ok, education hasn’t gotten any cheaper, but isn’t that the goal of increasingly using technology?). But we now see that it might be cheaper, but it isn’t any healthier (and in many cases less healthy), it is more devastating to the over-all environment, and only economically beneficial to a handful of massive multi-nationals. Is this really the kind of education we want to offer our children, particularly our poorest and most vulnerable? In poor neighborhoods, they’ll be fast food and private online edu.
The disruptive innovation in farming and food isn’t in technology; it’s in scaling down, finding balance, quality, and over-all sustainability. Organic farmers, growers, and animal ranchers, urban farmers, and others are changing the way we think about food. We might see disruption coming from similar sources in education. Take for example a movement in England where people have taken over abandoned buildings and turned them into schools; curious people, some smartphones, and voila, learning. No bells, no whistles, no nothing. That’s disruptive. Not providing standardized pre-packaged education online offered by underqualified individuals with little to no support. Government, school boards, and universities need to reinvest their money in the people who teach and create knowledge; the rest can clearly fall away and not impact education. In fact, it may facilitate it.
Next fall, I will be integrating a lot more technology in my classroom, in part because of forced standardization and accountability. But part of it is trying to make my class more effective. My job is to teach, but it is also to coach my students, particularly my developmental students. It’s to disrupt their worlds in order to encourage critical thinking or knowledge creation. A computer program might be able to award a student a “badge” (again, what is that preparing students for in their professional futures?), but a computer program can’t look a student in the eyes and tell them that they can do it, they can write, that they truly did a good job, ask them the right questions to get the heart of whatever problem they’re having, care enough to keep asking, or even express sincere disappointment when they let you down.
There’s a reason why the children of professors overwhelmingly go to small liberal arts colleges. There’s a reason why rich and middle-class parents fight to send their kids to good schools with small class sizes and good teachers, and will continue to do so, no matter how expensive it becomes. Technology is a tool, not a replacement, nor a silver bullet, especially for our most vulnerable students.
It’s been a year since I’ve started blogging. It seems like as good a time as any to look back over the year and reflect on how blogging has changed me.
Yes, you read that right, it has changed me. I am more engaged, more reflective, and, perhaps, more militant, in my own small way. I don’t just read about issues on higher education, I think about them in order to write about them here. When I teach (or, more accurately, after I teach), I am forced to reflect a little more carefully about what I am doing and why, because I need something to write about.
I am more connected to the larger community of academics. I write, people read, share, and respond. I know I have not only an audience, but a community of people who read and who I read. We have conversations, and maybe one day will meet face-to-face. Until then, I know more people than I ever did as a traditional academic.
And I know I am having an impact. I figured that between the four institutions I have taught at, I have reached approximately 1100 students (keep in mind, while I was doing my PhD, I only had one class; my other experiences were closer to full-time, but with writing intensive classes with lower caps). At least that many people have read my top post, How Higher Ed Makes Most Things Meaningless, especially considering that it was featured on both Inside Higher Education and Ed Leader News. Imagine my delight to find out that no less a figure than Henry Adams of The Academic Bait and Switch fame on the Chronicle and that he linked to my post in the comments of another Chronicle piece (which I can’t find right now). More people than I have ever taught have read that one post. More people than who have seen me speak at a conference. More people than who have read any of my academic essays.
But it is all of the people I have met outside of academia, those who are passionate about topics, rejecting the status quo of education at all levels, caring deeply about meaningful change. For me, blogging has opened my eyes to the world outside of academia. Does that sound like a sheltered academic statement? Indeed, it is. There is a degree of willful ignorance that an academic needs to have in order to survive the demands of living the academic life in higher education. The best thing that has ever happened to me is that I was unemployed for a time; I was forced to see thing differently and to do things differently. I saw others letting go and being successful, and it has empowered me let go.
Blogging has also, admittedly, fueled the more negative aspects of my personality, manifesting itself specifically as an obsessions with my blog’s stats. Lurking deep beneath my desire to be an academic is a need for validation, and the stats are one way that I can feel that sense of validation now that I am off the tenure-track. I see sites that do better than I do; College Misery gets the same amount of traffic a week as I do a month, if I’m lucky. Then again, misery loves company, and I’m not sure what thoughtful writing on the current state of higher education as well as teaching attracts. Less hits, apparently. Which is also depressing.
Wait, I’m celebrating here. I’m not perfect, and I still have some things I need to work on.
I’d really like to thank a few people: Mary Churchill who has been so supportive and inspiring me with her great work at University of Venus and Old School/New School; @ToughLoveForX who I have no idea how I “met”, but I am amazed at how connected this retired printer is, especially in the world of education; @comPOSTIONblog for founding #FYCchat with me; Worst Prof Ever for just generally kicking ass and doing and saying all the things I’m still not quite ready to; and all of the people who have come here, read my posts, commented, followed me on Twitter, shared my writing, and encouraged me to keep writing.
My goal for the next year? Get big enough to attract trolls. 🙂 I’m only half-joking.
One of my fellow writing instructors and bloggers, Laura at Red Lips and Academics, recently wrote about the challenges of teaching students in our culture of over-share. I’ve written previously about why I actually don’t mind assigning a narrative essay, even if it does reinforce some of their more narcissistic impulses. But the post, my own brush with wordlessness, and being in the middle of grading papers, made me think about what, exactly, our students are saying.
Juxtaposed with my brief brush with wordlessness is my son’s language explosion. He has just turned two and the language center in his brain finally awoken. All he wants to do is point to things and have us name them for him, then show off all of the words he probably didn’t even know he had locked away in his noggin. His excitement is palpable; he always wants us to read to him so he can point out all of the pictures he recognizes. He’s starting to sing songs.
As I mentioned in my last post, last week I ended up in the hospital for what we feared was a stroke. The symptom? I was no longer able to speak coherently. All of a sudden, what I meant to say and what I actually said no longer matched up. I was playing with the kids at the preschool, and suddenly, nothing I was saying to them made any sense. It wasn’t gibberish, but it wasn’t related to what I we were doing or talking about. Thankfully, kids are more accepting of silliness, so they were easily dissuaded from asking too much about what was wrong, and I was wearing sunglasses so no one could see the abject terror in my eyes. My head had been hurting and so I had previously texted my husband to come and pick us all up. By the time he got there, all I could manage to (haltingly) say was: can’t talk. He promptly took us home, scared one of his colleagues into coming over and babysitting, and we were off to emergency.
Sundown Friday saw the start of National Day of Unplugging. I didn’t know anything about that when I decided, on Thursday, to unplug as much as possible over the entire weekend, starting at about noon on Friday. Events this past week have left me…unmoored, and I needed time to think about what happened and what I want to do with the information. Some of it I will write about here. Other things will be referred to vaguely, much later, for fear of my job.
My grandmother used to clip and save everything; it wasn’t a successful reading session if she hadn’t marked off at least two pictures she wanted to eventually paint and clipped an article that she thought one of her daughters, grandchildren, or friends would be interested in reading. When I went away to university, I used to get letters from her that contained articles that mentioned my old high school, my old swim team, or future job possibilities, among other things. I always loved getting those letters.
I also have very clear memories of my grandmother wanting to show me an article or picture she had found and being completely unable to find it among the piles and piles of magazines and newspapers. She was in no way “drowning” in her magazines and papers; she recycled out what she didn’t need or want every week. And once she had showed you what she wanted you to see, out it would go. But my grandmother used to get so frustrated when she knew exactly what she was looking for but could not for the life of her find it.
I wonder sometimes how my grandmother would be in this more digital age; would she be emailing me links, bookmarking page upon page in Delicious? Would she still get overwhelmed, even without the physically piles and pages, and lose what it is she is looking for? I’m not very good at bookmarking links, marking tweets as favorites, or starring emails; I tend to get overwhelmed and purge frequently. I also figure that if I need it, I can google it. And then, I, like my grandmother, couldn’t find an article I knew existed. I knew what site it from (nas.org), and I knew what it was about (the university of the future), but I didn’t have the right keywords in order to find it (kept searching university and future, rather than Academic things to come).
Thank goodness for Twitter.
An article about teaching students about how much the internet remember about them and the value of erasing parts of ourselves from the net got me thinking about how much is gained and lost, remembered and forgotten, in this digital age. I’ve worked with archives for my dissertation research, and the idea that these letters and manuscripts could be more readily and easily available both excites and dismays me. I’m excited because, hey, we all like easy access and dismays because I loved being able to hold the letters in my hand and read not just what I needed but also what was there. Having things easily indexed and searchable may be faster, but sometimes the joy is in the journey. What could be lost is something extraordinary that you weren’t necessarily looking for.
We also, for a time, have lost the ability to see the evolution of a piece of writing; unless you purposefully saved versions of the same draft, or the version with the feedback/Track Changes, then all we have left much of the time is the final version. Part of my research involved watching how a translation came to be, looking at various drafts, edits, and feedback the translator did and received. Google documents could allow us to watch a document be shaped and evolve, but unless we consciously save the steps, then the process will be lost.
Digitally, I’ve lost my wedding pictures when my husband’s computer’s hard drive was replaced without them first asking if he wanted a back-up of the old one. I lost all of my poetry from a period of five years because I accidentally left my diskette (yes, it was that long ago) behind in the computer lab; I don’t actually have a complete hard copy of them all, and, at the time, I didn’t have my own computer to back them up on. We have learned the hard way that ebooks can be taken away quite quickly and easily, making it hard to predict when our notes and annotations could be unceremoniously ripped from us.
Then again, I’ve had my “office” broken into when I was a PhD student (just before my final comprehensive exam) and all of my books stolen; pictures and documents can just as easily be lost in a fire, flood, or other disaster; and an irresponsible, careless, or oblivious person can just as easily throw out a physical letter as they could delete an email. My own research has gaping holes because a flood wiped out almost all of the personal papers of the author I was studying. And I also know first hand how fantastic it is to physically find something you might not have been looking for but because you had to search through everything.
As academics, whether you are a digital humanist or not, we need to pay attention and rethink how and what it is we keep and what might be lost.