Tuesday, June 10, 2014

 

The dim light of dawn is finally breaking through.

As a college counselor and member of faculty, I am also a member of a union, the irony of which does not escape me. I'm also active in the union; since they defended me when the controversy surrounding Anya was threatening my job and career, I feel it's the least I can do.

With union membership comes union meetings, and we are no exception. Lunch is served, and we discuss whatever is most pressing: contract negotiations, grievances, and the like. Recently, we've been focusing on the nature of higher education as a career, and the existential threats the profession (and our livelihoods) face. Someone in the audience went as far as to state that higher education as a profession is at risk of becoming a dying profession.

Financial aid faces its own threats, and I've cataloged them on this blog more than a few times. Teaching faces different threats (or so it seems, he whispers forebodingly...), and right now, massive online classes (MOOCS), the prevalence of adjuncts, and centrally programmed curricula and syllabi seem the most immediate risks.

Now, in a vacuum, these three developments do represent a clear and present danger to the existence of full-time teaching positions, as well as a general threat to academic freedom. Nevertheless, I feel the teaching faculty is typifying the blind men and the elephant parable. In other words, they can't see the forest for the trees.

Let's begin with a rhetorical question: why do organizations hire someone? For that person to add value and increase the bottom line, i.e., make the company money. Either the new hire is replacing someone who left or was fired, or they're filling a new position. That explains why people get their jobs, and for some, that's enough. There may be no career ladder, or the nature of the job doesn't change very much, or the job is good enough to the point you don't want to move up. Although this group is seen as laborers and skilled blue collar workers, anyone who relies primarily on his or her technical skills could be included. Though it sounds strange, lawyers, doctors and dentists are included. If you want more autonomy or more money, you can open your own practice, but you're still practicing law, medicine or dentistry. As a personal example, I would fall under this category. I've done this job since 1999, and the role I play in the organization hasn't changed since I started working at my present job in 2001, with one exception. My knowledge base has increased; the regulations have changed and become more onerous, but the position itself is still reviewing tax returns, counseling students, visiting branch campuses or high schools, and tracking down missing paperwork. Anything I do above and beyond my duties may merit a raise in the future, but there's no new position on the horizon, nor am I looking for one. Not all financial aid offices are organized this way; in fact, most aren't. I'm happier with this structure.

The flip side of this is a position where someone is hired with the hope and expectation that they will move up the ranks, with new jobs, responsibilities, etc., that are materially different than they job for which they were hired. Someone is hired just as much for one's potential as one's present skills and abilities. A production assistant in Hollywood is looking to become a producer, and studios won't even look at you if they don't feel you have that potential. This carrot dangling in front of some poor 23 year old film grad is what gives him the strength not to kill the actor who's screaming his soy latte is two degrees too warm. Organizations like Enterprise Rent A Car is a another example of this. I interviewed with them a few times right after college, and they extolled how the company promotes from within, how everyone is a manager. etc.I suppressed a smile as she said this. My knowledge of HR and labor law was microscopic at the time, but I knew that my being a manager meant I wasn't qualified for over time. They didn't hire me, which was fine. At least when I worked extra hours waiting tables, I made extra money.

Either camp requires the employer to invest in its employees, albeit for different reasons. I need occasional training in new regulations, while when I worked at Friendly's (my first job some 26 years ago), I was cross-trained on every back of the house job imaginable. I could grill, wash dishes, and make a Jim Dandy with the best of them. I could have been a manager if I'd stuck around. Ironically, I had more upward mobility at 17 than I do now.

The financial aid training I receive can be webinars, or a group workshop at a campus or hotel. The latter are more fun, with lunch and a few branded items to take home like pens and stress balls. These are a chance to get out of the office, chat with counselors from other colleges, drink gallons of coffee, etc. The webinars are deadly dull, and I tune out after 20 minutes or so, but it can be done in our director's office. As such, it's much cheaper. So why would the college allow the outing versus the webinar? For one, it helps morale to get out of the office and shake up our routine, but the real reason is that the outcome is seemingly better in the classroom than a webinar. That's the prevailing theory, but this begs the question; what if it isn't?

I'm not asking purely on the basis of automation (though it's a factor), or the singularity, or any other sci-fi tropes. I ask because private employers are asking the same thing. Is reason you get a call center in India or the Philippines when you need customer service because they will do a better job than someone in Iowa? Probably not. Someone fluent in understandable American English who is also versed in American culture will do a much better assisting me than someone from another country. It's gotten so bad I spoke to someone whose English was A) so accented I couldn't understand it, and B) riddled with grammatical errors. I demanded to know where in the world I was calling, and eventually pried out that he was in Costa Rica. I exploded and immediately required a native English speaker, who got so much attitude she almost hung up on me. So why do they do this? To save money. But wouldn't that potentially cause a business to lose customers? In the Costa Rican example, probably. I have called American Airlines and Citibank many times since then, with only one call going to a non-English speaking nation. I was routed to the Czech Republic; again I demanded an American agent, but Mumbai or Manila residents are proficient enough in English that I'll at least try to get through the call.

In other words, these private employers are calculating a cost benefit analysis and believe that they can save more money than they would lose by delivering an aggravating, though marginally competent, customer service experience. As someone who worked in a call center, I'm a bit pickier than most, and have limited tolerance for endless questions and garbled translations anyway. To keep my business, a company would need a few American or Canadian reps to help me and others like me. I can eventually get to someone Stateside, but I have to run a gauntlet of pissy operators to get there.

Restaurants are getting in on this as well. The Chili's in the Palisades Mall now has tablets at every table that allow you to order you meal and pay without interacting with a server. Naturally, I'm deeply offended at this development. Moreover, I don't see the benefit. Waiters cost almost nothing in salary, and a good one will give the customers an excellent dining experience as well as up-selling appetizers, desserts, drinks, etc. Moreover, I like talking to waiters and waitresses, and these jobs were a vital part my returning to New York. I won't be eating at Chili's any time soon, but I wouldn't eat there anyway. I don't drink, and the food is lousy at best. Still, the hope is that the increased efficiency and table turnover will make up for any lost customers. I'd want to know how this affects tips, but Chili's can otherwise get fucked.

So why is this happening? Part of the answer is that while the results are not great, they weren't expected to be anyway. The results are hoped to be just good enough to keep customers from rioting online or finding new businesses.  The "New Rich" call this an 80/20 split, where it believed that 20% of the effort gets you 80% of the results. 20% of your customers deliver 80% of the profits, and so forth. The other 80% is often not worth the trouble. Using my school as an example, well more than 80% of our students are in 2 majors. Is it really worth it to have the other programs? (I say yes, but that's another post).

Companies cutting costs as close as possible to help the bottom line is nothing new, but I sense something deeper at work. I'm beginning to believe that employers no longer feel that the potential future contribution of most employees is worth the effort needed to cultivate it. In short, just good enough is all that is required, and that means a lot less money and effort on the part of employers.

Now, those with a humanist bent would decry this as ignoring the potential within all people to grow and how their contributions benefit society. I agree somewhat (wait for it.........................), but the surface scanning of the trend seems to contradict that. In reality, most people don't add that much value to a situation, and getting to that value may not give a high enough initial return on investment to make it seem to be worth the trouble. The problem with that attitude is that you don't know who is capable of what, and this attitude of just good enough to get by causes errors to accumulate. It's similar to the logic behind the terrible mortgages handed out prior to the housing crash. Home buyers were given the opportunity to take possession of homes they couldn't hope to afford, and the brokers chopped up these loans to microscopic cubes to be sold on the bond market. The brokers didn't hold the loans, so who cared if the loan was repaid? The realtors got their cut already, so if a home is foreclosed,  that just gives them a chance to resell the home again and get another commission. The examples are myriad, but you get the point.

That was the logic, anyway. The real-world results were a little different for the housing market. Those of us in higher ed do recognize that this attitude, in addition to diminishing our contributions as teachers, causes structural errors to be built up within the system that will cause education to be devalued and degrees to be suspect.

Nevertheless, the writing is on the wall, and it reads that these changes are coming. This is despite the negative evidence surrounding online education. Only certain liberal arts courses successfully adapt to online learning, and other subjects (like business courses) can as well, depending on the specific class. The other subjects, like science, math, art, etc. do not translate at all to the digital realm, though at some point in the future they might, if the technology allows. Also, and this is the main argument against MOOC's, is that the investment students have in online courses is generally lower than an in person class. This dynamic is worthy of a post itself, so I'll research the subject more thoroughly.

Still, academic success is relative. What do you call the woman who graduated last at medical school? Doctor. Nurses have an expression: nursing algebra, which states that C+ = RN. The use of the 80/20 split is already in higher ed, but teachers and counselors at my school are just now seeing it's macro effects. If a mass-produced class teaches most of the subject to some of the people for a much lower price than a classroom course, is it an acceptable trade-off?

 The answer is maybe. The metric you want in this case is not the outcome of a regular online course or MOOC (and for MOOC's, they're dismal), but the outcome of the the more expensive class on campus.  If the answer is yes, these classes, designed by the individual professor,  are more effective at teaching, and students learn significantly more, retain more, etc., than an online class or a pre-packaged one from Albany, then we as teachers can show why we need academic autonomy, and why we need smaller classes, and so on. If the answer is no, (and the outcomes can be closer than teachers care to admit), than our clamoring for money, autonomy, whatever, is misguided at best, and simple greed at worst.

I'll go out on a limb and say that teacher-designed, classroom classes are best, and for certain subjects, an absolute necessity. They're also required for certain students who need the structure and help not available online. For specific classes, such as writing and reading intensive classes, online may actually be better. MOOC's are a waste of time, no further discussion needed.

So how do we defend our position? That's easy: parents. More on that next time.







This page is powered by Blogger. Isn't yours?