Wednesday, November 2, 2011

College Degree has been Oversold, College Education has been misunderstood.

Alex Tabarrok has an provoking post at Marginal Revolution indicating that a college has been oversold. And I think he misses the point, a college degree has been oversold because like so many things in the past twenty years it has lost its signaling validity. This ironically has been flooded by the availability of cheap money for an education. And as the supply of money increased, the cost and distribution of college degrees increased as well. This is exactly the same thing that happened with the oversupply of money for housing led to an increase in the number of homes bought. The increase of housing funds led to more houses, the flood of education money led to more degrees. But as housing an degrees are non fungible goods, the range of degree quality increased as well. As such a degree's signaling value diminished, but the brand name degree signaling value increased because it was the only reliable indicator. Look at Michael Lewis, what did he study at Princeton? Art History! Now a single anecdote does not refute Tabarrok's point, but instead asks the larger question by what is meant by a college education.

In the past, the expectation was that students would do badly in college, that the graduating was a sufficient signal of ability. The ability to reason and learn. Despite your major. How else would Lewis be able to go from Art History to Economics at the London School of Economics. Somehow the market focus shifted from the process of education, to the product of education which is the transcript. Students behaved accordingly. Want evidence, at my alma mater Cornell, the publishing of median grades on transcripts instead of encouraging students to take harder classes provided hard information on what were the "gut" classes. And students optimized for efficiency. No surprise in hindsight, and probably in foresight as well.

My background is unique in that I studied the STEM curriculum at Cornell, (Computer Science and Biochemistry) but I switched over to the college of Arts and Sciences from Engineering while maintaining my majors. (A quirk of Cornell). Part of my motivation was to graduate in as close to 4 years as possible, but the other was that I realized I had to contravert some stereotypes. The other reason and most clear was that I enjoyed a broader education. My conclusion in twenty years after graduation is that my liberal arts classes probably were more important to my career than my engineering classes, BUT the classes in history, philosophy and literature were accelerants to my technical education. They probably were not sufficient in themselves.

The insight to my education was that every discipline has its own established models of thinking. And the way I was taught uniformly at Cornell was the question, how do you evaluate those models and under what conditions do you modify the models or accept new models. This was uniform from my literature classes (how do you critique a piece), to history (great men vs turn of events), to physics (classical vs quantum vs relativistic) to computer science (how do you determine a good solution). In Computer Science we never had a class on "C" or "Pascal" or whatever, we never argued about operating systems except in the analysis of what design decisions were made. What unified these classes was not the facts, but the mode of thinking. The other thing that tied these classes was that they were HARD. This was not fact based education, and as such it was labor intensive for both the student and the professors. That process for me was the education, not the facts and the skills were not tangible skills such as running a gel electrophoresis or programming in a language.

I left college with an understanding that the key things to approaching any problem are "What is the problem?", "what is the outcome?", "what do I know now?", "what don't I know to get to the outcome?" and lastly "how do I remove what I don't know?" And how did I get this conclusion, it was drilled repeatedly until it became second nature. Mr. Miyagi was right "wax on, wax off"

So in this discursive blog post, how did we get here. The truth is that fact based are easier for all parties involved, they are objective, they are quantifiable and they are easy to implements. Students just have to do pattern matching to solve a problem, they don't have to understand the problem. Students will gravitate to those classes since the probability of a higher grade is greater. To get a "real" college education there is heavy labor involved. For my non-technical classes there would be multiple drafts that focused on refining the argument, not just the grammar. This sucks for both students and professors. I was lucky that I had involved professors. What sucked for me was that through the process, I discovered that I am really not that smart and I have the grades to prove it. But what I can say is the process did improve me, and in the end I developed a level of competency to be a semi-functioning member of society.

The STEM curriculum is not as amenable to making less rigorous (though it can be done, fact based questions such as F = ? really are not different from who won the 1997 world series"). However the non STEM curriculum can be made perfunctory, student writes paper, professor grades, end of story.

Despite accreditation bodies, and other efforts and standardization. True college education is a messy process, and employers don't like messy processes, students want simple checkpoints but in the end all we have is a piece of paper that has no accurate signaling value. Just as today if you see someone with a nice car, do they make a lot of money or do they have the ability to get a loan. A college graduate, they have a degree but do they have an education.

A college degree has not been oversold, it's just been rendered meaningless through devaluation chasing available money. But a college education, it's priceless. It's too bad I can't use a college degree to find out if someone has a college education.

Sunday, October 30, 2011

A perfunctory world....

Today I spent much of my day recovering from the negligence of my personal life. This included dealing with the reams of mail that have piled up that I haven't attended to during my travels. On of the items I reviewed we my 401K summary reports from the different 401ks that I participate in through different employers. And if you have ever gotten one of these letters, you know that they are a joke. They basically give you a few numbers with very little context, explain that the full report has been filed with the government and that you are entitled under the law to obtain a full copy if you wish. One of my plans wants to charge me $5.00 to obtain this information.

I realized that this whole exercise illustrated what has happened to our society. We are so inundated with obligations, come ons, pitches and requirements that we all act perfunctorily. We basically go through the motions to meet the requirements, but not caring or even understanding what's going on with what we are being asked for. So I get this letter from my employer (or former employer) informing me that under law they need to do this. So here you go. We are going to meet this in the minimum fashion needed to satisfy the requirement. But what do I get from this obligation. Not much, except that I can now get the full report should I want it. So basically this is a wasted piece of mail. I don't even know if anyone in the government will actually read what my plans submit, for all I know they could be submitting a novel.

I do get a few numbers, but lacking any context all I can figure out is that my co-workers aren't saving a lot. But what I don't know is how does my plan compare to others. Are the funds liquid, should I need to roll over. What are any conflicts that my management company may have and how were they evaluated.

What I don't get is a sense that anyone is acting in my best interest, and that I have to ensure that my best interests are being looked after by myself. That is the only right I have. For all the requirements and regulations, no one seems to be actually reading what is being registered, no one is trying to make sense. It is as if the mere fact that a chain of requirements is met, that all will be good. And when things blow up, we create more rules and regulations that are barely met to avoid the next disaster. But they don't.

The philosopher John Searle came up with a thought experiment called the Chinese box, where you imagined you were inside a box and given Chinese through a slot and a series of instructions on how to decode the inputs and how to respond. To the outside world, you appear to understand Chinese. Is that enough? Would you understand Chinese?

Do all these rules lead to understanding? I realize that that is where we are today, so many of our work actions are done perfunctorily and we get things done. But are they REALLY done. And this is what is happening to our products and our companies, they are optimizing to meet the requirements, but the meeting the requirements doesn't satisfy the objectives. We have test scores to measure quality, but what we don't have is students who learn. We have products that are bought, but they are not used.

I had a professor who use to make mistakes during lecture on purpose. He wanted to make sure you were understanding the lecture, not just merely copying it down. Someone would almost always catch the mistake. His goal was not to give the lecture, his goal was to communicate understanding.

How much of what you do day to day misses the point. What are you doing to change this?

Sunday, September 11, 2011

Phoenix effect....

It's been a long time since I've written about product management and technology. I guess after dealing with it all day, I just don't have the interest or energy to write about the increasingly banal world that is technology, or more accurately technology reporting. I have often remarked that Silicon Valley is just like Hollywood, except not as good looking. Techcrunch is our Daily Variety, and we talk in terms of sequels or hybrids of past successes. It's like "Dances with Wolves" crossed with "Smurfs", it's like Loopt combined with flickr meets twitter. You get the idea.

So when Farhad Manjoo mused "Can Yahoo be saved", I couldn't resist as a former Yahoo, and also because it is one of the more cogent explorations. Light on melodrama, heavy on perspective. Makes me proud that he spent time on the Cornell Daily Sun, but alas I get nostalgic in a different way. But I digress.

Manjoo explains that the company is profitable (which surprises many people), that millions of people still visit (also surprising in the valley) and that there is great brand affection. But he says despite that, Yahoo has lost it's past glory because in the past you had many reasons to visit every day, and now less so. And that is expected given the time Yahoo came in to existence, the internet was very much like the big three/four network television. It was the biggest places to go, and for all intent and purposes the only place you could go. And like cable and satellite, higher quality channels proliferated. Remember when public access and CSPAN were a major part of the line up on your cable channel choice. Remember how bad (and how good ESPN was -- remember, it made its chops on America's Cup sailing. Have you watched sailing on TV. But also remember Keith Olbermann started there). But the cable channels got better, and since we still only have 24 hours in a day, Yahoo inevitably as a media property was going to lose viewership. It really is a zero sum game. Yahoo is still relevant in media in that it still does editorial well and that's why people still do visit.

Yahoo also cannibalized itself with it's own subproperties, sports, finance, Shine, Flickr etc, the goal was really like a traditional magazine group, something for everyone but not everything for everyone. The problem is these special interests don't get big, that's why they are called special interests. Like traditional media, the assets are in the economies of scale. Shared printing presses, etc. Yahoo was also relevant in that it was a one stop shop for mail, search and instant messaging. In search it lost its way, since it failed to realize that it was a technology, not a media play. And if Google or another company had not come along with the orders of magnitude better search results, it is most likely that search would have remained fragmented with the same lousy results of the past among multiple players. Alta Vista, Hotbot, Lycos really were all interchangeable and the name of the game then was index size, not search results quality. Others thought that search was a commodity back then, that was the prevailing wisdom. Unless you bought Google at $95 and held on, you are most likely engaging in some revisionist history with your memory.

So Yahoo continued to focus on the media game, and these media assumptions alter Yahoo's ability to reimagine themselves. Think about asking a fish, "how's the water?"

The first crippling assumption is that Yahoo is a brand, and that the umbrella brand matters. People in Yahoo relate to Yahoo mail, yahoo messenger, etc. But think hard of any media conglomerate where the umbrella brand matters. Conde nast? I read the New Yorker, Wired, Vanity Fair and Ars Technica. But I never think of the Conde Nast brand. The only time I think about Conde Nast is when I want to eat in their gorgeous cafeteria filled with gorgeous people. The brands under Conde Nast are independent, they don't reinforce each other. People want products, and will judge the products independently and in this case the brand association may be a hinderance. Think about the struggles with placing the Flickr brand in the Yahoo universe. Shine, should have just been Shine and you could login with your yahoo account. That's all. Note this happens with most consumer products. Think Unilever, would you conflate "Ben and Jerry's" with "Dove" soap. What about Google you may ask? Think about all the products that are getting sunsetted there.

The second assumption is that you can plan innovation. Rarely, do you invent in house the next great thing. You usually see it elsewhere and acquire. Apple may be the exception, but the initial iPod design came from platform components from PortalPlayer and Pixo, though Steve Jobs perfectionism was critical to the experience. Where did paid search come from? Google, no Overture. Google maps, acquisition of "Where 2 Technologies" from Australia. Critics will relent that you need to innovate and that implies invention from within. When in reality it will be about finding something off the beaten path. An aggressive venture arm or a incubator program will be about obtaining a portfolio of "call" options on the industry. You do have profits and you do have money in the bank. You may want to outsource your venture arm, since...

The third assumption is that being the cool kid matters. Yes and no. Yes in that it helps your corporate mojo which makes it easier to obtain and retain talent. No, in that in the tabloid business press remember the philosopher Henley "Kick 'em when they're up, kick 'em when they're down Kick 'em when they're up, kick 'em all around" The things featured in the cool kids press often aren't relevant. Companies such as color, Loopt and instagram won't move the needle for Yahoo. These products will get lots of press but it will be disproportionate to the impact, at best they will be niche products unless....

The fourth thing is that Yahoo needs horizontal network plays to regain "relevance". Unless the new kid is anything with a graph, protocols, APis, insurmountable amounts of data that can lead to network effects or defensible data moats. To get a play at these early start ups, the Yahoo money probably won't have the cachet. But this is where the action is for the long germ. Think transactions, payment systems, relevancy engines, configuration. Anything where there is a network effect or a collection of history. People don't leave flickr once they have their portfolio there. (An aside about flickr, photo sharing is a media property, not a technology. flickr is professional grade photography, Picassa is geek shoebox of photos - different aims, there is no global photo magazine). If a play does come about, don't bring it in too quickly to the Yahoo fold. Think Conde Nast.

Lastly, the fifth assumption holds true and is still a viable advantage is that Yahoo can deliver audience. A link off of the Yahoo homepage is the mother of all referrals. Everyone knows this. There is a pressure to refer traffic to Yahoo properties in the strict sense, Yahoo should use it's editorial advantage to link to whatever the new new service is. If an outside product really demonstrates value (which hopefully Yahoo has an investment stake in), it's audience can be a kicker to promote a product that deserves it, not just one where there is a financial interest present.

I've been asked about my thoughts on Yahoo, and these thoughts come from the best analogy of a rebirth that I can think of and it comes from China. Sina.com was very much the Yahoo of China, stagnating trying to figure out its future and its been made relevant again with it's Weibo (or microblogging) product. Think Twitter clone brought to life by Sina's ability to deliver audience, of a product with network effects (username and followers). There are many phoenixes in the tech world, Apple, IBM and Xerox come to mind. To paraphrase the saying "your ideas often go further if you don't insist on going with them" Yahoo can go further if it doesn't insist Yahoo comes along.

Friday, July 15, 2011

What is intuitive? Know your metaphors!

As a product manager one of the hardest decisions after determining a feature set is to work with an interaction or industrial designer to figure out how to implement the actual feature so an end customer can use it. The general consensus is that a product should be "intuitive", but there isn't a consensus definition of what "intuitive" means without using the word itself.

The widely accepted definition is that something is intuitive (I will skip the quotes from now on, but for veracity you should imagine them every time you see the word from now on) is that one should be able to discern a features operation without explicit instruction or description about how it operates. In that definition is an assumption that the implementation of the feature will map to some established metaphor or model that is in your user's mind.

The original Xerox PARC Alto workstation used the metaphor of a desktop, since the Alto was meant to be a document automation device. It was meant to replace your table top, and hence the interface was represented as a office environment. It is profoundly ironic that the word computer itself implies a metaphor, that of a computation device. But aside from legions of Excel jockeys, few people actually do any conscious computing. It's more appropriate that in Europe, computer science programs are known as information sciences departments. Again a different metaphor.

Most user design now focuses on two sets of metaphors for their interface designs. The first is model are competitive products that have established the dominant metaphor or expectation. For newer products, they will choose a metaphor of what they are trying to duplicate, a desktop, a camera, a chair, etc. So choosing the right metaphor is key to making something intuitive. But as our products go more global, the larger question to ask is do all users share the same metaphors? Let me refine the question further, are there metaphors that are universal?

Before I answer that question, take a look at the following video on YouTube.



As you look at the video, consider the following, first the window is a real physical object, and second it is spinning only in one direction. No change in the direction of rotation is occurring. This optical illusion is known as an Ames Windows and I'll let you look at the Wikipedia entry to understand how it works.

I first encountered this illusion in James Maas's intro psych class at Cornell. After having my mind blown, it was blown up again when I was told that some people see through this optical illusion right away, and those people were Native Americans. The reason, they lived in a world where windows were circular. So their minds did not try to coerce what they saw in front of them to their mental models and hence did not suffer any dissonance.

The same thing happens when product managers or designers have to specify a feature or its implementation. When choosing your design, do you assume that your users see windows as rectangular objects, or is it that you and your co-workers do because you spend all your time in the domain.

It is commonly remarked that children pick up new technology quicker than adults. It really is a slight toward adults that children are so fast. When it truth, they come to new technology without any established expectations. They see an Ames Window as it is, not as what they expect. They don't know what a window is. Adult users despite their best efforts, cannot see through their assumptions despite their best efforts, just like you probably did not see through the Ames Window. I'll give another example, many products still represent the save operation with the floppy disk icon, but there is clearly a generation of users who have never seen a floppy disk. They just know it is a save button, but they don't know the origins.

So be cognizant of which metaphors are you unconsciously assuming. This is really clear when you look at websites between different countries, in particular between Asia and Western countries. Also be conscious of what other products have become the metaphoric model. Are you breaking too much with established expectations, can you set new ones.

A good way of understanding what the prevailing metaphors of your space will be how you and others describe your product. Google is search, Facebook is Social, LinkedIn is Business Social Networking. The "is" is a dead giveaway as the dominant metaphor. People talk about Bing as a Google competitor, Google Plus as a Facebook competitor, Android as a more inexpensive iPhone. Those are the metaphors that will govern user expectations. In every case, redefinition is hard. It's much better to create a new category. An iPod is an iPod, not an MP3 player. An iPad is an iPad, not a tablet. The goal is to understand the assumed metaphors while defining a new one.

As Jon Ostrander once wrote in the comic book series "Grim Jack" -- "you can hide from the truth, but you have to know it first." You can break from the established metaphor, but you have to know it first.

Sunday, February 14, 2010

Focus on Mastery over Multiplication

There are two jobs that I would never want as a product manager and that is either being the PM for Microsoft Word or Intuit's Quicken. These are products that are so well established that any additional feature you add will statistically make the product worse. They are so bloated and feature rich that these inordinately complex products will only get more complex. As if that is even possible.

Sadly, as feature rich as they are, they are far from perfect. However, the way that shrink wrap software works, any new release begs the question of what can it do that's new. The truth is that assumption that a release has to do something new has only led to the untenable situation. That's the wrong question to ask, because it limits the notion of improvement to the realm of functionality. Many things can be improved upon simply by reworking what's been done in the past in a new way.

Artists get this. Often each new payment doesn't focus on creating something new, but creating something better than it was before. That's why you see the similarities in the great artists paintings. How many haystacks and lily pads did Money create? He was trying to get it right. Think of Mondrian and his classic design varied to explore shape and form.

Think about the most ordinary of paintings, the Mona Lisa. It's just the portrait of a woman, but it focuses on each part in the right combination to create something magical. So look to the masters, get the basics right, make them better. Don't focus on new styles or subjects.

Customers buy your products to do something they couldn't do before. If they can do what they did before far easier, far quicker. That's something different without being new. I'd pay for that.

Sunday, October 18, 2009

It all starts with a powerpoint....

Ideas are great.
Ideas are cheap.
Ideas are power.
Ideas are the engine of innovation.

Ideas spread.
Ideas are documented.
Ideas must be sold
Ideas end up in powerpoint slides

You're done.

If it were only so easy. In my career in various capacities of product creation life cycle, I have seen thousands of Powerpoint presentations and have come to the conclusion that Powerpoints are the business equivalent of new years resolutions. Product managers create lavish decks detailing all the possibilities and reasons why this next product will change the world. Designers put amazing mock ups of the coolest UIs and whiz bang interaction diagrams that will make a product so sweet that you'll want to lick it.

At this point the world is all yours. At this point, the Powerpoint deck represents all the possibilities of a great product. The only problem is that the product is only an idea, despite all that work, it's vaporware. In software, the product has to be crafted into code to actually be a product. And ultimately that code has to be compiled or turned into something that will actually run on a computer.

Wouldn't it be great if we could create a machine that takes the ideas captured in a powerpoint presentation and convert it into code. Where our imagination is instantly realized. But there isn't, so when I work on a product, I always remind myself:

"Powerpoints don't Compile"

So until they do, you'll have to figure out how to convert powerpoints into code. That's what this blog is all about, how do you take the ideas in those powerpoints and turn them into code.