My Brilliant Amazon Career, By The Numbers

farewell, old work laptopWhat was working at AWS and, later, Amazon’s mothership in Seattle like, you ask? As with any time at a huge corporation, especially one for which you relocated yourself and your family*, it’s complicated, but if you’re looking for pure metrics, which would be very Amazonian of you, have at it. Presented without (much) further comment, here’s a selection of my personal statistics – no seekrit project information or team details, obviously – over my not-quite 4 years:

  • 3 teams
  • 3 roles/5 job titles**
  • 7 managers
  • 18 direct reports
  • 3 laptops
  • 4 buildings
  • 9 desk locations
  • 121 PhoneTool icons (for the uninitiated, it’s A Thing)
  • 83.55% Old Fart (the internal tracking of employees hired after me – so I was in the most-tenured 16.45%)
  • 3 other offices visited, 1 in the US, 2 in other countries
  • 12+ master global taxonomies managed, with millions of terms in each and and in each local marketplace variation: Prime Video, Kindle, US Books, US DVD, US CDs, Digital Music, Audible, Handmade, Interests, some other oddities
  • SO MANY papers written: 1-pagers, 6-pagers, etc.

I’m excited about my next challenge to be named shortly, but am very much enjoying my needed break before jumping in – and finally having the chance to get back to beer blogging!

* This is something I can speak openly about – the relocation package and the team were great, even with our cats
** I was actually recruited initially as ‘Content Product Manager’ though that got superseded by more Amazon-specific internal titles over time – but I really liked that one, I admit

Beer, Tech & Institutional Memory

who lives who dies who tells your storyTech and brewing have the same problem. No, it's not sexism or a general lack of diversity (though there's plenty of both, in both) - it's the near-total absence of historic and digital preservation. As an ex-archivist, this has long irked me, and it's striking how similar many of the barriers are in two (theoretically) quite divergent fields.  Brian Alberts wrote a lovely piece calling on modern craft breweries to make a start at preservation, and he noted his (all-too-common) experience as a researcher; you know there are gaps, and sometimes you even know why, but you have to work with what you have. Let's delve a little further into why those gaps exist, and we'll review a bit about archival practice along the way.

First, let's imagine that a brewery (or fancy tech startup) has managed to consistently save some portion of their records - again, lots of credit to Brian for some suggestions on how to get started without investing a lot of time and money - what happens when they get to the archivist?

For this fun thought experiment, we're presuming that 1) records, both physical and digital, exist, and 2) there's a professional archivist being paid to process and describe the collection. Processing begins with 'appraisal' - simply figuring out what's there, and what should be kept; it (typically) has little to do with the monetary value - or lack thereof - of the records. No archivist wants to keep everything - it doesn't add anything to the historical record to keep 400 copies of invitations to an annual party, but it does add a huge burden to the administrative and storage costs associated with the collection. Once the collection has been appraised, the real processing begins - this is where things are arranged, described and re-housed - decisions will be made about keeping the original order or moving to something that may be more logical, paperclips and staples may (or may not) be removed, out will come the carefully-labeled, acid-free folders and boxes and Mylar sleeves for photos, and the collection will be described at whatever level (e.g. collection-level, folder-level, item-level) was desirable and/or possible.  Many assume that 'everything' gets digitized at this point - and archivists will laugh and laugh at this because it's enormously expensive and time-consuming. This doesn't mean there's a lack of interest or will to do that work - digitization projects are awesome - it's just rare that it gets funded, even in organizations with relatively deep pockets.

You'll notice we're primarily talking about paper and photographic records here - digital records, larger objects and other formats require even more work - and again, we're assuming that 1) someone has kept it and 2) someone is being paid to organize and provide access and (additional degree of difficulty alert!) 3) there are short and long-term preservation plans in place. Long-term digital preservation is incredibly complex, expensive and difficult, so we won't get too much into it here, but even short-term efforts require considerable thought and effort; every archivist has a story about getting a box of old Palm Pilots or 8-inch floppy disks that contain largely unrecoverable data.

And there are other allied efforts that could be taking place - an oral history program would typically try to cover a range of experiences; interviewing staff in different roles, at different levels and with different experiences to capture their stories. All too often, we only hear the perspectives of the company's founder, or the head brewer; we rarely value the stories of the junior coder, the middle managers or those on the packaging line.

Swinging back to tech for a moment, let's consider how deep those pockets are - and how none of that cash is going to preserve any aspect of most companies' histories. Of the larger companies, both tech and non-tech, that I've worked for (and I’m obviously excluding museums, libraries and archives here, since that is All They Do), I can count two that ever had any sort of formal archival program, and in both cases (HP and GSK) - that support has waxed and waned over the years; some of HP's historic records were destroyed in the Santa Rosa fires last fall, and when I left GSK, the formal management of the archives was a part-time effort. Many, if not most, of today's most successful and influential tech companies have no formal or informal program to capture their histories; in fact, they often have policies that actively undermine that goal - here's how.

When it comes to what archivists call born-digital materials - emails, Word documents, PowerPoints and so on - companies often actively delete and destroy those materials. There are a number of reasons for this, but the two biggest are legal and financial: the legal department is happy if there's a lot of regular deletion, since that means there's less to turn over in any discovery process in the case of a lawsuit, and those writing the checks don't want to pay for ever-increasing, expensive digital storage (cloud or otherwise). As a consequence, many companies have official records management policies that ensure email is deleted every x number of days if not explicitly stored elsewhere (though this is different if you work in fields that operate under a lot of preservation orders, like pharma - some things you'll need to keep forever). And not infrequently, all records management policies are drawn up by the legal department, with no input from any other stakeholders; historic preservation, either short- or long-term, is rarely on anyone's mind.

And knowing how quickly things change in tech, it's incredibly difficult even for those of us who work for global tech companies to trace through the path of a decision from a month ago, much less several years (or even decades) - it all becomes tribal knowledge and corporate lore. I could describe for you the cascade of poor decisions that the dot-com I worked for in the late 1990s undertook before finally failing in the dot-com crash, but it would be almost entirely folkloric; there is no trace of the emails, and I suspect I may have one of the few copies of one of the annual reports, though interestingly, a lot of my old code is still floating around on the Internet Archive, preserved purely by chance (but hey, you can still see some of our old job postings).

While it's a huge loss for future (or even modern) researchers, it's also a potential financial loss for companies; while the use case that most people understand readily is that of finding material to use in advertising or PR campaigns, keeping those records that record important business decisions can be key records in trademark disputes and other less-pleasant aspects of the business world. For tech companies, this means a lot of intellectual property, whether code or copy, would be preserved; for breweries, there's the obvious ability to find old recipes and packaging artwork, as well as beer names and labels for the now-inevitable trademark spat.

So, with all that said, is anyone doing it right?

Carlsberg BreweryIt's true that even larger breweries with long histories infrequently employ professional archivists - while Anheuser-Busch and Guinness have (or perhaps had, in the case of A-B?) formal programs, Guinness only got into the historic preservation game in 1998. Fullers does have a wealth of material, a fraction of which they display in a well-curated collection within the brewery, but the ongoing processing, arrangement and preservation strategies are overseen by one of the senior executives as something of a side project. If there is a model to follow, Carlsberg is one of the best; they have a team of professional archivists who manage the usual work you find in that sort of setting - arrangement, description, reference and digitization (where feasible). Carlsberg and Guinness both have the depth of history and wealth of materials that allow them to essentially run beer theme parks that use primary sources as the underpinning: a visit to the Guinness Storehouse is entirely unlike any brewery tour, but it does a fabulous job of highlighting historic advertising and key company documents, while the ongoing work of processing and preserving the collection goes on behind closed (but accessible by appointment) doors. The Carlsberg tour experience is certainly more like a brewery tour - indeed, their Jacobsen line is brewed in a small section of the historic brewery complex, while the modern brewery for their flagship lager is off-limits - but they do a very good job of mixing the history with the modern experience. An entire portion of the historic physical plant is set up as a museum, and it does a wonderful job of mixing 19th and early 20th century advertising with some of the company's documents that tell the story of Carlsberg, the Jacobsen family, the growth of brewing science and the globalization of the industry.

In short, there are a few models for brewers to look up to when thinking about how to begin saving and organizing to tell their stories in the long-term; what about tech? In truth, the examples are just as few and far between, if not more so. HP (another of my former employers, although as a contractor I was, ironically, safe from the decimation happening all around me during Carly Fiorina’s reign of terror) and IBM have both made some efforts (and, if memory serves, both had full-time corporate archivists at one time, though I believe that is not the case now for either one), but they seem to be rather piecemeal; indeed, the majority of HP's historic preservation now seems to be done by volunteers; it's not unlike the breweriana community. While fans and enthusiasts may do a great job of collecting and documenting certain things, they can't save the depth and quality of material that should be in a real archival facility.

We're all stories, in the end; just make it a good one, eh?And there are certainly ways to ensure records and artifacts are preserved without launching a brand-new department: companies can partner with their local museum or university archives to get going (this also means they should fund some permanent headcount and ensure there is a physical space for their collections) - the expertise they need is already available locally. Cisco has taken this approach to good effect, and the Oregon Hops and Brewing Archive is a natural partnership opportunity for Pacific Northwest breweries, though again, some financial help could help move them from a 'community archiving project' into a fully-funded program with more scope to collect, preserve and share brewing history.

And partnering with other organizations doesn’t mean giving up control over who sees what and when; there are plenty of archival collections out there with restrictions on what can be accessed or published, often with a specific timescale built in (e.g. ‘not for public access until 2050’ or similar requirements; you can put all sorts of complications in your deed of gift documentation if you so desire), so worries that a competitor may steal a recipe or other intellectual property can be relatively easily managed.

Although it's likely not feasible for every brewery to get to the world-class standard of the Walt Disney Archives, there is no real excuse for most deep-pocketed tech companies to not make at least a gesture in that direction (and here's a handy case study that can start the wheels turning). While corporations cannot control how future historians view them, by passively (or actively) limiting the available records, they limit the stories that can be told; by ceding the narrative to chance, they abdicate all opportunity to select which stories might be told.

Finally, a quick postscript: the white glove thing. In most archival facilities, white cotton gloves are only used for handling photographs; you are more likely to damage paper - especially brittle, highly-acidic, 19th and 20th century paper, which is typically in much worse condition than earlier, rag-based paper - with gloved hands than you are with clean hands. Your television lied to you.

This piece also appears on Medium

A Tech/Humanities Peanut Butter Cup

yum?“Hey, you got your peanut butter in my chocolate!”
“You got your chocolate in my peanut butter!”

What does a vintage candy commercial have to do with tech employment? Plenty.

A recent Forbes article described the hot new trend of tech companies hiring newly-minted holders of liberal arts degrees.  In my 20+ years of experience, this is neither new nor hot - I've worked on amazing dev teams full of people with multiple advanced degrees in the humanities who felt like earning an actual salary - but it's worth talking about. While it's absolutely true that there is a vast shortage of people with STEM skills in the US, and plenty of well-paying jobs sitting vacant for them at tech companies large and small, the notion that you 'need' a STEM degree to land one of these jobs is damaging, both to jobseekers and to companies. At the same time, there is an extremely tired idea that studying the arts or humanities is a waste of time, because it doesn't 'prepare you for the workforce' - and that's simply untrue as well. Both sets of skills are necessary in the modern workplace - and getting beyond that initial entry-level engineering job may be easier for those with liberal arts background, as we’ll discuss in a moment.

But the basic premise of the article maintains a strict tech/non-tech divide: a new Slack employee with an arts background was briefly profiled, but it emphasized that she was so useful because she was non-technical:

She’s been at the company for barely a month but she’s already helped a construction company assimilate Slack’s software to keep track of things as varied as plaster shipments and building regulations via employee smartphones. Lee says she’s in awe of her technical colleagues who write Slack’s code. They, in turn, respect her because of her untechnical ability to “connect with end users and figure out what they want.”

And this is the point that is often misunderstood: you can absolutely succeed in a technical role with a humanities or liberal arts background, as long as you've also got the technical chops, and even if you are in a purely non-technical role, remaining 'in awe' of your technical colleagues isn't particularly helpful - you should have at least some understanding of what goes into what they do, and know that it's hard work, not magic. Many new non-engineering graduates gained solid technical skills as they studied Proust or philosophy (which does, to be fair, get a mention in the article), but it’s not always a given.

On the flip side, moving into a management or leadership role with a purely technical background is a different sort of challenge. For those looking to brush up on their technical skills, there is a burgeoning industry of boot camps and self-directed learning. If you're an engineer who needs to learn to write, present and influence decision-makers in a new role, the path forward is rather murkier, even if someone is on a strict principal-engineer path. Good code isn't enough to get you there, and some of the more theoretical aspects of an engineering degree program (which in itself is not exactly 'vocational' education, though that's something that could be much more highly-valued in tech) are fabulous in helping you develop ways to approach a technical problem; being able to lead a team and explain to your leadership why you've chosen a particular path forward isn't as straightforward.

The Forbes article also included this leftover from Stereotype Salad:

People with balanced strengths in social and math skills earn about 10% more than their counterparts who are strong in only one area. In fact, socially inept math whizzes fare no better than go-getters who struggle with numbers.

While I'd be more than happy to introduce you to some equally-introverted historians (they'd totally hate that, of course), there is a useful point buried here: a basic understanding of both technology and the liberal arts gives you adaptability; fluency in both can give you career superpowers. Understanding how to wrangle data is important. Being able to contextualize and tell a story with that data, to multiple audiences, is equally critical. And having the ability to pivot to an entirely new role, vertical or industry is more realistic if you simply have more tools in your toolbox; being able to switch back and forth between technical and non-technical positions as business or life conditions change gives you options you might not have otherwise.

This is not to say that specialization is a bad thing, or that all engineers are lacking writing and management skills - far from it. But developing expertise in one or more areas is what happens on the job, as you gain more experience, and technical degrees become 'stale' far more quickly than those in the humanities: a programming language you spent several months, or even perhaps a few years, learning as an undergraduate is most likely almost useless ten years down the line - if you're still in the field, you've learned new languages and skills through work. But the ability to research, synthesize and present arguments, whether those are about the Corn Laws or stylistic pottery variations at Mohenjo-Daro, are still valuable skills when differently employed. The subject may be far removed, but the skills around critical thinking, thoughtful skepticism and time management are vital.

And arts/humanities graduates have another leg up when it comes to tech job descriptions: 'comfort with ambiguity.' You'll see a similar phrase in nearly every job description from a tech company, in both tech and non-tech roles, and yes, it's an extremely useful quality to have in this (and many other) fields. Fortunately, if you've spent several years gathering data, writing research papers and debating complex issues that don't have a clearly-identified 'solution,' congratulations - you've got the right mental training for this career. I've seen some young engineers struggle with just this aspect of the field - you can't always engineer your way out of the problem (well, often you can build something, but it leaves significant technical debt that you - or someone else - will need to deal with eventually), and there may be multiple paths forward. Having experience of referring to historical precedent goes a long way.

In my own tech career, I've never had to reproduce any of my shaky college algebra (turns out it wasn't even useful early on as a front-end and back-end web engineer), but I write research papers, give presentations and analyze strategies and processes; these are things I was quite well-prepared to do as both and undergraduate and graduate student of archaeology - and that's especially true for the data analysis skills I learned there, though the technologies and techniques are now quite different.

So, where do we go from here? I was fortunate to be in the right place at the right time as a self-taught techie; early on in the dot-com era, the skills were the important thing; it didn't matter where you'd acquired them. To a certain extent that's becoming true again - boot camps and coding challenges are offering other paths in to the profession. But there's a fundamental disconnect in the way we approach teaching both technology and the humanities, at least at the high school and college levels (there seems to be a little more room to experiment in the elementary years, though that seems largely driven by the STEM-only crowd). A newly-minted engineer, either at the undergraduate or graduate level, needs coursework and experience writing. New English or Art History grads may have had some exposure to technology through text mining or other digital humanities projects, but ensuring a solid exposure to 'real' coding is just as important for them.  Internships would also ideally include both coding and writing experience - and many more are starting to do just that.

We also need to do a better job as a profession helping people from purely technical backgrounds move into senior roles - a few hours of 'management' or 'business writing' training isn't especially impactful in most cases, and there aren't equivalent writing 'boot camps' to help hone those skills. Having a foundation as a matter of course, even if it wasn't the key focus of a degree program, would go a long way toward setting people up for success - testing out of English 101 isn't the same thing.

While many larger tech companies have figured out that an ever-broader population has tech skills as well as what we might term 'business' advantages, startups and smaller companies aren't always aware that they should cast a wider net in tech recruitment. Librarians have often been forced to become software development managers, just by the nature of modern work in the field. PhD historians often outpace new data science grads - many of those skills are part and parcel of modern academia, they just pay very poorly in that setting.

There is an artificial barrier between these two broad skillsets that needs to disappear; having a foundation in both is critical for success in tech, and in many other businesses.  Putting the two together brings out the best of both, just like the commercials said.

Eat up!

This post also appears  on

Stonehenge, Shoes & Shared Workplace Experiences

where the demons dwell!I recently had the good fortune to geek out on corporate culture with the wonderful people of Zappos (full disclosure, we are 'cousins' within the Amazon ecosystem, though I include my usual 'Not Speaking for AWS' disclaimer here), and while they had a full spectrum of fascinating, positive things about their culture to latch onto, what I was most struck by was the role that shared experiences played in shaping their unique approach to work, and how the thoughtful, intentional creation of shared workplace experiences is often overlooked as a tool to drive a positive corporate culture.

I am certainly not unique in having worked for a variety of companies, large and small, that miss the mark when it comes to helping you learn how to navigate and thrive in their specific cultures. Back in Silicon Valley during the dot-com boom and bust, I experienced both little startups - I was employee 18 (or so) at a dot-com, pre crash/burn – and I subsequently worked for a few huge, global tech companies. While those organizations were very different from each other in almost every way, they did share a total lack of structure around onboarding. That's expected (though not really excusable) at a startup, but even at Big Tech Company No. 2, no one helped me figure out how to get paid until about 3 months in. There was no training, either formally or informally, on in-house tools, norms or expectations. I don't think I saw a company mission statement or had a specific new hire or role-based orientation program until about a decade into my career.

And then I have experienced the other side of that coin - training and process overkill. Another nameless company I worked for was insistent about transmitting everything to do with its goals, values, compliance, and culture via time-consuming, mandatory e-learning. While there is certainly a time and place for asynchronous training, especially when you have a global workforce, I argue that if you are looking to foster long-term business relationships and a strong, healthy company culture, e-learning and classroom training aren’t magic bullets. Live, shared experiences are the key, and that brings me back to Zappos.

Everyone who joins Zappos, regardless of role or level, joins a cohort of new hires who have four weeks of training - they learn the customer service role inside and out, they work the phones and speak directly to customers in the call center; no one gets to opt out to attend a 'more important' meeting. Their training is capped off by a real-life graduation ceremony, and many of the people I met, in a variety of roles, fondly recalled their training; it gave them a firm grounding not just in the company culture and expectations, and also set them up for success at building relationships across departments and roles. I'm sure those relationships are a major factor in why there were so many long-term Zapponians - people whose tenure often exceeded a decade. From a tech perspective (including my own, which, again, is not unique, where I've seldom been in any one company more than 2-3 years), that's astounding.

This is not to suggest that every company should go out and bolt on a four-week immersion experience to their hiring process; it's certainly not cheap and for a globally-dispersed team, small or large, it's simply not always feasible or even desirable. But even fully-remote companies realize that technology alone can't create and develop culture; Automattic's approach of an annual meetup for the full company and smaller team get-togethers creates regular opportunities for their employees to share experiences. Other companies have town halls or all hands meetings that serve similar functions; the cyclical, almost ritual repetition of these kinds of meetings (and, not infrequently, the trip to the libation chamber bar after) lets employees build organic relationships and memories - 'remember the all-hands where X spoke or Y performed?' That's important.

Shared experiences drive shared purpose.  As humans, we seek out cyclical, seemingly ritual, experiences - is an annual trip to Disneyland substantially different from a theoretical 'pilgrimage' to Avebury or Stonehenge undertaken by their builders (and, quite probably, their plus-ones)? We have good evidence that the 'users' of Stonehenge (to put it in vaguely techie terms) liked a good annual party; the motivations behind it may have not been terribly different than that of a modern company picnic or offsite: do something different from your regular workday, with your colleagues (and possibly your family as well), then consume food and beverages. There would have been other commonalities with our era - everyone would recall the colleague who got horribly drunk one summer, or the time someone's dog tried to attack the fire-eater (you may recognize the voice of experience here). While the terms we use to talk about prehistoric gatherings tend toward the mystical or mysterious, that's largely a function of the paucity of evidence and/or our tendency to want to make something we don't immediately understand more meaningful, but annual or seasonally-occurring events in the distant past may have been quite similar to ours - a working meeting with a party afterward.

In the workplace, we create rituals whether we mean to do so or not. A standing happy hour, a semi-organized run at lunch, a yearly offsite or even our more formal business mechanisms like annual reviews or daily standups drive our culture. How we create and evolve those experiences for employees says a lot about that culture - going back to Zappos, they ensure that everyone has the opportunity to attend their all hands meeting; it's such a priority that the call center is shut down for the occasion, as it is - briefly - for some other seasonal events. Creating an environment in which all employees have consistent, shared experiences builds personal connections and deeper engagement - provided those are good experiences. Yes, it’s hard to do globally, at scale, but it’s worth trying.

A few simple guidelines:

  • Be intentional. What do you want to create, and why? How will you evolve it?
  • Be consistent. Create a regular cadence and stick to it.
  • Be inclusive. If your site or event doesn't welcome everyone (and there may well be certain team- or role-specific events), what are you telling current and prospective employees?
  • Have fun. You may not see a direct ROI on every event, but if your employees want to be there for the long term, you're doing something right by giving them something to remember that that isn't just their meeting schedule.

Finally, think long term. Everything you do is adding to your company’s history, whether that will eventually be long or short – what kind of story do you want your employees to tell their future grandchildren or robot overlords?

This post also appears on Medium.

Why Your Tech Company Needs an Archaeologist to Fix Your Corporate Culture

for realz indeed!It's been difficult to miss stories of tech and startup culture fails of late, whether it's Uber or Thinx, and there have been many excellent suggestions on how to improve diversity and the employee experience, but I'll throw another one into the mix: hire an archaeologist*.

No, it's not a joke, though I fully admit it may be a head-scratcher at first, but hear me out: I've been working in technology for 20+ years, and while I'm emphatically not speaking about my current role at AWS, where I'm the Culture Lead (yes, we're secretive, but you knew that, and no, I’m not claiming we’ve ‘solved’ everything culture-wise), I can assure that my two archaeology degrees have been incredibly useful in this field - though never more so than in my present position. Allow me to explain –

I fell into technology while working on my MA in archaeology at University College London in the 1990s; I began my tech career as a coder and moved (kinda/sorta) swiftly into people and technology management in Silicon Valley, NYC and elsewhere - I'm now happily situated in Seattle, where I get to do all sorts of Secret Things I Can't Tell You About Right Now. Along the way, I've seen some pretty bizarre things from a company culture perspective (terrible brand rallies! awful 'culture fit' excuses in hiring! team and product names that are totally offensive to colleagues in other regions!), but I've also been lucky enough to see the good as well. After a few general culture protips, we'll discuss how having an archaeological viewpoint can be a huge benefit - for real.

First, though, a few notes on What You Should Do; your company culture, like any other aspect of business, can't be left to good intentions - it needs structure and mechanisms to reinforce it and to help it evolve in a positive direction. Whether you are a tiny startup or a huge multinational, you need mechanisms that will scale with your organization's growth, and that can be consistently applied wherever your people are. You may need to modify them to work in some regions or for remote people or teams, but they should still be scalable and repeatable.

Your culture is modeled by your leadership, and that's at every level, from the c-suite to brand-new dev managers. While it seems that every company has 'values' or 'principles' that were drawn up early on, in my experience the uptake on these ranges from absolutely embedded and referenced on a daily basis to openly mocked and derided, with most places falling somewhere in between. When they work, they are a valuable tool and a core driver of your business - they dictate hiring, promotions and offer direction on key decisions. When they don't work, there's usually an obvious reason:

  1. They were developed by outside consultants to 'sound good'
  2. They are meaningless platitudes that simply take up time during the onboarding process
  3. They are actively terrible, and are used as an excuse to avoid diversity

I won't dig too deeply (see what I did there?) into the third point, simply because it needs to be its own discussion (as it is here), but I'll pivot to why they work when they work:

  1. They are thoughtfully, and intentionally, developed in-house, taking into account a wide range of viewpoints
  2. They are flexible and can be specifically applied to daily work, but aren't 'rules' that must be obeyed
  3. They are regularly reviewed and updated as the company grows
  4. They are an expected, and hence unremarkable, part of daily worklife

If your company's mechanisms for people management don't reflect whatever your company's stated values are - or if they overindex on a specific one or two points - you'll very quickly get drift away from the good intentions that went into their creation. Having repeatable, measurable processes around your business life cycle and the people who make it happen is the key to a healthy culture, and this is where the archaeologists come in.

The popular view of archaeologists falls into one of two main camps: we're either Indiana Jones or scruffy bearded people with a fondness for drink who wish they looked a bit more like Indiana Jones. I surely don't need to point out that both of those impressions skew almost entirely male (feel free to insert a Tolkien joke about dwarf wives and their beards), but there's a lot more going on than just drinking digging and/or punching Nazis. While I won't get too deeply into describing different approaches to archaeology (for example, did you know that theoretical archaeologists mainly argue about French social theory, and rarely, if ever, go outside, much less dig? Did you know that post-processual archaeology is real? Mostly true facts!), there are some commonalities that give archaeologists an edge in mapping and shaping company culture.

Everyone 'knows' that archaeologists can take an artifact (or, more typically, an assemblage of artifacts) and use clues from that artifact to tell us more about the people who created it, traded it, used it or who perhaps just thought it looked cool. At work, we create 'artifacts' every day without thinking twice about it - documents, wikis, websites, apps, you name it. And when we're speaking about those internally-created artifacts that are used to hire and manage people - interview notes, performance reviews, presentations and so on - it's easy to forget that the mechanisms that generated those artifacts were designed with specific long- or short-term goals in mind. Indeed, there may have been considerable 'cultural drift' between a mechanism's original purpose and its current usage; for example, it may have once been the case that 'big ideas' went through a presentation-heavy gating process to get executive buy-in, but now it seems that absolutely every decision goes through some version of that. That's not to say that processes and mechanisms like that can't work, but that the rationale behind them needs to be understood, and that they need to be regularly reviewed to ensure they are still fit for purpose. Not infrequently, most employees who need to actually follow these processes have little-to-no information about why it was created, or what the unwritten rules are - it's purely tribal knowledge.


And that's another way archaeologists 'get' how to dig (har) into corporate culture: when they don't know why something was created or can't pin down an obvious purpose, there's a default answer - ritual! (In all seriousness, this is a thing. It’s practically reflexive). But so much of what happens day-to-day at work falls into this bucket as well; as mentioned, the people who designed (or inherited) a process have left, or have long since forgotten its origin, and it has become almost entirely ritualistic - we do it 'just because.' Sure, we'd like to fix that broken process or mechanism, but it's like that For A Reason, we assume - and thus are corporate sacred cows born. This is just as true looking at archaeological sites; while some pretty weird things do, indeed, fall under the 'ritual' heading (at least without further evidence), it's also clear that people in the past not infrequently did things just because they were fun or looked cool - they aren't so different from us.

Throwing an archaeologist at your company processes and mechanisms can turn up all sorts of unexpected things about your company's culture; simply having a complete audit of all the 'things' you're doing, how they came about, whom they affect, how and where they are implemented is quite illuminating. Turning an archaeological lens on this adds further value; as mentioned above, people rarely know precisely why they created something or how it evolved, so having a background in making educated guesses in that regard, based on data, is quite useful.

With this information in hand, you can begin to make better data-driven decisions that drive your company culture - did you discover a gap in your onboarding process in a specific region? Perhaps there is no policy to handle difficult employee situations, or you may simply have not had time to develop a codified, shared value system for your organization. Knowing where you have a potential problem and what resources you need to allocate is job one - you can thank an archaeologist when they help you unearth these clues.

Finally, a closing thought for the archaeologists out there: want to come work in tech? You have great skills in data analysis, project management, research and writing (to name just a few), and many of you have excellent coding skills - while we don't get to spend much time studying the past over here, we have the opportunity to help our organizations be thoughtful about how we build the future. Bonuses: excellent pay and benefits (actual excellent pay and benefits, not what most rescue digs or academia can afford), opportunities to work remotely and/or travel, and a work culture that still enjoys a drink or three - though that's not certainly a requirement. Beards are entirely optional.

*Other types of social scientists are also available, but I don’t know if they are as much fun.

This post also appears on Medium.

#GHC16, Avoiding Gatekeeping and Expanding Opportunities for Women in Tech

At GHC16After years of following along on Twitter, not to mention 20 years simply existing as a woman in tech, I finally made it to my first Grace Hopper Celebration of Women in Computing in Houston (#GHC16 for you Twitter nerds) this year. And on the whole, it was a fabulous event — great keynote speakers, especially Dr. Latanya Sweeney of Harvard and Ginni Rometty of IBM, and so many opportunities to share experiences with other women in the field. It seemed that the vast majority of the attendees were computer science students looking for internships (and more power to them); they were poised, well-prepared and passionate about what we do — I wish I had been that clear about career paths when I was in my early 20s, and I was thrilled to chat with them — it was a splendid chance to offer advice and, of course, try to recruit them. Hiring is a lot harder now than it was in the 1990s, though more on that in a moment.

But I did notice a creeping undercurrent about who 'counts' as a woman in tech — not, I hasten to add, coming from any of the sessions I attended, merely snatches of conversation I overheard while walking the conference floor or lining up to get into a heavily-oversubscribed talk or two. 'She's just the recruiter' or 'I think she's in marketing, not a software engineer' or even 'she's not a CS major, she's just looking to find a job with a good salary.' And I admit that earlier in my career, I also had similar divisions in my mind — the women (and we only ever remarked upon the women, never the men — unconscious bias is a bitch) in marketing didn't 'get' what 'we' the developers did, they were a different breed. Never mind that back then, few of 'us' had actually studied computer science; we had fallen into the profession through various routes — perhaps coding on the side as a hobby, or taking an interesting tech elective, or even been 'drafted' into a long-open role by having the ability to fog a mirror. But we worked with code. We were techies. Different. Special. Highly in demand.

But having racked up a lot more work and life experience then, I realize now that it's just as easy to be the person on the other side of the 'othering.' A decade-plus into my career, when a CS degree was becoming the more standard route into tech (and the number of women I worked with dropped off quickly around that point), not having one suddenly became a bit suspect. Was I still a 'real' techie when I became ever-further-removed from hands-on coding? Sometimes my matrixed reports didn't think so — and were on occasion surprised to know that I understood what they were talking about and could call them out on sloppy development work. Were my project managers still techies? Maybe. What about tech writers, editors and designers? Sometimes — especially if they were men.

The current mania for 'STEM education' at the expense of the arts and humanities, especially at the undergraduate level, makes the tech/non-tech division seem natural and 'correct' — when, in fact, you cannot build good tech products and programs without a diverse mix of skills and backgrounds. Yes, we need more women (and people of many other underrepresented backgrounds) in technology, but we cannot let an undergrad CS degree and a great internship become the only path in, nor should we let people become so focused on writing great code that they cannot develop in other ways. I want to meet great engineers who can also write well, give a kick-ass presentation and become go-to mentors for others — and those so-called 'soft skills' are just as vital, and need nurturing from the start. Outside interests are just as important; you can be passionate about what you do without it being the only thing you do.

I digress to make the point that we're all in this together; whether you are a woman working in HR at a tech company or a female software engineer just getting started at a non-profit, you're both women in tech. Even if your current team has an ideal gender balance (and I've been on quite a few), it's unlikely you'll always be that lucky in your future career; being able to advocate for each other, instead of only those who are Just Like Us (and Just Like Us doesn't have to be based on gender or background — when we define ourselves by our roles at work, either in whole or in part, it's relevant) is hugely important. There are no Fake Tech Women any more than there are Fake Geek Girls. Women who want to transition into a tech career from another field, perhaps with decades of non-technical experience under their belts, should not feel unwelcome. Given how incredibly difficult it is to hire people with the right skills, we need to stop gatekeeping, even when it's unintentional, and help build other solid paths in. Coding boot camps, especially those with industry support that include internships for so-called non-traditional candidates, are a good start, but coding is just one important element of a successful tech career. Code should not be the sole defining feature of what a tech career looks like, any more than being a white dude under 30 is what a tech worker 'looks like.' We need to focus on our commonalities and drive positive change; creating artificial barriers is no help to anyone, not even the bottom line.

And that leads me to my next topic — where are the senior women in tech? The metrics presented at #GHC16 showed an uptick in early career tech women, but still what looks like a sheer cliff in mid-career and senior executive positions. The guidance offered was that formal leadership development programs are the key, and it certainly sounds like a useful path forward; I've been fortunate enough to participate in some useful coaching programs in previous roles, but they tended to focus on developing capabilities for individual projects or programs, rather than looking at how to move to the next level — that just 'happened' along the way. And I am very much aware of the fact that most of the other women I worked with in my early career are gone — they've left the field entirely.

But I took great inspiration from walking the #GHC16 conference floor and watching companies work hard to impress potential interns, entry-level and early career folk — imagine if we had the same opportunities as Old People to be, as Lerner and Loewe once wrote, 'worshiped and competed for' at conferences that focused on sharing roles at those levels. Yes, we get random calls from recruiters, but it's not the same as having the opportunity to see a fuller picture of what's out there and what we might work toward, nor does that offer the same chance to do in-person networking and story-telling. Luckily, there were some of 'us' there, and while we may not have been explicitly catered to by the hiring companies — not really an issue since most of us were there to hire for our own teams — it was nice to have some representation. Your tech career doesn't have to end when you switch careers at 35 or take some time out to travel or have a family, and it's important to see people who are visible reminders of that, just as it's important to see real-life examples of women of color in tech, transwomen in tech, disabled women in tech and so forth.

I've written before about how the media tends to portray 'successful' women in tech as those who made the C-suite before 40 (or 30, or 25, or hey, why not 12?), or as young company founders blazing new trails. But a mature field allows for a wide variety of career paths, and incremental success is just as valid as headline-friendly overnight success. Sure, I'd like to have retired wealthy by 40 and had the opportunity to become a world-traveling philanthropist, funding rare book libraries and specialist archives all along the way, but I do really love my current position — I'm still moving onward and upward in my career (which affords me a ludicrous level of freedom and privilege compared to most), and I have the opportunity to mentor others. Whether that means we need to have more conferences aimed specifically at mid- and senior-career women in tech I do not know, but I do know that representation matters, and there was a lot of it at #GHC16. Hopefully there is more to come.

My other takeaway was that people will stand in line for a very long time for a freshly screen-printed t-shirt, but I have yet to wrap my head around that one — though that said, it created an ideal bottleneck for career conversations, so all in all, a win. 🙂

Now, if I can just find (or kick off) one of those formal leadership development programs, I'll be set for my next act

This post also appears on Medium.

DAMNY 2016: All the Thoughts

damnyAnother DAMNY is in the books, and once again, there were a few too many thought-provoking sessions than one could attend without bilocating, but to my mind, that is the sign of a healthy conference agenda and a maturing field. While there were still discussions on choosing the right DAM and making the vendor-vs-roll-your-own decision – and very important and useful those are for those new to the field – it was encouraging to see more panels looking to the future - indeed, some were beginning to address the gaps I see in the DAM world. I continually wonder when DAM, content strategy and knowledge management will all coalesce (or, barring that, make their boundaries clear in solutions that play nicely together), and this year's conference confirmed that I'm not the only person asking those questions.

Sometimes a DAM is implemented without giving much thought to the foundational content strategy: in these cases, simply 'getting a DAM' is expected to solve any and all problems related to the digital supply chain, content marketing, audio and video encoding, web content management, rights management, digital preservation and content delivery, all in one fell swoop. A tool built to manage what we might now call 'traditional' digital assets - images, audio and video - may be tasked with being the single source of truth for copy and translations, contracts and filesharing; in short, handling and delivering structured and unstructured data of all stripes to varying degrees of success.

And perhaps that is indeed where we are going, albeit more thoughtfully - if the DAM is truly to be the core of the digital ecosystem, the end users may not need to know what it can and cannot do under the hood, as long as ancillary systems are seamlessly doing what the user needs, thanks to some deftly-designed data models, well-described asset relationships and friendly APIs. But without DAM leaders, both those at DAM vendors and expert DAM managers, developing these use cases and solutions for them, and demanding some firm industry standards, it will take some time to get to that ideal state. A case in point that came up in several sessions was that of the explosion in video resolution and formats - while that (exciting) problem will not apply to every organization, the approach to potential solutions will most likely affect the direction DAM vendors begin to move.

Similarly, the opportunities presented by linked data and well-described semantic relationships must be embraced; the digital humanities field was quite rightly called out for being at the forefront of this wave, having been surfing it long before business or even most technology companies thought to dip a toe in the water (just take a look at any THATCamp writeup). Indeed, it's another example of how librarians have been key to the development of DAM over the past decade; not only can they (we) whip up a snazzy taxonomy and run your DAM better than anyone else, but they (we) can be amazing futurists - defining a roadmap for a product before the vendor thought to do so, or simply building a homegrown solution.

And that brings me to a slight worry; I noted (though I was far from the only one to do so) that a few of the technology-specific panels fit the dreaded all-male panel stereotype. This has not been my general experience at previous DAMNYs, and I did see that at least one of them had not been designed that way, but DAM managers and end users - frequently librarians and, nowadays, marketers – and DAM product managers and developers sometimes give the appearance of dividing along gender lines. I've previously raised the concern about how this could affect salaries (tl;dr - as a technical, or any other sort - of profession becomes more 'feminized,' salaries shrink), but I would hope that as a small, though growing, profession, we can all be mindful of that pitfall and work together to avoid a needless binary, where (at least superficially) men develop the software and serve in senior executive roles, but women do the day-to-day work. I will certainly grant that as a women with 20 years of experience in technology, my Spidey sense is more sharply attuned to look for this than it might be otherwise, but here's how you can all make me feel better - take this year's DAM Foundation Salary Survey and let the data speak.

But there is another way we a rising tide can lift all ships in this field - we can be more proactive about creating mentoring opportunities, both for those looking to get into the field, as well as for those looking to get to that next career step. The DAM Guru program does an excellent job of matching people with those looking for advice on a particular solution or for those who are just starting out, but we have no formal mechanism as DAM practitioners to take that next step for mid- and senior-level folk. As someone who has been 'doing this' a long time, and in different types of companies, I'd be more than happy to mentor those coming up, but I'd equally love to spend some time with some of those very senior executives who are driving the shape of DAMs to come. To borrow a phrase, I want to be in the room where it happens, and I'd like to help other people who want to get there find their own paths.

My biggest takeaway from this year's DAMNY is that we're at an exciting point in DAM's maturity, and for those of us who are lucky enough to have found our way into this field, often by fairly circuitous routes, it's always nice to re-convene to be among 'our people' - but let's take lessons learned from other tech specialties and ensure that the DAM community's diversity continues to grow, rather than contract. As we develop systems with ever-broader capabilities, the field as a whole can only benefit from a wide range of backgrounds and experience - let's aim to keep adding new lifeblood.

I should probably propose a DAM career development workshop for next year

A Many Years Ago, When I Was Young and Charming…

Way back when...
Time Out

Twenty years ago this month, I landed my first tech job, quite by chance - and fell headfirst into a career I neither planned for nor expected, yet here I am, two decades later, enjoying my standing desk in a gleaming tower. The setting for this serendipitous accident was London, and London in January of 1996 was an exciting place to be. Britpop was in full force (even if many of the bands lumped into that category did not embrace the tag, often quite rightly), amazing comedy was all over television and live clubs, and the theatre was in fantastic shape, from the RSC to tiny pub venues. Keeping track of the wealth of culture on offer was the purview of Time Out, and even as a relatively poor grad student, especially one who was thrilled to discover student discounts on theatre tickets were much deeper in the UK compared to the US, I happily paid for a copy of the magazine each week to plan my leisure time - more on that in a moment.

Of course, I should not have had such extensive free time; I was busy studying for my MA at the Institute of Archaeology, with plans to go on for a PhD, and then to become a clubby and chummy academic in the JRR Tolkien or MR James mold - obviously, I fell at the first hurdle by never learning to use abbreviations, rather than my first name, or possibly by having two X chromosomes and not being born in the 19th century. Instead, I seemed to find ample opportunity to hang out at the British Museum (that totally counted as work, right?), see bands like David Devant and his Spirit Wife, catch Iain Glen and Judi Dench onstage, hit regular comedy nights and, just for fun, I learned to build websites.

My coding hobby began initially as a way to organize websites I liked for easy access - enormous shared desktop computers in a lab did not make bookmarking useful, but having my own hotlist (hotlists were a thing) gave me some portability and, oddly, kudos among my less-technical peers. Even in that now-distant era before web comments became an archive of discontent, I soon realized that my free webspace let me share my interests - and gave me a platform to complain about things. I believe the Spice Girls came in for a good deal of online umbrage from me in those early, pre-irony days, but as a cool indiekid, my online persona had to take against them. But I later turned this opportunity in a more positive direction by building sites for bands I liked - official versions were still some way in the future. There was also the instant gratification element missing from academic research - if I wanted to spin up a new webpage, it only took a few minutes to knock together some code, find an appropriately-’90s background image, and play around with fonts. A brief aside - I once had a turquoise and neon yellow tiled background that perfectly matched a cheap shirtdress I bought at C&A, or possibly Topshop - it is possible that I was cosplaying my own website before anyone discovered something so ridiculously meta was possible.

Then I realized you could get paid to do this.

One day while poking around on Time Out's website - one of a very few covering London in any meaningful way at that point - I saw an ad for a web assistant. If memory serves (and it may not be as accurate as I believe it to be), it sounded slightly mournful - the site was getting bigger, but no one else had the requisite HTML skills to keep it updated. Could someone please apply and perhaps they would train them to do the work? 'But I can do that right now,' I thought - and I duly emailed off a copy of my resume and links to the pages I had built. I got a speedy reply and an invitation for an interview - the notion of attaching a resume as well as links to previous 'work' seemed to have been rather more than any other candidates had managed. Within a few days, I presented myself at Universal House, just a short walk down Tottenham Court Road from my UCL stomping grounds, and was hired immediately.

I discovered that in addition to the princely sum of £75/day (yes, really), I'd also be receiving a free copy of Time Out each week - two if I wanted them! Never having had a real job before, such an unexpected perk was especially welcome - my days of getting terrible free corporate art, snacks, software release t-shirts and on-site massages were still some way in the future. I'd get to hear about upcoming gigs in advance as I dropped them onto the website, and if something was missing, I could add a plug for a band I liked, as long as it matched the writing style of the rest of the site. I learned about an exciting new comedy group called the League of Gentlemen, who had yet to make their way to television. I got press kits from bands like My Life Story, and invitations to alcohol-soaked book launches. I discovered that there was a free drinks trolley that went around the office on certain afternoons. In short, there was not a better job for an overeducated 20 year-old with no real responsibilities.

But it wasn’t all just fun and games – I also got the chance to build on my skills. When my boss (the only full-time employee on the website for a very long time indeed) went out of town, I got to field all the questions about what we did, and generally run the show; when I came back after a week away, I was excited to learn that he'd tweaked the site to improve the layout with 'a new thing - tables in HTML.' With our nested tables (frames came later) and many, many carefully-sliced gifs, we could almost, but not entirely, get rid of imagemaps for the 'graphics-heavy' version of the site that was offered to people with faster dial-up connections. A second brief aside here: while I never liked the sound of a connecting modem, I do miss the Eudora 'new email' tone, which was an exciting thing to hear at the time. The office sounds fundamentally different today.

In many ways, that first job set the template for my career; if I wanted to try something novel on the site - Javascript, ASP or another 'new' technology - I was encouraged to experiment. If it worked, great, and if not, well, it was worth giving it a go, and it was never bad to add another technical string to one's bow on company time; continuous learning was considered standard practice. I could dress as I liked, and my usual t-shirt-jeans-and-Doc Martens wardrobe was utterly unremarkable. Another plus: occasional-to-frequent free booze. That structure has served me well in the diverse directions my career has taken me since then - to Silicon Valley before the dot-com crash, where I worked at (an experience not unlike a triple-decker novel in many ways), Juniper Networks and Hewlett-Packard, to New York as a techie-in-non-tech companies (and ditto in Philadelphia), and back to the west coast, where I'm now an Amazonian in Seattle.

In those twenty years, I've only ever had to 'dress up' for work for the non-techie organizations (interestingly, it's also only outside of tech-specific companies that I've experienced any overt sexism, though that's another story) - it was delightful to donate all my 'grownup' work clothes when we moved back to the left coast, where I can wear my nerdy t-shirts, hoodies and DMs to work again without a second glance. Also back: occasional free booze, though as the tired parent of a tween and a toddler, I'm rarely out late - I need my sleep, so the 'occasional' aspect is really by choice these days.

If I have any work wisdom to impart as a 'veteran' tech nerd lady, it's this: hire smart people, with diverse backgrounds and skillsets, and let them get on with solving tricky problems as a team in their own way - but set high expectations. Keep learning about new technology, languages and tools, even if you accept you can't be an expert in everything; it's especially important if your career evolution has taken you out of day-to-day development and into a leadership position. Volunteer for things – the non-profit world desperately needs your skills and experience, and you never know when your passionate hobby project may become your full-time concern. But most importantly, ensure that the ladders you used to find your way still exist - or build new ones if they do not. There is no single path into the tech world, but people from 'outside' are not always aware how transferable, and ultimately useful, their experiences might be for a technical team. A little coding knowledge on top of solid writing, communication and management skills can go a very long way, especially if you give someone the time and space to learn by doing. Beer helps, too.

And if there is a larger moral to my narrative, it is that procrastination can pay off in ways you never expected - just call it 'learning' and it becomes a virtue, rather than a vice!

This post also appears on Medium.

Excavating (My Own) Websites Past

I fondly recall my very first URL - it wasn't a GeoCities site, though that would follow along in due course - but just the few KB (indeed) of web space every postgrad student was allotted by the Institute of Archaeology, University College London. Unfortunately, there's no trace of the content now, though the URL lives on as a 'not found' snapshot in the Wayback Machine. It's a shame, because while I don't recall falling prey to blink tags or other early web missteps, it did have a very vivid teal-and-yellow tiled background that coincidentally matched a dress I'd bought at Topshop (more on them below), and I wouldn't mind seeing either one again. So, while my first foray into web development doesn't exist anymore (a bit ironic, given that archaeologists love preservation, digital and otherwise), at least I still remember this:

But thanks to the Internet Archive's drive to save GeoCities - and, of course, a vast galaxy of sites beyond - some of my early work, both professional and otherwise, does live on; so many websites captured from the Time Before CMS and DAM. After running out of space on my UCL account, I set up shop on GeoCities with a 'hotlist' related to my MA dissertation - those were a big deal circa '95-'96, since search engines weren't especially powerful, and even the site that would become Yahoo, Jerry and David’s Guide to the World Wide Web, was human-curated back then. I also built a GeoCities site for one of my favorite bands, David Devant and his Spirit Wife, and employed what seemed like a pretty cutting-edge Java applet, though alas, the applet hasn't survived the freezing process. And I nearly forgot until the recent 20th anniversary that I used to help out on The Craggy Island Examiner, a Father Ted fansite. The site was powered by basic HTML, visible tables and not a few pints at a pub near Waterloo where we held 'editorial meetings,' and once a mini-Tedcon, circa 1996. But that bit of volunteer work did help lead to my first actual web job, at Time Out in January of 1996.

Time Out's website, circa 1996

The site was a one-man operation when I started, so it was perhaps noteworthy that the web team immediately reached gender parity when I joined (though we did have some occasional help from another gentleman/former member of Hawkwind later). I believe one reason I got the job was simply because I emailed my resume and links to my 'experience' in response to the job posting; it was mentioned in the interview that no one else had taken that radical step. Time Out was a fantastic place to work in the mid-1990s - I got a free copy of the magazine each week, I got invited to book launch parties, occasional press passes and the inside scoop on some of my favorite bands. All I had to do was update the site each week - all the global sites (such as they were then, imagemaps and all) were run from London. And when I saw it again, I actually remembered dropping in that note about Budapest. Midway through my tenure at Time Out, we brought in a more structured layout with 'complex' tables - though still no sign of a CMS or anything approaching one.

I moved on to work for an agency that built sites for clients like Christies, Condé Nast and the Evans Group (retail clothing shops like Dorothy Perkins, Evans, Topshop and so on), where heavily-sliced images, complex tables and frames - and getting them to line up in competing browsers - became the bane of my existence. But I do fondly recall the spinning 'D' on the Debenhams site; that was also quite exciting back then. And this particular Dorothy Perkins page was a nightmare to build - so I'm glad it still exists. Unfortunately the early Topshop pages seem to be long gone, though it was fun working on something for which you were the target audience.

Dorothy Perkins website

But the real mother lode (as it were) of my early web work comes from the Internet Archive's snapshots of my career at in Silicon Valley. As the web nerd in charge of the homepage, both for itself as well as many of its affiliated sites like WomensWire, Prevention and more, there's a great deal more preserved. I moved back to the US in late 1998 (when the site looked like this) , and, having turned down a wildly underpaid job at Yahoo (yes, there were stock options, but you couldn't have paid rent in the meantime), I commenced work at It was an exciting time to be there, and at first, there was a lot of 'smart content' aimed at women - not in the modern sense of 'smart content' of course, but there was a lot of information on careers, finances and health. It wasn't quite Bust Magazine territory, but it wasn't as far off as it would be later. I was tasked with building the redesigned site in 1999 - now everything was yellow - but what's most interesting to see is what remains of the content - features like the Bloomberg/ 30 Index, tracking the success of woman-led companies on Wall Street; the 'first ever online presidential primary for women' (spoiler alert: Al Gore won) and the Men of Silicon Valley ('high-tech's hottest bachelors!'). So yes, that was a Thing That Happened., early 1999

Men of Silicon Valley

There's much more to dig and record where that came from; I was at until 2001, when, with the writing on the wall for pure content sites,  I moved on to Juniper Networks where 'no layoffs' were promised - when that turned out not to be true, I went to Hewlett-Packard, where Carly Fiorina was on what seemed to be a mission to destroy the entire company, largely from the recording studio next to my desk, but that's a story for another time

This post also appears on Medium.

Will the 'Librarification' of DAM Demographics Affect Salaries?

DAM WonkaThis year's always-fascinating and very valuable DAM Foundation Salary Survey came out in February, and there were some interesting - though also, possibly worrying - trends to analyze. First, though, the positives: DAM jobs are becoming ever-more-global, as companies begin realize the value of their digital assets (or, perhaps more accurately, as they discover how disorganized or missing digital assets are a huge money pit). This is an encouraging trend, and one I would hope continues to grow. And the influx of those with MS-LIS and other library degrees suggests that the value of accurate metadata is being recognized - though I'll explore a concern that brings up as well in a moment.

Mapping job titles to skillsets and salaries was noted as a continued area of confusion, and one I have certainly seen borne out myself, as well as amongst my peers; while it's to be expected in a still-somewhat-nascent profession, it can be an area of frustration, not only for the postholder, but for potential recruiters and managers. It may seem a minor point, but given the volume of confused recruiter calls I receive, I think it's worth digging into it for a moment, given this background from the survey analysis:

“Those with the term “Director” in their title tended to make the highest salaries, and those with the term “Archivist” or Archives” tended to have lower incomes. There were no other clear correlations between title and salary. One listing that included the word “Supervisor” in the title made as much as other “Director”s; many with the title “Specialist” showed no appreciable difference than those listed as a “Manager”. This suggests that when reviewing the resumes of experienced DAM workers, an analysis of their actual daily duties, tasks, and projects may be more of an indicator of skill level than job title.”

Indeed, I've had to explain on numerous occasions that my current title, Content Librarian, isn't 'just' a content management role, and that I'm fairly senior in the hierarchy, where my tasks include crafting policies, setting standards and analyzing IT solutions - so likening it to a position such as 'the' University Librarian, rather than 'a' librarian who happens to work for a university, only makes sense to those coming from academia. When speaking with those from a straight-IT background, I explain it's a bit like a product or program management role with a lot of taxonomy bolted on, though any DAM professional knows that's still only a portion of 'what we do.' And having worked in traditional library and archival settings as well as in IT-focused environments, that brings me to my chief concern - will having more (very useful) library skills drive down DAM salaries, over time, simply through assumptions made by employers over title and background?

I've experienced the disparity between IT and library-land salaries first-hand - I began my career in IT, building websites and managing content back when it had to be done by hand, before DAM and CMS solutions existed. Even as software to help corral and catalog content and digital assets came into being, my salary working with those tools remained quite comfortable. Then I went back to library school, with a view toward using my IT background, augmented by my new taxonomy and knowledge management skills, in the heritage/academic sector - libraries, museums and archives. Despite having additional skills and experience, moving into that world reduced my pay by more than 50%; at the time, it was a manageable reduction, and I had a fantastic work environment and great colleagues, but it wasn't sustainable in the long-term. I returned to IT, and immediately more than doubled my salary - using the same skills, but with a different job title and cost center. While part of that jump was down to non-profit vs corporate budgets, even in the for-profit world, I know other DAM 'librarians' and 'archivists' who have found that a change in job title made a vast difference to them in terms of salary. It's anecdotal, to be sure, but it seems that those whose titles are more 'techie,' and less 'librarian-y,' often have higher incomes, albeit for the same sort of work - and good luck figuring out who is more junior or senior, if job title is your guide! Clearly, we have some work to do.

As more librarians - and more women - come into the DAM field, there is a danger that salaries may become depressed; we already know that the youngest cohort in the survey results have lower salaries, and that they are overwhelmingly female, though they have more library degrees. Having said that, it's quite rightly noted that their youth and relative lack of experience is likely the key driver behind their lower pay. But historically, the 'feminization' of a profession (think teaching, or, going back much further, textile production) has never had a positive impact on salaries; quite the reverse. It would be nice to think that we can ignore historical precedent and that we've moved beyond that - and I've written elsewhere about what it's like to be a mid-career woman in technology facing those issues – but given the existing salary gender gap in DAM, it's something we should continue to be vigilant about - let's make sure that gap is truly reflective of a historical blip, and that it doesn't become wider.

I am a firm believer in the value of a library background in the DAM world - combined with solid IT and management skills, it's an ideal, broad-based skillset for an evolving field. And I completely understand someone coming from years in 'traditional' library settings jumping at the first salary offered in a DAM role; given the lack of funding in academia and public libraries, it's (sadly) likely to be a big bump, regardless of how 'low' it might be for an IT or marketing position. But it's been well-documented that failing to negotiate in salary situations leads to lifelong repercussions, and as we see more highly-skilled, and likely previously-underpaid people coming into DAM roles, we should continue to share salary surveys and job title information as we build toward a more well-understood profession. Likewise, as hiring managers, we should do our best to keep salaries fair, and to help our recruiters and HR departments understand that a great DAM professional might not be obvious from their last job title or training.

My longer-term hope is that by highlighting the value of librarianship in digital asset management, we can help enhance information work all around, making the wider world realize that it's a useful route into a technical profession, and one that deserves to be better-known and appreciated, and paid on par with other IT jobs. An MBA may be one ticket to a 'good' salary in DAM, but we need to demonstrate that it isn't the only one, and that men and women have an equal shot at long-term advancement in the field.

Consider this a call to action to make an impact before the next salary survey!