Moving their cheese: a case study – WPCampus 2018 – WordPress in Higher Education
Articles,  Blog

Moving their cheese: a case study – WPCampus 2018 – WordPress in Higher Education

– [Elaine] Hey, everybody, thank you for joining me here today. I’m really glad to be back at WP Campus. Today, I’m here to tell you the tale of two major website overhauls, one that moved St. Mary’s
University onto WordPress, and another one that we built WordPress into something more sustainable for us. So, along the way, you’re gonna learn about some of the pitfalls we faced, some of the affordable
tools we came across that you can use to
check your own website. You’ll also learn how
to smooth things over with those masses, so they
don’t all come to you yelling, you moved my cheese,
as soon as you move it. So, just to get a quick
feel for people in the room, how many of you would
say you’re developers? Good smattering. How about designers? And it’s totally fine if you
raise your hand multiple times, ’cause I know some of
us wear so many hats. Managers? Marketers? We’ve got everybody here today. And do we have anybody who is kind of an all around webmaster? All right, I’m one of y’all, ’cause I do a little bit
of everything myself. I’ve tried to wrap up a
little bit of all these different disciplines into this today. So, you can all take away something that inspires you to take action. I do want to note that all the quotes that you’ll see in the slide deck are from Spencer Johnson’s famous
book, Who Moved My Cheese? I really find it interesting,
personally, how people can think that moving the digital
documents is a bad thing, because to me, unlike a printed book, where the pages are static, a website is really a
living breathing collection of information, where things
are constantly updated. I know that many, many of
you have been in situations where those changes are not
necessarily so well received. So, some you who have been
in higher ed for a while have probably seen this diagram before. It’ a Venn diagram with things
on a university homepage, things the visitors are looking for, and that little narrow spot in the middle where they overlap. Some of the things listed
as being on a homepage are a campus photo
sideshow, press releases, and a letter from the president. From the things people go
to the site for circle, we have the campus address, application forms, parking information. And in that middle, where
both circles overlap, the only thing that’s
shown in both categories is the full name of the school. Internal stakeholders
and your website visitors often have very different ideas about what should be on a
website and how it should work. And of course, it’s not just the homepage, it’s every section of your website has a different group of stakeholders with different ideas
and probably different levels of technological expertise. Hopefully you’re all here because you want your institution’s website to focus on that sweet spot in the middle where you’re meeting
your institution’s goals and your visitors’ goals too. I’m here to tell you it is possible. You’ll probably never have a perfect site, I’m not sure that one even exists but you can cut through
a lot of the clutter and give everyone a
better user experience. And you can gather the
data you need to prove that you’re improving the experience and win people over to your side. One of the biggest keys
is to make progressive enhancements so instead
of these major redesigns every five years you make
smaller changes every few months. Another key is to test
and measure your changes so you can really say without a doubt that it’s the changes that led to success and not some vague
external factor that nobody can really identify. So from the book, the quicker
you let go of old cheese the sooner you find new cheese. In 2011 St. Mary’s had what I like to call a proprietary quasi
CMS which was basically just a PHP file that
interpreted query streams to figure out what other PHP files to include and maybe a
little bit from a database. It sorta got the job
done, but then one day the web developer, the
web developer up and left. So it became clear to even
the nontechnical folks that the site needed to move
to a more standard platform. So based on the fact
that the previous site had been made in PHP, it was neck and neck between Drupal and
Wordpress, and in the end, it was WordPress that won out. This was a really, really
painful transition. Since the old system had no concept of hierarchy whatsoever,
just endless query streams, the first overhaul and
tools were really simple. Paper, red pens and
weeks and months of time. So my office literally printed out a list of every url on the site
and divided that list up into stacks. Each person went through their stack, often having to go to the website and type in the address to
see what was on that page. And then they decided what was current what would stay and what would go. When they were ready to load
content onto the new site, there was no such thing as exporting. They manually copied and pasted
from the existing website. And they had to do double
work making changes were they could on the old site and the new one until it was ready. There was a vendor that set up wordpress on fresh new shiny server and
they developed a custom theme and installed plugins
while the St. Mary’s team was loading content so
during this, there were fits and starts of the site crashing in the middle of content loading. But eventually the day came
and the new website launched. Now this was before my
time, but I have heard a couple first hand
accounts of the fallout and it really wasn’t pretty. I’m hoping that some of
you have campus communities that embrace change but in our case, the faculty and staff at that time, really did not embrace this change. As our next quote from the book says, the more important your cheese is to you, the more you want to hold onto it. Now because communications
had done the hard work of trimming the fat,
it didn’t let everyone on campus back in as content editor. See, this wasn’t just a
migration to a new platform, it was also the beginnings of a workflow. So that we can keep the site more current and the branding and writing styles could be more consistent. So as you can imagine,
when these stakeholders whose cheese had just been moved, realized they couldn’t move it back again, they didn’t respond with thank you notes and congratulations. Of course, you can
debate the pros and cons of having more content editors. On the one hand, allowing people to make their own updates
makes it easier for them. So if you have engaged
faculty and staff members they can bring you fresher content. But on the other hand,
allowing people direct access to make all their own
updates without moderation, makes for a lot of content that
comes from different voices. And if you have editors
who try to tightly control formatting by adding
lots of inline styles, it can make the aesthetics
pretty inconsistent, not to mention the whole
other topic of accessibility. When you have content editors who are in the system everyday, it’s really easy for them to forget things like adding alt text and
adding web friendly images. Some of these issues are really a lot easier to solve than others. So for example, we use the insanity plugin which resizes every upload automatically so if they’re uploading their
CYMK 4,000 pixel wide image it’s not a problem, it
automatically fixes it in the background for them. And there are also
plugins to check for basic accessibility standards,
like just making sure you have alt text on everything. But there’s still no
substitute for someone who’s experienced with WCAG standards manually reviewing the page
to make sure it’s up to par. So since communications and this vendor were trying to do the migration, we gained control of our
information architecture. It took some time to smooth things over but eventually most people learned to appreciate the changes
or at least live with them. Some stakeholders heard or
have learned that if you push hard enough for long
enough, you can eventually wear down some of the
resistance and get your own way. So over time, several things happened. One school broke off and built
their own separate website. Except that in order
to give certain offices just one website to log
into to make changes, about 20% or so of that school’s content stayed over on the institutional website. So visitors had to click back and forth back and forth from site to site. Admissions then liked that microsite idea so they built their own. And since the focus of the
main institutional website was on admission, a lot of
content got duplicated there. Meanwhile Google Analytics
was setup that it was just quietly collecting
data in the background because no one really knew much about it. Another school pushed hard
enough for long enough that instead of just having one list of their academic programs, they got two. One by department, which they were certain everybody would be looking for and one under a list of
all the schools program. Plot twist, hardly anyone
used either of those paths. Most visitors came straight
from the universities list of all the programs,
not really caring what school or department
a program was in. So over time, content was published in two or even three different places, all to make sure someone
could navigate to it using a specific linear path that somebody on campus really preferred. So it wasn’t quite the clean, fresh start that everyone had envisioned. So fast forward to the part of the story where I come into the picture. It’s 2015, three years
after WordPress is launched, there’s already been a complete redesign and a number of enhancements. They’ve added various custom post types, cron jobs, taxonomies, you name it. Up until now, St. Mary’s
has been using the vendor who did the initial migration. So for hosting, updates,
enhancement, everything. And they found that just
core and plugin updates take almost all the
monthly retainer hours. So they decide to an in-house position. I’ve just come in to fill that position and after browsing through the website from the outside and getting some initial recommendations like making
sure core is current, removing the second slider
plugin, things like that, I think we’re in pretty good shape. As soon as I’m given access into WP admin my mind is blown with the amount of custom post types
and pages, and content. The cheese is not as
fresh as I thought it was. So I slowly dig through each post type about a third of which are
not actually being used. They were just developed
for a proof of concept and they’re sitting on the live site because there’s no dead site. There’s no stagings, there’s just prod. One of the CPTs is built
just to spit out html to paste into Mail Chimp,
yet it’s still accessible through permalinks and site maps, so it’s competing with
our original news posts. There are thousands of urls from what really should just be
hundreds of pieces of content. So I start map out the actual hierarchy of what’s used, what’s not, and what’s going to duplicate it and triplicate it. Even more fun, they have
set up that individual school on it’s own site,
mostly like the institutional website, but not quite so it’s theme, even though it’s named the same is about 10% different,
just enough to make me have to comb through it’s code separately. As I’ve spoke with the
vendor and poked my way deeper into this code base, I learned some scary new things. Like the site had been loading too slowly, so the vendor had created a custom table to hold a copy of page navigation and they used that
instead of wp list pages. So those of you who raised your hands as developers, I hope
you’re either cringing or just not quite processing what I just said because it makes no sense. Later on I found out they’d
been using a combination of plugins that stuffed
the options table so full that the little database server was too overloaded to even
perform wp list pages. Which also caused some really fun downtime not long after I arrived. So it took a fair amount of time to slowly trim off the old things that weren’t used and the weird things that shouldn’t have been used. I felt a little like a contractor who has come into this monstrous building that looks clean and
simple on the outside, but inside the electrical wiring is about to catch fire,
the plumbing is lead and it’s routed so oddly,
you really don’t know what you’re gonna have to tear up to figure out where it goes,
and there are 17 different colors of paint on the wall. You just have to roll up your sleeves and start using some elbow grease on whatever looks like it might pose the most imminent danger
and ignore the moss green shag carpet until you’re sure the house isn’t going to collapse around you. So I had to remind
myself as the book says, see what you’re doing wrong, laugh at it, change and do better. A few months in we left the vendor and their hosting behind, and I’ll give an unsolicited
shout out to Pagely for helping us migrate
everything including our really weird complicated cron jobs and settings that a lot of
hosts might have overlooked. Now that the inside of the house, the code behind was somewhat cleaner, it was time to work on
the user experience. So having heard how well the changes went over the last time, I started to arm myself with data. It’s easy to demand the
navigation structure you like, but it’s harder to defend it if it doesn’t work when
the numbers prove it. So once I mapped out all the content and checked our analytics
for basic page views, I found out 37% of our pages
were getting less than one visit every five months,
including those people on campus who were demanding
this navigation structure in the pages that we had in place. So now that the site wasn’t
crashing at random intervals, and we knew a lot of our
content was gathering dust, it seemed like the next logical steps were to trim the content
and the navigation. Again, because of the
previous negative response to major changes, my team
decided it would be safest to trim down navigation before we actually moved any pages. So in preparation, I consolidated our Google Analytics (coughing) and added some filters to better handle the multiple subdomains that were feeding into the one account. And I set up Crazy Egg on
our most popular pages. Now Crazy Egg tracks pages one by one but we wanted to track
navigational clicks sitewide. So I set up a little javascript which added a class and a unique id to all the header and footer links. Every time one of those
things was clicked or tapped, a PHP script inserted click
data into a custom table. We recorded what page the person was on when they clicked,
which specific link they clicked or tapped, the link text so it was a little more human readable and their ip address so we could tell whether they were on campus or off. Nowadays with the GDPR
we would just set a flag of on or off campus, so we
wouldn’t be collecting that. This screenshot on the
slide is something I manually put together, so
it more visually displayed the data, since Crazy Egg
only shows a screenshot of what’s already visible on the page, not things like the drop down
navigation or segue elements. And it ended up being worth every minute I had to spend pasting and
nudging these nuggles around. Because it was so easy for everybody to understand exactly what
each little number meant. So as I suspected a lot of these links were not used or very seldom used and this was not over
summer or Christmas break, it was prep time. So based on this information, we reduced the number of links in
the header and footer from over 200 to under 80. This was gonna have major
impact on user experience and SEO, all without
actually moving the cheese. Most people on campus
actually didn’t even notice. But the few who mentioned
it, mostly to ask for their link to be
restored, were able to live with the change for two reasons. One was, we had the data to show them only a small percentage
of on campus visitors were even using a particular link. The other reason people
accepted the change was we explained our plans for the future. This was not the final navigation, no news, nothing hard, but
this was an interim step. My team was just on the brink of launching a new intranet, which
would make it possible to gather all of our internal
documents on one site. Many on the institutional site
hone in on just perspective students and public audiences. We also had a plan to connect
a lot more usability tests and continuing monitoring analytics to help guide our decisions. So with all the new data
we would be better able to bring the most sought
after content to the surface and make these stakeholders
most important pages easier to find, who wouldn’t want that? Another major change I made
early on was to fix site search. Anybody who has used
Wordpress built in search knows it’s lacking. And anyone who has a big
institutional website or multiple sites that
just can’t be searched because they’re separated, knows how painful this topic can be. If you have a single WordPress site or multi-site and that’s
all you need to search, there are plugins
available like Relevanssi. But in higher ed where every department in every school and office
has a separate budget, and a different platform
on a different subdomain, it just wasn’t going to be possible to use a WordPress plugin
to solve this problem. So after searching for a while, I found a company called
Swiftype that solved the problem. They have a crawler that will index as many domains and subdomains as you want with whatever restrictions you want. Obviously we wanted the
institutional website and the rogue school fully indexed. But then we had sites like Athletics where we only needed the homepage so the game results
didn’t overwhelm people searching on our main site. And then there were the other sites like the academic catalog
where we wanted a few key pages indexed, but no the whole
thing because visitors typically want to see a program page and not just a boring catalog description. The other big benefit with Swiftype is you can customize your search results. They have this drag and drop dashboard so you can set up specific pages as the top results for certain queries. You can also set up synonyms,
so things like dorms and residence halls will still bring up the same results. There are just tons of
customizations you can do. They’ll also send you weekly reports on your top keywords as well as keywords that have no results
so you can continually be improving your results. That was the first major website change that already received compliments so I felt like we set
a good precedent there. So by 2016 the header and
footer had been tamed. Old unused post types had been weeded out, the theme had been recoded,
you can actually search the website and find things,
and we have a solid post prevent embarrassing downtime. The critical cleanup
was done so what next? My goal is to improve both the SEO and the user experience,
so I went back to the data. I researched how people were navigating to each major section of the website. Often there were two
or three different ways they were all being used
and often these paths were not the very linear drill down paths that we had created by going to a school, and then a department, then
the level, then the program. A lot of times people would drill down from academics to programs
to a specific program or even start off browsing an event and then search for a program that related to something that resonated with them. So it wasn’t a case of content being in a wrong place, different visitors just navigate differently and that’s okay. The problem was our
structures were all silos. So knowing that our
hierarchy was hurting us, we secured budget dollars to set out on our second big overhaul. Rebuilding WordPress from the ground up. It was time to completely
replace the theme and all the post types. So we could tackle the
actual content cleanup that we’d sidestepped when we just updated the header and footer. We found a new vendor and
zoomed out to 10,000 foot view of the site. Not just at that point in time, but where we wanted it to go
in the next five to ten years. And during the course of several
multi-day planning sessions we mapped out the different
types of content we had and what we might want to build
in the foreseeable future. Since we also knew the
intranet was launching soon we took that into account and we were able to really toward our vision of making this main institutional site, primarily focused on prospects and their parents, with just enough additional information to take care of the rest
of the general public. We scoured analytics to
found out how our visitors were connecting the dots. That was a crucial step looking
through our site search logs to see what people were not
finding in the navigation as well as what pages
people tended to view after looking at a certain page. This all helped us understand
the thought process of the people we most
want to use our website, the prospective students. It helped that we had
Google Analytics Views set up specifically for
on and off campus traffic so we could focus specifically on the external visitors. We knew that nesting pages under pages under pages and maintaining
two or three copies of these pages under different passwords what was hurting us. So instead of thinking of every page as reachable through it’s tree, we had a lightbulb moment when we realized we needed to map out the
relationships between our content. So for one example we
determined we needed a faculty post type that can be assigned
to a department post type which is assigned to a school taxonomy. And the relationships
could then automatically build those navigational
pathways we needed without having to create
copies of anything. The new theme template for a department would automatically list out any faculty and programs associated
with that department so we would no longer have content editors manually managing these links. The templates would always
take care of it for us keeping the site fresher with less effort. It was the same with our new
post types and taxonomies. Everything was very carefully crafted to give us the structure we had needed from the beginning and take
advantage of automation. We had our plan for new content types but there was still this
labyrinth of content that didn’t quite fit into clear buckets. There were still pages
that had not been seen in months or even years, how do you go about cleaning up that mess? Well we did not use stacks
of print outs and red pens, I was just trying to
sort the pages digitally in some way that would let
our whole communications team help with the work. So I tried out different site map services but none of them ended up
being quite right for us, mostly because our hierarchy
was just so sprawling, it’s hard to capture on a single screen. So I did something unusual, I combined Google Sheets, card sorting and Balsamiq. And I’ll walk you through what
we did one step at a time. First I pasted every
single url of our websites, the institutional site
and the rogue school into Google Sheets. I then spent what felt like an eternity adding analytics for each
page so we could judge what to keep and what to
trim not only by opinion but by numbers. It took about a week to pull all this data and double check it for accuracy. I played around with the
API which will allow you to pull it automatically,
but because we were dealing with so different pages,
we maxed out the number of possible API calls and it
just didn’t work smoothly. So I stuck with the manual process to make sure we got data
for every single page. Then it was a matter of going through line by line to decide what could move to the new intranet, what
could be completely removed and what could be combined. So here’s our spreadsheet
showing each page title, its analytics and the old url. And I believe this was about six months worth of unique page views
in that number column. Another column that didn’t
quite fit into the screenshot here showed the new url
where it was going to live. And we also tracked all
the printed vanity urls we knew about so our spreadsheet
came to over 1700 lines, which I know is small for some of you. But it was huge for us. So we cut some of our key url structures but in the end, we went
from over 1200 pages plus our news articles on the main site. Down to 450 plus news so there were a lot of redirects to set up. The only way to make
sure we didn’t miss any was to track everything. So that was what the Google Sheets managed us to do and it
allowed our whole team to go in and edit together. Now a giant spreadsheet
like that is enough to make anyone go
cross-eyed, let alone someone who thinks visually like
me and the designers here in the room. So I went back to my UX toolkit. Card sorting is a way to rethink
categories and structures. If you haven’t heard of it,
it’s basically a process that asks different people
to group information whatever way it makes sense to them. You can do it with post it notes but with that many pages, I think we would have had to take
over the whole arena. So I found a free card sorting program and we did an open sort, where we started with our lower level pages and let people group them into buckets. We named the buckets later
and those then became our top level and second level navigation. The other option you have with
card sorting is a closed sort where you already know
your top level categories and you just have people
group the cards into them in whatever it feels logical to them. Going back to the wisdom
of Who Moved My Cheese? He knew sometimes some fear can be good. When you’re afraid things
are gonna get worse if you don’t do something, it
can prompt you into action. But it is not good that you’re so afraid that it keeps you from doing anything. So as you can imagine sorting 1200 pages into buckets was pretty
long and painstaking process and we went through several iterations. But in the end, just like a jigsaw puzzle, things started to fall into place and eventually we ended up
with a manageable number of groups. Then I took the groups back to the team and we chose labels for them together and we made sure that
overall, we were not missing any major buckets. The goal of this stage was
not to make sure everything was perfectly categorized but
just to identify top level and second level categories,
everything basically that would eventually go
into our top navigation. So now we have this giant spreadsheet to track all the things
and we have all the content loosely sorted into buckets. And next step we needed a way
to visualize the site map. Not long before we started on this project I had set up a new website for our alumni so I still had access
to a Balsamiq membership which had worked great for wire framing. It suddenly occurred to me that I could use their little text boxes and arrows to build a
nice simple site map. You can also do this
in Visio or Illustrator and in hindsight, one of
those might have been a little easier, because you
can control page sizes, and we have a few people in our office who really need that hard
copy to review anything. But Balsamiq’s workspace
expands just as big as you need to be, there’s no
limit on your artwork size. So it wasn’t constrained,
which is why it worked so much better for me in the end, in any of the site mapping
tools I had looked into. And snapping the arrows to
points in alignment tools helped my slightly OCD personality, make sure everything was spaced out evenly and snapped to the right points. So I started with those buckets we identified in card sorting
and then went one by one through the spreadsheet checking each page to see how much content it had and adding a new square to Balsamiq each time it looked like
we needed a new page. Every square contained the new page title and the all the different
spreadsheet lines where content was coming from,
so all those little numbers are different lines in the spreadsheets with the original pages where they were going to be combined from. Many of these combined content that had previously been spread across four or five separate pages and again, a lot of this was
not about linear hierarchy, it was just grouping and
adding the right relationships. Once I had a rough draft I
printed it out on 11 by 17 pages and met with our team to review and adjust and we iterated until we’d felt we’d arrived at the best structure. I should mention this was not a completely linear process from step one
to step two to step three. For the hierarchy project
we did sit down first and review analytics and
predictions to determine the overall content types
and the relationships. That way our vendor was
able to start building out those custom post types, taxonomies, all the relationships while
we sorted through the content. Some of the content structures were more straight forward so for example, we knew we were still have
a campus life section, so some of that type of known content we loaded earlier into the new site. And some of the other pages where we weren’t quite sure yet
where in the structure they would land. So although this may sound like a clear, logical progression in hindsight, in reality it was a little bit messier and we still face the same problem as the first migration. We couldn’t really export the
old site and just import it, because the structure
was so very different. And we had about a month’s worth of time when everyone had to enter the changes into both the old live site
and the new staging site. But with the visual site map to guide us and the spreadsheet to
track all the content so we wouldn’t lose
anything in the transition, we at least knew what
percentage of the content had been loaded, and
therefore roughly how much time was left before we could
switch over to the new site. While we were wrestling with the content, the vendor was mocking
various theme templetes to make sure that our fresh new site would also get a facelift. When you’re changing
your entire information architecture, there’s no really
no progressive enhancement or A/B testing little changes
where you can send people to two different versions
and determine a winner. So to measure the success of this project, we looked more to the page views and the time on page for the content that we most needed our
visitors to look at. Like the list of all of
our academic programs and the main admissions page. And while the vendor mocked up pages we did some usability
testing to make sure visitors perceptions of the new
designs were positive. These tests, which we
did through gave us some really good insights and told us that the
designs were right on track but there were a few tweaks
that would make the overall experience a lot better. We made a number of those
tweaks before we ever launched. Our whole office pitched
in to copy and paste from the various source pages
into the new staging site. And then previewed it to
make sure that the finished pages made sense visually and not just on that big old spreadsheet. Anybody here can build a
tool that can magically crunch all these numbers
and build the new site, I’m sure I can find
budget for it somewhere. But comparing the data
before the restructure and after, it’s a night
and day difference. And even on a shoestring budget, you really can make an
impact if you’re willing to put in the elbow
grease and track results. So here are some of the
things we’ve learned from these two big overalls. Lesson number one, make smaller
changes whenever possible. Focus on one section of your website or something sitewide like button styling. So instead of waking up
to a whole new website, stakeholders can see small
changes happening over time. Lesson number two, collect as much data as you can to support your changes. Do usability testing, do A/B testing even pull a role at google analytics. Anything that helps justify the reasoning behind the changes you’re making. Lesson three, keep your
stakeholders in the loop. They won’t wanna hear
about every time you change but some of the folks
like your admissions teams and deans are gonna wanna
know if you change the layout of all the pages about
majors and graduate degrees. Give them a little
summary, maybe a preview on the staging site before
the change goes live. And again, give them
some background and data to back up your choices. Lesson four is offer options. Whenever certain stakeholders
are losing power, give them a concrete
plan so they don’t feel like you’re completely
cutting them out of the loop. In our second overhaul,
everyone who was not in our communications office lost editing permissions on the main website. That was hard to swallow for some folks who have frequent updates. The key there was to
think through the process. So we already had a new request form built when we talked to those offices. That way we were reassured we would get their updates posted in a timely manner. We’d actually thought
through how this would impact them and how we could make things better. In a couple of cases the
offices ended up happier because we can actually
make updates a lot faster than they can and that’s one less thing on their to do list. And when it’s not a case of power loss, but maybe you’re simplifying and losing some functionality, offer options. So instead of just telling
people this is the new way, we’re rolling with it,
bring them in half way through the process and explain why their preferred ways are not working. Help me them see the bigger pros and cons and make suggestions of their own so they’re helping you
decide how to move forward and meet all your goals together. Last but not least, I want to leave you with a couple more tools to
make cheese moving less painful. I can’t recommend A/B
testing highly enough for the small changes,
anything where you’re not overhauling your whole
information architecture, which should be a very rare occurrence. We’ve used a number of testing tools over the past few years,
and the two that have worked best for us are Visual Website Optimizer and Google Optimize. Visual Website Optimizer is
exactly what it’s name says. You set up an A/B test where you’ll have a control group and
at least one variation. So for example one of
our tests was to change the above the fold content
on our main about page. To see if we could funnel people to our most important pages faster. Half of our website visitors
saw the old about page with lots and lots of paragraphs and half of them saw the new version which split up some of our key differentiators in a little list and added
some little images as well. VWO let us set this test
up by just adding a line of javascript to the one page and then we used our WYSIWYG editor to
create a new second layout so I didn’t have to
touch the sitewide styles or anything like that. The new version blew the old one away with visitors spending
more time on the page and clicking through the
new links to our top pages. So we were able to test this change and confirm that it was
these content changes rather than any outside
factors that contributed to visitors staying longer
to consume more content and going straight to
our top converting pages. Google Optimize is really
similar though I found I had to get pretty creative
with the measurements there. One test we ran that was really successful was when we changed our
sitewide fonts this spring. We were tired of the all caps heading font we had loved during the hierarchy project and we wanted to see how different fonts would affect engagement. So we initially picked eight fonts that coordinated with
our printed collateral. I set up our staging
sites so the about page could render each font
based on a query stream. So everyone in our
office could see exactly what they were going to look
like with beautiful content. From there we narrowed down the list to the three that we liked the most. To make this test work
sitewide, I had to load all those fonts plus our old
fonts in the head of the site and I added a line of CSS as well. So that if the body had a specific class, it’d force all the fonts to render in one of these new variations. Then in Google Optimize I
created the four variations. The original plus the three new fonts and I just added that body
class in the variations to make the different fonts appear all across the whole website. It was a little hard
figuring out what variable to measure as the goals
since Google Optimize’s goals are fairly primitive to
some of the paid options. But I ended up
hypothesizing that if people found it easier to read the website, they should stay on our website longer and perhaps explore a little more. So I set page views as the goal. I was really surprised
at the huge difference the fonts made. Our control with our original fonts average 2.7 pages per session. One of our test fonts
match that almost exactly. Another test font raised it
to 2.58 pages per session and the winner raised the
bar to 2.9 pages per session, which is a 7% increase
in the average number of pages people look at
before they leave our site. Keeping people on the
site just by changing the fonts was a huge win. So I hope I got you thinking
about some experiments you wanna try and ways
to help your stakeholders become more comfortable when
their cheese gets moved. One last reminder here if you questions, we’re doing them, I
think, mostly in Slack. Is the website accepting- – [Facilitator] Podcast channel is open and for those of you
here, you can ask out loud or post stuff in the
discuss channel on Slack. – [Elaine] Don’t forget
to submit feedback. Does anybody have questions in the room? – [Student] What was the
name of the A/B tester other than Google Optimize that- – [Elaine] Visual Website Optimizer. It goes by VWO for short. It’s a great service,
it’s really affordable. (audience talks over one another) – [Facilitator] Can
you repeat the question he just asked you? – [Elaine] So the question is,
if we had to do anything to- If we had to do it again,
would we do things differently? I know for the first one,
they definitely would have tried to find a new PHP
expert to actually export content instead of doing
the manual parsing. For the main hierarchy project, I think the biggest thing I would change would have
been to have two different environments to work in so that the vendor could have been working
in a dev environment and we could be loading and staging without all those fluctuations of the site suddenly breaking as
they wrote machine code. That’s about all I would
change on that project. – [Student] What did
you use for an intranet? – [Elaine] For intranet we ended up using, oh gosh, I think it’s an
Ellucian product I wanna say. I will have to get back
to you on Slack with that. – [Student] What’s next? – [Elaine] Right now we are working on redesigning just the homepage because we still have
that same static homepage we’ve had since the
hierarchy project in 2016 and we wanna kind of
feed it more dynamically so our news posts and different things that change automatically without somebody having to make changes all the time and our events will be higher up. So smaller projects like
that are the focus now. – [Student] So with your redesign, how big was your team and your budget? – [Elaine] How big was the team and budget for the redesign? We ended up with a $25,000
budget for our vendor which I think, was pretty fair to them, and our team I’m the only developer, SEO, analytics mastery person on the team, but on
our communications team we have anywhere from about 12 to 15 different content editors in the system, so that’s the size of our team. – [Facilitator] If
anybody has anything else you can ask Elaine, otherwise,
I think it’s lunchtime so enjoy. (applause)

Leave a Reply

Your email address will not be published. Required fields are marked *