Monitoring Mentorship @ Hackathons

How HackMIT (and others) can use HELPq to gain insight on mentorship


What is HELPq?

HELPq is real-time help queue and mentor management application, targeted at hackathons and classrooms, where there is a need for issues to be claimed and satisfied within minutes. It includes a simple interface for requesting tickets, claiming tickets, and a powerful dashboard for administrating users/mentors and examining metrics.

HELPq was originally built for HackMIT, but has been used at hackathons like Blueprint, Meteor Summer Hackathon 2015, WHACK, MakeMIT and WildHacks (among others!).

It is open-source and super easy to extend and customize! There's plenty of documentation and more gets added all the time. You only need to edit a few config files.

This queue has been used for several hackathons now - and we've gotten some really fantastic numbers with mentor ratings and response time.

At HackMIT 2015, 425 users submitted 502 tickets through our system, with an average response time of 9:37 and a median response time of 2:48. From our data, the majority of the time, a hacker could request help and receive world-class mentorship in-person, at their table, within five minutes, at a 1000+ person hackathon.

You can read more about it (and set one up yourself) on GitHub! Don't forget to star ;)

Woo, Data!

While the queue has its own built in metrics panel, it's not (yet) extremely comprehensive. So, I've decided to spend my time (at a hackathon, of course!) to look back on past data, aggregated across hackathons, and see if there was anything interesting we could get out of it...

The first thing that I wanted to check was mentorship over time. Considering hackathons tend to be overnight events, hackers will often need assistance at odd hours of the night. Whether there will be a mentor available, however, is another question.

Percentage of Tickets Submitted By Hour by Hackathon

Blueprint Meteor Summer Hackathon HackMIT WHACK

Above are the percentages of the total tickets submitted per hour by hackathon. Notably, Blueprint is an outlier here - it's distribution is focused in the midday because it was not an overnight hackathon.

From these graphs you can sort of estimate things like hackathon start time and hackathon intensity. For Blueprint, the peak time for assistance was the 12pm-1pm slot, eventually waning off as hacking ended at 5pm. The Meteor hackathon showed demonstrable peaks in the evening and (possibly) moments of panic late at night before people eventually went to sleep. HackMIT and WHACK had some otherwise 'flatter' request distribution, where help was always something someone needed.

Percentage of Tickets Submitted By Hour Overall

This excludes data from Blueprint, which was not an overnight hackathon.

Every hackathon except Blueprint was selected for the overall aggregate, as they all tended to start some time in the morning and end ~24-36 hours later.

The most notable thing here is that the majority of tickets are submitted between the hours of 3-11pm, before eventually dropping dramatically in the late hours - but never to zero.

Here we can see the ramp-up of hacker's projects. We can kind of see this as hackers solidfying what they actually want to do and trying to build it out at the beginning, before eventually reaching technical difficulties later in the evening. For seasoned hackathon attendees, this insight really isn't terribly surprising. Problems always start piling up as it gets later :)

Still, it's nice to see that we have data that can show this to be a typical hackathon behavior.

Median Response Time By Hour (Overall)

Response time in minutes. Lower is better!

People are submitting tickets, but are there mentors there to claim them?

Lower median response time is better - the faster we can get people to help, the better.

The median stays quite low, well under 5 minutes for the majority of the afternoon, until about 7pm and onwards, at which point the response time takes a dramatic hit. We can attribute this behavior to company mentors being reasonable adults and heading home/hotel to sleep, before returning the next morning.

A sizeable number of tickets are still submitted during this time, and a few mentors do their best to satisfy them, while mentors begin to wake up and return to the hackathon.

Quality over Time (Overall)

Average mentor ticket rating out of 5, for each hour.

Does the quality of mentorship change over the course of the hackathon?

Mentor ratings as determined by the hackers receiving help seem to find that the help they receive is on average in the upper 4-5 star range - mentors do an excelent job!

Interestingly, mentor ratings dip from 2-3:59am, but rise again!

We can form a few theories about this. Perhaps it is late at night, and hackers are more frustrated? A lack of mentors with technology specific knowledge means that while they try their best, they may not necessarilly have the answers - but it's all they've got!

The times of 4-6am can be seen as a time of desperation, where it may be that any help is appreciated with a grin and five stars, where sleep deprived hackers begin to "just get it done enough for a demo."

Topics and response times

Which topics were answered the fastest? The slowest?

Users must submit a topic for their help ticket - but how does the topic presented affect how long you'll have to wait for help?

20 Most Common Topics

    Ordered By Response Times

      20 of the most common words from ticket topics were selected.

      Right off the bat, we see that people really need a lot of help with Android. We can also see what kinds of technology people are interested in using. General APIs are, of course, king, while specific technologies like Parse, Meteor, Swift expose some interesting trends in choosing web apps versus mobile - the people who need help the most are probably building an Android or web app.

      Interestingly, the topics people need the most help with are also the topics with the longest wait times. Android and parse help is highly demanded but also comes with some of the longest wait times!

      From this, it seems that while mentors are quick to accept tickets with general topics like 'api' or 'python', there are either a lack of mentors who specialize in mobile, a hesitance to debug newer technology, or a combination of the both.

      What about sets of topics?

      Taking the 100 most common topics, we can build sets of words where people need help with multiple topics in the same ticket.

      20 Most Common Topic Sets

        Ordered By Response Times

          Once again, 20 of the most common sets were selected from ticket topics.

          This is super cool because now we have even more evidence as to what people are interested and what people are hesistant to answer. Once again, we see high-demand topics and low-supply in mentors. Ruby on Rails seems to be a much less popular framework than it used to just a few years ago, as node and flask take the lead in mentor knowledge and demand.

          Plenty of people are also trying to make use of push notifications, calendars, and the uber API.

          Abandonment :(

          tickets were abandoned.

          That is, they either expired or were never claimed.

          Top 10 abandoned topics

            Android unfortunately seems to make it on the list of most requested and subsequently abandoned ticket topics, along with unity and azure.

            Unity undoubtedly points to a Oculus VR hack, which is frustrating often because finding adequate help from someone knowledgable about Oculus can be very difficult.

            Mentors vs Power Mentors

            Mean response rate by number of tickets completed by mentor

            A lot of mentors will answer one or two tickets, but far less go beyond, even reaching the double digits during their 24hr period.

            In the graph above, we see only a few that answer 5-10, and far fewer who answer more than 10.

            However, mentors who tend to answer many tickets have a particularly low response time - helping indicate that there are some mentors who effectively wait on the queue all the time and is always ready to go. That's awesome!

            Things to Consider

            Note - the data aggregate used, because it included HackMIT 2015, can have a somewhat significant bias on the data due to its sheer size. However, the hackathon structure, timescale, and emphasis of queue usage was encouraged about the same in each.

            Looking retrospectively at data lets us identify trends. We can always use this information to try and make mentorship better.

            Incorporating more ex-hackers and students could be a great way of improving mentorship at hackathons, especially when those with hacking backgrounds have more experience with the "newest, freshest" APIs. Improvement could also come through, this could mean things like encouragement of incentives for mentors to stay longer, or encouragement of hackers to actually sleep!

            There's a lot of potential to make this element of hackathons even better, and having technology to facilitate it and the data to back it up will help.

            Hackathons are great for learning, and an equally great place to teach. They're not just one of the best places to build something real - they're also a place where you can help shape the minds of potential new computer scientists and watch people get excited about learning.

            Hackathons helped me get started. And while I don't hack nearly as much anymore, I still love going to help mentor.

            Once again, you can read more about the queue (and set one up yourself) on GitHub! Don't forget to star ;)

            You can also contact me via email at ehzhang@mit.edu or Twitter @ehzhang!



            Edwin Zhang
            I do things sometimes! MIT '16