top of page
harry2180

CrowdSolving 101: How Long Does a Humanitarian Challenge Need to Get Results?

Approval for any innovation project requires satisfactory answers to three key questions:

  1. How likely is a viable solution to come out of the project?

  2. How much will the project cost, in out-of-pocket expenses and internal labor?

  3. How long will the project take?

This blog entry tackles the 3rd question and shares learnings from four recently-completed open-innovation challenges where SeaFreight Labs served as Project Advisor. These challenges were run by Habitat for Humanity and World Vision to solve long-standing humanitarian problems related to the work of each organization[See Note 1 below]. They were run in late 2020 and into 2021 on the InnoCentive platform and with the InnoCentive crowd.


Future blog entries will tackle the other two questions.


Figure 1 (below) documents the elapsed time spent in each major phase of these four humanitarian crowd-solving projects. On average, the challenges took slightly more than a year to run with challenge design taking about 100 days, the challenges running to the public for about 90 days and judging taking another 190 days to decide on a winner. In the cases where there was a public announcement of the winners, this took an additional 50 days, on average.

Figure 1.


My first caveat is that it is highly likely that a commercial organization would not require so much elapsed time to run a successful challenge. The need of a humanitarian organization to involve people beyond the initial project team in both challenge design and judging contributed significantly to the elapsed time consumed in both of those phases. An organization that could make all necessary decisions internally would dramatically reduce elapsed project time.


A second caveat is that these challenges could easily have taken quite a bit longer than they did if the project leadership at the Seeker, InnoCentive and SeaFreight Labs had not made it a consistent priority to push the project forward in search of the hoped-for solutions.


So, where did all the elapsed time go?


Phase 1: Challenge Design (= 100 days, on average)

  • Challenge design was significantly affected by the fact that no one on any of the project teams had any prior experience with open-innovation crowdsolving. The entire team at each organization needed to be trained on how to select problems well-suited for crowd-solving and how to know which problems to reject.

  • Challenge design was also elongated by the fact that these projects were added mid-year to an already full plate of the project leadership. Challenge design was competing with existing job responsibilities and this tension sometimes led to a delays in turning around a draft of a challenge statement. There were an average of six drafts for each published challenge statement as the team refined, and further refined, the problem statement to its minimal essence.

  • Designs on later challenges done by the same teams were completed in as few as 61 days.

Phase 2: Solution Solicitation (= 90 days, on average)

  • All four challenges were designed to run 90 days in order to allow enough time to attract an adequate number of solvers to give the challenge the best chance of getting valuable submissions. Figure 2 shows that the 3-month run for each challenge provided the time to recruit the eventual number of solvers that led to successful outcomes.

Figure 2.

  • It is my belief that a shorter public solicitation period would have been counterproductive to the objectives of the challenge.

Phase 3: Round 1 of Judging (=40 days, on average)

  • This round of judging is used by the project team to eliminate weaker submissions and create a ‘short-list’ of semifinalists. In this round, it took judges an average of 10-15 minutes per submission to read and evaluate each submission. With counts of round-1 submissions ranging from 22 to 71, there was a wide range in the amount of a judge’s time that was required to complete the work in this round (4-15 dedicated work-hours/judge).

  • Different project teams made different decisions about how many people should be involved in this phase. The number of round-1 judges ranged from 3 to 7.

Phase 4: Round 2 of Judging (=40 days, on average)

  • This round of judging is used by the project team to decide on a set of finalists that would be reviewed by a larger, external group (2 challenges) or invited to participate in a field test to gain objective comparisons between the finalists (2 challenges).

  • Habitat-India did not need a round-2 because they went straight to their external judging in the next round.

  • Habitat-Philippines used external judges for this round, so needed time to recruit the judges and educate them on the challenge. This added 30-40 days to this phase for them.

Phase 5: Final Judging (=110 days, on average, with a wide variation dependent on field testing)

  • This round of judging is used by the project team to decide on the challenge winners, if any. Happily, all four challenges awarded the full challenge prize amount.

  • COVID-19 had a major impact on two of the challenges.

a. The Habitat-India team was significantly delayed in their final judging by their use of external judges and the impact of COVID-19 in India during the judging. It is likely these external factors added 45-75 additional days to their project.

b. The World Vision Chlorine Monitor challenge was also affected by COVID-19. The project lost 2-4 finalists because they could not get into their labs to build the requested prototypes. The finalists that did complete a field test needed 60-90 days of extra time to get parts and necessary lab time.

  • Two of the projects requested field testing from the finalists (Habitat-Philippines and World Vision Chlorine Monitor). Solvers were requested to build out their designs and submit videos showing the process and/or results of the prototypes. Since each solver was doing all of this work with only the hopes of winning the challenge prize, but with no other remuneration, this process took 90-150 days.

Phase 6: Public Announcement (=50 days, on average, but only for some of the challenges)

  • This phase is used to create the marketing message about the winners and the solution. It is tied into the greater marketing strategy of the humanitarian organization.

  • In the case of Habitat-Philippines, this was a 1-hour virtual awards ceremony attended by over 200 people. This led to a number of articles in the local press.

  • In the case of World Vision’s sanitation challenge, this was a press release.

  • The other two challenges did not make public announcements yet.


It is important to remember that the completion of a challenge is only an intermediate milestone. Selecting a winning solver and winning solution does not achieve anything UNLESS the organization subsequently applies their new intellectual property against the initial target problem and eventually makes the problem something from its past. Achieving this outcome requires more investment of resources, capital and elapsed time. The ultimate success of each of these crowd-solving projects is totally dependent on these follow-on projects and I wish them each great success in their efforts.

[Note 1] Habitat for Humanity’s were “Increasing Resilience to Earthquakes and Typhoons for Homes with No Foundations” which launched as an RTP (“Reduction-To-Practice”) challenge on 7 October 2020 and closed on 5 January 2021, and “Improved Construction and Demolition Waste Management” which launched as an ideation challenge on 26 October 2020 and closed on 25 January 2021. World Vision’s were “Affordable Rural Single Family Sanitation Solutions” which launched as an ideation challenge on 14 October 2020 and closed on 12 January 2021, and “Low-Cost Chlorine Monitoring for Rural Piped Water Systems” which launched as an RTP challenge on 4 November 2020 and closed on 4 February 2021. See www.seafreightlabs.com/our-challenges for more details.

61 views0 comments

Recent Posts

See All

Comentarios


bottom of page