Words matter: Customize to configure

Let’s look at some definitions when it comes to software development and the nuances each one plays:

Customize: to modify or supplement with code development internally to match end-user requests, it may not be preserved during an upgrade. This could be analogous to hacking into a game like Pokemon Go and enabling end-users the ability to spoof their locations, to obtain regional exclusive pocket monsters.

Tailoring: modifying or supplementing without code to enable a system into an environment.  Analogous to downloading Pokemon Go a Google play store or Apple app store, where the right version of the app is downloaded into the right environment.

Personalization: meeting the customers’ needs effectively and efficiently.  This is achieved by analyzing customer data and using predictive analytics.  A great way is using the Active Sync tool to encourage players of Pokemon Go to be more active, but realizing there are three tiers to active players and personalizing the rewards based on those levels that are achieved.  Personalization can also be seen with character customizations, clothing, poses, and buddy pokemon.

Configure: it is the process of setting up options and features tailored to meet implementation of business requirements.  In pokemon go, some people want to achieve a full pokedex, some like the gym system, some like the 1:1 battles, 1:1 trades, side quests, beating the villains, etc. You can configure your goals in the game by doing one or all, and you can do it to the amount that you want, meeting your requirements for satisfaction in playing the game.

Now if we want to think of these concepts on a continuum:

Customize <——- Tailoring ——- Personalization ——-> Configuring

where the cost of complexity decreases from right to left, constriction in growth decreases from right to left, and a decrease in profit margin occurs from right to left.

The question now becomes, its the additional complexity on this spectrum worth the extra cost incurred?

That is for you to decide.

 

 

Decluttering & Recycling

Last year I mentioned that I am a minimalist, though I do not subscribe to the 100 item challenge.  However, there is value in disposing of items that are no longer providing any value in your life.  Rather than trashing them, why not recycle them for cash.  Here are a few places that accept gently used and sometimes roughly used items, in an effort to create a more sustainable economy and the planet.  For really old devices, they extract the precious metals to be used in new devices.

Note: Shop around all these sites and programs to get the most money for your product. Also, one site or store may not take it, but another might so keep shopping around. Also, if you are getting store credit make sure it’s at a store you will actually use.

Note: This is not a comprehensive list.  Comment down below if you know of any other places or apps that have worked for you really well.  Some apps work best in the city versus the suburbs.

  1. Amazon.com Trade-In: They will give you Amazon gift card, for Kindle e-readers, tablets, streaming media players, BlueTooth speakers, Amazon Echos, Textbooks, Phones, and video games.
  2. Best Buy: Will buy your iPhones, iPads, Gaming Systems, Laptops, Samsung mobile devices, Microsoft Surface devices, video games, and smartwatches for BestBuy gift cards.
  3. Game Stop (one of my favorites): Will take your video games, Gaming systems, most obscure phones, tablets, iPods, etc. and will give you cash back.
  4. Staples: Smartphones, tablets, and laptops can be sold here for store credit.
  5. Target: Phones, tablets, gaming systems, smartwatches, voice speakers for a target gift card.
  6. Walmart: Phones, tablets, gaming systems, and voice speakers can be cashed in for Walmart gift cards.
  7. Letgo app: A great way to sell almost anything.  Just make sure you meet up in a public place to make the exchange, like a mall or in front of a police station. Your safety is more important than any piece you were willing to part with in the first place.
  8. Facebook.com Marketplace: Another great way to sell almost anything. The same warning is attached here as in Letgo.
  9. Decluttr.com: They pay you back via check, PayPal, or direct deposit.
  10. Gazelle: They will reward you with PayPal, check or Amazon gift cards.
  11. Raise: This is for those gift cards you know you won’t use.  You can sell them for up to 85% of its value, via PayPal, direct deposit, or check.
  12. SecondSpin: This is for those CDs, DVDs, and Blu-rays, and you can earn money via store credit, check, or PayPal.
  13. Patagonia: For outdoor gear and it is mostly for store credit.
  14. thredUp: This is for your clothes. Once they are sold via the app you can receive cash or credit.
  15. Plato’s Closet: Shoes, Clothes, and bags can be turned in for cash.  Though they take mostly current trendy items.
  16. Half Price Books: Books, textbooks, audiobooks, music, CDs, LPs, Movies, E-readers, phones, tablets, video games, and gaming systems for cash.
  17. Powells.com: For your books and you can get paid via PayPal or credit in your account.

My advice, I try to sell first to a retailer, because they are going to always be there, it’s their job, it’s safer, you can do it at your own schedule, and you will get what they promise you.  No hassle of no-shows, fear of meeting a stranger, getting further bargained down when you are there and they conveniently forget to bring the full amount, or them arriving way late.

Another piece of advice is to hold on to at least one old phone (usually the latest one), for two reasons: (1) if your current phone breaks, you can use this as an interim phone, (2) international travel, if the phone is unlocked.

Subsequent advice is to make sure you turn off and clear out all our old data from electronic devices.  The last thing you want to do is have your data compromised when doing something positive for the earth.

Also, Look for Consignment shops, local book stores, and ask around. You never know who you may be able to sell stuff to.  At a consignment shop, you deposit your items there, and if they sell, you get a part of the earnings. When all else fails, what you cannot sell, recycle it by donating it to goodwill, habitat for humanity, etc.

Futuring & Innovation: Compelling Topics

  • There are forces that may help facilitate or reduce the likelihood of success of innovation, such as technological, cultural, economic, legal, ethical, temporal, social, societal, global, national, and local.
  • TED talks are videos that addresses innovations related to Technology, Education, and Design, and they can be found at this Web site,
  • Sociotechnical Systems: the interplay, impact, and mutual influence when technology is introduced into a social system, i.e. workplace, school, home, etc. (com, n.d.; Sociotechnical theory, n.d.) The social system comprises people at all levels of knowledge, skills, attitudes, values and needs (Sociotechnical theory, n.d.).
  • Think tanks are a group of people that review the literature, discuss the literature, think about ideas, do tons of research, write, provide ideas, legitimize ideas, advocate, lobby, and arguing just to address a problem(s) (Mendizabal, 2011; TBS, 2015; Whittenhauer, n.d.). In short, they are idea factories: creating, producing, and sharing (Whittenhauer, n.d.). The balance between research, consultancy, and advocacy and their source of their arguments/ideas: applied, empirical, synthesis, theoretical or academic research; help shape what type of think tank they are (Mendizabal, 2011). Finally, there are two types of think tank models, one roof model where everyone gathers in one physical place to meet face-to-face or the without walls model where members only communicate through technological means (Whittenhauer, n.d.).
  • Nominal Grouping Technique (NTG) is a tool for decision making, where it can be used to identify elements of a problem, identify and rank goals by priorities, identify experts, involve people from all levels to promote buy-in of the results (Deip, Thensen, Motiwalla, & Seshardi, 1997; Hashim et al., 2016; Pulat, 2014). Pulat (2014) describes the process as listing and prioritizing a list of options that is created through a normal brainstorming session, where the list of ideas is generated without criticism or evaluation.  Whereas Deip et al. (1977) describe the process as one that taps into the experiences of all people by asking them all to state their idea on a list, and no discussion is permitted until all ideas are listed, from which after a discussion on each item on the list can ranking each idea can begin. Finally, Hashim et al. (2016) stated that the method is best used to help a small team to reach consensus by gathering ideas from all and exciting buy-in of ideas.
  • Dalkey and Helmer (1963), described that the Delphi project as a way to use expert opinion, with the hopes of getting the strongest consensus of a group of experts. Pulat (2014) states that ideas are listed, and prioritized by a weighted point system to help reduce the number of possible solutions with no communication between the experts or of the results during the process until the very end.  However, Dalkey and Helmer (1963) described the process as repeated interviewing or questioning individual experts while avoiding confrontation of other experts.  Questions are centered on some central problem and between each round of questioning consists of available data requested by one expert to be shown to all experts, or new information that is considered potentially relevant by an expert (Dalkey & Helmer, 1963; Pulat, 2014).  The solution from this technique improves with soliciting experts with a range of experiences (Okoli & Pawlowski, 2004; Pulat, 2014).
  • Serendipitous innovations: discovering what makes one thing special and applying it elsewhere, like Velcro’s.
  • Exaptation innovations: Never giving up, finding secondary uses for the same product, and not being afraid to pivot when needed, like Play-Doh.
  • Erroneous innovations: Creating something by accident in the pursuit of something else, like Saccharin (C7H5NO3S) the artificial sweetener.
  • Kodak is a great example where a good plan but something went wrong because of circumstances beyond their control.
  • The traditional forecast is essentially extrapolating where you were and where are you are now into the future, and at the end of this extrapolated line this is “the most likely scenario” (Wade, 2012; Wade, 2014). Mathematical formulations and extrapolations is a mechanical basis for traditional forecasting (Wade, 2012). At one point, these forecasts make ±5-10% in their projections and call it the “the best and worst case scenario” (Wade, 2012; Wade, 2014).  This ± difference is a range of possibilities out of an actual 360o solution spherical space (Wade, 2014). There are both mathematical forms of extrapolation and mental forms of extrapolation and both are quite dangerous because they assume that the world doesn’t change much (Wade, 2012).
  • Scenario planning could be done with 9-30 participants (Wade, 2012). But, a key requirement of scenario planning is for everyone to understand that knowing the future is impossible and yet people want to know where the future could go (Wade, 2014).  However, it is important to note that scenarios are not predictions; scenarios only illuminate different ways the future may unfold (Wade, 2012)! Therefore, this tool to come up with an approach that is creative, yet methodological, that would help spell out some of the future scenarios that could happen has ten steps (Wade, 2012; Wade, 2014):
    1. Framing the challenge
    2. Gathering information
    3. Identifying driving forces
    4. Defining the future’s critical “either/or” uncertainties
    5. Generating the scenarios
    6. Fleshing them out and creating story lines
    7. Validating the scenarios and identifying future research needs
    8. Assessing their implications and defining possible responses
    9. Identifying signposts
    10. Monitoring and updating the scenarios as times goes on

Resources:

  • Dalkey, N., & Helmer, O. (1963). An experimental application of the Delphi method to the use of experts.Management science9(3), 458-467.
  • Deip, P., Thesen, A., Motiwalla, J., & Seshardi, N. (1977). Nominal group technique.
  • com (n.d.) socio-technical system. A Dictionary of Sociology. Retrieved from Encyclopedia.com: http://www.encyclopedia.com/social-sciences/dictionaries-thesauruses-pictures-and-press-releases/socio-technical-system
  • Hashim, A. T., Ariffin, A., Razalli, A. R., Shukor, A. A., NizamNasrifan, M., Ariffin, A. K., … & Yusof, N. A. A. (2016). Nominal Group Technique: a Brainstorming Tool for Identifying Learning Activities Using Musical Instruments to Enhance Creativity and Imagination of Young Children.International Advisory Board,23, 80.
  • Mendizabal, E. (2011). Different ways to define and describe think tanks. On Think Tanks. Retrieved from https://onthinktanks.org/articles/different-ways-to-define-and-describe-think-tanks/
  • Okoli, C., & Pawlowski, S. D. (2004). The Delphi method as a research tool: an example, design considerations and applications.Information & management42(1), 15-29.
  • Pulat, B. (2014) Lean/six sigma black belt certification workshop: body of knowledge. Creative Insights, LLC.
  • Socio-Technical Theory (n.d.) Brigham Young University. Retrieved from http://istheory.byu.edu/wiki/Socio-technical_theory
  • Wade, W. (2012) Scenario Planning: A Field Guide to the Future. John Wiley & Sons P&T. VitalSource Bookshelf Online.
  • Wade, W. (2014). Scenario Planning – Thinking differently about future innovation. Globis Retrieved from http://e.globis.jp/article/343

Whittenhauer, K. (n.d.). Effective think tank methods. eHow. Retrieved from http://www.ehow.com/way_5728092_effective-think-tank-methods.html

Sociotechnology plan for Getting People Out to Vote

Abstract:

According to the US Census Bureau (2016), there are approximately 227 Million eligible voters in the U.S. However; the Bipartisan Policy Center stated that in 2012 the voter turnout was 57.5%. This helps establish a need for Getting Out to Vote (GOTV) efforts. Regardless of any party’s political views, ideologies, and platforms, each party should improve their Get Out to Vote (GOTV) initiatives, which help convert citizens into active voters on election day (Garecht, 2006). Fortunately, technology and big data could be used to leverage the GOTV strategies to allow for mass social contact that is tailored to the voter and yet still cheaper than door-to-door canvassing. The purpose of this sociotechnical plan for GOTV is to build a bi-partisan mobile application that serves the needs of the citizens and politicians to increase poll attendance rate and ensuring a more democratic process.

Introduction:

Democracy in any nation is at its best when everyone participates.  Regardless of any party’s political views, ideologies, and platforms, each party should improve their Get Out to Vote (GOTV) initiatives, which help convert citizens into active voters on election day (Garecht, 2006). GOTV initiatives are meant to get voter who doesn’t usually vote to get out and vote on election day or those who intend to vote to follow through (Bash, 2016; Stanford Business, 2012).  According to the Institution for Social and Policy Studies (ISPS) (2016), a large number of studies have found that personalized methods, like door-to-door canvassing for GOTV, is the best and most robust method currently out there. However, mass email, mailers, or robocalls isn’t, because it lacks dynamic and authentic interaction.  Nearing the last few days of the election, voters or would-be voter already have picked who they would vote for and which way to vote on certain initiatives (Bash, 2016).  So it is not a matter of convincing people but having a high voter turnout.

A good goal for any political party’s GOTV initiative is to obtain 10% of the voters needed to win the election (Garecht, 2006). Door-to-door canvassing is very cost prohibited, but they are cost efficient whereas mass social contact is not cost efficient even though it is cheaper (Gerber, Green, & Larimer, 2008; ISPS, 2016). Green et al. (2008) stated that door-to-door canvassing costs approximately 10x more than mass social contact per vote.  Even though the costs are huge when doing door-to-door canvassing, the Republican National Committee for 2016 projects to have knocked on 17 million doors for their efforts, compared to 11.5 million in 2012’s elections (Bash, 2016).  Fortunately, technology and big data could be used to leverage the GOTV strategies to allow for mass social contact that is tailored to the voter and yet still cheaper than door-to-door canvassing.  Currently, Social media, email, online ads, and websites are used for GOTV (Fuller, 2012).

Scope: 

The current and next generation voters will be highly social and technologically advance, leveraging social media and other digital tools to learn about the issues from each candidate and become social media influencers (Horizons Report, 2016c). Therefore, as a feature, social media could be used as a way to develop personal learning networks and personal learning on the issues and initiatives (Horizons Report, 2016c; Horizons Report, 2016e). Twitter has been used by students to discover information, publishing, and sharing ideas, while at the same time exploring different perspectives on the same topic to promote cultural and scientific progress (Horizons Report, 2016a).

Walk book, an app is being used by the Republican National Committee to aid in their GOTV efforts, which shows voter’s living location, their party affiliation, and how reliable they are as a voter (Bash, 2016). The walking book mobile app, also allows for door-to-door canvassing personnel to have dynamic discussions through dynamic scripting which handles a ton of different scenarios and responses to their questions.  Data is then collected and returned to the Republican National Committee for further analysis and future updates.

Another feature of Social media technologies, is that as these technologies continue to evolve beyond 2016, these tools can be used for crowdsourcing, establishing an identity, networking, etc. (Horizons Report, 2016b; Horizons Report, 2016e).  Establishing identities can work for the political campaign, but leveraging established voter social media identities could help create that more tailored response to their values and what is at stake in the election.  The limitation comes from joining all these data sources containing huge amounts of unstructured data into one system, to not only decipher a voter’s propensity to vote but their political leaning (Fuller, 2012).

Purpose: 

According to the US Census Bureau (2016), there are approximately 227 Million eligible voters in the U.S. However; the Bipartisan Policy Center stated that in 2012 the voter turnout was 57.5%. This helps establish a need for GOTV efforts. When more people go out to vote, their voices get heard.  Even, in states that are heavily democratic or republican, a higher turnout can increase the chance of the political candidate to be more centrist in their policies to remain elected in the future (ThinkTank, 2016; vlogbrothers, 2014). The lower the voter turnout, the higher the chance the actual voice of the people are not heard of.  Vlogbrother’s best said: “If you aren’t voting, no one hears your voice, so they have no reason to represent you!”. Also, elections are not usually about the top ticket vote, but all the down ballot stuff as well, like local city, county, and state level public offices and ballot initiatives (ThinkTank, 2016). The purpose of this sociotechnical plan for GOTV is to build a bi-partisan mobile application that serves the needs of the citizens and politicians to increase poll attendance rate and ensuring a more democratic process.

Supporting forces: 

Social: When using scripts that state that voter turnout will be high, helps increase voter turnout, because the people start identifying voting as something they must do as part of their identity because others are doing it as well (Stanford Business, 2012). Also, in today’s world, social media has become a way for people to be connected to their social network at all times, and there is a real fear of missing out (FOMO) if they are not as responsive to their social media tools (Horizons Report, 2016d). Other social aspects have been derived from behavioral science, like adding simple set of questions like

  • “What time are you voting?”
  • “Where are you voting?”
  • if there is early voting “What day will you be voting?”
  • “How will you get there?”
  • “Where would you be coming from?”

Has shown to double voter commitment and turnout, when focused on people who don’t organically think of these questions (Stanford Business, 2012). Helping, voters determine answers to these questions help with their personal logistics and show how easy it is for them to vote.  This was one of the key deciding factors between Barak Obama’s versus Hillary Clinton’s GOTV 2008 Democratic Primary campaign (Rigoglioso, 2012).  Also, if there is a way to have those logistic questions posted on their social media platform, they become social champions to vote but are also now socially held accountable to vote.  Finally, VotingBecause.com is a social platform for voters to share why they are voting in the election, making them more socially accountable to vote (Sutton, 2016).

Technological: Currently platforms like YouTube are using their resources and their particular platform to help their users get out to vote (Alcorn, 2016). The USA Today Network has worked together to launch a one stop shop, VotingBecause.com, which helps voters easily read about the issues and candidates in the election and even register to vote (Sutton, 2016).

Economical:  According to a Pew Survey (2015), 68% of US adults own a smartphone, while 86% of all young adults (ages 18-20) own a smartphone. In a different Pew Survey (2015a), showed that 65% of adults use social media, up from the 7% in 2005. Therefore, a huge voting block has a social media account and a smartphone (Levitan, 2015b) for which technology can be leveraged at a cheaper cost than door-to-door canvassing.

Challenging forces: 

Social: Unfortunately, this FOMO leads to people feeling burnt out or exhaustive. Therefore users of social media need to balance their time on it (Horizons Report, 2016d). Facts and rumors are both all posing as information in social media, and deciphering which is which is exhausting (Levitan, 2015). Therefore, for this innovation to become a reality, any information shared via social media should come from an independent and trusted source.

Legal: In most states, it is illegal to take a photograph of a polling place, which would make it hard for people to show their civic duty accomplishment on social platforms without getting into legal issues (Fallen, 2016). This may decrease people’s want to share and feel connected, which eventually could also impact the likelihood of decreasing personal and social accountability in voting.

Technological: Building a comprehensive database of the typical low propensity voter, so that a campaign can create personal messages and personal conversations with those voters (Fuller, 2012). A good database would include a Phone number, address, voting propensity, voting record, street address, email address, issues of importance, etc. (Fuller, 2012). Security is also an issue, the one stop shop mobile application solution must take into account each person’s right to access certain types of data, to still ensure anonymity of voting records of civilians.

Ethics: Collecting huge amounts of data from social media and tying that to personal yet public voting record could cause harm, given that political beliefs are of a private manner.  If the data is not used primarily for GOTV initiatives to get everyone’s voice heard in the political system, then it shouldn’t be collected.  Data could be used unethically by some to suppress the vote as well. Therefore this must be conducted by an independent (non-partisan) group.

Methods:

Social media is constantly evolving thus the use of Delphi Technique from a political think tank, political scientists, sociologist, social behaviorist, and actual GOTV managers would be needed on an ongoing basis (Horizons Report, 2016b; Horizons Report, 2016e; Stanford Business, 2012). Dalkey and Helmer (1963), described that the Delphi project was a way to use expert opinion, with the hopes of getting the strongest consensus of a group of experts.  Pulat (2014) states that ideas are listed, and prioritized by a weighted point system to help reduce the number of possible solutions with no communication between the experts or of the results during the process until the very end.  However, Dalkey and Helmer (1963) described the process as repeated interviewing or questioning individual experts while avoiding confrontation of other experts.  Experts must be drawn from different groups on the research spectrum: theoretical to the application as well as different academic fields to help build the best consensus on the methodology to leverage social media on GOTV efforts. Finally, one could consider conducting the Delphi Technique either in a one roof model where everyone gathers in one physical place to meet face-to-face or the without walls model where members only communicate through technological means (Whittenhauer, n.d.).

Models:

To build this socio-technical system to leverage unstructured social media data with voter registration data in GOTV efforts, one must consider the different levels in designing a socio-technical system as seen in Figure 1 (Sommerville, 2013).  Each of the levels plays an important role in facilitating the entire socio-technical plan and must be heavily detailed, with backup systems.  Backup systems and a disaster recovery plan are needed to avoid the same fate that Mitt Romney’s GOTV 2012 program ORCA suffered, where thousands of Romney volunteers were left with a data blackout (Haberman & Burns, 2012). But, it is important to note that a good socio-technical GOTV plan would include all the different levels because all these different levels in the socio-technical system feed into each other and overlap in certain domains (Sommerville, 2013).

2

Figure 1. The socio-technical Architecture with seven levels. (Source: Adapted from the Sommerville, 2013).

But, elections are bound at fixed points in time, and they must come to an end.  Thus, this allows for the socio-technical GOTV plan to have a work breakdown structure.  The resulting work breakdown structure could be multiplied or divided based on the lead time or the importance of the election candidacy or initiative (Garecht, 2002; Progressive Technology Project, 2006):

  • As soon as possible:
    • Create a GOTV plan, strategy, methodology, based on the methodologies created through the Delphi method, with a wide range of experts.
    • Assign one person as a chairman for GOTV.
    • Sign up volunteers for GOTV efforts, remembering that they are not there to convince anyone who to vote for, just to get them to vote.
  • 90-60 days before the election:
    • Gather all data from all the data sources and create one system that ties it together.
    • Identify phase: Have predictive data analytics begin running algorithms to decipher which voter has a lower than average propensity to vote on election day and allow the algorithm to triage voters.
      • Add people who attend political events, staff members, volunteers, and they should already have a higher propensity to vote.
    • When applicable have GOTV staff file absentee ballots.
  • 30 days before the election:
    • Begin implementation of the GOTV Plan.
    • Updating databases, registration information, voter addresses, etc.
    • Identify phase: Keep rerunning the predictive data analytics model.
    • Motivation Phase: Getting people to vote, by making it easy for them to vote, and establishing social accountability via their social media accounts.
    • When applicable have GOTV staff file absentee ballots.
  • Ten days to 1 week before the election:
    • New volunteers come in at this time, to help, and in the GOTV plan, there should be training and roles given to them so that they can be of most use.
    • Identification + Motivation Phase: Contact each person on that list to remind them of their civic duty, motivate them, and remind them where their polling place is and what time they stated would be best for them to vote.
    • Motivation Phase: Using social media advertising tools to drop ads on people located on these low propensity voter lists as derived from predictive data analytics. Even sending text and email reminders of their polling places and times of operation, would help make the voting process easier for these voters.
    • When applicable have GOTV staff file absentee ballots.
  • Election day:
    • Have voter log into a system to say that they have voted, or scan their social media to see who has voted to cross out their names from those who have yet to vote that day. The aim is to get 100% conversion rate of the inactive voter to active voter.

Understanding that this work breakdown structure deals with the intersection of technology and people is key to making it work effectively.

Analytical Plan:

The aim of this socio-technical GOTV plan is to have 100% conversion rates of inactive to active voters.  However, this is quite impossible for larger campaigns and big elections (Progressive Technology Project, 2006).  But, to analyze the effectiveness, of the socio-technical GOTV plan is to cross reference the data that the predictive data analytics has created for low propensity voters, to the voters reached through various technological or social media means, to the voters who voted and that match the GOTV list.

Another way to evaluate the effectiveness of the socio-technical GOTV plan is to see how closely did the real results matched to the milestones identified in the work breakdown structure.  Daily figures should be captured along with narratives to supplement the numerical data to create lessons learned, to eventually be fed back to the experts who devised the methodology to conduct further future developments.  The Delphi Technique can be reiterated with the new data to build a better socio-technical GOTV plan in the future.

Anticipated Results:

As voter turnout increases, no matter which political party wins, the views will be more centrist rather than polarizing, because each of the voter’s voices was heard (ThinkTank, 2016; vlogbrothers, 2014).  Another result from this socio-technical GOTV plan is that the voter is now empowered to make a data-driven decision from the national level and down ballot due to the information presented in these GOTV plans (Alcorn, 2016; Sutton, 2016). This in turn will help create a positive use of social media as a tool to enhance learning and develop personal learning network (Horizons Report, 2016c; Horizons Report, 2016e). Finally, an unintended social impact of this GOTV plan is creating more civically minded citizens that are active in politics at all levels who are willing to be influencers discovering, creating, publishing and sharing ideas (Horizons Report, 2016a, Horizons Report, 2016c, ThinkTank, 2016).

Conclusion:

According to the US Census Bureau (2016), there are approximately 227 Million eligible voters in the U.S. However; the Bipartisan Policy Center stated that in 2012 the voter turnout was 57.5%. This voter turnout rate is horrible, and the Vlogbrother’s said: “If you aren’t voting, no one hears your voice, so they have no reason to represent you!”. This helps establish a need for GOTV efforts. When more people go out to vote, their voices get heard.  Also, elections are not usually about the top ticket vote, but all the down ballot stuff as well, like local city, county, and state level public offices and ballot initiatives (ThinkTank, 2016).

According to a Pew Survey (2015), 68% of US adults own a smartphone, while 86% of all young adults (ages 18-20) own a smartphone. In a different Pew Survey (2015a), showed that 65% of adults use social media, up from the 7% in 2005. Therefore, a huge voting block has a social media account and a smartphone (Levitan, 2015b) for which technology can be leveraged at a cheaper cost than door-to-door canvassing. Green et al. (2008) stated that door-to-door canvassing costs approximately 10x more than mass social contact per vote.  Currently, Social media, email, online ads, and websites are used for GOTV (Fuller, 2012). Fortunately, technology and big data could be used to leverage the GOTV strategies to allow for mass social contact that is tailored to the voter and yet still cheaper than door-to-door canvassing.

The Diffusion of Innovation (DOI) theory is concerned with the why, what, how, and rate of innovation dissemination and adoption between entities, which are carried out through different communication channels over a period (Bass, 1969; Robertson, 1967; Rogers, 1962; Rogers 2010). Entities need to make a binary decision that can fluctuate in time between whether or not to adopt an innovation (Herrera, Armelini, & Salvaj, 2015). Rogers (1962) first proposed, that the timing of the adoption rates of innovation between entities follows a normal frequency distribution: innovators (2.5%), early adopters (13.5%), early majorities (34%), late majority (34%), and laggards (16%). The cumulative frequency distribution mimics an S-curve (Bass, 1969; Robertson, 1967; Rogers, 1962). However, Bass (1969) claimed that Rogers’ frequency distribution was arbitrarily assigned, and therefore he reclassified both innovators and early adopters as innovators and everyone else as imitators, to create a simplified numerical DoI model.

Imitators are more deliberate, conservative, and traditional learners, who learn from innovators, whereas innovators are more venturesome (Bass, 1969; Rogers, 2010). Innovators have a lower threshold of resistance to adopting an innovation than imitators since the innovator’s adoption rates are driven by the adopters’ perception of the innovation’s usefulness (Rogers, 2010; Sang-Gun, Trimi, & Kim, 2013).  Therefore, an innovator will adopt the implications of this socio-technological GOTV plan, record their findings and when a certain amount of data is collected, and this innovation has been implemented much time over, will it finally get adopted by the imitators.  Once a significant amount of imitators has adopted these measures, we can then begin to see U.S. voter turnout reach the 80% or higher mark.

Areas of Future Research:

For further the GOTV low propensity voter prediction and more accurate predictive data analytics algorithms are needed. Thus more research is needed.  Different types of predictive data analytics can produce different results, whether the algorithm is using supervised or unsupervised machine learning techniques. Other efforts needed in the future is preprocessing the unstructured social media data and connecting it to voter registration data.

References

A Kodak Moment

Kodak- good plan but something went wrong because of circumstances beyond their control

In 1884 patents for the photographic film were produced, and eight years later the company to be known as Kodak was founded by a high school dropout George Eastman (Elliott, 2012; Sparks, 2012).  In 1900, the “Brownie” camera was introduced at $1 each and film at $0.15, and it allowed for the general public to have access to a camera (Anthony, 2011; Elliott, 2012; Sparks, 2012). Quality was its major differentiator (Cohan, 2011). With time cameras got better, easier to use, and smaller to make the end-user experience as simple and welcoming as possible (Elliott, 2012). In the 1930s, Kodak had an IPO on the Dow Jones and in 1969 the film used to capture the Apollo 11 missions was from Kodak (Elliott, 2012; Sparks, 2012). In 1975 Kodak introduced the digital camera where pictures can be stored on cassette tapes (Sparks, 2012).

By 2004-2009, Kodak stopped selling film cameras and tape recorders to meet the new shift in the market for digital cameras (Elliot, 2012; Sparks, 2012).  As this new shift was occurring, 13 manufacturing plants and 130 processing labs closed, cutting over 47K jobs starting in 2003 (Strydom, 2012).  Therefore, Kodak started and kept losing their market share in 2011 and filed chapter 11 on 2012 as an attempt to transform the company into a pioneer in digital cameras (Sparks, 2012; Strydom, 2012). Today, they struggle to keep pensions and other benefits for their retired workers, while they are leaning out their processes and restructuring their costs (Strydom, 2012). Kodak to try to stay afloat, has sued Apple and Blackberry for stealing their patented technology (Anthony, 2011). The only thing that Kodak can do now sells its >1000 patents in cameras and video tapes (Anthony, 2011, Elliott, 2012).

Forces that adversely affected Kodak

Competition: From the 1900s, Kodak dominated the market shares with consumer photography (Elliott, 2012). However, competition is fierce. Sony, Canon, Apple, HP, and Fuji all began to specialize and develop faster than Kodak to chip away at its market share or create new markets as first movers (Anthony, 2011; Cohan, 2011; Elliott, 2012; Sparks, 2012).

Economical: Fuji specialized in film and was able to out price Kodak. Therefore Fuji film was able to quickly take away a cash cow product from Kodak, such that Kodak had to lay off 20K jobs just to offset (Cohan, 2011).  Fuji essentially was able to gain control of the market share in the film segment.

Ethical: In 1948, Polaroid came out with instant photography and Kodak copied that technology, lost a suit of $909M for stealing the technology from 1976-1986 (Cohan, 2011). Kodak to try to stay afloat, has fruitlessly sued Apple and Blackberry for stealing their patented technology (Anthony, 2011).

Technological: As Kodak moved away from the film, it tried to take some dominance in the digital camera market, but they couldn’t penetrate that market enough to be sustainable (Elliott, 2012). As it tried to gain market share in the digital camera realm, they were late.  The timing of Kodak entering into this market had allowed Sony and Canon to become the first movers and establish market control (Anthony, 2011). Kodak tried to work on digital camera personal printers, but HP was the first mover in that market and had a strong hold on the market share (Cohan, 2011).

Why it is relevant

There are many big companies out there today that has become comfortable and is now are seeing an introduction of competition, which is threatening to take away their market dominance and reduce the market share.  Though competition is great for consumers, which allows for pricing wars to exist, it is terrible for a company that is trying to drive down costs enough remain competitive while still turning over profit.

The story of Kodak is not different from other companies, and it does show that empires do fall.  It starts off with competition.  Fuji specialized in film and was able to out price Kodak (Cohan, 2011). Then it came with Apple that started to try to produce digital cameras, but it was then Canon and Sony that specialized in this new area and developed the technology much faster than Kodak that Kodak just couldn’t catch up (Cohan, 2011; Elliott, 2012; Sparks, 2012). Finally, when Kodak tried to dominate a space where they could gain a reasonable amount of market share by working on tangential technology, the digital camera printers, they were too late, and HP became the first mover in that technology space and held onto its dominance in the market (Cohan, 2011). Therefore, the story of Kodak should provide a cautionary tale to other companies.

References

Sony Walkman and Scenario Planning

The Sony Walkman: Scenario-type planning

Sony didn’t do the proper scenario-type planning and only relied on standard forecasting, which is why it’s market share fell behind Apple’s.  A key requirement for scenario planning is for everyone in the planning session to understand that knowing the future is impossible and yet people want to know where the future could go (Wade, 2014).  However, it is important to note that scenarios are not predictions; scenarios only illuminate different ways the future may unfold (Wade, 2012)! Sony should have created a brainstorming session to identify as many of the driving force(s) or trend(s) that could have an impact the Sony Walkman (Wade, 2014)?  Thus, Sony should have thought of any trend or force (direct, indirect, or very indirect) that would have any effect in any way and any magnitude to the problem.

The Sony Walkman Story

Before the introduction of the Sony Walkman, cassette player technology existed in the 1960s, but households preferred to listen to vinyl records instead (Haire, 2009). The Walkman, a device that merged a light weight and portable cassette player with a light weight headphone, was introduced to the Japanese market in 1979 where it was sold out in three months at $150 per device (Adner, 2012; Franzen, 2014).  The device even had two headphone jacks so that two people can listen to the same music/recording at the same time (Haire, 2009). In the 1980s, the Walkman commanded about 50% of the market share in both Japan and the U.S. selling over 200 million devices over 30 years (Adner, 2012; Haire, 2009). The iPod made by Apple from 2001-2009 sold 160 million units (Haire, 2009).

Then, in 1990 CDs and digital music files like the mp3 came into existence (Adner, 2012; Franzen, 2014).  CDs and mp3s provided better quality and integrity than cassettes, which started to drive the cassette player’s market share towards zero.  Cassettes worked on the film, which tends to degrade with time as well, where the digital files didn’t.  The first mp3 player was from Saehan Information Systems, in 1998 (Adner, 2012).  Sony quickly adapted to these new formats as well and created the CD version of the Walkman and eventually the mp3 version of the Walkman, but it still stuck onto is proprietary music format the ATRAC (Franzen, 2014; Haire, 2009). Also, the industry saw this change from cassettes to CDs to mp3s has happened and was trying to figure out which mp3 player would eventually dominate the market like the Walkman did (Adner, 2012).

In 2001, the iPod came into the scene and took over the market, even when the market was already saturated with about 50 different types of mp3 players (Adner, 2012).  Steve Jobs learned that on its own, the iPod was useless, but with broadband download speeds and mp3 marketplace the market was ready for the easy to use the device at $399 and 5 GB of storage (Adner, 2012, Apple, n.d.). What made the iPod so successful was the analysis of the challenging forces that made mp3 players a hard market to sell and addressing them by providing seamless integration with an mp3, which was introduced the iTunes music management software in its first iteration in 2001 and re-imagined storefront called the iTunes Music store in 2003 (Adner, 2012).

By 2008, Apple had claimed 48% of the market share in mp3 devices which was similar levels of the Walkman in the 1980s (Adner, 2012). In 2010 the cassette version of the Walkman device line came to an end (Franzen, 2014). In 2015, the newest mp3 Walkman device the ZX2 is $1200 with 128 GB and expandable microSD card slot, which is now Sony’s aim for higher quality audio devices (Miller, 2015). Unfortunately, the ZX1 and ZX2 doesn’t have smartphone features like apps (Franzen, 2014; Miller, 2015).

Challenging forces to move from the Walkman to mp3 players:

Legally: In 1998-2001 there was no storefront to download mp3 music legally (Adner, 2012).

Technology: Even if music was obtained legally from CDs, people had to use a third party software to convert files, which in those computational computing days took hours to conduct (Adner, 2012).

Therefore, who cares if you are first to market (Saehan Information Systems) if there is no easy way to download mp3 music easily and legally.

Supporting forces to move from mp3 players to the iPod:

Legally: The iTunes Music Store, allowed for songs to be downloaded at a modest price and legally for $0.99, which Apple was able to get 10% commission from it (Adner, 2012).

Technology: The seamless integration of the iPod to the music storefront made the device easy to use, which helped increased its market share in the mp3 market. By 2009, over 8 Billion songs were sold, totally $800 Million in revenue (Adner, 2012).  This iTunes storefront, became a cash cow for Apple, while the iPod went under further innovation into the iPhone product line and the iPod touch product line (Apple, n.d.).

Example Scenario Planning four quadrants for the Sony Walkman case based on the forces listed above:

capture

Conclusion

Sony didn’t do proper scenario-type planning and only relied on standard forecasting, which is why it’s market share fell behind Apple’s.  However, the lesson learned from this case study is that a company doesn’t need to be the first mover to make it big in the market. A proper scenario planning could be the key to succeeding when entering a saturated market.  Apple was three years late to the party, yet it was their patience, learning about the supporting and challenging forces for mp3 player dominance, and letting the key market players align for their product, was the key to Apple’s success.

It is easy to do a scenario planning exercise on past events to today’s events (Wade, 2014). It is harder to do scenario planning moving into the future. Also, scenario planning events should never remain static.  The future is always evolving.  Thus, the scenario plan should change to reflect the new landscape, but the difference is that this planning allows for the mind to be more flexible and receptive to pivot quickly (Wade, 2014).  Scenario planning can take into account any force, not just the two listed above Political/Legal, Economical, Environmental, Societal, Technological, etc. (Wade 2012, Wade 2014).

References

Different Types of Innovation

Serendipitous innovations: discovering what makes one thing special and applying it elsewhere

Georges de Mestral in 1941 went out to walk his dog in the woods and noticed how the burrs clung to him and his dog (Bellis, 2016; Suddath, 2010). De Mestral was curious enough to study these burrs under a microscope and from that he wanted to recreate it (Bellis, 2016). It took eight years of trial and error to create a synthetic burr that had tiny hooks, that would grip to a cloth full of tiny loops and the names of those two cloths “velvet” and “crochet” were combined to form Velcro (Bellis, 2016; Suddath, 2010).  Velcro was made to rival the zipper (Bellis, 2016). Velcro had its big break when it was used by NASA in the 1960s Apollo mission, then hospitals began to use them, then the military, and now it’s used on planes, cars, shoes, home décor, etc. (Suddath, 2010).

Exaptation innovations: Never giving up, finding secondary uses for the same product, and not being afraid to pivot when needed

The mixture of flour, water, salt, boric acid and mineral oil was first originally used as a reusable soup product to help clean wallpaper as part of the Kutol company (Biddle, 2012; Hiskey, 2015; Wonderopolis, n.d.). Hiskey (2015), chronicles that in 1933 Noah McVicker and Cleo McVicker created the doughy substance because at that time wallpaper couldn’t get wet.  However, the lack of toxic chemicals made it an ideal to become the toy it is today (Hiskey, 2015; Wonderopolis, n.d.).  This pivot from wallpaper cleaner to toy occurred when teachers began to use this product for a molding compound to make art for craft projects in school (Hiskey, 2015; The Strong, n.d.; Wonderopolis, n.d.).  When, the inventor’s nephew, Joe McVicker, eventually came into the Kutol Company and noticed this secondary use of their product, and thought it would be good to rename the product “Play-Doh” and marketed it to schools (Biddle, 2012; The Strong, n.d.; Wonderopolis, n.d.).

Erroneous innovations: Creating something by accident in the pursuit of something else

Two chemists in 1879 were working in the Lab at John Hopkins University, where one of them got hungry and forgot to wash his hands (Hicks, 2010; Smallwood, 2014).  Constantin Fahlberg didn’t die from this, which could have happened, but noticed that the chemical saccharin (C7H5NO3S) which he and his peer created made his food taste sweet (Hicks, 2010).  He created the Artificial sweetener that is now used in the “Sweet’n Low” pink packets; that is 300x sweeter than cane sugar and cheaper to produce (Hicks, 2010; Smallwood, 2014).  In 1884, Constatin patented the chemical saccharin without his co-inventor and set up a production shop in New York City (Hicks, 2010). In the 1970s a saccharin scare was created stating it was empty calories and harmful to the health of the consumer, the first part of the claim was substantiated, but the second claim has never been vetted with evidence, and in 2000 it was removed from the U.S. National Toxicology Program list of carcinogenic chemicals (Smallwood, 2014).  From this erroneous innovation, aspartame in 1965 a chemical 200x sweeter than sugar and sucralose in 1976 that is 600x sweeter than sugar was created (Hicks, 2010).

References

Case Study: Sociotechnical system in education

Definition of key terms

  • Sociotechnical Systems: the interplay, impact, and mutual influence when technology is introduced into a social system, i.e. workplace, school, home, etc. (com, n.d.; Sociotechnical theory, n.d.) The social system comprises people at all levels of knowledge, skills, attitudes, values and needs (Sociotechnical theory, n.d.).
  • Formal Learning: scholastic learning in schools (Hayashi & Baranauskas, 2013)
  • Non-formal Learning: scholastic learning outside of schools (Hayashi & Baranauskas, 2013)
  • Informal Learning: other learning that occurs outside of schools (Hayashi & Baranauskas, 2013)

Case Study Description (Hayashi & Baranauskas, 2013)

This qualitative study introduced 520 donated laptops among the students (ages 6-14) and teachers in the public school, Padre Emilio Miotti School, in Campinas, Brazil.  With a goal of providing a detailed description of the results in order to inspire (transfer knowledge) over focusing on generalizing the results to other schools and scholastic-socio technological systems.  The sociotechnical system is defined by cultural conventions, where the participants in the study can be classified under in the formal, informal, and technical levels of a Semiotic Onion (Figure 1).

1.png

(Source: Adopted directly from Hayashi and Baranauska, 2013)

Therefore, the goal of this qualitative study was to understand how to insert the technological artifacts (the laptops), into the scholastic curriculum, that makes sense to the end users (scholastic community: teachers, students, etc.) into a meaningful integration across all aspects of the Semiotic Onion.  Data collection for this qualitative study was done through interviews and discussion in the Semio-participatory Workshops in 2009, as well as the authors being participant observers over a one year period in the scholastic activities.

There were four opportunities that should be considered (supporting forces for adoption):

  • Transforming homework assignments: Allowed for teachers to bring some homework into the classwork and allow the students to conduct their searches, normally done at home at school. Teachers could now observe the emotional flux of their students evolve while they complete the assignments.  This evolution of the emotional flux during homework use to be only observed by parents.
  • Integrating the school in Interdisciplinary Activities: In a collaborative fashion, teachers were able to create assignments using the laptop cameras to capture everyday objects or events of the students to help show them how to eat healthier, different animals and their behaviors, save on the electric bill, teach them about calories, watts, electricity, animals, etc. This creates a path of data to information to knowledge that helps motivate the students to learn more.
  • Laptops inside and outside the school walls: Students have more pride in using their own devices and were willing to showcase and educate the public about their technology and its effectiveness. This has far reaching results that were not explored in this study.
  • Student Volunteers: The use of older students to help troubleshoot younger student’s laptop problems, which taught some students patients and other skills across the Semiotic Onion. The students learned about responsibility, empathy, and other vital social skills.

There were issues across the Semiotic Onion that were also enumerated (challenging forces for adoption):

  • Technological: Internet connection was slow and intermitted even though there was broadband internet available and wireless routers
  • Technological: How to recharge 30 laptops at a time with only two wall sockets
  • Technological: How to transport laptops back and forth from storage rooms to classrooms
  • Technological: Laptop response times at certain periods of times were slow at best
  • Technological: Demand for technological support increases dramatically
  • Formal: The fear of laptops being stolen from the students on their way to or from school
  • Formal: Teachers worried that they could find or create technological assignments that fit their lesson plans
  • Informal: Teachers are not comfortable in teaching with technology they are not familiar with themselves
  • Informal: Most parents didn’t and couldn’t use the student’s laptop to assist them

This study concludes by saying that the introduction of technology into the education system in these scenarios for this case study had a positive response and that key lessons learned, assignments could be duplicated and studied in other scenarios.  Therefore, the authors emphasized on the transferability of the study rather than generalizability of the results.

Evaluation of this case study

This study was a case study of the socio-technological scholastic system when donated laptops were introduced into a Brazilian school.  This paper presented the socio-technological plan and its analysis.  The authors were thorough by listing all the opportunities (supporting forces for adoption) and issues (challenging forces for adoption) of technological inclusion into the scholastic system by evaluating it from the perspectives of the Semiotic Onions.  Therefore, this was a thorough study of this study’s positive introduction of technology to the scholastic, social system.  The only drawback in this study is that the researchers failed to interview how the laptops affected the world outside of the school walls and familiar homes.

This paper is a well-designed qualitative study that uses surveys, interviews, etc. to gain their primary results, but to improve the study’s credibility, the researchers become a participant observer for one-year videotaping and taking field notes to supplement their analysis.  They mention that case studies are done to foster transferability of ideas across similar situations rather than generalizing the results.  Therefore the authors stated the limitations of this study and how they mitigated issues that would arise about the study’s credibility.

References:

Higgs Boson: Case Study on an infamous prediction that came true

Definitions:

  • Forecasting (business context): relies on empirical relationships that were created from observations, theory, and consistent patterns, which can have assumptions and limitations that are either known or unknown to give the future state of a certain event (Seeman, 2002). For instance forecasting, income from a simple income statement could help provide key data for how a company is operating, but the assumptions and limitations on using this method can wipe out a business (Garrett, 2013).
  • Predictions (business context): are a more general term in which, is a statement of a future state of a certain event, that can be based on empirical relationships, strategic foresight, or even scenario planning (Seeman, 2002; Ogilvy, 2015).
  • Scenarios: alternate futures that change with time as supportive and challenging forces unfold, usually containing enough data like the likelihood of success or failure, the story of the landscape, innovative opportunities, challenges to be faced, signals, etc. (Ogilvy, 2015; Wade, 2012).

Case Study: An infamous prediction that came true

The Higgs Boson helps tell the origin of mass in the universe (World Science Festival, 2013). Mass is the resistance of an object to be pushed and pulled by other objects or forces in the universe, and mass is made up of the constitute particles of that object (Greene, 2013; PBS Space-Time, 2015; World Science Festival, 2013).  The question is where does the mass of these particles that give an object its mass comes from?  The universe if filled with an invisible Higgs Field, in which these particles are swimming in and experiencing a form of resistance (when the particle speeds up or slows down), this resistance in the Higgs Field is the mass of the particles (Greene, 2013; World Science Festival, 2013).  Certain particles have mass (electrons), and others don’t (photons), this is because the certain particles interact with the invisible Higgs Field (PBS Space-Time, 2015). Scientist use the large Hadron Collider to speed up particles in such a way that when they collided in the correct way (1:1,000,000,000 chance), the particles’ collisions would be able to clump a bit of the Higgs Field to create a Higgs particle that lasted for a 10-22 second (Greene, 2013; PBS Space-Time, 2015; World Science Festival, 2013). Therefore, finding the Higgs particle is a direct link to proving that the existence of the Higgs field (PBS Space-Time, 2015).

The importance of proving this prediction correct (World Science Festival, 2013):

  • Understanding where mass comes from
  • The Higgs particle is a new form of particle that doesn’t spin
  • Shows that mathematics lead the way to discovering something about our reality

This was a prediction in the waiting to be confirmed through observation for over 50 years, which got its roots in the form of scientific and mathematical roots of quantum physics and by Higgs in 1964 (Greene, 2013; PBS Space-Time, 2015; World Science Festival, 2013).

Supporting Forces for the prediction:

  • Technological: the development of technology to study mathematics over the course of 50 years helped facilitate the discovery of this prediction (Greene, 2013; World Science Festival, 2013). The actual technology use is called the ATLAS detector attached to the Large Hadron Collider (Greene, 2013).
  • Financial: Through international collaboration from thousands of scientists and over a dozen of countries, they were able to amass the financial capital to build this $10 Billion Large Hadron Collider.

References:

Traditional Forecasting Vs. Scenario Planning

Traditional Forecasting

Traditional forecast is essentially extrapolating where you were and where are you are now into the future, and at the end of this extrapolated line this is “the most likely scenario” (Wade, 2012; Wade, 2014).  Mathematical formulations and extrapolations is a mechanical basis for traditional forecasting (Wade, 2012). At one point, these forecasts make ±5-10% in their projections and call it the “the best and worst case scenario” (Wade, 2012; Wade, 2014).  This ± difference is a range of possibilities out of an actual 360o solution spherical space (Wade, 2014). There are both mathematical forms of extrapolation and mental forms of extrapolation and both are quite dangerous because they assume that the world doesn’t change much (Wade, 2012).  However, disruptions like new political situations, new management ideas, new economic situations, new regulations, new technological developments, a new competitor, new customer behavior, new societal attitudes and new geopolitical tensions, could move this forecast in either direction, such that it is no longer accurate (Wade, 2014). We shouldn’t just forecast the future via extrapolation; we should start to anticipate it through scenario analysis (Wade, 2012).

Advantages (Wade, 2012; Wade, 2014):

+ Simple to personally understand, only three outcomes, with one that is “the most likely scenario.”

+ Simple for managements to understand and move forward on

Disadvantages (Wade, 2012; Wade, 2014):

– Considered persistence forecasting, which is the least accurate in the long term

– Fails to take into account disruptions that may impact the scenario that is being analyzed

– Leads to a false sense of security that could be fatal in some situations

– A rigid technique that doesn’t allow for flexibility.

Scenario Planning

Scenario planning could be done with 9-30 participants (Wade, 2012).  But, a key requirement of scenario planning is for everyone to understand that knowing the future is impossible and yet people want to know where the future could go (Wade, 2014).  However, it is important to note that scenarios are not predictions; scenarios only illuminate different ways the future may unfold (Wade, 2012)!

Therefore, this tool to come up with an approach that is creative, yet methodological, that would help spell out some of the future scenarios that could happen has ten steps (Wade, 2012; Wade, 2014):

  • Framing the challenge
  • Gathering information
  • Identifying driving forces
  • Defining the future’s critical “either/or” uncertainties
  • Generating the scenarios
  • Fleshing them out and creating story lines
  • Validating the scenarios and identifying future research needs
  • Assessing their implications and defining possible responses
  • Identifying signposts
  • Monitoring and updating the scenarios as times goes on

However, in a talk Wade (2014), distilled his 10 step process, to help cover the core steps in scenario planning:

  • Create a brainstorming session to identify as many of the driving force(s) or trend(s) that could have an impact on the problem at hand? Think of any trend or force (direct, indirect, or very indirect) that would have any effect in any way and any magnitude to the problem and they could fall under the following categories:
    • Political
    • Economical
    • Environmental
    • Societal
    • Technological
  • Next, the group must understand the critical uncertainties in the future, from the overwhelming list. There are three types of uncertainties:
    • Some forces have a very low impact but very in uncertainty called secondary elements.
    • Some forces have a very high impact but low uncertainty called predetermined elements.
    • Some forces have a very high impact and high uncertainty call critical uncertainties.
  • Subsequently, select the top two most critical uncertainties and model the most extreme cases of each outcome, it is “either … or …”. They must be contrasting landscapes from each other. Place one critical uncertainty’s either/or in one axis, and the other on the other axis.
  • Finally, the group should describe the different types of scenarios. What would be the key challenges and key issues would be faced in either of these four different scenarios? How should the responses look like?  What are the opportunities and the challenges will be faced? This helps the group to strategically plan and find a way to potentially innovate in this landscape, to outthink their competitors (Wade, 2014)?

Advantages (Frum, 2013; Wade, 2012; Wade, 2014):

+ Focuses on the top two most critical uncertainties to drive simplicity

+ Helps define the extremes in the four different Landscapes and their unique Challenges, Responses, and Opportunities to innovate to create a portfolio of future scenarios

+ An analytical planning method helping to discover the Strengths, Weaknesses, Opportunities, and Threats affecting each scenario

+ Helps you focus on the players in each landscape: competitors, customers, suppliers, employees, key stakeholders, etc.

Disadvantages (Wade, 2012; Wade, 2014):

– No one has a crystal ball

– More time consuming than traditional forecasting

– Only focuses on 2 of the most critical uncertainties, in the real world there are more critical uncertainties needed for analysis.

References