Category Archives: Mentoring & Other Business

How to Survey, Part 2 (Best Practices)

In my 16 April blog entry How to Survey, I presented 3 sections: Key Questions, Tools and Services, and Reading. In this entry, I present some Best Practices based on my experience and the advice of two wise and capable women with whom I had the honor to work: Dr. Robin Jeffries and Dr. Kornelija Zgonc. All errors may be attributed to my misunderstanding, not their teaching!

The most recent survey completed by my department here in the Chief Technologist’s Organization at Sun Microsystems was the SEED mentoring program quarterly report for April 2008. See Mentoring Success Metrics (April 30, 2008) for details. SEED (Sun Engineering Enrichment and Development) has been collecting quarterly feedback from a web-based survey since 2002, so this is a mature example of a cyclic survey. The SEED survey is not anonymous. Most of the practices below are also appropriate for one-time surveys and for anonymous surveys.

Characteristics of a Good Web-based Survey (with examples from SEED):

  • It is Short. The SEED survey consists of 14 questions. One way to shorten surveys: don’t ask for information that can easily be mined from another source.
  • It is Easy to Use and Understand. Use pull down menus wherever possible to provide clear options. When a range of answers is possible, offer the same one-to-seven range, with “1” being low, “4” neutral, and “7” being high. State questions as simply as possible and test for clarity (if it is possible to misunderstand, someone will). Avoid jargon, abbreviations, and local slang.
  • It is Easy to Analyze the Responses. Use very few open text fields. Use a seven point range so that there is a clear low, neutral, and high (more on this below). “Does Not Apply” and “No Response” are always options. “No Response” is the default option (that is, the respondent must make an active change to answer).
  • For Cyclic Surveys – Prior and Future Versions are Comparable. Questions do not change much over time.
  • It is Trustworthy. Send a survey copy immediately in email to the respondent. Make survey analysis results available to respondents promptly. Actively protect private and anonymous information. Say in the survey introduction what will happen with the results (then, do what you say). Remember Robin Jeffries’ First Law of Surveys: “Don’t ask questions unless you are prepared to act on the results!”

The following Attributes of Poor Surveys list is material developed by Kornelija Zgonc, former Sun Chief Master Black Belt, and my Six Sigma mentor:

What’s Wrong?

  • Survey goals unclear
  • No forethought about your processes
  • Lots of yes/no questions
  • Lots of written questions
  • Focus on symptoms
Why it’s a Problem:

  • Take-aways unclear
  • Don’t know how to implement changes
  • Limited analytics; need big sample sizes
  • Unclear or unfocused questions
  • Get more questions, not answers!

The following Attributes of Great Surveys is also material developed by Kornelija Zgonc:

  • Goals, processes, and possible cause/effect relationships are analyzed up front
  • Widely-scaled numerical questions allow lots of analytics and keep sample sizes low
  • Only need a few written questions to address unforeseen situations or problems
  • Survey has action-oriented focus to generate solutions, not more questions

Why a 1 to 7 Range?
Multiple choice options make it easier to statistically analyze survey results. One of the common and energetic “discussions” among those who design surveys is what range to allow for numerical questions. Simply put: how many number choices should the respondent be offered? Too short a range (like: 1=bad, 2=neutral, 3=good) may not reflect an accurate subtlety of opinion. However, too many options can give a false confidence in the value and gradation of the answer. Don’t ask for more precision than your users are likely to know!

A range of seven is the best choice. When seven or more numbers are offered in a scale (like: 1=strongly disagree, 2=disagree, 3=disagree, 4=neutral, 5=agree, 6=agree, 7=strongly agree), the data collected behaves and can be analyzed like continuous variables. (Data are discrete if there are a limited number of values possible. Example: number of legs on a cat, number of letter grades possible on a test. Data are continuous when the measurements can have any value. Examples: time, weight.) This allows tremendous analysis flexibility because there are many more statistical tools for continuous data analysis than for discrete  data analysis.

Why Statistics Don’t Matter (sometimes) With all deference to my colleagues who are statisticians and Six Sigma Master Black Belts, sometimes statistics don’t matter.

  • The survey itself is a form of communication, regardless of whether it is answered, analyzed, or acted on. The survey may change the nature of the audience’s awareness.
  • If you don’t ask the right audience or collect enough responses, the answer does not matter.
  • Some people will never give a top or bottom score under any circumstances.
  • Refine, reduce, remove:
    • Too many surveys make people hate or ignore you.
    • Too many questions will cause your audience to abandon the survey part way through.
  • If your questions are too personal or respondents are embarrassed to tell the truth
    (for example: admitting they don’t know the answer), answers will be worthless.

Links and formatting on this post refreshed 11 October 2017

1 Comment

Filed under Mentoring & Other Business

Mentoring Success Metrics

Every quarter, the SEED (and PreSEED) mentoring program announces a
web-based survey for current participants (mentees), mentors, and managers.
Tanya Jankot just finished her analysis of the April 2008 reports.
The results and comments are very similar to previous quarters. Satisfaction remains high, and the most frequent request from participants is for more opportunity (and financial support) for face-to-face contact with their mentor and other participants. Once again, there was no significant difference in satisfaction between participants co-located with their mentor and those working at-a-distance.

The purpose of SEED’s quarterly report is to measure the success of the program. It also gives participants, and their mentors and managers, a chance to voice their opinion of the program and share their thoughts and experiences with fellow participants and the SEED program team. These reports are published with the full knowledge of the participants; we encourage participants to submit more private comments in a separate email. We measure the program’s success through participants’ reported satisfaction and learning, plus the more objective annual measures of promotions, retention, performance rating, etc. The success of the individual participant is due to their own capabilities and hard work (plus available opportunities and good management!). Increased success of the participants as a group may be attributable in part to the SEED program.

Here are some of report highlights:

    • This quarterly report was for 4 terms (3 SEED terms, plus 1 PreSEED term).
    • 92 people responded to the survey: 53 participants, 29 mentors, and
      10 managers answered. There were 149 eligible participants (36% response rate).
    • Participant respondents report that participation in the program positively influenced the following:
      • Better career direction: 58% of respondents
      • Greater understanding of Sun’s overall architecture, strategy, or business direction: 55% of respondents
      • Broader network of contacts (peer or executive): 45% of respondents
      • Increased visibility, within or outside work group: 42% of respondents
    • Participant satisfaction with the program:
      • 92% reported being satisfied
      • 98% thought that the meetings with their mentor were worthwhile
    • Mentor satisfaction with the program:
      • 79% believe their Mentee’s participation has made them more valuable to Sun
      • 89% would want to be a mentor again in the SEED program in the future
      • Several Mentors noted that their partnership had just begun and they were not yet able to assess the program’s impact.
    • As with past quarterly reports, analysis does not show significant difference in responses to “Q15 Overall Worth of Meetings with Mentor” and “Q24 Overall Satisfaction with Program” between participants at-a-distance from their mentor and those co-located with their mentor. A full 76% of participants who responded to this quarterly report were at-a-distance from their mentor. This is a positive indication that SEED mentoring partnerships are beneficial to participants whether or not the mentoring pair is able to meet in person.

For information on some of SEED’s survey techniques, see my 16 April 2008 blog
How to Survey
and my 1 May 2008 blog

How to Survey, Part 2 (Best Practices)
.

PreSEED is a pilot of the SEED worldwide Engineering mentoring program.
More information on SEED is available at

http://research.sun.com/SEED/

1 Comment

Filed under Mentoring & Other Business

52 PreSEED Participants Selected

On 25 April, we selected the 52 participants in the PreSEED Engineering
mentoring program for the June – December 2008 term. There were 65
applicants to the term from Sun’s Software Division worldwide but not
all applicants were eligible because of incomplete applications, or a
mismatch with the scope of the program.

Tanya Jankot and I are now preparing for next month’s announcement of
the 2008-2009 Recent Hire and Established Staff terms of the SEED mentoring
program. Because of PreSEED’s success, we are redesigning the
scope of all three mentoring groups (PreSEED, SEED Recent Hires, and SEED
Established Staff) to fit together better. We have a flow chart already
and will be announcing the new scope’s details soon.

The next steps for new PreSEED participants are:

    1. Participants will create their 10 name Mentor Wish Lists
      due on 5 May 2008 (9 a.m. Pacific time)
      by way of the internal web site.
    2. Participants will work with Tanya Jankot to personalize the
      participant web pages she will create.

About the New Participants
65 Applicants
52 PreSEED Participants Selected
Work Locations: China, Czech Republic, Germany, India, Ireland,
Japan, Russia, USA
Division: 100% Sun Software Group
Gender of Participants:
* female: 8  [ 15% ]
* male: 44  [ 85% ]
Grade Level: all Members of the Technical Staff, levels two to four (MTS 2-4)
14 Previously Applied to SEED, 27%
Countries of origin this term include: Austria, China,
Czech Republic, El Salvador, Estonia, Germany, India, Iran,
Ireland, Japan, Korea, Russia, Slovakia, UK, USA, Viet Nam

Software Chief Technologists

Bob Brewin
(Distinguished Engineer and Vice President) and

Tim Marsland
(Fellow and Vice President) are PreSEED’s pilot term Champions.

Greg Papadopoulos
(Chief Technology Officer and Executive Vice President of Research and Development) is the SEED
program executive sponsor.

PreSEED is a pilot of the SEED worldwide Engineering mentoring program.
More information on SEED is available at

http://research.sun.com/SEED/

Leave a comment

Filed under Mentoring & Other Business

65 PreSEED-2 Mentoring Applications

The PreSEED-2 application web pages were open between 14 April and
noon today. We received 65 submissions by today’s deadline, 57 of which
are complete.

Once the application period was closed, the program staff started
evaluating which applications are complete and meet the selection
criteria: these are the eligible applications. Part of this evaluation
is verification by Sun Human Resources (HR) of each applicant’s recent
performance ratings, hire date, etc. Applications which are substantially
incomplete or are found to contain deliberate misrepresentations are eliminated
from consideration. Another part of this evaluation is whether the
applicant’s manager strongly supports the application. Verification
takes time and can’t start until after the deadline. In a regular SEED term, 15%
to 20% of applicants are disqualified for one reason or another.

The PreSEED-2 pilot mentoring term for Sun Software Members of the
Technical Staff will accept up to 50 participants; it will run from
June-December 2008. We will not know until after the verification review which
submissions will be accepted. I will announce the participants accepted
into PreSEED-2 on or before 25 April.

On 3 April, we announced PreSEED-2, the second pilot mentoring term aimed at helping Sun Engineering staff who have been getting almost all “Sun Standard” (2 or Standard-level) performance ratings onto a path which may lead them to higher engagement. The first PreSEED pilot term is currently under way, running from March-September 2008. The PreSEED-1 metrics and feedback so far are good and the same or better than metrics of a regular SEED worldwide mentoring
term. We are now collecting the first formal feedback from PreSEED-1
mentees, managers, and mentors.

Software Chief Technologists

Bob Brewin
(Distinguished Engineer and Vice President) and

Tim Marsland
(Fellow and Vice President) are PreSEED’s pilot term Champions.

Greg Papadopoulos
(Chief Technology Officer and Executive Vice President of Research and Development) is the SEED
program executive sponsor.

PreSEED is a pilot of the SEED worldwide Engineering mentoring program.
More information on SEED is available at

http://research.sun.com/SEED/

Leave a comment

Filed under Mentoring & Other Business

46 PreSEED-2 Applications

Since the application web pages opened for use on 14 April, we have
received 46 submissions, 28 of which are complete. The due date for
application submission is 21 April.
Preference is given to applications
which are completed earlier.

The PreSEED-2 pilot mentoring term for Sun Software Members of the
Technical Staff will accept up to 50 participants; it will run from
June-December 2008. We will not know until after the 21 April deadline which
submissions will be completed by the applicant then verified by Sun Human
Resources (HR). The initial applicant group is remarkably diverse geographically.
We have submissions so far from China, Czech Republic, Germany, India,
Ireland, Israel, Russia, Switzerland and the USA. Only 40% of the applicants
so far are working in the USA.

On 3 April, we announced PreSEED-2, the second pilot mentoring term aimed at helping Sun Engineering staff who have been getting almost all “Sun Standard” (2 or Standard-level) performance ratings onto a path which may lead them to higher engagement. The first PreSEED pilot term is currently under way, running from March-September 2008. The PreSEED-1 metrics and feedback so far are good and the same or better than metrics of a regular SEED worldwide mentoring
term. We are now collecting the first formal feedback from PreSEED-1
mentees, managers, and mentors.

Software Chief Technologists

Bob Brewin
(Distinguished Engineer and Vice President) and

Tim Marsland
(Fellow and Vice President) are PreSEED’s pilot term Champions.

Greg Papadopoulos
(Chief Technology Officer and Executive Vice President of Research and Development) is the SEED
program executive sponsor.

PreSEED is a pilot of the SEED worldwide Engineering mentoring program.
More information on SEED is available at

http://research.sun.com/SEED/

Leave a comment

Filed under Mentoring & Other Business

How to Survey

Introduction

This is an revised version of a 2008 web page (first created in 2003 for Sun Microsystems) that brings together in one location key information and resources for how to conduct surveys. The initial audience for this information was the Sun Sigma (Six Sigma) professional community.

How do I know about surveys and data collection? I was certified as a Sun Sigma Black Belt in 2002 and served as a Master Black Belt 2002-2010. I was also in one of the last classes that Dr. Deming taught on statistical management methods, in 1993.

Key Questions

Four questions to ask yourself before starting to create a survey:

1. Why survey?

A survey is one of many good ways to collect information from customers. It may or may not be the best way for your situation. Have you considered other options such as field studies, baseline research, interviews, and focus groups? Many customer groups get surveyed over and over and get very tired of questions: what do you know about the data already collected from the target group?

2. What are the rules?

Information protection, security, and privacy are some policy and legal areas to consider before developing your survey. Local laws about sweepstakes and contests also need to be considered for some survey incentives. If you work for a company, there may be different policies for internal corporate surveys and external customer surveys.

3. What questions?

Developing survey content is as much an art as it is a science. How do you form questions so that the results can be usefully analyzed? What will you do with the answer to each question? How many questions will your target audience answer before abandoning the survey? Is the way you ask the question clear to people from other contexts and countries? An excellent survey takes time and testing to perfect. If you are new to surveying, consider asking a consultant expert for support.

4. What tool?

If you are not having someone else create your survey, there are a number of tools available to you, details are available here under Tools and Services.

Tools and Services

1. Sample Size Calculator, Creative Research Systems web site tool

    “This Sample Size Calculator is presented as a public service of Creative Research Systems. You can use it to determine how many people you need to interview in order to get results that reflect the target population as precisely as needed. You can also find the level of precision you have in an existing sample.”

2. Three-way Percent Calculators

    “Precision and accuracy. Please be aware that there are certain limitations to all web-based calculators. The arithmetic used can lead to errors in some calculations when the numbers get very big or very small. If your work depends on being absolutely, positively accurate to the last decimal place – use a real calculator!”

3. Web-based Surveys

    Web-based surveys are a best practice. You can create a custom web based survey using HTML, PERL, and CGI scripts. For those to whom these are not easily available, survey tool companies provide an easy alternative. Before starting to use any third party survey tool (especially a “free” version), be sure to consider who owns your survey data, how the privacy of your data is protected, and whether the tool company charges for larger numbers of responses.
    • Zoomerang “Create custom web-based surveys and get rapid results. Start using the #1 online survey tool today!”
    • VTSurvey “A web-based tool which enables end users to autonomously create and run online surveys, feedback or registration forms.” (last update: 2005)
    • SurveyMonkey “Intelligent survey software for primates of all species. SurveyMonkey has a single purpose: to enable anyone to create professional online surveys quickly and easily.”
    • SurveyGizmo “SurveyGizmo is an exceptionally powerful survey tool designed to make even the most advanced survey projects fun, easy & affordable.”

Reading

  • “Ask Them Yourself” – How to survey your customers on the cheap, By Ellyn Spragins, FORTUNE – Small Business – Innovation, From the Dec. 2005 Issue of FSB
  • “Keep Online Surveys Short” by (former Sun Distinguished Engineer) Jakob Nielsen – Alertbox, February 2, 2004
      “To ensure high response rates and avoid misleading survey results, keep your surveys short and ensure that your questions are well written and easy to answer.”
  • “Raising Your Return on Innovation Investment” By Alexander Kandybin and Martin Kihn, Booz Allen Hamilton, 2004 (free – web site registration required)
      “There is also is a flaw in the methods by which most companies go about developing new products. Focus groups and surveys elicit consumer opinions, but people can’t know what they don’t know.”
  • “Listening to the Voice of the Customer” by Mark Federman, Chief Strategist, McLuhan Management Studies, McLuhan Program in Culture and Technology, University of Toronto – November 28, 2001 (9 pages, PDF format)
  • “Getting the truth into workplace surveys” by Palmer Morrell-Samuels, Harvard Business Review, February 2002 – Reprint R0202K
  • How to Conduct Your Own Survey by Priscilla Salant, Don A. Dillman. John Wiley & Sons (1994) ISBN: 0471012734
  • An alternative to the survey:
    • “Field Studies: The Best Tool to Discover User Needs” by Jared M. Spool, Originally published: March 13, 2007
        “While techniques, such as focus groups, usability tests, and surveys, can lead to valuable insights, the most powerful tool in the toolbox is the ‘field study’. Field studies get the team immersed in the environment of their users and allow them to observe critical details for which there is no other way of discovering.”
    • “Risks of Quantitative Studies” by (former Sun Microsystems Distinguished Engineer) Jakob Nielsen, Nielsen Norman Group: Alertbox, March 1, 2004. Follow up article: Accuracy vs. Insights in Quantitative Usability, Nielsen Norman Group: Alertbox, 21 November 2011
        “Number fetishism leads usability studies astray by focusing on statistical analyses that are often false, biased, misleading, or overly narrow. Better to emphasize insights and qualitative research.”

See Katysblog 1 May 2008 blog entry How to Survey, Part 2 (Best Practices) for more.
“How to Survey” was refreshed 26 November 2014.

1 Comment

Filed under Mentoring & Other Business

PreSEED-2 Application Status

Since the application web pages opened for use on 14 April, we have
received 25 submissions, 19 of which are complete. The PreSEED-2
pilot mentoring term for Sun Software Members of the Technical Staff
will accept up to 50 participants; it will run from June-December 2008.
The initial applicant group is remarkably diverse geographically.
We have submissions so far from China, Czech Republic, Germany, India,
Ireland, Israel, Russia, and the USA. The due date for application
submission is 21 April.

On 3 April, we announced PreSEED-2, the second pilot mentoring term aimed at helping Sun Engineering staff who have been getting almost all “Sun Standard” (2 or Standard-level) performance ratings onto a path which may lead them to higher engagement. The first PreSEED pilot term is currently under way, running from March-September 2008. The PreSEED-1 metrics and feedback so far are good and the same or better than metrics of a regular SEED worldwide mentoring
term. We are now collecting the first formal feedback from PreSEED-1
mentees, managers, and mentors.

Software Chief Technologists

Bob Brewin
(Distinguished Engineer and Vice President) and

Tim Marsland
(Fellow and Vice President) are PreSEED’s pilot term Champions.

Greg Papadopoulos
(Chief Technology Officer and Executive Vice President of Research and Development) is the SEED
program executive sponsor.

PreSEED is a pilot of the SEED worldwide Engineering mentoring program.
More information on SEED is available at

http://research.sun.com/SEED/

Leave a comment

Filed under Mentoring & Other Business