Lets do a community survey (just like Eclipse (pdf))! But what questions should we ask?
Given the work below, I've drafted up a set of questions. Please take the time to review - you can add any comments here.
Please consider:
Update, SeanKennedy - 22 Feb 2012: About the draft survey, here is the breakdown of questions (to give an idea of survey size):
SeanKennedy - 15 Feb 2012: For me, I wonder if we should try to direct some questions towards the IT staff who have to implement/deploy/maintain the software systems and the integrations.
SeanKennedy - 16 Feb 2012: A comment I've recieved by email: some of the initial questions can move to the back (i.e. things like organization size, role in the organization etc.).
LeeReamsnyder - 16 Feb 2012: A few comments:
SeanKennedy - 22 Feb 2012: Some more input arrive by direct email:
Overall, I really liked the survey - All the questions seemed very relevant. Nearly all the q / a 's were concise and clear. The format and design of the survey from a UI perspective was very consistent and seems clean.
Just some of my thoughts... note that some of them really just are preferences - so disregard them or take anything useful.
Page 1
*(Just nitpick…. and just a preference if anything - you have 'OSLC 2012 Community Survey' 2x at the top - i know one is the master border and the inner border changes)
Q3
*Not sure i would put 'don't know' as an option.. only because I'd want them to make an effort to describe their industry by using other at the bottom
Q5
*(Just nitpick… and maybe just me) But the options.. 'Tool… something' . Tool can have many contexts and especially different from a 'Utilities' to a 'High-End tech Software' company.. and in fact probably someone with the usual common sense would probably understand the context of tool here - i see this as a very small potential point of ambiguity.. Not sure if 'software tool integrator' or a sentence each would help me identify better… If you feel everyone understands the context of 'tool' then i prefer the short and concise format you have in place.
Q8
Contrasting my suggestion in Q3, i would be in favor of keeping the Don't know here.. reason being.. i could see a user not remembering where they heard about OSLC and i would rather they say don't know than funnel it falsely into another bucket.
Q14
This isn't a bad comment, but one that made me think of something funny… regarding one of the choices "Scenario elucidation"… i imagined all these people going to dictionary.com to do queries on that word. My ignorance has been revealed!
Q17
I'm not sure if Q18 is intended specifically to capture the purpose I'm about to describe… but I think a good follow up question to Q17, is a comment box that allows people to explain why certain tools (Forum, direct email.. etc..) were not useful to them.. or to include thoughts on how a tool could be more useful…
Q25/Q26
I'm not sure how long you're going to keep the community survey up, nor for I know the frequency at which the different specs evolve into the next version etc.. but putting an 'Other' box here may provide a buffer for where the survey isn't updated.. Unless of course, there is no 'other' intentionally so that you can programmatically tally something in the database… or the reality is that the form will be updated before anyone has a chance to implement a new version of the spec…
Q29
I see the potential for an other box that let's user specify things like… the OSLC workshop, Friends that have implemented it, Webcasts.. .to kind of catchall and let us know if there's another resource or trend that we aren't considering - but has been impactful.
Another email:
The draft is a good start, and here are suggestions I have to improve it so that we gather more valid result (sorry I don't have the bandwidth to comb through every questions).
Page 1 - get rid of the text inside the parenthesis "and the survey won't take very long, your answers will disqualify..." This sounds bad and people easily get confused by it :-)
Q2 - I am not sure what you are going after with the phrase "professional experience". Are we talking about programming experience, or simply any type of work experience? and how does general work experience (e.g. someone has been a doctor for 10 years) help us in the survey? I think we should be more specific about the type of experience we want users to tell us.
Q7 - Mixing the scale for the answers could lead to poor results. I suggest we either use years (2008, 2009, ...) for all the answer choices, or use 0-3 months, 3-6 months, 6-12 months, etc.
Q8 - The "Don't know" choice doesn't seem to make sense here :-)
Q9 and all other rating questions - The 1 to 10 scale is a poor choice for rating questions. We should use a 5-point or 7-point Likert scale for all rating questions.
Q11 - Similar to Q7, we are mixing the scale for the answers.
SeanKennedy - 23 Feb 2012: More email responses today:
First:
Here are some editorial comments:
Question 2
-You could change "None" to "not applicable"
-If it fits with the kind of analysis you want to do with the data, consider merging some of the year categories -- less than 2, 2-5, 6-15, more than 15 (because 15 years is a long time in software)
Question 6
Do you want to add a category for independent consultants/sole proprietor?
Question 11
Change the last selection to "I haven't participated in an OSLC workgroup."
New after Question 11
Consider adding a question about whether there is interest in participating in workgroups in the future.
Question 15
Consider rephrasing to use this format:
- Change question to:
, how well do you feel you were able to contribute in the following cases?
-Then, list on far left changes to read:
During scenario elucidation
During scenario selection
etc.
It's more readable.
If you do this, apply the same changes to Questions 20, 22, and 28.
For example for Question 22, change to say:
"Considering your experiences participating in workgroups, how likely are the following actions for you?"
Recommend to your peers that they join a work group.
Continue your participation in workgroups
Join a new workgroup covering a topic relevant to you.
Question 28
Replace : with ? in question.
Initial cap items in far left column
Question 29
Same comments as for Question 28
Question 30
Initial cap choices to be consistent with use elsewhere.
Question 31
Initial cap the choices
Question 35
Change to "how would you rank them with respect to the following software qualities?"
Second:
Common questions
Shown to all participants.
What are some of the biggest challenges you face with OSLC?
<Comment> we should also try to ask what they did when they ran into issues or faced challenges.
For example,
(internal customers) did they reach out to teams that have done this before?
(external customers) did they open a service request with their software provider?
did they
did they reach out to sale rep? their lab advocate? or
do they post questions some where (jazz.net forum? or OSLC community site)?
We want to identify gaps - either in documentation, in services team, in community, in support, etc.
What are some of the biggest successes you have with OSLC?
<Comment> This sounds a little vague - but I am not sure how to make it clearer.
For example, did they write their own application against the Rational products that provide OSLC service? (If you need, I can send you a customer presentation on the value realized with CQ OSLC CM services). Did they implement an OSLC provider for their own products?
What were the things (read documentation, follow on tutorial, buy services? etc.) they did that contributed to their success?
SeanKennedy - 23 Feb 2012: Notes from Comms WG meeting:
SteveSpeicher - suggested question:
SteveSpeicher - suggested question (perhaps belong in other sections)
Only shown to those who indicated some participation.
Only shown to those who indicated some implementation experience.
Only shown to those who indicated that their company uses software integrated via OSLC.
Shown to all participants.
I | Attachment | Action | Size | Date | Who | Comment |
---|---|---|---|---|---|---|
![]() |
OSLC_Survey-RFC1.zip | manage | 266.7 K | 15 Feb 2012 - 21:32 | SeanKennedy | Contained: a PDF with an illustration of how the survey will look on surveymonkey.com, and the spreadsheet used to populate that survey. |