Community Call Notes and Video - 12 September 2013

P2PU Community Call
12 September 2013

Attendees:

Dirk
Bekka
Vanessa
Jane
Erika

List of Projects (delete those that have no news):
MOOC Maker (Dirk)
Courses UX (Dirk)
P2PU Internal Tech (Dirk)
Boston Workshop (Bekka)
Lab Reports (Bekka)
Communications (Bekka)
Offline MOOC (Karen)
Assessment Framework (Vanessa)
Badges platform (Vanessa)
Data Explorer Mission 2 (Vanessa)
P2PU Strategy 2013 (Philipp)
Python MOOC (???)
School of Open (Jane)
School of Ed (Karen)
School of Data (Question about how to list here - managed independently)
Music MOOC (TBD)
New Projects (Philipp)

Agenda
Progress (what we’ve been working on)
Music Mooc
Tools, modules Badges identified
http://pad.p2pu.org/p/music-mooc
Badges
DML final report (in process)
UX testing for dashboard complete: 71 responses, 262 views http://verifyapp.com/u/42162
Assessment blog posts: 2/3 complete
Lightweight profiles for MOOC
Priorities (focus next week)
Assessment: Finish assessment blog post
Research: MRI announcements on the 17th
Badges: DML report complete
Data Explorer Mission: get a consistent sync
Setting up for OKCon 16 - 18 Sept
MOOC maker
Problems (walls we ran into on the way)
Process (org stuff)
Vanessa traveling Sept 16
Out of office 10/2-morning of 10/3
Ideas (on the horizon)
HASTAC DIY Badge System–Vanessa is on their call on 9/24

Discussion
Relevant blog post: “Are MOOCs Counterproductive” http://acreelman.blogspot.se/2013/09/the-silent-majority-why-are-mooc-forums.html

  • Forums scare people off
  • Experts are intimidating
  • 25% of folks are the
    most active

Course Feedback & Suggestions (monthly check-in): https://trello.com/b/AxNhy4Ey/feedback-and-suggestions

  • Uploading/embedding images in course request (Karen)
  • Nice thing about not doing it ourselves is that there is a bunch of stuff we don’t
    need to do - eg. upload the images, host them, people asking us to
    take down images, etc.
  • But maybe worthwhile looking into some API to upload images in some other service. Or tips for where to upload and the link
  • It is frustrating to have to go out and upload an image elsewhere and
    come back and link Action item: Dirk will look at different options
    and prioritize
    -Marking old versions of facilitated courses as ARCHIVED (Jessica and Delia)
    Will a big archive sign actually help?
    Action item: as a first step add that and see if it helps. If it doesn’t, may have to disable commentson courses that are archived.
    -New SOO landing page (in progress w/Dirk and Jane)

London meetup planning update (BK & JP): http://etherpad.creativecommons.org/p/SOO_London_meetup

  • Creating a human timeline of the open education space -
    shaped by the people who turn up - people tell personal stories to set the parameters of the timeline
    This will be added to the OKFNs Open Ed Handbook
    Bekka has been working with OKFN on the handbook

June’s Paper!
http://jolt.merlot.org/vol9no2/ahn_0613.htm
Top down models (Coursera/EdX) versus emergent models (P2PU)
Learner engagement (reaching out) versus participation (engaging each other)
We gave June all our data: he and his team came up with a framework to understand the interactions
Of the total, there were 368 projects (18.09%) that went live and were not deleted from the P2PU platform. Of these projects 159 were challenges, 132 were study groups, and 77 were courses. About 85% of these projects were in English and 12% in Spanish (with less than 3% in other languages).
It was found that 6,483 members returned to P2PU at least once after account creation (approximately 16% of all registered members). This implies that that the majority of members (approximately 85%) created accounts and never again engaged with the community.
Of the 6,483 members of P2PU who logged in at least once more after creating an account, 4,730 (73%) have signed up to be an organizer, participant, or follower of at least one learning project. This suggests that of P2PU members who do return to the site, a substantial number attempt to become active members in projects by organizing a project or signing up to be a participant.
The majority of these learning groups were small. The median number of participants in study groups was three members and the median in courses was five members.
The detailed logging of tasks provides some nuanced ways to conceptualize and observe persistence and engagement. For example, challenges in P2PU had an average of 4.67 learning tasks that members were asked to complete (Table 2). On average, members reported completing roughly half of the available tasks (2.24 average number of tasks completed per adopter).
We thought that very few of the social features (Like "follow) were being used, but in actual fact, the these features were being very well used.
People scared of Identity theft regarding assignments- interesting!

Assessment research - (VMG)
RPA Journal
Core issues:
Identity fraud
Scalability

Meyer: Cheating and MOOCs
http://www.rpajournal.com/dev/wp-content/uploads/2013/05/SF3.pdf
Problem: MOOCs are ripe for cheating, and so instructors don’t trust them.
Proposed Solution: have a lot of test questions and use item response theory to figure out how hard they are to equalize for adaptive learning.
P2PU: Seem to think cheating is a big problem–if you look at students as numbers, it might be. Change the learning activity so that assessing is learning, and cheating evaporates as a problem.

Balfour: Collaborative Peer Review
http://www.rpajournal.com/dev/wp-content/uploads/2013/05/SF4.pdf
Problem: MOOCs at scale pose more material to be assessed than a team of instructors have bandwidth to assess.
Solution: Coursera’s version of Calibrated Peer Review: “For Koller and Ng, “calibrated peer review” is a specific form of peer review in which students are trained on a particular scoring rubric for an assignment using practice essays before they begin the peer review process.”
"Lastly, a few studies suggest that structured, computer-regulated peer evaluation in specific situations may be more beneficial to students than just feedback on their writing (Heise, Palmer-Judson, & Su, 2002; Likkel, 2012). "
P2PU:
Students don’t usually trust peer review–takes some convincing that peer review is relevant. Can frame in terms of "not a test–relevant to your life"
We can reframe the discussion about scale–learning has a distinctly affective/emotive dimension. Leaners must develop relationships in order to trust each other enough to assess–they must know each other’s online presence and respect their point of view. Only if we build upon relationships will peer learning scale, in a networked way. CPR does this by training students to give feedback first, to calibrate.

Sandeen, Assessment in the New MOOC world
http://www.rpajournal.com/dev/wp-content/uploads/2013/05/SF1.pdf
Problem: MOOCs put the issue of assessment front and center, no longer an add-on in course design
Shift from inputs (curriculum, seat time, quality instructors) to outcomes (competency-based) evaluation–>opportunity for assessment
Solution: assessment experimentation & ways to secure authentication/identity
Opportunity in the humanities: “The majority of MOOCs offered for credit are in STEM disciplines. It will be interesting to see new developments in large scale online assessments for classes in the humanities and the arts where multiple choice exam questions are not always the most effective or accepted assessment method.”

WHAT TO DO ABOUT RESEARCH AT P2PU

  • Research going on at P2PU
  • Research guidelines for P2PU
  • Data Policy
  • Research that backs up our projects
  • Research in this space

We can put in parking lot for next week: What is future of info.p2pu.org

Sites we like: http://www.hastac.org/collections/badges-learning-research

And the video: