Mobile Learning 09 Conference, Day 2, Project K-Nect Presentations

0217091852edited

A series of three presentations covered Project K-Nect during Mobile Learning 09:

Break-Out Session (Suzette Kliewer, Dr. Kathy Spencer (Onslow Co. Schools), & Crystal Wong (SOTI, Inc).

The project is a process of educating children, not a thing.

K-Nect teacher portal (content, assessment, monitoring), and student system on phones (eContent, IM, blog, assessment, virtual HD); runs in a closed, secure community, only works with people within the program. Student web portal as well (website, blog, etc.)

Teacher can monitor everything that students do (how does this impact student use of phones; does it turn kids away from using the device because there is too much monitoring going on?? How does it affect kids’ use of tech in general? Is it still a division between school and life?)

Math problem sets are real-world applications of math taught in school. Includes multi-media. There is an eContent repository to post links to other helpful sites. IM is used for communication among students and student-teacher (big part of project). Blogs are used for posting solutions to problem, questions, videos, etc., as well as commenting, including video solutions. Assessment for Algebra I. Virtual HD to store video, pictures, etc.

Kids know more about tech than teacher (have heard this many times before)

To be added to the site: teacher can create video and broadcast it out to all kids’ phones.

Examples of actual problems: multimedia intro, students have to explain answers. There’s a help function as well.

 

Keynote 1: Research and Assessment of Project K-Nect (Dr. Scott Perkins, ACU)

Assessment and Research on Mobile Learning in Education: An Initial Agenda

Assessment principles

  • Multiple methods (objective data and self-report
  •  Multiple perspectives (student, teacher, institution
  • Develop formal (standardized) scales

Increasing student involvement:

  • Who should we ask, how, and when?
  • Survey and self-report only part of the picture

What we know is promising, the seven “good practices” are a great fit.

Empirical demonstrations of impact needed (how do we do this?)

  • Impact on learning
  • Parameters of learning
    • Student reports on effort, time, and competence
    • Objective evidence of achievement and mastery
  • Student and teacher perspectives
  • What we need to do next
    • Fund and routinely utilize formal research methods

Research Strategies

  • Outgrow self-report and attitude 
  • Commit to routine assessment
  • Innovative and long-term research
  • Train and equip researchers
  • Foster discipline-specific applications
  • Organize collaborations and sharing of results
    (I think some of this is already happening)

 

Keynote 2: Research and Assessment of Project K-Nect : Findings (Stacie Hudgens, PsyMes Consulting)

What is the impact of Project K-Nect on algebra skills? Level of implementation question was added. Level of implementation makes a difference!

Quasi-Experimental one-group pre-post test design (no control group!!).
Data sources:

  • Student surveys
  • Student assessment
  • NC algebra I EOC exam
  • Onsite interviews and focus groups
  • Device data
    • Usage data
    • Problem set info and achievement
    • Quiz performance

Key findings:
Students: n = 89 (51 finished the required assessment and surveys). Small data set!! Spread across 4 schools (10, 15, 16, 10)

Four constructs to measure implementation success (low, middle, high):

  • Building level tech support
  • Admin involvement 
  • Consistent communication with program director
  • Program level implementation within the classroom.

Schools A and B: mid level implementation schools (3 of 4 constructs met)
School C: high implementation (4 of 4)
School D: low implementation (2 of 4; last two constructs only)

Overall change pre to post on algebra I  assessment was 5.4% (ES = .40) -> moderate change.

Achievement by implementation level to control for natural growth etc.
High implementation: ES = 1.5
Medium implementation: ES = .43 and .22
Low implementation: ES = (didn’t get this one, need to check the full report when it is released).

Summary of findings

  • Positive correlation between impl and student achievement
  • Increased student engagement, time spent on algebra, confidence with device and math, utilization, and parental involvement
  • Increased communication between teachers and students

Limitations

  • Small sample size
  • Limited duration
  • Block scheduling

Need validated instrumentation and methodology from the preliminary year to obtain longitudinal findings.

Elliot Soloway commented that it’s important to go beyond efficacy research; we need to know under what conditions the technology has an impact (i.e. fidelity research).

And also when we say we should be doing longitudinal research. We should be following kids over time, not just research a different set of kids each year.

Image Credit: “Capitol Butter”, my camera phone  :)

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s