Hello, DEMO SLAMS. The perfect place to do so.
For some years, I have been using Demo Slam to end learning conferences and summits. It’s a relaxed and fun way of sharing ideas.
Demo Slams are great ways to build community and collaboration. Being so supporting of risk-takers -- embracing the IB-learner profile, and walking the walk.
What it looks like:
Any comments about Demo Slams, as always, I'd love to hear from you.
Twitter are fantastic sources of information, and for me, a tremendous source of learning. Running #PYPConnectEd is incredible and so rewarding -- new connections, keep up with colleagues and make friends. The #PYPConnectEd PLN is terrific! Our community always contributes to inspiring ideas and incredible resources to all of the topics that are covered—the Exhibition, Agency, Well-Being and Play. PYPConnectEd has you covered.
Preparing for PYPConnectEd meant the following:
Running a PYPConnectEd chat:
Joining a Chat
On the day:
As always drop me comment and let me know how you go on.
I have spent most of today deep in discussion around personalized learning and adaptive learning — two of educations hot potatoes.
First, let us take personalized learning: Here, we are targeting individuals, addressing their needs, context, and individual goals; providing the right content, tools and experiences to help them learn and grow. Versus adaptive learning: Doing everything that personalized learning does, but 'live' and in the moment. Using algorithms adaptive learning technologies, detect a user's behaviour and provide personalized recommendations.
In this post, I'd like to consider the following questions concerning personalized and adaptive learning:
Is adaptive learning the ideal and personalized just a step along the way?
I can see how we might use adaptive teaching to enhance differentiation to meet students' needs. Let's use it to slow the pace of learning down. I mean, can learning ever really "be late." Using algorithms and technologies to reduce cognitive load gets my vote! Through gamified mathematics is a great idea. Have you tried out Reflex Maths? Let me know your thoughts and ideas.
Feel free to leave your answers in the comment box. As always, I'd love what you all have to say.
As a Google for Education Certified Trainer, I use an activity app / a tracker for my training sessions. It is great. It allows me to know exactly what I am doing, and for how long, because if you ask me day-to-day or at the end of the week, and I'd probably say...a-lot or what didn't I do...
This week, I have been introduced to Tech Request, a book by Emily L. Davis and Brad Currie. The book is part of a Leveraging Technology class with Dr Susan Patterson of Lesley University.
In Chapter 5: How's It Going? Emily and Brad discuss gathering and sharing data on the impact of coaching. Recognizing that there are so many kinds of data available to coaches, and how it is difficult to know what to look for, and what to do with it, they explore an impact continuum, strategies for analyzing and utilizing the data, and approaches to share data with stakeholder, in ways that will promote longevity and sustainability.
What should we collect?
While I am gathering data on how much time I spend coaching, running professional development sessions and preparing model lessons, I want to know about the implementation and impact of programs— namely Seesaw. Emily and Brad are firm in their belief that "there is no expectation" that tech coaches should begin with a "full-blown" data collection and analysis, in fact, they advise that it is done carefully over a long period of time.
As a coach, I am always asking myself "is it working?" Now, with some of the metrics provided in chapter 5, it will hopefully be a little (or a lot) easier to enact a more qualified and quantified answer.
At the start of this post, I wrote how I use an activity tracker to log my hours. In this chapter, this is referred to as counting metrics, and while it "can only give a 30,000-foot view of a coaching program. With that aid, there is no doubt in my mind that this is an incredibly useful place to start.
The next metric described is a program quality metric. This type of data captures an understanding of how well aligned the practice is to the goals. To dig a little deeper into the program, we must begin to ask "what and "how well" questions. What types of technology applications are coaches helping to implement to enhance learning experiences?
Know Thy Impact
Here, I am now asking, "is what I am doing, making a difference?" To measure this and have a firmer understanding of success is to dig a little deeper.
What do we do with what we have collected?
As mentioned above, data is useful as a coach in two ways:
Putting it all together
Gathering, analyzing and using data as a coach is a critical skill. One that is used to evaluate the program's sustainability and highlight areas for improvement. Chapter 5, looks at frameworks, to determine the type of data to collect; shares tools, for data collection purposes and offers strategies for how to share the data internally and externally.
On reflection, using the data from my activity app and tracking system, now I need to start asking the why and how questions: Why is this happening and how can I change things? Something I need to become more cognizant and aware of is: How much of my time is reactive vs being proactive?
In short, this is an excellent chapter for any tech coach or technology leader interested in data!