Archive for the ‘Competence Development’ Category

Coding the future

March 8th, 2012 by Graham Attwell

The debate over computer science, digital literacies etc. in the UK is still continuing. And the success of the Raspberry Pi computer – selling out of its first 70000 production run in under a week shows the demand and interest in coding and computers in general.

One driver of the debate is that employers are unhappy with the competence and knowledge of potential employees. But this is not new. Employers have always moaned that job applicants do not have the right skills, aptitudes, attitudes – whatever. And it is always the fault of the schools or universities. Maybe it is time that employers started thinking about their own role and responsibilities for training a future workforce. And that includes the IT industry. Of course curricula need updating. Learning how computers work is probably more of a democratic necessity rather than for employment or the economy. There is a danger that we evolve as a society of consumers essentially controlled by the technology of a few major corporations. You know who they are!

But just tweeking the school curriculum or weeding out production fodder university courses will not solve the problem. The real issue is how we view learning – how we create learning environments outside the classroom and how we value learning that takes place outside the formal education sector.

I like the following thoughtful comments from Chris Applegate on his blogpost ‘Why it’s not just about teaching kids to code

Secondly, there’s a spectrum of challenges, but there’s also a spectrum of solutions. It’s not just schools and universities that need to bear the burden. As I said, coding is a practice. There’s only so much that can be taught; an incredible amount of my knowledge comes from experience. Practical projects and exercises in school or university are essential, but from my experience, none of that can beat having to do it for real. Whether it’s for a living, or in your spare time (coding your own site, or taking part in an Open Source project), the moment your code is being used in the real world and real people are bitching about it or praising it, you get a better appreciation of what the task involves.

So it’s not just universities and schools that need to improve their schooling if we want to produce better coders. Employers should take a more open-minded approach to training staff to code – those that are keen and capable – even if it’s not part of their core competence. Technology providers should make it easier to code on their computers and operating systems out-of-the-box. Geeks need to be more open-minded and accommodating to interested beginners, and to build more approachable tools like Codecademy. Culturally, we need to be treat coding less like some dark art or the preserve of a select few.

 

Algorithms and Embedded Ethics

February 21st, 2012 by Graham Attwell


This is a critical issue. In this short nine minute video, Eli Pariser says “Your filter bubble is your own personal, unique universe of information that you live in online. What’s in your filter bubble depends on who you are, and it depends on what you do. But you don’t decide what gets in — and more importantly, you don’t see what gets edited out.”

This also applies to attempts to develop algorithm based systems for learning. We have to make sure that people are encouraged to challenging ideas, rather than just following the pathway of least resistance (which is yet another reason why I worry about simple taxonomy driven system).

Open Learning Analytics or Architectures for Open Curricula?

February 12th, 2012 by Graham Attwell

George Siemen’s latest post, based on his talk at TEDxEdmonton, makes for interesting reading.

George says:

Classrooms were a wonderful technological invention. They enabled learning to scale so that education was not only the domain of society’s elites. Classrooms made it (economically) possible to educate all citizens. And it is a model that worked quite well.

(Un)fortunately things change. Technological advancement, coupled with rapid growth of information, global connectedness, and new opportunities for people to self-organized without a mediating organization, reveals the fatal flaw of classrooms: slow-developing knowledge can be captured and rendered as curriculum, then be taught, and then be assessed. Things breakdown when knowledge growth is explosive. Rapidly developing knowledge and context requires equally adaptive knowledge institutions. Today’s educational institutions serve a context that no longer exists and its (the institution’s) legacy is restricting innovation.

George calls for the development of an open learning analytics architecture based on the idea that: “Knowing how schools and universities are spinning the dials and levers of content and learning – an activity that ripples decades into the future – is an ethical and more imperative for educators, parents, and students.”

I am not opposed to what he is saying, although I note Frances Bell’s comment about privacy of personal data. But I am unsure that such an architecture really would improve teaching and learning – and especially learning.

As George himself notes, the driving force behind the changes in teaching and learning that we are seeing today is the access afforded by new technology to learning outside the institution. Such access has largely rendered irrelevant the old distinctions between formal, non formal and informal learning. OK – there is still an issue in that accreditation is largely controlled by institutions who naturally place much emphasis on learning which takes place within their (controlled and sanctioned) domain. yet even this is being challenged by developments such as Mozilla’s Open Badges project.

Educational technology has played only a limited role in extending learning. In reality we have provided access to educational technology to those already within the system. But the adoption of social and business software for learning – as recognised in the idea of the Personal Learning Environment – and the similar adaption of these technologies for teaching and learning through Massive Open Online Courses (MOOCs) – have moved us beyond the practice of merely replicating traditional classroom architectures and processes in technology.

However there remain a series of problematic issues. Perhaps foremost is the failure to develop open curricula – or, better put, to rethink the role of curricula for self-organized learning.

For better or worse, curricula traditionally played a role in scaffolding learning – guiding learners through a series of activities to develop skills and knowledge. These activities were graded, building on previously acquired knowledge in developing a personal knowledge base which could link constituent parts, determining how the parts relate to one another and to an overall structure or purpose.

As Peter Pappas points out in his blog on ‘A Taxonomy of Reflection’, this in turn allows the development of what Bloom calls ‘Higher Order Reflection’ – enabling learners to combine or reorganize elements into a new pattern or structure.

Vygostsky recognised the importance of a ‘More Knowledgeable Other’ in supporting reflection in learning through a Zone of Peripheral Development. Such an idea is reflected in the development of Personal Learning Networks, often utilising social software.

Yet the curricula issue remains – and especially the issue of how we combine and reorganise elements of learning into new patterns and structure without the support of formal curricula. This is the more so since traditional subject boundaries are breaking down. Present technology support for this process is very limited. Traditional hierarchical folder structures have been supplemented by keywords and with some effort learners may be able to develop their own taxonomies based on metadata. But the process remains difficult.

So – if we are to go down the path of developing new open architectures – my priority would be for an open architecture of curricula. Such a curricula would play a dual role in supporting self organised learning for individuals but also at the same time supporting emergent rhizomatic curricula at a social level.

 

Training and learning

December 21st, 2011 by Graham Attwell

This time of the year things are supposed to be quiet. Christmas parties and that kind of stuff. However at Pontydysgu its not like that this year – though a dare say we may stop for the odd mince pie and glass of mulled wine in the next few days.

We have been completing project reports and writing new proposals. And I have been traveling for the last five weeks. So there is plenty to update on this blog.

The week before last I was in Bucharest for the final conference of the PREZENT! project – aiming to increase participation in continuing training for those at risk in the labour market. The project has taken a series of actions over providing access information, and awareness about opportunities for continuing and lifelong learning in Romania.

And it turned out to be a very inter sting event. The conference organisers had produced a draft strategy on training in Romania and used the event for consultation prior to submitting the strategy to the education ministry. Although I was struggling to follow the debate (my Romanian being non existent) the strategy certainly seemed to have sparked off a considerable discussion.

Yet many of the issues were hardly new, or indeed unique to Romania. Delegates were concerned about business models and how training should be financed. Indeed, there seemed to be much support for the idea of a training levy on enterprises. Delegates were concerned about the quality and regulation of training. And delegates were concerned about professional development for training and particularly over the use of technology for training.

Personally I felt they were over optimistic about the potential impact of legislative change or even at getting legislation right. However this might reflect different cultures and certainly in the past there has been some evidence that Romanian governments have taken more interest in training than the UK (although that is not difficult!).

My contribution to the conference was mostly based on the use of technology to support informal learning. And although everyone was very polite and said how much they had enjoyed the presentation I am not sure they got it. Learning remains inextricably bound to formal training programmes usually linked to classroom or workshop delivery. Whilst there might be acknowledgement of the importance of informal learning it goes no further than that.

Possibly it is because trainers see no role for themselves in informal learning. however I have long held that informal learning does not happen by accident. Informal learning depends on rich learning environments be they in school or in the workplace. And informal learning depends on the ability to use that learning in work or in everyday life. For many their job does not provide either that richness in activities or in learning environment. For many the workplace is just a source of drudgery. And this could be the vital role trainers could take – in designing and developing rich learning environments. But I think for that we would require new ways of recognising learning based on learning processes rather than merely accrediting outcomes. And whilst education and training remains dominated by a discourse around competences that doesn’t seem likely to happen.

Work process knowledge, practice and mobile devices

November 28th, 2011 by Graham Attwell

Last week I took place in a seminar on mobile learning – called SOMOBNET, organised at the Institute of Education in London and supported by the EU Stellar network.

A few things from the seminar have kept me pondering in the days since. Firstly, it seems that although there is a lot of anecdotal evidence as to the widespread use of mobile devices in the workplace – and I think we could speculate that such usage is including learning if only in the form of ‘ring a friend’, we have few if any studies such informal use. Furthermore the present frameworks and theory of mobile learning are very much based on the use of technology for learning within formal educations settings and are of limited relevance to the ways we are using mobile devices today.

To develop such a theory I think we need to look more closely at the nature of practice.

I included two slides on practice in my presentation at the seminar (click on slides below to see full size versions).

Yishay Mor tweeted something like ‘Attwell is proposing practice as an alternative to competence’. I had not realised I was doing that, but thinking further on Yishay’s tweet it makes some sense. Competence as a construct is clearly alienated from the reality to work practice. Yet we have needed such constructs just because we have been unable to directly capture practice as it happens. Furthermore learning and knowledge development have also largely been seen as happening at a distance form practice, through formal curricula and in training centres. The ability to use mobile devices directly in the work process and to capture those work processes through new media removes the need to mediate through externally and often expert derived competence constructs. More on this to come.

In the summary discussion chaired by Sonia Livingstone, I once more reiterated my opinion that mobile devices were most interesting for learning in the context of vocational education and training and occupational practice. Sonia threw me a little when she asked me if this was because I despaired of the school systems. I am not a great fan of secondary schooling systems which I think are largely dysfunctional. But that is not the reason why I am so interested in the potential of mobile devices for learning at work. I see teh ability to use such devices as extending access to learning to the many people who are outside the formal education sector. And I tend to feel that both research and practice in the use of mobile devices will be held back whilst it remains the preserve of educational researchers working from a  schooling paradigm.

The debate over the future of education gets public

October 24th, 2011 by Graham Attwell

The debate over the future of Higher Education is continuing. There were two interesting newspaper articles in the past days in the Guardian and the New York Times.

The Guardian reports that the first set of statistics on applications to university next year, published by the Universities and Colleges and Admissions Service (Ucas), reveal that 52,321 applicants have applied from within the UK, compared with 59,413 this time last year. This is a fall of some 12 per cent, perhaps unsurprising given the steep rise in university tuition fees.

But the main interest is in the detail. The fall in applications is by no means even across universities and subjects, or by geographical region or age of applicant. The Guardian reports: “The figures suggest more women than men have been put off from applying to university. Some 10.5% fewer women have applied this year, and 7% fewer men.

Mature students appear to have been particularly deterred by the higher fees, the figures show. The number of applicants aged 40 or older has fallen by 27.8%, and among those aged between 30 and 39 the number has dropped by 22.7%.”

In terms of regions  the “numbers of applicants from the east Midlands (down 20%), Yorkshire (17.3%) and the north-east (14.7%) have fallen furthest, the figures show. London (down 9.1%) and the south-east (8.1%) have been less affected.”

And in terms of subjects “applications to education degrees have fallen by 30%, and those to business studies by 26.1%, the figures show.”

There are some pretty clear patterns here. Although there is no data on socio econo0mic backgrounds of applicants the fall in applicants is greatest from working class areas. And in  deterring mature students from applying, this will have a disproportionate effect on education which has in the past been an attractive second career.  The reduction in applications for business studies is more puzzling. Once more this may be an effect of less applications from mature students. Or it could be a general disillusionment with business as a whole. Or it may be that students are turning towards more vocational degrees and fear business studies offers little chance of post university employment.

It is also interesting to note that the fall in applications is uneven across institutions. The elite universities – like Oxford and Cambridge –  are little affected with the biggest reductions seemingly hitting the old Polytechnics.

Once more this can be seen as a class factor, with elite universities always having had a disproportionate number of applications from higher income social groups.

All in all, the figures appear to co0nfirm those critics who pointed to the UK university system becoming more elitist, with working class students afraid of the high debt levels the new fees structure will result in.

The New York Times published an “Opinion” article by Michael Ellsberg entitled “Will dropouts Save America”?

Although somewhat whimsical, Ellsberg points out most job creation comes from business start ups. he goes on to say: “Start-ups are a creative endeavor by definition. Yet our current classrooms, geared toward tests on narrowly defined academic subjects, stifle creativity. If a young person happens to retain enough creative spirit to start a business upon graduation, she does so in spite of her schooling, not because of it.”

But Ellsburg’s solution is hardly progressive. He thinks schools and universities should teach people how to buy and sell things as the bedrock of business start up. And in general he thinks young people are better off not going to university. Ellsburg ignores the importance of access to capital for those seeking to set up new businesses. But I would agree with several things he says. he points out that there is a dual job market in the USA – and I would contend in the UK as well. he points to an informal job market with employment being based on netwo0rking and contacts. “In this informal job market, the academic requirements listed in job ads tend to be highly negotiable, and far less important than real-world results and the enthusiasm of the personal referral.”

And he says “Employers could alter this landscape if they explicitly offered routes to employment for those who didn’t get a degree because they were out building businesses.”

Such employment routes used to be called apprenticeships. A revival of apprenticeship training could offer a high skills alternative to university education and provide the job adaptability skills need for succeeding in the highly unstable employment market today. But such apprenticeships cannot be left to employers alone. In the UK the government has taken to calling almost any course an apprenticeship, regardless of skills levels or length. Apprenticeship requires development and regulations to ensure the quality of the learning experience. Bit apprenticeship can offer an alternati9ve route of education to the failed model of mass university education.

Where is European educational research heading?

September 25th, 2011 by Graham Attwell

My promised post on the European Conference on Education Research, held earlier this month at the Freie Universitat, Berlin.

The conference attracted some 2200 delegates with hundreds of presentations spanning the different networks which comprise the European Educational Research association. the Pontydysgu team were supporting ECER in amplifying the conference through the use of different social media and through producing a series of video interviews with network conveners. On the one hand this meant my attendance at conference sessions was very limited, on the other hand the interviews with eleven different network conveners gave us perhaps a unique overview of where European educational research is heading.

A number of common themes emerged.

First was that the networks themselves seem to be evolving into quite strong communities of practice, embracing not just conference attendees but with extended networks sometimes involving hundreds of members. And although some networks are stronger n one or another country, these networks tend to suggest a European community is emerging within educational research. Indeed, this may be seen as the major outcome of European funding and programmes for education. A number of network conveners suggested that the search to develop common meaning between different educational and cultural traditions was itself a driving force in developing innovation and new ideas.

Secondly, many of the networks were particularly focused on the development of research methodologies. One of the main issues here appeared to be the development of cross domain research and how such research could be nurtured and sustained. This also applied to those considering submitting proposals to future conferences (next year’s conference is in Seville) with many of the conveners emphasizing they were keen to encourage submissions from researchers from different areas and domains and emphasizing the importance of describing both the research methodology and the outcomes of the research in abstract submissions.

There was also an awareness of the need to bring research and practice closer together, with a seeming move towards more practitioner researchers in education.

The question of the relation between research and po9licy was more complex. Despite a formal commitment by many educational authorities to research driven policy, some network conveners felt the reverse was true in reality, especially given the financial crisis, with researchers being forced to ‘follow the money’ and thus tailor their research to follow policy agendas. This was compromising the independence of research institutions and practice.

I asked each of the interviewees to briefly outline what they considered were the major trends in educational research. A surprising number pointed to a contradictory development. On the one hand policy makers are increasingly obsessed by targets and by quantitative outcomes, be it numbers of students, qualification levels or cost per student. The Pisa exercise is one example of such a development.Whilst no-one was opposed to collecting such data, there was a general scepticism of its value, on its own, in developing education policy. Such policies were also seen as part of a trend towards centralising education policy making

On the other hand, network conveners pointed to a growing bottom up backlash against this reductionist approach with researchers, parents and students concerned that educatio0n is not merely a economic function and that quality cannot be measured by targets and number crunching alone. This movement is being expressed in different ways with small scale local movements looking at alternative forms of learning, a movement also facilitated by the use of new technologies for teaching and learning.

Open Badges, assessment and Open Education

August 25th, 2011 by Graham Attwell

I have spent some time this morning thinking about the Mozilla Open Badges and assessment project, spurred on by the study group set up by Doug Belshaw to think about the potential of the scheme. And the more I think about it, the more I am convinced of its potential as perhaps one of the most significant developments in the move towards Open Education. First though a brief recap for those of you who have not already heard about the project.

The Open Badges framework, say the project developers, is designed to allow any learner to collect badges from multiple sites, tied to a single identity, and then share them out across various sites — from their personal blog or web site to social networking profiles. The infrastructure needs to be open to allow anyone to issue badges, and for each learner to carry the badges with them across the web and other contexts.

Now some of the issues. I am still concerned of attempts to establish taxonomies, be it those of hierarchy in terms of award structures or those of different forms of ability / competence / skill (pick your own terminology). Such undertakings have bedeviled attempts to introduce new forms of recognition and I worry that those coming more from the educational technology world may not realise the pitfalls of taxonomies and levels.

Secondly is the issue of credibility. There is a two fold danger here. One is that the badges will only be adopted for achievements in areas / subjects / domains presently outside ‘official’ accreditation schemes and thus will be marginalised. There is also a danger that in the desire to gain recognition, badges will be effectively benchmarked against present accreditation programmes (e.g. university modules / degrees) and thus become subject to all the existing restrictions of such accreditation.

And thirdly, as the project roils towards a full release, there may be pressures for restricting badge issuers to existing accreditation bodies, and concentrating on the technological infrastructure, rather than rethinking practices in assessment.

Lets look at some of the characteristics of any assessment system:

  • Reliability

Reliability is a measure of consistency. A robust assessment system should be reliable, that is, it should yield the same results irrespective of who is conducting it or the environmental conditions under which it is taking place. Intra-tester reliability simply means that if the same assessor is looking at your work his or her judgement should be consistent and not influenced by, for example, another assessment they might have undertaken! Inter-tester reliability means that if two different assessors were given exactly the same evidence and so on, their conclusions should also be the same. Extra-tester reliability means that the assessors conclusions should not be influenced by extraneous circumstances, which should have no bearing on the evidence.

  • Validity

Validity is a measure of ‘appropriateness’ or ‘fitness for purpose’. There are three sorts of validity. Face validity implies a match between what is being evaluated or tested and how that is being done. For example, if you are evaluating how well someone can bake a cake or drive a car, then you would probably want them to actually do it rather than write an essay about it! Content validity means that what you are testing is actually relevant, meaningful and appropriate and there is a match between what the learner is setting out to do and what is being assessed. If an assessment system has predictive validity it means that the results are still likely to hold true even under conditions that are different from the test conditions. For example, performance evaluation of airline pilots who are trained to cope with emergency situations on a simulator must be very high on predictive validity.

  • Replicability

Ideally an assessment should be carried out and documented in a way which is transparent and which allows the assessment to be replicated by others to achieve the same outcomes. Some ‘subjectivist’ approaches to evaluation would disagree, however.

  • Transferability

Although each assessment is looking at a particular set of outcomes, a good assessment system is one that could be adapted for similar outcomes or could be extended easily to new learning.  Transferability is about the shelf-life of the assessment and also about maximising its usefulness.

  • Credibility

People actually have to believe in the assessment! It needs to be authentic, honest, transparent and ethical. If people question the rigour of the assessment process, doubt the results or challenge the validity of the conclusions, the assessment loses credibility and is not worth doing.

  • Practicality

This means simply that however sophisticated and technically sound the assessment is, if it takes too much of people’s time or costs too much or is cumbersome to use or the products are inappropriate then it is not a good evaluation!

Pretty obviously there is going to be a trade off between different factors. It is possible to design extremely sophisticated assessments which have a high degree of validity. However, such assessment may be extremely time consuming and thus not practical. The introduction of multiple tests through e-learning platforms is cheap and easy to produce. However they often lack face validity, especially for vocational skills and work based learning.

Lets try to make this discussion more concrete by focusing on one of the Learning Badges pilot assessments at the School of Webcraft.

OpenStreetMapper Badge Challenge

Description: The OpenStreetMapper badge recognizes the ability of the user to edit OpenStreetMap wherever satellite imagery is available in Potlatch 2.

Assessment Type: PEER – any peer can review the work and vote. The badge will be issued with 3 YES votes.

Assessment Details:

OpenStreetMap.org is essentially a Wikipedia site for maps. OpenStreetMap benefits from real-time collaboration from thousands of global volunteers, and it is easy to join. Satellite images are available in most parts of the world.

P2PU has a basic overview of what OpenStreetMap is, and how to make edits in Potlatch 2 (Flash required). This isn’t the default editor, so please read “An OpenStretMap How-To“:

Your core tasks are:

  1. Register with OpenStreetMap and create a username. On your user page, accessible at this link , change your editor to Potlatch 2.
  2. On OpenStreetMap.org, search and find a place near you. Find an area where a restaurant, school, or gas station is unmapped, or could use more information. Click ‘Edit’ on the top of the map. You can click one of the icons, drag it onto the map, and release to make it stick.
  3. To create a new road, park, or other 2D shape, simply click to add points. Click other points on the map where there are intersections. Use the Escape to finish editing.
  4. To verify your work, go to edit your point of interest, click Advanced at the bottom of the editor to add custom tags to this point, and add the tag ‘p2pu’. Make its value be your P2PU username so we can connect the account posting on this page to the one posting on OpenStreetMap.
  5. Submit a link to your OpenStreetMap edit history. Fill in the blank in the following link with your OpenStreetMap username http://www.openstreetmap.org/user/____/edits

You can also apply for the Humanitarian Mapper badge: http://badges.p2pu.org/questions/132/humanitarian-mapper-badge-challenge

Assessment Rubric:

  1. Created OpenStreetMap username
  2. Performed point-of-interest edit
  3. Edited a road, park, or other way
  4. Added the tag p2pu and the value [username] to the point-of-interest edit
  5. Submitted link to OpenStreetMap edit history or user page to show what edits were made

NOTE for those assessing the submitted work. Please compare the work to the rubric above and vote YES if the submitted work meets the requirements (and leave a comment to justify your vote) or NO if the submitted work does not meet the rubric requirements (and leave a comment of constructive feedback on how to improve the work)

CC-BY-SA JavaScript Basic Badge used as template5.

Pretty clearly this assessment scores well on validity and also looks to be reliable. The template could easily be transferred as indeed it has in the pilot. It is also very practical. However, much of this is due to the nature of the subject being assessed – it is much easier to use computers for assessing practical tasks which involve the use of computers than it is for tasks which do not!

This leaves the issue of credibility. I have to admit  know nothing about the School of Webcraft, neither do I know who were the assessors for this pilot. But it would seem that instead of relying on external bodies in the form of examination boards and assessment agencies to provide credibility (deserved for otherwise), if the assessment process is integrated within communities of practice – and indeed assessment tasks such as the one given above could become a shared artefact of that community – then then the Badge could gain credibility. And this seems a much better way of buidli9ng credibility than trying to negotiate complicated arrangements that n number of badges at n level would be recognized as a degree or other ‘traditional’ qualification equivalent.

But lets return to some of the general issues around assessment again.

So far most of the discussions about the Badges project seem to be focused on summative assessment. But there is considerable research evidence that formative assessment is critical for learning. Formative assessment can be seen as

“all those activities undertaken by teachers, and by their students in assessing themselves, which provide information to be used as feedback to modify the teaching and learning activities in which they are engaged. Such assessment becomes ‘formative assessment’ when the evidence is actually used to adapt the teaching work to meet the needs.”

Black and Williams (1998)

And that is there the Badges project could come of age. One of the major problems with Personal Learning Environments is the difficulties learners have in scaffolding their own learning. The development of formative assessment to provide (on-line) feedback to learners could help them develop their personal learning plans and facilitate or mediate community involvement in that learning.Furthermore a series of tasks based assessments could guide learners through what Vygotsky called the Zone of Proximal Development (and incidentally in Vygotsky’s terms assessors would act as Significantly Knowledgeable Others).

In these terms the badges project has the potential not only to support learning taking place outside the classroom but to build a significant infrastructure or ecology to support learning that takes place anywhere, regardless of enrollment on traditional (face to face or distance) educational programmes.

In a second article in the next few days I will provide an example of how this could work.

Pedagogic Approaches to using Technology for Learning – Literature Review

May 31st, 2011 by Graham Attwell

The proliferation of new technologies and internet tools is fundamentally changing the way we live and work. The lifelong learning sector is no exception with technology having a major impact on teaching and learning. This in turn is affecting the skills needs of the learning delivery workforce.

Last September, together with Jenny Hughes I undertook a literature review on new pedagogical approaches to the use of technologies for teaching and learning. You can access the full (86 pages) document below.

The research was commissioned by LLUK to feed into the review then being undertaken of teaching qualifications in the Lifelong Learning sector in the UK. The review was designed to ensure the qualifications are up to date and will support the development of the skills needed by the modern teacher, tutor or trainer.

However, we recognised that the gap in technology related skills required by teaching and learning professionals cannot be bridged by qualifications alone or by initial training and a programme of opportunities for continuing professional development (CPD) is also needed to enable people to remain up to date.

The literature review is intended to

  • identify new and emerging pedagogies;
  • determine what constitutes effective use of technology in teaching and learning
  • look at new developments in teacher training qualifications to ensure that they are at the cutting edge of learning theory and classroom practice
  • make suggestions as to how teachers can continually update their skills.

Pedagogical Appraches for Using Technology Literature Review January 11 FINAL 1

Technology and Competence

March 30th, 2011 by Graham Attwell

All software is a beta. And we are forever messing with the structure of the Pontydysgu web site. So here is a new innovation. We are going to use the front page right hand column for short news items and announcements (please feel free to send in anything you would like to be posted there). And that frees up this news column. For what? For an editorial column, I think. Or an excuse for a rant.

And here goes rant number one. I am ever more dismayed by projects claiming to use technology to measure competence. Why? Because firstly I think we should be using software to develop imagination, to let people play, to encourage creativity, not restricting the idea of what is or is not legitimate learning or achievement. And secondly simply because I don’t think we can measure competence through software. Inevitably such attempts become just lists of tasks or formal knowledge which can be ticked off – with or without the help of evidencing. Such check box lists tell little of what people can really do and next to nothing about their ability to use skills and knowledge in real world situations. It was that approach which led to the near demise of the first wave of e-Portfolio development. And it is time we learned from that lesson.

Here endeth the first rant 🙂

  • Search Pontydysgu.org

    Social Media




    News Bites

    Cyborg patented?

    Forbes reports that Microsoft has obtained a patent for a “conversational chatbot of a specific person” created from images, recordings, participation in social networks, emails, letters, etc., coupled with the possible generation of a 2D or 3D model of the person.


    Racial bias in algorithms

    From the UK Open Data Institute’s Week in Data newsletter

    This week, Twitter apologised for racial bias within its image-cropping algorithm. The feature is designed to automatically crop images to highlight focal points – including faces. But, Twitter users discovered that, in practice, white faces were focused on, and black faces were cropped out. And, Twitter isn’t the only platform struggling with its algorithm – YouTube has also announced plans to bring back higher levels of human moderation for removing content, after its AI-centred approach resulted in over-censorship, with videos being removed at far higher rates than with human moderators.


    Gap between rich and poor university students widest for 12 years

    Via The Canary.

    The gap between poor students and their more affluent peers attending university has widened to its largest point for 12 years, according to data published by the Department for Education (DfE).

    Better-off pupils are significantly more likely to go to university than their more disadvantaged peers. And the gap between the two groups – 18.8 percentage points – is the widest it’s been since 2006/07.

    The latest statistics show that 26.3% of pupils eligible for FSMs went on to university in 2018/19, compared with 45.1% of those who did not receive free meals. Only 12.7% of white British males who were eligible for FSMs went to university by the age of 19. The progression rate has fallen slightly for the first time since 2011/12, according to the DfE analysis.


    Quality Training

    From Raconteur. A recent report by global learning consultancy Kineo examined the learning intentions of 8,000 employees across 13 different industries. It found a huge gap between the quality of training offered and the needs of employees. Of those surveyed, 85 per cent said they , with only 16 per cent of employees finding the learning programmes offered by their employers effective.


    Other Pontydysgu Spaces

    • Pontydysgu on the Web

      pbwiki
      Our Wikispace for teaching and learning
      Sounds of the Bazaar Radio LIVE
      Join our Sounds of the Bazaar Facebook goup. Just click on the logo above.

      We will be at Online Educa Berlin 2015. See the info above. The stream URL to play in your application is Stream URL or go to our new stream webpage here SoB Stream Page.

  • Twitter

  • Recent Posts

  • Archives

  • Meta

  • Categories