E-Agriculture

2. How to analyze the socio-economic impact in rural areas?

Karl Jonas
Karl JonasFraunhofer Gesellschaft FOKUS Institute for Open Communication SystemsGermany

humble engineer's comment, ready to take experts bashing:

It is transparency. It is crowd sourcing.

Ask the people. They know it all. Make information about the projects available. To everybody. What did it cost? What was planned? If you bring ICT to a school: ask the parents (two years afte the end of the project!) if it had an impact. Ask the owners of the kiosk if new services had an impact.

A blog on the internet. For every project. Known to everybody. Open to everybody.

karl

Jenny Aker
Jenny AkerTufts UniversityUnited States of America

Karl,

I agree, but with transparency and crowdsourcing comes the need for verification. When Ushahidi was used to report violence in Kenya, it required independent observers to confirm those reports - -the same with swine flu in Mexico and the food riots in Mozambique. Crowdsourcing is a great opportunity for sharing information, but also the potential for misinformation.

Jenny

Anja Kiefer
Anja KieferGTZGermany

The method of assessing impact should be related to the objective of the analysis.

If we want to know whether farmers experience a benefit, we should ask the farmers. If we want to know whether young people enjoy the communication possibilities of mobile phones, we should conduct a survey among youth.

But problems start when we wish to show rigid scientific proof of the impact of ICT on certain factors, e.g. income generation. I don't think that it is enough to ask farmers if their income has increased. Even if that were the case, many other factors influence changing incomes - world market prices, changes in supply and demand, weather conditions...

If we need to prove that ICT have had a direct impact, these factors need to be taken into account. Probably, methods other than surveys or questionnaires have to be used as well to get scientificly valid results.

Jenny is right, crowdsourcing requires verification, but the very nature of crowdsourcing implies a self verification. If out of 50 messages, 40 say the same thing, then one would assume that the information is correct.

Jenny Aker
Jenny AkerTufts UniversityUnited States of America

While I agree that the "strength in numbers" might give us an indication of accuracy, I think that this really depends upon the context and how crowdsourcing is being used. It might work really well for certain (easily observable) topics, like riots (Mozambique), electoral violations (Kenya, Mozambique) and earthquake survivors (Haiti). But it might work less well for other aspects in the medical field -- like using Ushahidi to report cases of the swine flu in Mexico (where fevers can be swine flu or something else).

I don't think that ths minimizes or negates the power of crowdsourcing -- it is a powerful tool and one that allows individuals and groups (who might not have otherwise had a voice) to participate. This cannot and should not be overlooked. But at the same time, I think that it can/should be used with other data collection tools.

stephen kimole
stephen kimoleKenya Institute of Organic FarmingKenya

I can’t agree with Karl more, the beneficiaries have the best experiences- “he who wears the shoe knows where it pinches, and not the manufacturer”. I remember last year I was conducting a survey on some tillage techniques productivity and one of the respondents took me to his store to see the amount of maize he had got from his farm, and there was my answer.

Rami Eid-Sabbagh
Rami Eid-SabbaghHasso-Plattner-InstituteGermany

i agree with you karl. this a good way to get an indication about the impact. however as jenny said, a second round for verification might be desirable. also doing the measuremnts while the projects runs would help to steer the project impact into the right direction.
if we look at blogs or wikipedia there is often only a few people driving it (e.g. wikipedia maybe 5% of user are contributers which is still a lot, and of course you have all others who verify), so maybe we get only one view.
nonetheless as you said it should be participatory and transparent.

Olaf Erz
Olaf ErzIICDNetherlands

One of the M&E components that IICD has put in place is designed to promote learning within and between projects at various levels. It allows IICD, its partners to receive feedback on the outcomes of the project implementation during focus group discussions. These ones provide also opportunities for the stakeholders to reflect on the progress and achievements of the project, as well as, take appropriate actions, where necessary, to ensure that the project interventions achieve the desired results. All active projects are subjected to at least one evaluation every year. In an effort to increase ownership at the project and participant levels, IICD and its M&E partners are gradually transferring responsibility for the project level focus group discussions to the project implementing partners to organise so that they can take ownership of the process and build their internal capacity over time for the use generation and use of evaluation information.

Olaf Erz
Olaf ErzIICDNetherlands

In addition to project focus group discussions, IICD and its partners hold annual national focus groups in which partners review the general outcomes of all projects evaluated, discuss themes and issues that have emerged from the evaluation, and draw lessons learned as a means of cross fertilising the activities of the individual projects in the coming years. Outcomes and commitments are incorporated into an annual learning report. The commitments serve as the basis for action plans of the partners for the subsequent years of the project. The national focus group discussion provides an opportunity for IICD and partners to review general trends and challenges confronting the use of ICT as an instrument for development. Participants collectively find ways in which success stories can be mainstreamed across different projects/sectors and further discuss and suggest ways in which common challenges can be tackled, especially through the adoption of joint strategies to common issues.

Use participatory approaches. Ask the endusers to analyse the merits and demerits of ICT. Take the feed back. Study and identify the gaps. Ask for solutions from the people side. Do SWOT analysis. Focus on the parameters that enhance their knowledge, skills and access while improving their incomes. Take suggestions from the people.Conduct mid term assessments. Note short term and long term impacts. Make the entire process simple. Focus on tangible and intangible benefits. Try to take mid course corrections. In my opinion rural areas have to be empowered by ICT as information and access to information are key factors for development.

Analysis may be done based on these four important criteria

Affordability

Access (connectivity here in this case) to information

Quality of the Information

Utility of the Information in improving incomes

Janaki