The internal dissemination and impact of in-house research in Social Services Departments

Marian Barnes

Development Officer (Research), Hounslow Social Services Department

Tom Wilson

Department of Information Studies, University of Sheffield

Abstract
This article presents some of the findings from an empirical study of modes of dissemination used by in-house researchers in social services departments. It suggests that too much emphasis may be placed on the production of formal reports and too little attention may be given to techniques of oral communication. For research to have an impact on policy it is important that careful thought should be given to how it is disseminated.

Introduction

The impetus for this paper came from two sources: first, the authors presented a paper at the 1984 SSRG Annual Workshop which discussed the range of dissemination activities which social services researchers might use (Barnes and Wilson, 1984). The authors were conscious that the paper lacked an empirical base and the work reported here was. in part, an attempt to obtain such a base.

Second, research by Bowl and Fuller (1982) explored aspects of research and their relationship to policy which have relevance for the methods of dissemination which might be most profitably employed. However, dissemination was not examined explicitly. The work reported here is an attempt to go some way towards filling the gap.

Aims of the research

The aims of this research were:

  • to gather as large a collection as possible of cases of research projects which had an impact on the work of departments and to identify the kinds of impacts made;
  • to discover how the research had been conducted and what kinds of data were gathered;
  • to discover what methods of dissemination were employed and the role played by the researcher or research section in dissemination;
  • to discover how research fitted into the policy and planning systems employed by departments;
  • where possible, to identify modes and models of dissemination which appear to be successful and which could act as guides for other researchers.

Methodology

A questionnaire was devised and, following a limited pilot test, sent to 127 SSDs in England, Wales and Scotland. Respondents were asked to report on the latest piece of research defined by the researcher as 'successful' in its impact on departmental policy or practice. Responses were received from 62 departments (48.8 per cent), but 15 respondents reported that they had no research section, or did not carry out work which they would describe as research, giving 47 useable responses. Three departments refused to participate. If those departments which did not respond at all are assumed to be divided in the same proportions as those from which some response was obtained almost 25 per cent of social services departments may lack research sections.

When statistical results are reported below they represent the distribution of the cases received and not the population of all current research projects in social services departments. The work by Bowl and Fuller (1982) is much more representative of the general state of research.

Characteristics of the reported research projects

The subject of research: The categories used in Table 1 are almost identical to those used by Bowl and Fuller but direct comparison is not possible because the data are reported differently. The Spearman rank correlation coefficient (Siegel, 1956) for the two rankings is 0.817, which is significant at the 0.01 level suggesting that the cases reported are a subset of the projects reported by Bowl and Fuller, at least in respect of the range of subjects covered. As such they may be considered as representative of the type of research carried out in SSDs.

Table 1: Subject categories of reported projects
 No. of projects%RankB & F rank
Elderly1755.412
Children1020.821
Organisation816.736
Mental illness24.265
Mental handicap36.253
Physical handicap48.344
Political issues12.176
Clients in general12.176
Homeless12.176
Social analysis12.176

The research workers: As job titles which include in-house research functions vary, titles such as Development Officer, Planning Officer etc. were included within the category 'SSD researcher/section'. In half of the reported projects staff other than in-house researchers were involved in carrying out the work, often in association with R and D staff. In such situations the potential for informal dissemination by word-of-mouth is increased. It is not clear if this happened, but in departments where oral communication among colleagues is the normal and preferred means of communication it can reasonably be suggested that informal discussions during the progress of the research may have prepared the ground for dissemination of the final results.

Style and method of research: quantitative research was the predominant style (63 per cent of responses), with interviewing the principal method of data collection (63 per cent). Almost a third of respondents claimed that action research was the style of research, usually in association with quantitative or qualitative research. Neither the rest of the answers, nor the reports themselves, reveal exactly how these action research projects come to be assigned to that category.

The combination both of research styles and methods reported indicates the pluralistic approach necessary to take account of the multiplicity of interests and influences in applied social research. Smith and Cantley's (1984) assertion that evaluative research needs to consider the different notions of success held by different interest groups, and thus must embody the principles of methodological triangulation, can be applied to other forms of research intended to have an impact on social services policy or practice.

Research which has considered the various perceptions of a situation which may exist may have more chance of being received as useful than that which has adopted a one-dimensional approach. In designing a research project the researcher must have an eye not only on who takes decisions arising from it, but who will be affected by those decisions. This implies that a range of methods may be needed to study the problem.

The role of previous research: responses showed that previous research guided the conduct or design of the reported projects. In particular, research in other SSDs contributed to more than half of the projects (60.4 per cent). The internal evidence for the usefulness of prior research in reports provided with the responses is scanty: there are few citations or other indications in the text of reports.

A final report is, essentially, an internal document and does not need citations, but their absence does not mean that previous ideas have been ignored. Whilst the lack of citations may not be important from a departmental point of view, it must inhibit the development of research in SSDs which is set in the context of earlier, associated work. A bibliography in a research report is a useful source of ideas for other researchers and could promote the wider use of local, in-house research.

The range of types of data collected is further evidence of the pluralistic approach adopted in the research projects reported (See Table 2).

One type of data is rarely sufficient to provide evidence for use in policy-making or decision-making relating to practice. For example, the following types of data were collected in a project on Home Care Services which was designed to have an impact on professional practice, organisation structure, resource allocation and other policy/ planning processes:

  1. measures of service outcome/effectiveness
  2. data on client needs
  3. consumer viewpoints
  4. views of staff (management)
  5. views of staff (service providers)

An evaluation of a service which attempted to measure outcome without taking note of consumers' experience of sen-ice delivery or the views of staff on problems in the organisation or provision of services would be less effective and could meet with resistance from staff whose views had not been taken into account. It is encouraging that in nearly half of the projects the views of consumers were sought. This contrasts with practice in the dissemination of results reported below.

Table 2: Types of data collected
Data collected N %
Client needs3266.7
Usage, occupancy rates25 52.1
Service provider views22 45.8
Consumer viewpoints21 43.7
Management views21 43.7
Client trends, etc21 43.7
Outcome of services18 37.5
Organisational efficiency16 33.3
Other15 31.2
Community needs12 25.0
Staffing data11 22.9
Community background9 18.7
Costs/cost effectiveness8 16.7

The relationship of research to planning and policy-making

We assume that research is commissioned so that the results may contribute in some way to the improvement of the activities of SSDs (leaving aside other possible motivations such as the need for delaying mechanisms or organisational game-playing). Consequently, one would expect some explicit and recognised relationships between research and planning and policy-making.

To the question, 'Is there a generally understood system for planning/policy-making in your Department?'. 75 per cent responded yes and 25 per cent said no. This finding is disturbing. Research can only be completely effective in assisting planning and/or policy-making if its role is understood generally in the organisation and if it can be seen to relate to the overall policies and plans of the Department. Where either of these aspects of the whole is fuzzy there will be difficulty in matching research ideas to policies and plans in a meaningful way. The difficulties of implementing this rational model of planning and policy-making are well understood, but there is no harm in using a model as a target at which to aim.

The senior staff of departments are likely to be the sources of most research ideas, with practitioners playing a very small part, except when involved in working groups. The majority of cases reported had their origins at the Directorate or Senior Management Team level. In the majority of cases (62.5 per cent). the source of the idea was not the target for the results of the project.

Table 3: Intended areas of impact of projects
Impact areaN%
Resource allocation3371.7
Organisational structure 30 65.2
Professional practice 23 50.0
Other policy aspects 15 32.6
General policy 9 19.6
R and D development 7 15.2
Cooperation 1 2.2
Public relations 4 8.7

Table 3 shows that the chief area of the intended impact of the research was resource allocation, which, in times of financial stress, is not surprising. Similarly, political and other pressures are leading to a reconsideration of the organisation of sen-ices and this is reflected in the second rank position of organisational structure. However, respondents assigned an average of 2.5 categories to their cases, suggesting that research is pluralistic in impact as well as in approach.

Dissemination of research results

Typically, a discussion of research dissemination will focus on how and where journal articles can be published, and how one can obtain copies of reports and papers through the library system or other information services. In-house researchers are exhorted to make their work available to the wider social services audience. The editors of journals such as Research, Policy and Planning and Social Services Research know how much effort is needed to encourage in-house researchers to contribute.

Behind the effort and the exhortation lies the unexpressed assumption that internal dissemination is happening and is non-problematic. This view cannot be taken for granted given the way in which information is communicated in large, hierarchical, dispersed organisations. In-house researchers need to be aware that their work could have implications for other departments, but before that they need to have suitable methods to ensure that their work is known within their own organisations.

Table 4: Modes of dissemination
 No.%
Final report4287.5
Summary report 25 52.1
Policy document 21 43.7
Newsletter item 4 8.3
Training24.2
Organization development7 14.6
Talks2654.2
Other12.1

On average, the cases reported here were disseminated by two methods. A rather higher average had been expected and some modes of dissemination may have gone unreponed. For example, elsewhere in responses it was noted that reports were presented orally to committee or to other groups, although this was not recorded under 'talks'. This is an important point: in departments where oral communication is the typical and preferred means of transmitting information too great a reliance on documentary communication is likely to be inappropriate. Oral presentation is an important activity and needs to be recognised as such.

Three points arising from Table 4 are worthy of comment. First, little more than half of the projects resulted in a summary report and, unless the final report is a brief document, a summary is crucial if it is to be read by busy staff member? whose working day is often interrupted. Wilson and Streatfield (1980) report that almost 75 per cent of all communication events, including reading, lasted only five minutes or less and 36 per cent lasted only one minute or less.)

Second, the few occasions on which newsletter items were prepared raises a similar point. A brief newsletter report is more likely to be read (and by more people) than is a lengthy research document. Some departments responding to the questionnaire may not have an internal newsletter: if so they will have problems in informing all staff of departmental developments.

Third, dissemination through training is mainly of relevance to those whose practices are more likely to be directly affected by research findings. Only two projects reported use of training which suggests that either it is rarely seen as relevant, that links between research and training are inadequately developed, or that pressure of other work does not allow its more widespread use. Of course, researchers may feel more comfortable earning on the work of research and writing reports but, since 23 project were intended to affect professional practice, it would seem that there is scope for improving the links between research and training.

In the cases reported, the individual researcher or research unit generally had responsibility for disseminating results (71 per cent). However, dissemination is not a unidimensional activity and the extent of such responsibility was likely to van with the intended audience and the circumstances within which dissemination was carried on.

Responses on the nature of involvement in dissemination describe a wide range of activities. In approximately half of the cases, dissemination was described as typical, but in over a quarter of cases it was felt to be atypical. Not only does there appear to be great variation among departments in terms of what comprises internal dissemination but there is also evidence of variation within departments, presumably reflecting the range of activities undertaken as in-house research.

Some responses indicated purely documentary modes of dissemination. Tasks included duplication and distribution of reports, preparation of action summaries and press releases and attempts to get items on committee agendas. The researcher is usually prominent in such activities and is involved in communication with different audiences - managers and practitioners within the department, committee clerks and press officers within the authority, political sponsors of research and professional colleagues outside the local authority.

Oral dissemination of findings involves both formal presentations to groups of staff and members and informal, but often regular, discussions with individuals such as advisers, senior management, and casework staff.

In some circumstances dissemination could be an activity which continued over a considerable period of time. Reports and presentations would go first to one group, then to another. either in the same form with comments attached or revised as a result of discussion and, possibly, further analysis.

The conclusions drawn from research may be redefined and further analyses may be called for as results are fed into the decision-making process. The final product of the research may have shifting boundaries as discussion of implications leads to further analysis and interpretation. The research is owned by the department and the need to involve other staff in determining the implications for change and how findings can best be applied means that the intellectual ownership of the work is shared.

However, it is chiefly researchers who ensure that reports and summaries reach those who need to know. They have to negotiate for space on management meeting agendas and to take a leading role in organising seminars and talks. In the next phase the initiative needs to shift to those with developmental and managerial responsibilities if the process is not to stop there.

In ten of the projects reported the researcher was not directly involved in dissemination. In one case the work had been undertaken by temporary staff and dissemination was carried out by the department's permanent research and planning officer. In other cases managers, specialist development officers. or advisers carried out dissemination, illustrating the perceived importance of the link between research and operational developments. However, it is also useful for the researcher to be involved in dissemination to ensure an accurate presentation of what the research involved, and to ensure a continuing dialogue between those involved in operational activities and those concerned primarily with analysis and policy development.

Responses emphasise the complexity of disseminating findings and implementing research-based change in large, hierarchical, complex organisations. Some responses illustrate the effect of political involvement in the generation and conduct of research, while others focused on the way the involvement in research of other departmental staff can aid dissemination. Some research was generated by working groups involving a range of staff which had explicit briefs to review services or devise a strategy for development. Here the personal commitment to the importance of the project by staff other than the researcher ensured wider dissemination than might otherwise have been the case.

As noted earlier, it is encouraging that research often includes collection of data relating to consumers' views of services but service users are rarely included in dissemination. There are obvious problems: if research has evaluated a day centre or residential unit there may be comparatively straightforward ways of bringing the relevant clients together to dicuss findings, but survey research or an evaluation of, say. the provision of home help services, presents more difficulties. Given the time involved in bringing findings to the attention of decision-makers, there may be little energy left to explore ways of letting consumers know about results. However, this may not be the only reason. Managers may be unwilling for research that identifies resource gaps to be brought to the attention of those who may suffer as a result of such gaps.

Implementation of research findings

Space does not allow us to present detailed results on this aspect of the study and we hope to publish a further paper on the subject. Here we shall deal with the generalities of implementation rather than with the detail of changes made as a consequence of research.

In our previous paper (Barnes and Wilson, 1984) we pointed to one of the fundamental differences between academic researchers and in-house research workers. Although we acknowledge the concern and efforts of academics to ensure their research is both relevant and used, for academic researchers in general the goal of dissemination may be publication, so that research findings can become part of accumulated knowledge on a particular topic. They can anticipate that their contribution will form part of a continuing series of investigations into particular problems. In-house researchers cannot assume continuing attention to their issues, nor are they employed primarily to add to the sum of knowledge in a particular rield. In broad terms, they are employed to contribute to the process of change and improvement in the organisations in which they work. The main aim of in-house research, therefore, is not dissemination and integration into a body of knowledge but implementation of the consequences of research.

The job titles of researchers working in SSDs and their organisational location suggest that this is often recognised. Whether or not effective systems exist for research findings to make a useful contribution to decision-making about policy and practice is a different matter. Bowl and Fuller (1982) recorded the fact that it is often up to individual researchers to establish their own rules. In some situations researchers must continually renegotiate their role vis-á-vis managerial and operational staff so that a contribution can be made to developments without seeming to usurp managerial responsibilities. In-house researchers, close to operations and policy-making processes. have an advantage over external researchers in this respect. Such a situation may still be problematic; and in depanmenis where priorities relate primarily (and legitimately) to service provision, how research can relate constructively to departmental concerns is not always clearly thought out.

In the previous section it was suggested that active dissemination of research findings shades into a less conscious process of diffusion. Where dissemination becomes part of implementation is similarly unclear, and in some cases may not be recognised or formalised. One project reported was thought to have resulted in changes in the practice of home-care organisers, home helps or area managers, but no one had explicit responsibility for taking decisions about what changes should result. The impact of research in raising consciousness about issues and contributing to a general educational process for staff is one that is very difficult to measure, but its importance was evident in a number of the projects. The following resumé illustrates this:

'In the absence of any clear responsibility for dissemination, the researcher took this on herself. The final report was circulated to team, area and divisional managers and a summary was circulated to IT workers and anyone else who requested it. Then, the researcher became involved in working parties examining IT policy and the results of the survey were fed into these discussions. New IT managers appointed as a result of a departmental re-organisation made use of findings in formulating policy and when a Planning Group was set up the researcher also became involved in this and results of the survey were fed into planning documents...'

This shows how the researcher came to be regarded as an expert on the topic of the research and was called upon to go beyond the formal written or oral dissemination of results to panicipate in policy-making groups. This point is worthy of note, given the arguments sometimes put forward in favour of external rather than in-house researchers. External researchers may be regarded as experts in the subject they research and, consequently, their findings may be given more weight than those of the in-house researcher who may be regarded as a generalist. The example given above, and others reported to this project, indicate that in-house researchers can make useful contributions to substantive issues beyond the dissemination of results.

In other projects there was a more clear-cut division of responsibilities between research dissemination and implementation. In a project on alarm systems the researcher's responsibility for dissemination was limited to internal circulation of documents and discussion of press releases. The specialist officer for domiciliary services took on the responsibility for explaining the changes which the research indicated and for decisions about implementation.

Coming somewhere between these two examples are occasions when the researcher has no formally-recognised role within the process of implementation but, because of a commitment to the issue developed in the course of the research, acts as a prompter of those with formal responsibilities to ensure that things happen. One project was seen by the researcher as a way of getting the Committee 'off the back' of the department in relation to meals-on-wheels, a low-priority service in the department. Responsibility for implementation was delegated to the Assistant Director (Administration) who had no previous direct involvement in the project. It was reported that implementation depended on the 'continual prompting' of the researcher. This was not typical for the department but is an example of the problems of trying to impose a project from above which has little apparent commitment from other staff. The more typical model in this department was reported to be one in which research was carried out in association with a working group of practitioners committed to the project and able to reject findings or accept and implement them.

In contrast, another project reported had the active involvement of an Assistant Director in undertaking the research, in disseminating findings, and in feeding the results into the policy/planning process. She was responsible for compiling position statements on provision, chairing a working party on the strategy of care for elderly people and for aspects of budgetary planning. Implementation was delayed for rwo years awaiting the availability of resources, but this was felt to add to. rather than detract from, what was described as the 'enormous' impact of the project. The continuing active support of a senior member of staff with the necessary power to implement changes is obviously an advantage.

So, who did take decisions about the implementation of research findings? Responses reflect those on policy-making systems in the departments. The level at which decision-making is allowed to take place varies considerably and is likely to be a function of the type of decision required in response to the research findings, but the variations also appear to indicate a difference in the extent to which decisions are delegated. Decisions involving capital expenditure have to be made, or at least formally ratified, by committee. Decisions on professional practice, however, are more likely to be made by operational, management or advisory staff. This has implications for how the researcher ought to present conclusions. In some cases one person or one small management group will need to be convinced of the importance of research findings. In others. specialist staff working in one area of social services will have decision-making powers delegated to them, and in yet another situation lay councillors with interests relating primarily to their constituents and their party group will be the most important people to convince.

Procedure for implementing research findings may be even less clear-cut than systems for policy-making, but we have cases of in-house research which have contributed to substantial change. There is a degree of similarity in the topics reported which could suggest unnecessary repetition of very similar research projects in different departments. However, the fact of increased attention to a research issue appears to increase the probability of change. This may not be the case if departments simply try to import research findings from elsewhere. While there may well be occasions when departments should utilise research undertaken elsewhere, a more productive strategy may be to increase the opportunities for collaborative research undertaken in a number of departments simultaneously. The effective management of such projects could provide results that are useful to each department, give a basis for comparison, and provide a sounder base for representations to councils, government or other agencies.

Conclusions

The in-house researcher fills a specialist role in the organisation: one devoted to the discovery of facts about its operations and purposes and to the dissemination of those facts throughout the system and beyond. The significance of this role for a social activity which is based upon knowledge cannot be overestimated. Many organisations outside the social services carrv on research which may be of interest to SSDs but other work (Wilson and Streatfield, 1980) has shown that this work is perceived to be of little significance and that very few departments have effective means for the dissemination of such information to their workers.

In such a climate, therefore, the value of an individual or section devoted to the generation and transmission of knowledge on issues agreed as significant to the organisation is obvious. However, if the management of research and the subsequent dissemination of results is thought of as a purely technical task the full benefits will not be gained. The dissemination of information is part of the process of organisational learning and in this process the importance of oral communication in a predominantly oral culture cannot be over-stated. The in-house researcher is uniquely placed to disseminate the findings of research through talks to groups of staff, or to committees, and through training, but the necessary structures must exist to allow this to happen. From the evidence presented above it is clear that insufficient attention is given to these modes of dissemination at present although their importance is recognised.

At present there is a heavy emphasis upon documentary modes of dissemination, with the final report as the principal means. We recognise the need for researchers to bring all aspects of their work together as part of the process of seeking an understanding of the results and of their organisational meaning. It may be less effective in reaching others, however. On the other hand. if more effort is put into non-documentary modes of dissemination, how might this affect the extent to which the results of research are made known to a wider audience? If researchers prepared material for training purposes or for talks to committees and working parties they might have ready-to-hand material which could be transformed into a journal article more easily than is the case with a final report.

Clearly, we are in danger of being accused of putting forward a counsel of perfection in respect of dissemination. We realise. however, that time is a scarce resource and that, very often, crisis drives the research process. The lack of a clear distinction between research and monitoring is part of the problem - when data are lacking for a particular decision the researcher is required to set up the mechanism for collecting them and for producing a report on their meaning. Unless formal monitoring and reporting systems are established to cope with this need the result is ad hoc data collection with little chance of continuity.

The only escape from this situation is to separate research from monitoring, ensure that monitoring systems are established in such a way that they can be operated with a low level of support, and to key research and dissemination to organisational priorities and objectives. To achieve this the researcher must be given access to the decision-making individuals and groups in a department. There is little sense in expecting researchers to adopt a pro-active stance towards organisational problems if they are not in a position to acquire the necessary organisational intelligence available to those with decision-making responsibility.

Our evidence suggests that departments get value from money invested in research. Whether it would always be possible to get the same value from commissioned research done by external consultants is debatable. We would argue that external researchers are unlikely to have:

  • the same degree of commitment to the organisation as the best of in-house researchers;
  • the same access to all staff of the department with the same high degree of informality in contact;
  • the same degree of credibility with staff of the department (and the opportunity to develop that credibility over time);
  • the same availability to management when crucial issues arise at crisis points;
  • the same opportunities to participate in the implementation of change - often over very lengthy periods of time.

Finally, we see implications for training researchers in these results. Researchers come to SSDs with a variety of backgrounds, usually embodying research skills acquired through academic study or research. That background is not all that a researcher needs: it is equally important to understand the need for monitoring and to have the ability to learn how to devise reporting systems for day-to-day management decision-making. It is also necessary to understand the particular culture of SSDs - the impact of professionalism on the social worker's attitude to the job, the impact of geographical dispersion on access to information, the oral character of information transfer in departments, the need to transform research data into training materials, and the need for a variety of modes of dissemination if research is to have an effect upon practice.

Acknowledgements

We would like to thank all members of the SSRG and other members of local authority SSDs who contributed their time and descriptions of cases to this work. We would also like to thank Ric Bowl and Roger Fuller for permission to use and quote from their report.

Note

Dr. Marian Barnes is now Professor of Social Research at the University of Birmingham. Professor Wilson is now Professor Emeritus, University of Sheffield.

References

  • Barnes, M. and Wilson, T.D. (1984). Dissemination of in-house research in LASSDs, Research, Policy and Planning, 2, 19-24.
  • Bowl. R. and Fuller, R. (1982). A study of research in Social Services and Social Work Departments: revised report of the First Stage. Birmingham: University of Birmingham, Social Services Unit. (unpublished report).
  • Siegel, S. (1956). Nonparametric statistics for the behavioral sciences. New York, NY: McGraw-Hill.
  • Smith, G. and Cantley, C. (1984). Pluralistic evaluation. In J. Lishman (Ed.), Evaluation. Aberdeen: University of Aberdeen, Department of Social Work. (Research Highlights, No. 8).
  • Wilson. T.D. and Streatfield, D.R. (1980). You can observe a lot...: a study of information use in Local Authority Social Services Departments. Sheffield: University of Sheffield, Postgraduate School ofLibrarianship and Information Science.

Originally published in Research Policy and Planning, 4(1), 1986, 19-24

counter
Web Counter

Valid XHTML 1.0!