Briefing Paper 16

AttachmentSize
16- PURE_Briefing_Paper_-_Benchmarking.pdf93.72 KB
Embedded Scribd iPaper - Requires Javascript and Flash Player
Observatory PASCAL
Place Management, Social Capital and Learning Regions PASCAL UNIVERSITIES REGIONAL ENGAGEMENT PROJECT (PURE) PURE Briefing Paper No. 16 BENCHMARKING UNIVERSITIES IN REGIONS Some Thoughts Arising from Benchmarking in Melbourne PURE
November 2009
Introduction From the outset, the benchmarking of university engagement in their regional context has been an important part of the methodology adopted for the PASCAL Universities and Regional Engagement (PURE) project. Two dimensions of the benchmarking were proposed: i. ii. each university was to benchmark its existing engagement, using an instrument derived from experience in the United Kingdom; and regional authorities were asked to rate their progress towards achieving key outcomes in regional development.
There were a number of levels at which the university benchmarking exercises were intended to be useful: i. ii. iii. iv. as a means of self-assessment, engaging a variety of stakeholders in various parts of each university to reflect about engagement, leading to new arrangements and initiatives within universities; as a means of identifying relative strengths and weaknesses across the universities in the Melbourne region; as a means of identifying opportunities for a university to draw on another’s experience, where one appears to have performed relatively well; and as a means of identifying relative strengths and weaknesses across the universities in the different regions involved in the PURE project.
Similarly the regional benchmarking instrument was intended to: i. ii. iii. enable key regional stakeholders in Melbourne to be reflective about the region’s performance in key areas of strategic and policy priority; map perceptions by key stakeholders of performance against regional priorities, and commence dialogue where differences arise; assist universities to see where Melbourne’s regional stakeholders identify weaknesses in progress towards their key aspirations, and identify areas where universities might focus efforts to assist regional development; and provide a foundation for dialogue about regional aspirations across the PURE regions.
iv.
Both instruments are designed as learning resources, to facilitate reflection and discussion both within institutions and comparatively about their current aspirations and current arrangements, and how they might be enhanced. In this sense, the conversations are more important than the ratings themselves.
PURE BP No. 16 http://www.obs-pascal.com/  Page |1
Observatory PASCAL
Place Management, Social Capital and Learning Regions The benchmarking exercise was anticipated to be conducted twice: once at the outset of the project, and a second run, about 12-15 months later, once further project work had been undertaken. The work within the first phase of Melbourne PURE has reached a point where it supports a useful discussion about both the benchmarking instruments themselves, and the implications of the data for the PURE project itself in Melbourne. There is yet to be any opportunity to compare the local benchmarking data with other regions. Benchmarking Instruments Two discrete instruments were provided by PASCAL. The universities’ instrument was prepared by Professor David Charles at the University of Newcastle-on-Tyne, following earlier work that he and Paul Benneworth had done for the Higher Education Funding Council for England. Copyright of the instrument is retained at the University of Newcastle-on-Tyne. University Benchmarking It focuses on a higher education institution’s (HEI) contribution to various aspects of regional development. It is organised around a series of ‘practice’ (processes) and ‘performance’ (past achievements) indicators and subindicators, with a rationale provided for each indicator. The instrument also includes a brief account of ‘good practice’ and seeks the university’s ranking of itself on a scale of 1-5. Each sub-indicator has a few words which attempt to indicate the circumstances which would warrant a 1, 3 or 5 rating. For example, with the ‘practice’ sub-indicator ‘University participation in provision of public transport or other services’, the following advice is provided: Levels 1 No support or investment from the university. Complete reliance on the public or private sector to provide services used by staff and students, or else services are restricted to university users only 2 3 University gets involved in the provision of services and tacitly allows the community to make use of services. 4 5 University engages in a strategic dialogue with the local community over the demand and provision of services and takes community demands into account in the planning of university investment and provision
Overall, there are eight key indicators, seven of which are derived from a theory of regional competitiveness. The eighth relates specifically to the engagement processes within universities themselves: 1) Enhancing regional infrastructure – supporting the regional infrastructure, regulatory frameworks and underlying quality of environment and lifestyles. This includes the HEI helping the region to identify where improvements can be made, or providing direct input to the quality of the local environment.
PURE BP No. 16
http://www.obs-pascal.com/ 
Page |2
Observatory PASCAL
Place Management, Social Capital and Learning Regions 2) Human capital development processes – supporting the development of human capital through education and training both within the HEI and in other organisations. The emphasis here is on how the HEI adds to the stock of human capital by facilitating the development of people in the region, and retains both local and nonlocal graduates. (The education of people from outside the region who then leave it does not add to the stock of human capital in the region, and therefore is not relevant for this process. However it may be important at national level, and it does add to regional GDP.) 3) Business development processes – the creation and attraction of new firms, as well as support for developing new products, processes and markets for existing firms. 4) Interactive learning and social capital development processes – encouraging co-operation between firms and other institutions to generate technological, commercial and social benefits. Regional collaboration and learning between organisations are important in regional success. HEIs can promote the application of knowledge through regional partnerships, and encourage networking and the building of trust. 5) Community development processes – ensuring that the benefits of enhanced business competitiveness are widely shared within the community, and that the health and welfare of the population are maximised. 6) Cultural development – the creation, enhancement and reproduction of regional cultures, underpinning the other processes above, and interpreting culture both as activities that enrich the quality of life and as patterns of social conventions, norms and values that constitute regional identities. 7) Promoting sustainability – long-term regional development must be underpinned by processes seeking to improve sustainability, even though some of these objectives may appear to conflict with business development objectives (from the Introduction to the instrument, Benchmarking the Regional Contribution of Universities). Regional Benchmarking The second instrument was designed to support thinking amongst regional authorities/stakeholders that would complement the benchmarking by universities. It is a new instrument, developed by Professor Charles, specifically for use in PURE, and hence, its initial use is very much in the spirit of ‘road-testing’ its value. It is intended as a means of gaining an overview of the strengths and weaknesses of the region’s development, and of identifying the challenges facing a region that might assist a university that wanted to have an impact in its region to focus its efforts. This was differentiated, in planning PURE, from an exercise in which the regional authorities would assess the current contribution of the universities. The instrument acknowledges the rather extensive list of quantitative indicators that are used by the European Union, national governments and the OECD to compare regional performance. However, it also includes a qualitative component which proposes a similar process to that offered to universities. It sets out a series of indicators and sub-indicators, with options for ranking the perceived status of regional development.
PURE BP No. 16
http://www.obs-pascal.com/ 
Page |3
Observatory PASCAL
Place Management, Social Capital and Learning Regions The groups of indicators are: 1) 2) 3) 4) 5) 6) 7) Understanding the region; Framework (or ‘infrastructure’) conditions; Human capital development; Business Development processes; Interactive learning and social capital; Cultural development; and Sustainability.
Under each of these headings, there is a set of sub-indicators with options for coding responses from 1-5. For example, in relation to sub-indicator 7 in the ‘Framework Conditions’ section, the following options are offered: Effectiveness of regional strategic planning
Responses
Non existent Emerging regional planning framework – elements in place but poor integration. Regional strategic planning framework, but static and unresponsive to competitiveness agenda Planning framework is responsive to competitiveness strategy but tends to be reactive Planning is integral to competitiveness framework, and interactive
code 1 code 2 code 3 code 4 code 5
To date, four regional stakeholders have been invited to complete the instrument. Universities’ Benchmarking All of the Victorian Universities have now completed the ratings. All bar one of these universities have their base in metropolitan Melbourne. Three also have a strong provincial presence, while two have some rural activity. All have multiple campuses. Each university has completed the ratings against the indicators as a single institution, meaning that in some cases, judgements will have been made about the overall balance of activity across quite different local settings. Each university adopted a different process for completion of the instrument, adapted to their circumstances and capacity at the time. The universities undertook the ratings at different times over a period of several months, meaning that some have had longer to reflect on their circumstances than others. In the early stages, there was considerable concern about the instrument. It is a substantial document (80 pages) and a thorough implementation of the process could be a very substantial exercise. However, as it has become recognised as a resource in which universities can use for their own benefit principally, the universities approached the task to suit their own circumstances and gain some benefit from the process used, as well as completing the actual ratings. There was some feedback that the instrument itself needed refinement. It still reflects its British origins, and some parts seemed not to be relevant to some of the universities. Amongst the Melbourne universities, some obviously have a strong sciences research focus, whereas others emphasise community development. Detailed
PURE BP No. 16 http://www.obs-pascal.com/  Page |4
Observatory PASCAL
Place Management, Social Capital and Learning Regions feedback on these matters was given to David Charles and the PURE coordinating team. As the Introduction to the instrument indicates, it is not expected that all universities would rate equally across all indicators. This is very much a matter of individual universities’ own strategic priorities, and how they can learn from their ratings of current practice and performance. After the initial concern at the scale of the exercise, the response to the instrument has been generally favourable. Interesting diversity has appeared, and some institutions have already adapted their internal processes and priorities in consequence of the learning from the initial processes. The next stage of analysis leads us to consider whether one or more institutions consider themselves to be very strong in relation to one or more of the key dimensions evaluated through the instrument, such that other institutions might learn from them should they consider this to be important to their strategic priorities. Secondly, the data, viewed as a whole, enables the OKC and the universities together to consider whether there is an aspect of regional development where there is a serious gap in university engagement. Universities’ Benchmarking Results 1) The first group of indicators is ‘Enhancing regional infrastructure’, which encompasses the following subindicators: Benchmark 1.1 Engagement in regional infrastructure planning and assessment Benchmark 1.2 Using university demand as lever to upgrade infrastructure Benchmark 1.3 Investment in a high quality campus Benchmark 1.4 University involvement in multi-partner knowledge precincts Benchmark 1.5 University participation in provision of public transport or other services Benchmark 1.6 University provision of core public services The mean of the eight responses was 3.3, with a range from 2.5 to 3.8.
2) The second group of indicators is ‘Human capital development processes’, which encompasses the following sub-indicators: Benchmark 2.1 Access for students from disadvantaged groups Benchmark 2.2 Retention of graduates in the region Benchmark 2.3 Involvement in regional skills strategies Benchmark 2.4 Responsiveness to regional labour market demands Benchmark 2.5 Involvement of employers in developing the curriculum Benchmark 2.6 Course provision for employers and employees Benchmark 2.7 Supportive relationships with local schools Benchmark 2.8 Tailored training programmes for local policy organisations The mean of the eight responses was 3.7, with a range from 3.4 to 4.0.
PURE BP No. 16
http://www.obs-pascal.com/ 
Page |5
Observatory PASCAL
Place Management, Social Capital and Learning Regions 3) The third group of indicators is ‘Business development processes’, which encompasses the following subindicators: Benchmark 3.1 Strategic plan for business support Benchmark 3.2 Creation of spin-off firms Benchmark 3.3 Engagement in investment attraction Benchmark 3.4 Promoting graduate entrepreneurship Benchmark 3.5 Graduate start-ups arising from university programmes Benchmark 3.6 Availability of entrepreneurship modules Benchmark 3.7 Student placements with local employers Benchmark 3.8 Incentives for staff to engage with business The mean of the eight responses was 3.0, with a range from 1.6 (the next lowest was 2.8) to 3.8. 4) The fourth group of indicators is ‘Interactive learning and social capital development processes’, which encompasses the following sub-indicators: Benchmark 4.1 Involvement in regional governance Benchmark 4.2 Contribution to regional economic analysis Benchmark 4.3 Analysis of regional futures Benchmark 4.4 Staff exchanges Benchmark 4.5 Participation in learning region strategies Benchmark 4.6 Hosting policy seminars and workshops with local partners Benchmark 4.7 Connecting regional partners to international networks Benchmark 4.8 Supporting collective leadership of regional learning culture The mean of the eight responses was 3.4, with a range from 2.6 to 3.9. 5) The fifth group of indicators is ‘Community development processes’, which encompasses the following sub-indicators: Benchmark 5.1 Contributing to healthy cities and health promotion Benchmark 5.2 Support for community-based regeneration Benchmark 5.3 Student community action Benchmark 5.4 Opening up university facilities to the community Benchmark 5.5 Organising and hosting events and festivals for the community Benchmark 5.6 Co-production of community-relevant research with community partners Benchmark 5.7 Supporting community and social development through the curriculum Benchmark 5.8 Leading debates around the university/ society compact The mean of the eight responses was 3.7, with a range from 3.2 to 4.3.
PURE BP No. 16
http://www.obs-pascal.com/ 
Page |6
Observatory PASCAL
Place Management, Social Capital and Learning Regions 6) The sixth group of indicators is ‘Cultural development’, which encompasses the following sub-indicators: Benchmark 6.1 Cultural strategy Benchmark 6.2 Provision of cultural facilities Benchmark 6.3 Impact on local tourism Benchmark 6.4 Levels of participation by the community Benchmark 6.5 Fostering regional cultural identities Benchmark 6.6 University spin-offs to the cultural sector The mean of the eight responses was 3.0, with a range from 1.7 to 3.7. 7) The seventh group of indicators is ‘Promoting sustainability’, which encompasses the following subindicators: Benchmark 7.1 Universities leading societal responses to the challenges of sustainability Benchmark 7.2 Sustainability at the heart of university governance Benchmark 7.3 Universities managing research to focus on core societal challenges Benchmark 7.4 Universities creating new models for sustainable societies Benchmark 7.5 Promoting sustainability through the curriculum Benchmark 7.6 Promoting education for sustainable development Benchmark 7.7 Performance against environmental management systems The mean of the eight responses was 3.2, with a range from 1.7 to 4.1. 8) The eighth group of indicators is ‘Promoting engagement within the university’, which encompasses the following sub-indicators: Benchmark 8.1 Engagement embedded in university vision and mission Benchmark 8.2 Strategic plan for engagement Benchmark 8.3 Developing staff skills for engagement Benchmark 8.4 Rewarding and valuing engagement Benchmark 8.5 Resources for engagement Benchmark 8.6 Community involvement in governance of the university The mean of the eight responses was 3.8, with a range from 3.0 to 4.7.
Overall, the mean ratings ranged from 3.0 to 3.8. The strongest ratings were given to the work of promoting engagement within the universities, and lesser scores to the other dimensions of contribution to regional development.
PURE BP No. 16
http://www.obs-pascal.com/ 
Page |7
Observatory PASCAL
Place Management, Social Capital and Learning Regions The relative ratings were:         Promoting engagement within the university (3.8); Human capital development processes (3.7); Community development processes (3.7); Interactive learning and social capital development (3.4); Enhancing regional infrastructure (3.3); Promoting sustainability (3.2); Business development processes (3.0); Cultural development (3.0).
Whilst allowing for all of the difficulties which the universities had in completing the instrument and differences in perceptions that might follow, this is still an interesting set of data (assuming that the relativities within a university’s own ratings of itself are valid). It indicates, not surprisingly, the relative priority placed on learning and teaching and the provision of graduates, and that decisions about these priorities are framed with an eye to regional needs. It suggests, on the other hand, that there is relatively limited priority given to issues of regional infrastructure, business development and cultural development. This raises questions about whether these are low priorities for the universities, or whether they are areas in which their engagement and performance is relatively poor. Ironically, there is more attention to processes with the university to promote engagement, than appears to be delivered in actual engagement activities; perhaps this is a function of timing. Secondly, there is considerable diversity, with most groups of indicators, between the highest and the lowest ratings which universities have given to each group of indicators. This means that where an area is of high priority to a particular university, yet it is has rated itself at the lower end, there are other universities from whom they might seek some understanding of the processes and policies which they have implemented. The diversity is even greater if the range of responses on each sub-indicator is considered. While this level of analysis is probably more detailed than is warranted by the reliability of the data, a review of the Attachment to this report indicates that there is a gap of at least three points in the ratings (1-4, 2-5) on half of the subindicators. It has been suggested that it might be at this level that good practice or lessons to be learnt are captured. For example none of the universities rate themselves particularly highly in terms of graduate start ups but nearly all universities rank themselves highly in terms of positive relationships with local schools. Regional Benchmarking Results Similarly, the draft Regional Benchmarking instrument has been tested with both DIIRD (for Victoria), the City of Melbourne, the Committee for Melbourne, and the Victorian Employers’ Chamber of Commerce and Industry (VECCI). While DIIRD was able to provide quantitative data, the qualitative process raised questions about how the Government could participate in this kind of process, given the complexity of government and pressures which they faced. Detailed ratings were received from both the City of Melbourne and the Committee for Melbourne, who indicated that it has been useful in generating some fresh thinking. While issues of confidentiality have been important (as also for the universities), the instrument has proved relatively straightforward (perhaps more so at the city level), providing the people with the appropriate expertise can be brought into the process. A third response is awaited from an employer organisation.
PURE BP No. 16
http://www.obs-pascal.com/ 
Page |8
Observatory PASCAL
Place Management, Social Capital and Learning Regions From the two sets of ratings received to date, the following summary provides an interesting picture. 1) The first group of indicators were ‘Understanding the region’.       Is there a clear understanding of the boundaries of the region and a sense of regional identity? Is there a regional partnership that exerts leadership and creates vision, and does this include wide representation of social partners? Is there a clear vision of the regional strategy, aims and objectives? Foresight and scenario planning Economic research capacity Consultation on regional priorities
The mean of the two responses was 3.5, with the individual responses being 2.7 and 4.3. 2) The second group of indicators were ‘Framework conditions (infrastructure)’:         Landscape quality Public transport quality and extent Connectedness – Air Connectedness – road Connectedness – rail Freight handling facilities Effectiveness of regional strategic planning Integration of economic, land-use and transport planning
The mean of the two responses was 3.2, with the individual responses being 2.9 and 3.5. 3) The third group of indicators were ‘Human capital development’:      Strategy for enhancement of skills level in the workforce In migration and attractiveness of region Quality of vocational training for young people not in permanent education Graduate retention Scale and social inclusiveness of higher education
The mean of the two responses was 4.3, with the individual responses being 4.2 and 4.4. 4) The fourth group of indicators were ‘Business Development processes’:  Regional cluster strategies  Success of regional clusters  Provision of finance for existing firms  Coherence of regional business support partnership  Quality of management in the region  Export orientation of firms in the region  Existence of a comprehensive support network for start-ups  Culture of acceptance of business failure  Quality of Foreign Direct investment – high value added and services  Fit of FDI with existing regional clusters  Change in level of FDI in recent years. The mean of the two responses was 3.6, with the individual responses being 3.7 and 3.5.
PURE BP No. 16 http://www.obs-pascal.com/  Page |9
Observatory PASCAL
Place Management, Social Capital and Learning Regions 5) The fifth group of indicators were ‘Interactive learning and social capital’.      Commitment to a learning region strategy General culture of trust Association formation and activity Business involvement in social responsibility Workplace democracy.
The mean of the two responses was 3.9, with the individual responses being 3.5 and 4.2. 6) The sixth group of indicators was ‘Cultural development’.    Cultural assets (museums galleries, theatres etc) Distinctive regional cultural offering Support for grassroots arts activities.
The mean of the two responses was 4.9, with the individual responses being 4.8 and 5.0. 7) The seventh group of indicators were ‘Sustainability’.        Recycling activities Energy and water use in the home Change in biodiversity Environmental engagement of companies Recycling of industrial and commercial waste Business involvement in community environmental enhancement Level of derelict land – rate of change.
The mean of the two responses was 3.5, with the individual responses being 3.8 and 3.3. Overall, the mean ratings ranged from 3.2 to 4.9. On a couple of indicators, the ratings diverged significantly, raising interesting questions about the specific criteria which each group was using to determine current performance. The strongest ratings were given to cultural development, whilst ‘Framework conditions’ received the lowest rating. The relative ratings were:        Cultural development (4.9); Human capital development processes (4.3); Interactive learning and social capital development (3.9); Business development processes (3.6); Understanding the region (3.5); Promoting sustainability (3.5); and Framework conditions (3.2).
On the face of these ratings, the contrast between the regional stakeholders and the universities in rating ‘cultural development’ makes sense; this is an area where significant contribution from the universities is not required. Interestingly, many of the universities do have significant cultural facilities of one kind or another, but it
PURE BP No. 16 http://www.obs-pascal.com/  P a g e | 10
Observatory PASCAL
Place Management, Social Capital and Learning Regions might be that their purpose and orientation is more towards their own students than it is to community benefit. This is a good example of the way in which the instrument can prompt further reflection and learning. On the other hand, the very positive rating given by both to human capital development might reflect a shared judgement about the priority which this has for regional development. According to this approach, both ‘framework conditions’ (infrastructure) and sustainability are areas where there is considerable scope for a more strategic and collaborative approach to the achievement of regional priorities. The initial feedback has been that the instrument is useful, both in prompting reflection and in highlighting areas for further attention. It is at a relatively simple stage of development, and will benefit from providing greater clarity about the distinctions amongst the various options.
UNIVERISITIES BENCHMARKING SUMMARY Number 1 1.1 1.2 1.3 1.4 1.5 1.6 Mean 2 2.1 2.2 2.3 2.4 2.5 2.6 2.7 2.8 Mean 3 3.1 Business development processes Strategic plan for business support
http://www.obs-pascal.com/ 
Field Enhancing regional infrastructure Engagement in planning & assessment Using university demand as a lever Investment in high quality campus University involvement in precincts University participation in public transport etc University provision of core public services
MEAN 3.0 3.0 4.0 3.5 2.5 3.5 3.3
STD DEV 1.4 1.4 1.4 2.1 2.1 2.1 1.8 2.1 1.4 1.4 1.4 1.4 2.1 0.7 2.1 1.6 2.1
Lowest 2.0 2.0 3.0 2.0 1.0 2.0 2.0 2.0 3.0 3.0 3.0 3.0 2.0 3.0 2.0 2.6 1.0
Highest 4.0 4.0 5.0 5.0 4.0 5.0 4.5 5.0 5.0 5.0 5.0 5.0 5.0 4.0 5.0 4.8 4.0
P a g e | 11
Human capital development processes Access for students from disadvantaged groups Retention of graduates in region Involvement in regional skills strategies Responsiveness to regional labour market demands Involvement of employers in developing curriculum Course provision for employers and employees Supportive relationships with local schools Tailored training programs for local policy organizations 3.5 4.0 4.0 4.0 4.0 3.5 3.5 3.5 3.8 2.5
PURE BP No. 16
Observatory PASCAL
Place Management, Social Capital and Learning Regions 3.2 3.3 3.4 3.5 3.6 3.7 3.8 Mean 4 4.1 4.2 4.3 4.4 4.5 4.6 4.7 4.8 Mean 5 5.1 5.2 5.3 5.4 5.5 5.6 5.7 5.8 Mean 6 6.1 6.2 6.3 Cultural development Cultural strategy Provision of cultural facilities Impact on local tourism
http://www.obs-pascal.com/ 
Creation of spin-off firms Engagement in investment attraction Promoting graduate entrepreneurship Graduate start-ups from university programs Availability of entrepreneurship modules Student placements with local employers Incentives for staff to engage with business Interactive learning development and social capital
3.0 3.0 2.5 2.3 2.5 4.0 3.5 2.9
2.8 2.8 2.1 1.8 2.1 1.4 2.1 2.1
1.0 1.0 1.0 1.0 1.0 3.0 2.0 1.4
5.0 5.0 4.0 3.5 4.0 5.0 5.0 4.3
Involvement in regional governance Contribution to regional economic analysis Analysis of regional futures Staff exchanges Participation in learning region strategies Hosting policy seminars & worships with local partners Connecting regional partners to international networks Supporting collective leadership of regional learning
3.5 3.5 3.5 3.5 3.0 4.0 3.5 3.5 3.5
0.7 2.1 2.1 2.1 1.4 1.4 0.7 2.1 1.5
3.0 2.0 2.0 2.0 2.0 3.0 3.0 2.0 2.4
4.0 5.0 5.0 5.0 4.0 5.0 4.0 5.0 4.5
Community development processes Contributing to healthy cities and promotion Student community action
health 4.0 2.8 3.0 3.8 4.0 4.0 3.0 4.0 3.6 3.0 3.5 2.5 1.4 2.5 2.8 1.8 1.4 1.4 1.4 1.4 1.8 1.4 2.1 2.1 3.0 1.0 1.0 2.5 3.0 3.0 2.0 3.0 2.3 2.0 2.0 1.0 5.0 4.5 5.0 5.0 5.0 5.0 4.0 5.0 4.8 4.0 5.0 4.0
P a g e | 12
Support for community-based regeneration Opening up university facilities to the community Organising and hosting events and festivals Co-production of community relevant research Supporting community & social development Leading debates around university/society impact
PURE BP No. 16
Observatory PASCAL
Place Management, Social Capital and Learning Regions 6.4 6.5 6.6 Mean 7 7.1 7.2 7.3 7.4 7.5 7.6 7.7 Mean 8 8.1 8.2 8.3 8.4 8.5 8.6 Mean Promoting engagement within the university Engagement embedded in university vision and mission Strategic plan for engagement Developing staff skills for engagement Rewarding and valuing engagement Resources for engagement Community involvement in governance of the university Promoting sustainability Leading social responses to challenges Sustainability at heart of university governance Managing research to focus on core societal challenges Creating new models for sustainable societies Promoting sustainability through curriculum Promoting education for sustainable development Performance against environmental management systems 3.5 3.0 3.0 2.5 2.5 2.8 3.0 2.9 2.1 1.4 2.8 2.1 2.1 2.5 1.4 2.1 2.0 2.0 1.0 1.0 1.0 1.0 2.0 1.5 5.0 4.0 5.0 4.0 4.0 4.5 4.0 4.4 Levels of participation by the community Fostering regional cultural identities University spin-offs to the cultural sector 3.3 3.0 3.0 3.0 2.5 1.4 2.8 2.1 1.5 2.0 1.0 1.5 5.0 4.0 5.0 4.5
4.0 4.0 3.5 3.5 3.5 4.0 3.8
1.4 1.4 0.7 2.1 2.1 1.4 1.5
3.0 3.0 3.0 2.0 2.0 3.0 2.7
5.0 5.0 4.0 5.0 5.0 5.0 4.8
PURE BP No. 16
http://www.obs-pascal.com/ 
P a g e | 13

Published under a Creative Commons License By attribution, non-commercial, non-derivative

13th PASCAL International Observatory Conference - Glasgow

Click the image to visit site

Click the image to visit site

X