The scheduled five-year review of the TARA should consider new evidence that has emerged since 2017. This includes insights and evidence relating to bushfires, timeframes for climate change risks, and other evidence arising from new or ongoing research, monitoring, cultural knowledge or other formal documentation relating to the marine estate. Future MEMS initiatives, outcomes, management actions and projects should reflect any revised priority threats and risks.
NSW Government should provide longer-term funding for delivery of the MEMS program. This will allow more efficient planning and implementation, and reduce risks to achievement of intermediate and long-term outcomes for the marine estate environment and the communities and industries it benefits.
Processes for decision-making, approvals and communication should be reviewed and refined. Decision-making and approvals processes must be made clear; and decisions communicated to staff in a transparent and timely manner. To ensure that collaborative processes add value to the project or outcome, input should only be sought from those with relevant expertise or accountabilities.
Program management reporting requirements and frequency should be reviewed to ensure reporting is fit-for-purpose for staff and decision-makers. MEMS governance groups and decision-makers should communicate their responses for addressing reported risks, issues or recommendations to initiative staff. External reporting of content captured through program management reporting should also be reviewed to ensure the information and frequency is fit-for-purpose for stakeholders.
Enhance centralised communications and engagement by:
- Ensuring MEMS staff are aware of the existing centralised communications and engagement strategy, and their responsibilities for undertaking coordinated and consistent engagement, where appropriate.
- Providing external audiences with clear and simple summaries of the MEMS scope and MEMS agencies roles and responsibilities, to assist in managing stakeholder expectations. This should include clarifying interfaces with other related legislation and agencies.
- Providing consistent messaging on shared and common topics, accompanied by guidance for staff to develop initiative-specific messages and communication where required (including co-branded communications, where appropriate).
Clearly communicate the process and requirements for adaptively managing actions and projects, where needed, to MEMS staff.
Initiative leads, with input from MIMP staff, should review specific outcomes and indicators identified through this evaluation to improve their appropriateness and practicality. Refinements must be consistent with the definitions of outcomes and indicators set out in the Monitoring and Evaluation Framework. Once refinements to outcomes and indicators have been agreed, initiative leads must complete indicator specifications, where required, to guide data collection.
Initiative leads and MIMP staff, with input from MASC and MIMP Steering Committee, should review, agree and clearly communicate roles, responsibilities and accountabilities for outcomes and monitoring. In doing so:
- Roles, responsibilities and accountabilities for outcomes, indicators and data collection should be assigned to specific individual roles, in alignment with existing governance and delivery roles across the MEMS and MIMP. Roles and responsibilities for indicators and data collection will be assigned to initiative leads or delegates.
- Accountabilities for achieving specific outcomes should be allocated to relevant senior executive roles within the MEMA agencies, reflecting alignment with existing areas of responsibility.
- Include timeframes for fulfilling responsibilities.
Initiative leads should budget for MIMP implementation with respect to their initiative. MASC and MIMP SC should ensure funding and resources are allocated to effectively implement the MIMP.
Initiative leads should identify any short-term outcomes that require ongoing effort to maintain achievements to date, or where there is a desire for further improvement. Initiative and project leads should continue to consider these outcomes in planning and delivery during Stage 2.
Initiative leads, with support from MIMP staff, should embed systems or processes within initiatives for collating data against outcomes, indicators and measures. This should reflect agreed roles and responsibilities and data collection frequencies identified in the Monitoring and Evaluation Framework. Where possible, these should align with existing processes for collating and documenting initiative-level data and information.
Periodic reporting on outcomes and indicators should be embedded into existing initiative and program reporting processes, in line with the frequency of data collection in the Monitoring and Evaluation Framework.
Establish an approach for external reporting against outcomes and indicators. The approach should outline format, frequency and content requirements; be targeted to audience needs and interests; and drive accountability. The reporting frequency may be longer than the data collection frequencies specified in the Monitoring and Evaluation Framework.
MASC and the MIMP SC should confirm responsibilities and set timeframes for executing accepted recommendations. They should also allocate responsibilities, to agencies and roles, where not identified in the recommendations.
This is the first of three evaluations planned under the Marine Integrated Monitoring Program (MIMP) to measure the Strategy’s progress. A mid-term evaluation is due to commence in 2023 and the final evaluation expected to commence in 2027.
Download copies of the report documents using the links below. Send an email to ask for these documents in a different format.