Research & Insights — DVSupport.Network
Domestic violence research summaries, academic partnerships, and data insights for institutions and government agencies.
Research & Insights
Purpose and Scope
This page outlines shared research themes, collaboration options with academic partners, approaches to non-sensitive open data, and pathways for organizations to participate in research and insights activities across the domestic violence response ecosystem.
Core Research Themes
Research and insights activities generally focus on system-level patterns rather than individual cases. Common themes include:
- Service Utilization Trends
- Volumes and characteristics of referrals across agencies
- Patterns in shelter requests versus available capacity
- Use of legal advocacy, civil protection order support, and related services
- Cross-system touchpoints (e.g., social services, health, justice, housing)
- Service Gaps and Mismatches
- Geographic areas with limited or no access to specific services
- Waitlists, declined referrals, and unfilled program slots
- Eligibility criteria that may unintentionally exclude potential clients
- Under-resourced specializations (e.g., language access, disability-related supports)
- Access, Navigation, and Coordination
- Points in the referral pathway where requests are delayed or discontinued
- Cross-agency communication patterns and bottlenecks
- Impact of coordination tools, shared protocols, or warm handoff models
- Capacity, Workforce, and Infrastructure
- Staffing and training needs related to coordinated response
- Technology readiness and interoperability across organizations
- Administrative burdens connected to reporting and compliance
- Equity and Access Disparities
- Service use by demographic groups, where lawfully collected and appropriately de-identified
- Language and cultural access gaps at system level
- Structural barriers to coordination for smaller or under-resourced organizations
Collaboration with Universities and Research Institutions
Partnerships with universities and research institutions can strengthen the rigor and utility of system-level insights. Common collaboration models include:
- Data Partnership Agreements
- Structured MOUs specifying data types, de-identification standards, storage expectations, and retention periods
- Role definitions for data stewards, principal investigators, and agency liaisons
- Pre-approved categories of analysis and reporting use cases
- Evaluation of Coordinated Programs
- Independent evaluation of coordinated response models, pilots, or network-wide initiatives
- Mixed-method approaches to examine implementation fidelity and operational outcomes
- Feedback loops to adjust protocols or partnership models based on findings
- Graduate and Practicum Projects
- Placement of graduate students to support data cleaning, analysis, and visualization
- Capstone projects that map service gaps or test coordination models
- Joint supervision arrangements between agencies and academic advisors
- Joint Funding Applications
- Co-developed proposals for research grants focused on coordinated service delivery
- Shared workplans with clear deliverables and decision-making structures
- Agreed data access pathways for funder-required reporting
Considerations for Academic Partnerships
When establishing research collaborations with universities, organizations often:
- Define governance structures (steering committees, data access committees, or advisory groups)
- Clarify ownership and usage rights for data, tools, and resulting publications
- Align research timelines with operational and funding cycles
- Plan for knowledge translation, such as briefings, dashboards, and practitioner-oriented summaries
Open Data Insights (Non-Sensitive)
Open data insights focus on high-level, non-sensitive, and de-identified information that can inform policy, planning, and system design. These materials are not intended to contain personally identifiable information or detailed incident descriptions.
Typical Open Data Domains
- Aggregate Service Volumes
- Counts of referrals, intakes, and program enrollments by time period
- Aggregate shelter occupancy and turn-away figures
- Numbers of coordination meetings, case conferences, or joint activities
- System-Level Capacity Indicators
- Number and type of programs active in a region
- Availability of specific modalities (e.g., legal clinics, advocacy, housing navigation)
- Agency participation rates in coordinated initiatives
- Network and Referral Patterns
- Aggregate flows between service types (e.g., helpline to shelter, shelter to legal supports)
- Typical referral pathways and cross-system touchpoints
- High-level mapping of coordination clusters (without identifying any individual agencies that opt out)
- Public-Facing Dashboards and Briefs
- Regional fact sheets on capacity, coordination structures, and service gaps
- Non-technical summaries of key trends for policy and planning audiences
- Visualizations that show change over time without exposing sensitive details
Data Protection and Thresholds
To minimize risk of identification or inference, open data practices often include:
- Use of minimum cell size thresholds for published tables and charts
- Aggregation across time or geography when working with small counts
- Suppression, rounding, or banding approaches for rare combinations of variables
- Removal of direct identifiers, as well as obvious indirect identifiers
- Internal review processes before public release of analyses or dashboards
Participation Options for Organizations
Organizations can engage with research and insights activities through multiple pathways, depending on capacity, governance, and priorities.
1. Contributing De-Identified Operational Data
Many networks use structured mechanisms for agencies to contribute non-sensitive, aggregate, or de-identified data. Typical steps include:
- Agreeing on common data definitions and metrics
- Establishing a point of contact for data coordination within each organization
- Setting a reasonable data submission schedule (e.g., quarterly or semi-annual)
- Using standard templates or secure transfer channels, consistent with local expectations
- Receiving feedback or data quality reports to support internal improvements
2. Participating in Research Governance
Organizations can help shape research agendas and guardrails by:
- Joining steering or advisory groups for research and data initiatives
- Contributing to the development of shared research priorities and protocols
- Reviewing proposed analyses for alignment with partner expectations
- Co-developing criteria for public dissemination of findings
3. Co-Designing Research Questions
Frontline, administrative, and leadership teams often identify operational questions that benefit from systematic analysis. Examples include:
- Understanding drivers of waitlists for specific service types
- Evaluating the impact of new referral pathways or intake processes
- Assessing coordination outcomes of shared case conferencing models
- Analyzing barriers faced by smaller or rural organizations in multi-agency initiatives
4. Hosting or Supporting Pilot Projects
Organizations may opt to host pilot projects aligned with research and insights activities, such as:
- Testing new data collection forms or digital tools
- Implementing time-limited coordination protocols for evaluation
- Participating in training or capacity-building interventions that are studied over time
- Providing feedback on usability, feasibility, and resource requirements
5. Applying Findings to Practice
To close the loop between research and operations, organizations can:
- Integrate findings into strategic planning, funding proposals, and partnership discussions
- Update referral agreements, MOUs, or protocols based on documented trends
- Use dashboards or reports to monitor system-level performance indicators
- Share internal summaries with boards, coalitions, and key stakeholders