Use it when you want to begin reproducibility and open science work without writing the first draft from scratch.
Open Science Practices Chain AI Prompt
Step 1: Preregistration — write and submit a complete preregistration before data collection begins. Include: research question, hypotheses, design, measures, sample size justif... Copy this prompt template, run it in your AI tool, and use related prompts to continue the workflow.
Step 1: Preregistration — write and submit a complete preregistration before data collection begins. Include: research question, hypotheses, design, measures, sample size justification, primary analysis plan, secondary analyses, assumption checks, missing data plan, and exclusion criteria. Timestamp it. Step 2: Registered Report submission (if applicable) — if the target journal offers Registered Reports, format the Stage 1 submission. Submit before data collection for an In-Principle Acceptance. Step 3: Research compendium setup — initialize the project directory structure with separate raw data, processed data, code, and output folders. Set up version control (Git). Record the computing environment (renv, requirements.txt). Write the README. Step 4: Data collection and contemporaneous documentation — document all protocol deviations, unexpected events, and unplanned decisions in a study log as they occur. Do not rely on memory after the fact. Step 5: Analysis — run the pre-specified analyses exactly as registered. Any deviation from the plan must be explicitly noted with a reason. Additional exploratory analyses may be conducted but must be clearly labeled as unregistered. Step 6: Open materials, data, and code — prepare all study materials for sharing. De-identify the data. Finalize the analysis code so it runs from raw data to paper tables and figures with a single command. Deposit to a repository with a DOI. Step 7: Transparent reporting — write the paper with transparent reporting: report all pre-registered outcomes (not just significant ones), label exploratory analyses, include the preregistration DOI, materials DOI, and data DOI. Complete the relevant reporting checklist (CONSORT, STROBE, etc.).
When to use this prompt
Use it when you want a more consistent structure for AI output across projects or datasets.
Use it when you want prompt-driven work to turn into a reusable notebook or repeatable workflow later.
Use it when you want a clear next step into adjacent prompts in Reproducibility and Open Science or the wider Research Scientist library.
What the AI should return
The AI should return a structured result that is directly usable in a reproducibility and open science workflow, with explicit outputs, readable formatting, and enough clarity to support the next step in research scientist work.
How to use this prompt
Open your data context
Load your dataset, notebook, or working environment so the AI can operate on the actual project context.
Copy the prompt text
Use the copy button above and paste the prompt into the AI assistant or prompt input area.
Review the output critically
Check whether the result matches your data, assumptions, and desired format before moving on.
Chain into the next prompt
Once you have the first result, continue deeper with related prompts in Reproducibility and Open Science.
Frequently asked questions
What does the Open Science Practices Chain prompt do?+
It gives you a structured reproducibility and open science starting point for research scientist work and helps you move faster without starting from a blank page.
Who is this prompt for?+
It is designed for research scientist workflows and marked as advanced, so it works well as a guided starting point for that level of experience.
What type of prompt is this?+
Open Science Practices Chain is a chain. You can copy it as-is, adapt it, or use it as one step inside a larger workflow.
Can I use this outside MLJAR Studio?+
Yes. The prompt text works in other AI tools too, but MLJAR Studio is the best fit when you want local execution, visible Python code, and reusable notebooks.
What should I open next?+
Natural next steps from here are Code Review for Reproducibility, Data Sharing Plan, Meta-Analysis Readiness.