<?xml version="1.0"?>
<response><xml version="1.0" encoding="UTF-8"><resource xmlns="http://datacite.org/schema/kernel-4" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4/metadata.xsd"><identifier identifierType="DOI">10.60964/bndu-r97n-bg26</identifier><creators><creator><creatorName nameType="Personal">Gava GP</creatorName><givenName>Giuseppe P</givenName><familyName>Gava</familyName><nameIdentifier nameIdentifierScheme="ORCID" schemeURI="https://orcid.org">https://orcid.org/0000-0002-0462-916X</nameIdentifier></creator><creator><creatorName nameType="Personal">Lef&#xE8;vre L</creatorName><givenName>Laura</givenName><familyName>Lef&#xE8;vre</familyName><nameIdentifier nameIdentifierScheme="ORCID" schemeURI="https://orcid.org">https://orcid.org/0009-0004-7713-1401</nameIdentifier></creator></creators><titles><title xml:lang="en">Analysis of hippocampal ensembles during Contextual Feeding and cNOR tasks</title></titles><resourceType resourceTypeGeneral="Dataset">Analysis of hippocampal ensembles during Contextual Feeding and cNOR tasks</resourceType><publisher>University of Oxford</publisher><publicationYear>2024</publicationYear><dates><date dateType="Issued">2024</date></dates><language>en</language><relatedIdentifiers><relatedIdentifier relatedIdentifierType="DOI" relationType="IsCitedBy">10.1126/science.adk9611</relatedIdentifier></relatedIdentifiers><rightsList><rights rightsURI="https://creativecommons.org/licenses/by-sa/4.0/legalcode">Creative Commons Attribution Share Alike 4.0 International</rights></rightsList><descriptions><description xml:lang="en" descriptionType="TechnicalInfo">This dataset contains two jupyter notebooks and python scripts to run some exemplar analysis from the article 'Organising the coactivity structure of the hippocampus from robust to flexible memory'.&#xD;
&#xD;
The jupyter notebooks provided are:&#xD;
- 'cNOR_task.ipynb' which computes the object/location coding analyses shown in Figures 1 and 4&#xD;
- 'coactivity_cond+cnor.ipynb' reproduces some of the coactivity analyses shown in Figures 1 and 4.&#xD;
&#xD;
In the 'results' folder are stored processed data that are loaded throughout the notebooks.&#xD;
&#xD;
The python script 'makeGraphbatch.py' computes coactivity graphs during active exploration times (theta-informed) from the spiking data. After downloading, ensure the spiking data folder is named 'data' and placed inside the root folder 'orgCoactHippo'. See script for more info.&#xD;
&#xD;
Inside the 'recordings' folder, there are text files that list the recording days belonging to each task. That is: 'cond_ll145' and 'cond_ll149' lists the food-context conditioning days for each animal, while `cnor_x` and `cnor_y` list the cnor days in the two contexts, regardless of the animal's identity.&#xD;
&#xD;
The python library 'util_func.py' data loading and processing functions used by the python script and notebooks. See script for more info.&#xD;
'difference_estimation_plot.py' is a python library to produce estimation plots as in the article.&#xD;
&#xD;
For the analysis in the original paper, this code was run in Python version 3.10. Execution of the code requires the following libraries:&#xD;
&#xD;
matplotlib 3.7.1&#xD;
matplotlib-inline 0.1.6&#xD;
networkx 2.8.4&#xD;
numpy 1.24.3&#xD;
pandas 1.5.3&#xD;
pandas-ods-reader 0.1.4&#xD;
scikit-learn 1.2.2&#xD;
scipy 1.10.1&#xD;
seaborn 0.12.2&#xD;
</description></descriptions><fundingReferences><fundingReference><funderName>Biotechnology and Biological Sciences Research Council, UKRI</funderName><awardNumber>BB/S007741/1</awardNumber></fundingReference><fundingReference><funderName>Biotechnology and Biological Sciences Research Council, UKRI</funderName><awardNumber>376 BB/N002547/1</awardNumber></fundingReference><fundingReference><funderName>Medical Research Council, UKRI</funderName><awardNumber>MC_UU_00003/4</awardNumber></fundingReference><fundingReference><funderName>Medical Research Council, UKRI</funderName><awardNumber>MR/W004860/1</awardNumber></fundingReference></fundingReferences></resource></xml></response>
