Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
B
BIDS tools
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Package Registry
Container Registry
Model registry
Operate
Environments
Terraform modules
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
NTrepo
BIDS tools
Merge requests
!1
new laura folder
Code
Review changes
Check out branch
Download
Patches
Plain diff
Merged
new laura folder
au620747-main-patch-86964
into
main
Overview
1
Commits
14
Pipelines
0
Changes
1
Merged
Laura Rævsbæk Birch
requested to merge
au620747-main-patch-86964
into
main
1 year ago
Overview
1
Commits
14
Pipelines
0
Changes
1
Expand
0
0
Merge request reports
Viewing commit
23e66b7c
Prev
Next
Show latest version
1 file
+
35
−
0
Inline
Compare changes
Side-by-side
Inline
Show whitespace changes
Show one file at a time
23e66b7c
Upload New File
· 23e66b7c
Laura Rævsbæk Birch
authored
1 year ago
laura/ntlab_create_dataset_description_json.py
0 → 100644
+
36
−
0
Options
import
json
from
collections
import
OrderedDict
def
create_dataset_description_json
(
path
,
name
,
bids_version
=
"
1.7.0
"
,
hed_version
=
None
,
dataset_type
=
'
raw
'
,
dataset_links
=
None
,
data_license
=
None
,
authors
=
None
,
acknowledgements
=
None
,
how_to_acknowledge
=
None
,
funding
=
None
,
ethics_approvals
=
None
,
references_and_links
=
None
,
doi
=
None
,
generated_by
=
None
,
source_datasets
=
None
):
# Prepare dataset_description.json
description
=
OrderedDict
([
(
'
Name
'
,
name
),
(
'
BIDSVersion
'
,
bids_version
),
(
'
HEDVersion
'
,
hed_version
),
(
'
DatasetType
'
,
dataset_type
),
(
'
DatasetLinks
'
,
dataset_links
),
(
'
License
'
,
data_license
),
(
'
Authors
'
,
authors
),
(
'
Acknowledgements
'
,
acknowledgements
),
(
'
HowToAcknowledge
'
,
how_to_acknowledge
),
(
'
Funding
'
,
funding
),
(
'
EthicsApprovals
'
,
ethics_approvals
),
(
'
ReferencesAndLinks
'
,
references_and_links
),
(
'
DatasetDOI
'
,
doi
),
(
'
GeneratedBy
'
,
generated_by
),
(
'
SourceDatasets
'
,
source_datasets
)])
# Remove None values
pop_keys
=
[
key
for
key
,
val
in
description
.
items
()
if
val
is
None
]
for
key
in
pop_keys
:
description
.
pop
(
key
)
# Only write data that is not None
with
open
(
path
,
'
w
'
)
as
outfile
:
json
.
dump
(
description
,
outfile
,
indent
=
4
)
\ No newline at end of file
Loading