List use cases: Batch importing, and all variations of importing

FOLIO Road map about batch importing:

RFeature nameDescriptionNotes from MM-SIGRelated system wide UX features
1Import data: Batch Import of metadata recordsAllows for the batch upload and importing of rich metadata and holding records.  Holdings data may be embedded in the bibliographic records or separate files. Ability to match an existing bibliographic, holdings, and/or item record to overlay it.(LW@NCSU - Batch edit is critical - I'm adding a feature at row 31 for further discussion) RE: overlay. Ability to batch edit multiple bib/hldgs/item recs: should it be done within the system? What would be needed to enable that? Should it be done outside the system (e.g MarcEdit)? In order to do so, need to be able to reliably: 1. identify the records to be exported; 2. export the records; 3. edit the records; 4. import the records, incl. safely overlay without impacting fields that are out of scope for the change NC@Cornell: +1. JS@Duke: We would like to be able to import bibliographic data (both single record data and multiple record data) from outside sources such as OCLC's Connexion client. There should be an integration so that one-by-one and/or multiple record overlay is also possible, such as is currently available using OCLC"s Connexion client.Automation; Integrations; Workflows; To-do; Notes; Change tracker;

-

Batch editing functionalityAbility to pull together a set of records based on user defined criteria, and make global edits to those records. THis include all records that exist in a FOLIO system. Order, patrons, etc....RE: overlay. Ability to batch edit multiple bib/hldgs/item recs: should it be done within the system? What would be needed to enable that? Should it be done outside the system (e.g MarcEdit)? In order to do so, need to be able to reliably: 1. identify the records to be exported; 2. export the records; 3. edit the records; 4. import the records, incl. safely overlay without impacting fields that are out of scope for the change;Batch edit; Automation; Workflows; To-do
 Logistics around importing batch files (not so much the file itself, but the support processes for dealing with imported MARC files)
  • Scheduling/automatic import vs. triggering import upon command
  • How are files moved around between the source and FOLIO, e.g. FTP, download from website, other
  • Staging area before files are actually loaded to FOLIO?
  • Where are they loaded? Does it differ depending on what they are supposed to do? (e.g. maybe load to acq app if they are being used to bring in order details, but load to inventory app if being used to add/update metadata/holdings/item records?)
  • Matching (e.g. matchpoints, cascade of matchpoints, variation by type of file or source of file?)
  • Overlay (e.g. complete overlay, no overlay of existing bib data (only adding new stuff), protecting some existing bib data)
  • Templates/defaults/scripts for automatic edits when loading data


Batch edit; FTP; Scheduler; Workflows; Acq orders; Inventory bib data/holdings/items; Templates


List of use cases:

  1. Shelfready approval plans (where the library is loading a MARC record to build bib, order, item all at the same time, followed by an EDIFACT invoice)
  2. Individual purchases (physical or E) and their cat records (initial records to create bibs and orders, then subsequent records to overlay with final cataloging/URL details, and perhaps create holdings and items)
  3. Batches of PDA/DDA records - for discovery of new titles added to the program, and then perhaps subsequent records to trigger automatic order record creation and final cataloging metadata for titles triggered for automatic purchase
  4. NCSU Single/Batch load of MARC records (bib records via YBP)
  5. Import process in the GBV