Page tree
Skip to end of metadata
Go to start of metadata



Discussion items


Progress Check - Definition of Done 

  • Common DoD items - to be included in individual team's DoD - Quick update, release issues consumed time and no progress to report. Check back in next week.

Progress Check - Code of Conduct 

Met once and meeting again tomorrow - work in progress. Team of 6 coordinating. We'll follow up towards the end of October on progress.

See Discuss post for more details: Seeking input on a FOLIO Community Code of Conduct

???Performance / Longevity Testing - DEBT-6???

DEBT-6 - Getting issue details... STATUS

  • What's needed?
  • Who will do it?
  • Desired outcome: 

    A set of goals to achieve by the end of Q4 supported by a list of stories

Test Design is the biggest and toughest piece of this. Key activities that would be included:

  • Circulation
  • Batch loading of data
  • Acquisitions
  • Other?

Tests would likely include

  • API tests
  • Browser-based testing on realistic workflows
  • Longevity testing
  • Potential Security-specific tests

Could we leverage the work done on Integration tests?

Test Designs are coupled with the test data (dependencies, etc.)

May need to instrument the applications to provide data so we can assess performance

Don't overthink this - what can we leverage now to get good feedback as soon as possible

Deliverables might include a package that can be used to run these tests in your own environment (similar to current API an Integration tests) and/or to create the environment (AWS) based on code

Adding to the suite of tests should be in the Definition of Done for new features/modules

Who can lead this effort?

  • Zak is the key maintainer of integration tests
  • Someone else could also hop in and leverage the integration tests, etc.
  • Not sure if Anton has background/experience in load/performance testing
  • Someone would have to evaluate tools that can be used 

How can we break this problem up?

  1. Environment - Core Platform
  2. Defining the test scenarios (which tests, how many of each, what data is needed, how big a dataset, etc.) ← Likely community product owner-type
  3. Building the tests themselves - Core Functional ( ? )... some teams have created sets of Jmeter tests - these may be useful too. Would be helpful to leverage all teams to build these tests
  4. Collect and/or create data to be used - Mike and Tod to query Sys-Ops, potentially need to augment and/or curate additional data. Harry K might have a standard set of users
  5. Identifying which tools can be used to profile the application so that we can assess the results

What needs/can be done in Q4?

  1. Browser-based testing should be the focus
  2. Get a small set of test scenarios/transactions (5) that are circulation based that we can build tests around
  3. Stretch goal, have those happening while batch updates are also happening
  4. Not concerning ourselves with loan histories for now

Tod to reach out to implementer's group to get input into the Circulation and batch loading scenarios.

Note we have licenses/permission to use Jprofiler for this project.

???Tech Debt List
  • Resume from where we left off

Action items

For next time discuss security issue and access to JIRA tickets.