", # this will be used as the instance's uuid if provided
+ "variant": "A" , # an additional indicator for signal subtypes
+ "hostIdentifier": "host1",
+ "calendarTime": "2022-10-19T10:35:01Z",
+ "time": 1618698901,
+ "columns": {
+ "pid": 888,
+ "path": "/usr/bin/security",
+ "cmdline": "/usr/bin/security dump-keychain",
+ "state": "running",
+ "parent": 555,
+ "created_at": 1918698901,
+ "updated_at": 2118698901
+ }
+ }
+}
+
+requests.Session()
+r = session.post(
+ API_ENDPOINT,
+ json=signal_instance,
+ headers={
+ "Content-Type": "application/json",
+ "Authorization": f"Bearer {API_TOKEN}",
+ },
+)
+```
+
+:::note
+
+You can view the full API documentation for the `/instances` endpoint in the [API Reference](https://netflix.github.io/dispatch/docs/api#tag/signals/operation/create_signal_instance__organization__signals_instances_post).
+
+:::
+
+## Creating a Signal Definition
+
+At a high level, you'll need to configure some basic metadata about your `signal` such as the name, description, and variant.
+You can also configure whether a `signal` should create a case or not. Choosing not to create a case can be useful for contextual signals that
+don't necessarily warrant triage and investigation on their own. But, you can still use these signals to enrich other signals and cases.
+
+
+
+**Variant**: The same signal can have multiple varitants with different definitions.
+
+**Owner**: Typically the team or owner that produces the Siganl.
+
+**External ID**: This ID will be used to correctly associate incoming signals to this definition. This ID should be unique across all Signal Definitions.
+
+**Conversation Target**: Defines the channel name where the Case for the Signal will be created.
diff --git a/docs/docs/administration/settings/signal/engagement-filter.mdx b/docs/docs/administration/settings/signal/engagement-filter.mdx
new file mode 100644
index 000000000000..ee5d7ac95993
--- /dev/null
+++ b/docs/docs/administration/settings/signal/engagement-filter.mdx
@@ -0,0 +1,12 @@
+# Engagement Filter
+
+Engagement Filters are used to automate the process of reaching out to a user involved in a specific Signal Instance. Engagement Filters make use of Entity Types that match email addresses for users in your environment and custom message you configure to be sent to those users. Engagement Filters also support multi-factor authentication when validating suspicious behavior. This feature is useful when you want to be confident the user you engaged is actually who they say they are and not a malicious actor.
+
+### Creating an Engagement Filter
+
+To create an Engagement Filter, follow these steps:
+
+1. Navigate to a Signal Definition edit page.
+2. Click on the '+' icon adjacent to the 'Engagement Filter(s)' dropdown menu.
+
+
diff --git a/docs/docs/administration/settings/signal/entity-type.mdx b/docs/docs/administration/settings/signal/entity-type.mdx
new file mode 100644
index 000000000000..485297f421df
--- /dev/null
+++ b/docs/docs/administration/settings/signal/entity-type.mdx
@@ -0,0 +1,20 @@
+# Entity Type
+
+Entity Types enable the extraction of entities from the raw Signal Instance data. Once extracted, these entities are stored for further use, enabling features like automatic correlations, engagement filters, and signal filters.
+
+### Creating an Entity Type
+
+To create an Entity Type, follow these steps:
+
+1. Navigate to a Signal Definition edit page.
+2. Click on the '+' icon adjacent to the 'Entity Type(s)' dropdown menu.
+
+Upon clicking, the Entity Type playground will be launched in a modal window. This playground is an interactive tool designed to aid you in creating and validating your Entity Type. Entity Types can be constructed using either regular expressions or the JSON Path format.
+
+
+
+In the illustrated example, a new Entity Type is defined using the JSON Path format. The JSON Path expression, columns.cmdline, is used to extract the value of the cmdline column from the raw Signal Instance data. The playground editor accentuates the extracted value within the raw Signal Instance data, validating a successful match with the desired value.
+
+Once you are satisfied with your Entity Type, proceed through the playground to the naming and description stage. Here, you'll provide a suitable name and a brief description for the Entity Type. By clicking 'Save', the Entity Type gets successfully created and is then linked with the corresponding Signal Definition.
+
+
diff --git a/docs/docs/administration/settings/signal/filter.mdx b/docs/docs/administration/settings/signal/filter.mdx
new file mode 100644
index 000000000000..2722d85aa8f8
--- /dev/null
+++ b/docs/docs/administration/settings/signal/filter.mdx
@@ -0,0 +1,46 @@
+# Filter
+
+Signal Filters are used to define how Signals should be grouped together or when they should be snoozed.
+
+:::note
+To create a Signal Filter, you must first define the Entity Types that will be used to deduplicate or snooze Signals.
+:::
+
+## Snooze Filter
+
+Snooze filters make use of entities extracted from signals to define when a signal should be snoozed. This feature is useful when an influx of signals is expected for a given period of time (e.g. some known administration activities) and you want to temporarily stop cases from being created. Even when a signal is snoozed it will still be processed and associated entities will be created.
+
+For example, you can create a `Snooze Filter` that will snooze all incoming signals that contain a specific JA3 hash.
+
+:::info
+You also have the option to create a `Snooze Filter` without specifying any entities, which will snooze all incoming signals matching that filter.
+:::
+
+### Creating a Snooze Filter
+
+To create an Snooze Filter, follow these steps:
+
+1. Navigate to a Signal Definition edit page.
+2. Click on the '+' icon adjacent to the 'Signal Filter(s)' dropdown menu.
+3. Select the `Snooze` radio button under the `BASIC` tab.
+
+
+
+
+## Deduplication Filter
+
+In order to perform signal duplication, a duplication filter must be created. Deduplication filters leverage extracted signal entity types and a sliding time window in order to determine if a signal should be marked as a duplicate. If a match is found, the current signal is marked as duplicate and it is associated with the existing case.
+
+:::info
+By default, all Signals are deduplicated over a one hour window unless a custom Deduplication Filter is defined.
+:::
+
+### Creating a Deduplication Filter
+
+To create an Deduplication Filter, follow these steps:
+
+1. Navigate to a Signal Definition edit page.
+2. Click on the '+' icon adjacent to the 'Signal Filter(s)' dropdown menu.
+3. Select the `Deduplication` radio button under the `BASIC` tab.
+
+
diff --git a/docs/docs/administration/settings/signal/index.mdx b/docs/docs/administration/settings/signal/index.mdx
new file mode 100644
index 000000000000..25054048bd46
--- /dev/null
+++ b/docs/docs/administration/settings/signal/index.mdx
@@ -0,0 +1,7 @@
+---
+sidebar_position: 4
+---
+
+# Signal
+
+Signals are a way to define different events that you ingest into Dispatch and raise to the level of cases.
diff --git a/docs/docs/administration/settings/templates.mdx b/docs/docs/administration/settings/templates.mdx
new file mode 100644
index 000000000000..68e6034fa055
--- /dev/null
+++ b/docs/docs/administration/settings/templates.mdx
@@ -0,0 +1,115 @@
+# Templates
+
+Templates are used by Dispatch to create case or incident specific documentation. These templates are copied and filled out to the best of Dispatch's abilities. After their creation, they are normal documents that are associated to your case or incident and can be used for collaborating and capturing facts and findings.
+
+There are several types of templates that Dispatch supports:
+
+- Case
+- Incident
+- Executive
+- Review
+- Tracking
+
+
+
+
+
+
+**Name:** Name of the template.
+
+**Description:** Description of the template.
+
+**Weblink:** The weblink to the template.
+
+**External Id:** External identifier for the document template. Used for API integration (e.g. Google doc file id). Typically, the unique id in the weblink.
+
+Enabling evergreen for a template instructs Dispatch to send an email reminder to the template owner informing them that they should review the template to ensure that the template is up to date.
+
+**Evergreen Owner:** The email address representing the owner of this document template.
+
+**Evergreen Reminder Interval:** Number of days that should elapse between reminders sent to the document template owner.
+
+### Case Template
+
+A copy of this template is created for each new case on case creation. It contains the current state of the case and is used by the case owner/ assignee to capture facts and findings.
+
+- [Example Case Document](https://docs.google.com/document/d/1g1cl9liXG8US0eBnrZYRaeWa7Ek_hoZJ5PPadas44vI)
+
+#### Template Variables
+
+The following is a list of available variables that Dispatch will attempt to resolve on document creation. Note: we do not currently re-resolve these.
+
+NOTE: All variables must be enclosed in a `{{}}`
+
+- `case_name` - The case's name
+- `case_title` - The cases's title
+- `case_description` - The case's description
+- `case_resolution` - The case's resolution
+- `case_owner` - The case's owner
+- `case_type` - The case's type
+- `case_severity` - The case's severity
+- `case_priority` - The case's priority
+- `case_status` - The case's status
+- `case_storage_weblink` - Link to the storage resource
+
+### Incident Template
+
+A copy of this template is created for each new incident on incident creation. It contains the current state of the incident and is used by incident participants to share knowledge about the incident.
+
+- [Example Incident Document](https://docs.google.com/document/d/1fv--CrGpWJJ4nyPR0N0hq4JchHJPuqsXN4azE9CGQiE)
+
+#### Template Variables
+
+The following is a list of available variables that Dispatch will attempt to resolve on document creation. Note: we do not currently re-resolve these.
+
+NOTE: All variables must be enclosed in a `{{}}`
+
+- `name` - The name of the incident
+- `title` - The incident's title
+- `description` - The incident's description
+- `resolution` - The incident's resolution
+- `commander_fullname` - The current commander's name
+- `type` - The incident's type
+- `priority` - The incident's priority
+- `status` - The incident's status
+- `conversation_weblink` - Link to the conversation resource (if any)
+- `conference_weblink` - Link to the conference resource (if any)
+- `storage_weblink` - Link to the storage resource (if any)
+- `document_weblink` - Link to the incident document (if any)
+- `ticket_weblink` - Link to the incident ticket (if any)
+
+### Executive Template
+
+Often during an incident an executive report needs to be drafted that provides a high-level overview of the incident and the current actions that are being carried out. A copy of this template will be created, filled, and stored in the incident storage every time a new executive report is drafted.
+
+- [Example Executive Report](https://docs.google.com/document/d/1dab6k14p5ageo5B_d1YlB_zS9hMGHDMXy9RUbIZous4)
+
+### Review Template
+
+A copy of this template is automatically created when an incident is marked as stable. It is used by the incident commander and participants for reconciling any incident learnings, discussions, or post-incident tasks.
+
+- [Example Incident Review Document](https://docs.google.com/document/d/1MkCTyheZRtKzMxOBhLgh3PrvarERA9Bwo0joM7D9tmg)
+
+### Tracking Template
+
+Some incidents require the tracking of multiple assets, this template is a simple spreadsheet that allows incident participants to collaborate on tabular data.
+
+- [Example Incident Tracking Sheet](https://docs.google.com/spreadsheets/d/1Odk4KlL7uMF_yd7OvTOCaPWmtTA_WzFBIA4lMeU5cGY)
+
+### Template Association
+
+Case and incident templates can be associated to their corresponding types. This allows our templates to closely match a given case or incident type and provide additional context and direction for those given types.
+
+Additionally, templates can be associated with multiple case or incident types, if for example, you only want to use one template.
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/docs/docs/administration/upgrading.mdx b/docs/docs/administration/upgrading.mdx
new file mode 100644
index 000000000000..bb5c9e065af5
--- /dev/null
+++ b/docs/docs/administration/upgrading.mdx
@@ -0,0 +1,45 @@
+---
+sidebar_position: 2
+description: Staying up to date.
+---
+
+# Upgrading
+
+If you're upgrading to a new major release, you should generate a new configuration file using the latest Dispatch version. Doing so ensures that any new settings are visible and configured if required.
+
+Beyond that, upgrades are simple as bumping the version of Dispatch \(which will cause any changed dependencies to upgrade\), running data migrations, and restarting all related services.
+
+:::info
+In some cases, you may want to stop services before doing the upgrade process or avoid intermittent errors.
+:::
+
+## Upgrading Dispatch
+
+### Upgrading the package
+
+The easiest way to upgrade the Dispatch package using `pip`:
+
+```bash
+uv pip install --upgrade dispatch
+```
+
+You may prefer to install a fixed version rather than the latest, as it will allow you to control changes.
+
+If you're installing from source code, you may have additional unfulfilled requirements, so take the necessary precautions of testing your environment before committing to the upgrade.
+
+### Running Migrations
+
+Just as during the initial setup, migrations are applied with the upgrade command.
+
+```bash
+dispatch database upgrade
+```
+
+### Restarting services
+
+You'll need to ensure that _all_ of Dispatch's services are restarted after an upgrade. Restarting these services is required because Python loads modules in memory, and code changes will not be reflected until they are restarted.
+
+These services include:
+
+- server -- `dispatch server start`
+- scheduler -- `dispatch scheduler start`
diff --git a/docs/docs/administration/user.mdx b/docs/docs/administration/user.mdx
new file mode 100644
index 000000000000..ca5e132e64a0
--- /dev/null
+++ b/docs/docs/administration/user.mdx
@@ -0,0 +1,24 @@
+---
+sidebar_position: 3
+---
+
+# User Management
+
+Users or organizational members represent users of the Dispatch UI and are different from individual contacts or incident participants. These user accounts are used to control access to the Dispatch UI only. We do not currently support the creation or removal of users via the Dispatch UI, except in the case of self-registration.
+
+
+
+
+
+
+
+**Role:** Dispatch uses role-based access control (RBAC) for its UI. Currently, this is only used to protect sensitive incidents whose visibility is set to restricted. We do not currently have any controls surrounding Dispatch configuration and settings. There are four roles defined by Dispatch:
+
+- Member: Can access everything except restricted incidents unless they are a direct participant.
+- Admin: Allows full access to the Dispatch UI and all incidents, whether their visibility is open or restricted.
+- Manager: Currently the same as Admin.
+- Owner: Allows full access to the Dispatch UI and to manager organizations.
+
+**Settings:**
+
+- Default Projects: List of projects that Dispatch will use to apply filtering across the UI (e.g. case or incident tables and dashboards).
diff --git a/docs/docs/changelog.mdx b/docs/docs/changelog.mdx
new file mode 100644
index 000000000000..bde126885e76
--- /dev/null
+++ b/docs/docs/changelog.mdx
@@ -0,0 +1,13 @@
+---
+description: Short description of changes.
+---
+
+# Changelog
+
+:::info
+Dispatch uses the [calver](https://calver.org/) version schema.
+:::
+
+See the the "released" page on the application repository for more information about changes:
+
+[Releases](https://github.com/Netflix/dispatch/releases)
diff --git a/docs/license.md b/docs/docs/license.mdx
similarity index 100%
rename from docs/license.md
rename to docs/docs/license.mdx
diff --git a/docs/docs/support.mdx b/docs/docs/support.mdx
new file mode 100644
index 000000000000..09b5f860a4fb
--- /dev/null
+++ b/docs/docs/support.mdx
@@ -0,0 +1,48 @@
+---
+description: >-
+ We take the security of Dispatch seriously. The following are a set of
+ policies we have adopted to ensure that security issues are addressed in a
+ timely fashion.
+---
+
+# Support
+
+## General Support
+
+We provide support via our Github issue [tracker](https://github.com/Netflix/dispatch/issues). We do our best to respond in a timeline manner, but please keep in mind that open source is not a core responsibility of our job and we provide the code and support on a best effort basis.
+
+## Reporting a Security Issue
+
+We ask that you do not report a security issue to our standard GitHub issue tracker.
+
+If you believe you've identified a security issue with `Dispatch`, please report it via our public Netflix bug bounty program at [https://hackerone.com/netflix](https://hackerone.com/netflix).
+
+Once you've submitted the issue, it will be handled by our triage team, typically within 48 hours.
+
+> **Note:** Any secrets in the codebase, for example in the `docker-compose.yml` file, are used for demonstration purposes and not meant for production systems.
+
+## Support Versions
+
+At any given time, we will provide security support for the `main` branch and the two most recent releases.
+
+## Disclosure Process
+
+Our process for taking a security issue from private discussion to public disclosure involves multiple steps.
+
+Approximately one week before full public disclosure, we will send advance notification of the issue to a list of people and organizations, primarily composed of known users of `Dispatch`. This notification will consist of an email message containing:
+
+- A full description of the issue and the affected versions of `Dispatch`.
+- The steps we will be taking to remedy the issue.
+- The patches, if any, will be applied to `Dispatch`.
+- The date on which the `dispatch` team will apply these patches, issue new releases, and publicly disclose the issue.
+
+Simultaneously, the reporter of the issue will receive notification of the date we plan to make the issue public.
+
+On the day of disclosure, we will take the following steps:
+
+- Apply the relevant patches to the `Dispatch` repository. The commit messages for these patches will indicate that they are for security issues but will not describe the issue in any detail; instead, they will warn of upcoming disclosure.
+- Issue the relevant releases.
+
+If a reported issue is particularly time-sensitive â due to a known exploit in the wild, for example â the time between advance notification and public disclosure may be shortened considerably.
+
+The list of people and organizations who receives the advanced notification of security issues is not, and will not, be made public. This list generally consists of high-profile downstream users and is entirely at the discretion of the `Dispatch` team.
diff --git a/docs/docs/user-guide/cases/index.mdx b/docs/docs/user-guide/cases/index.mdx
new file mode 100644
index 000000000000..1c474683a3cf
--- /dev/null
+++ b/docs/docs/user-guide/cases/index.mdx
@@ -0,0 +1,3 @@
+# Cases
+
+Most participants will never have to use the Dispatch Case UI. But for commanders and power users, this view provides a way to search, filter, and interact with cases even if they are closed.
diff --git a/docs/docs/user-guide/dashboard/incident.mdx b/docs/docs/user-guide/dashboard/incident.mdx
new file mode 100644
index 000000000000..a1e14a740c94
--- /dev/null
+++ b/docs/docs/user-guide/dashboard/incident.mdx
@@ -0,0 +1,48 @@
+---
+description: How to get useful incident metrics.
+sidebar_position: 1
+---
+
+# Incidents
+
+## Aggregated top-line metrics
+
+
+
+
+
+
+
+These metrics are aggregated across all currently filtered incidents.
+
+## Breakdown on key incident facets
+
+### By Incident Type
+
+
+
+
+
+
+
+### By Incident Priority
+
+
+
+
+
+
+
+## Forecasting
+
+Dispatch has the ability to do some _simple_ forecasting. It looks at prior incident history and applies [Exponential Smoothing](https://machinelearningmastery.com/exponential-smoothing-for-time-series-forecasting-in-python/#:~:text=Exponential%20smoothing%20is%20a%20time%20series%20forecasting%20method%20for%20univariate%20data.&text=Exponential%20smoothing%20forecasting%20methods%20are,decreasing%20weight%20for%20past%20observations) to guess how many incidents will be encountered in the future.
+
+This works okay for small incident loads but becomes better with more incidents. If there isn't enough data to make a reasonable forecast one will not be displayed in the UI.
+
+An example forecast:
+
+
+
+
+
+
diff --git a/docs/docs/user-guide/dashboard/index.mdx b/docs/docs/user-guide/dashboard/index.mdx
new file mode 100644
index 000000000000..550862aa8251
--- /dev/null
+++ b/docs/docs/user-guide/dashboard/index.mdx
@@ -0,0 +1,10 @@
+---
+description: How to get useful metrics.
+sidebar_position: 1
+---
+
+# Dashboards
+
+Dispatch provides basic dashboarding and reporting functionality that allows users to understand how incidents impact their organization. It comes preconfigured with useful aggregations like the number of incidents by type and priority.
+
+Additionally, all of the dashboard graphs are dynamic. This dynamism allows us to identify interesting subsets of data by filtering by different incident facets.
diff --git a/docs/docs/user-guide/data.mdx b/docs/docs/user-guide/data.mdx
new file mode 100644
index 000000000000..246cafdf973e
--- /dev/null
+++ b/docs/docs/user-guide/data.mdx
@@ -0,0 +1,3 @@
+# Data
+
+The data view is how one accesses Dispatch's internal data catalog. It provides the ability to store metadata about data sources likely to be used to resolve incidents.
diff --git a/docs/docs/user-guide/incidents/commander.mdx b/docs/docs/user-guide/incidents/commander.mdx
new file mode 100644
index 000000000000..72c0fbd8181a
--- /dev/null
+++ b/docs/docs/user-guide/incidents/commander.mdx
@@ -0,0 +1,226 @@
+---
+description: What to expect as an incident commander.
+---
+
+# Commander
+
+## Reporting
+
+Within Dispatch, Incident Commanders \(ICs\) are also participants and will receive all of the participant messaging. When resolved as the Incident Commander, you are assigned that Dispatch role, and your identity is propagated.
+
+All Slack commands are listed below, or you may view _groups_ of commands relating to [People](#people), [Communications](#communications), [Tasks](#tasks), and [Incident Resources & Metadata](#incident-resources-and-metadata).
+
+## All Slack commands
+
+- [`/dispatch-add-timeline-event`](#%2Fdispatch-add-timeline-event)
+- [`/dispatch-assign-role`](#%2Fdispatch-assign-role)
+- [`/dispatch-engage-oncall`](#%2Fdispatch-engage-oncall)
+- [`/dispatch-list-my-tasks`](#%2Fdispatch-list-my-tasks)
+- [`/dispatch-list-participants`](#%2Fdispatch-list-participants)
+- [`/dispatch-list-tasks`](#%2Fdispatch-list-tasks)
+- [`/dispatch-list-workflows`](#%2Fdispatch-list-workflows)
+- [`/dispatch-list-incidents`](#%2Fdispatch-list-incidents)
+- [`/dispatch-notifications-group`](#%2Fdispatch-notifications-group)
+- [`/dispatch-report-executive`](#%2Fdispatch-report-executive)
+- [`/dispatch-report-incident`](#%2Fdispatch-report-incident)
+- [`/dispatch-report-tactical`](#%2Fdispatch-report-tactical)
+- [`/dispatch-update-incident`](#%2Fdispatch-update-incident)
+- [`/dispatch-update-participant`](#%2Fdispatch-update-participant)
+- [`/dispatch-run-workflow`](#%2Fdispatch-list-workflow)
+- [`/dispatch-create-task`](#%2Fdispatch-create-task)
+- [`/dispatch-create-case`](#%2Fdispatch-create-case)
+## People
+
+These commands help manage the people helping resolve the incident.
+
+### /dispatch-assign-role
+
+Anyone helping run an incident may play various roles. For example, you may have a scribe or an executive liaison, or you may hand off the incident to a new Incident Commander. At any of these times, use `/dispatch-assign-role` to quickly assign a role to any individual.
+
+It's essential to use this command when handing off responsibility for incident leadership. Doing so will help avoid any confusion about the identity of the current Incident Commander.
+
+
+
+
+
+
+
+### /dispatch-engage-oncall
+
+You'll need the help of various teams to resolve an incident. To quickly engage an on-call member of another team, use `/dispatch-engage-oncall` to determine their identity and optionally page them.
+
+
+
+
+
+
+
+### /dispatch-list-participants
+
+Use this command to determine which teams and individuals are engaged in the incident. The output looks like this:
+
+
+
+
+
+
+
+### /dispatch-update-participant
+
+Participants in an incident, or the Incident Commander, may want to know a participant's area of expertise or their expected contribution to resolving an incident. Use `/dispatch-update-participant` to update the reason a participant was added. The dialog appears like this:
+
+
+
+
+
+
+
+## Communications
+
+These commands help manage incident communications.
+
+### /dispatch-notifications-group
+
+An incident notifications group consists of individuals or distribution lists. Manage this group by using `/dispatch-notifications-group.`
+
+
+
+
+
+
+
+### /dispatch-report-executive
+
+Some stakeholders are invested in an incident's progress but aren't expected to be directly involved with the incident. For example, your Chief Financial Officer may want to know of an ongoing security incident regarding financial data but will likely not be directing participants or their actions. To keep external stakeholders such as these informed, use `/dispatch-report-executive` to build and distribute a high-level report.
+
+
+
+
+
+
+
+### /dispatch-report-incident
+
+Use `/dispatch-report-incident` to report a new incident.
+
+
+
+
+
+
+
+### /dispatch-report-tactical
+
+Regular tactical reports, such as using the Conditions, Actions, and Needs (CAN) format, are critical to keeping your participants well-informed. Use `/dispatch-report-tactical` to easily create these.
+
+The report form will appear like this:
+
+
+
+
+
+
+
+The output in the Slack channel will appear like this:
+
+
+
+
+
+
+
+## Tasks
+
+Dispatch provides a lightweight bridge between Google Docs comments assigned as tasks and your Slack incident channel.
+
+It looks like this, in the Incident Document:
+
+
+
+
+
+
+
+The following commands help manage these tasks associated with an incident.
+
+### /dispatch-list-my-tasks
+
+Any individual who issues the `/dispatch-list-my-tasks` command will see a list of tasks created by or assigned to them.
+
+
+
+
+
+
+
+### /dispatch-list-tasks
+
+Use `/dispatch-list-tasks` to display a temporary message listing all tasks associated with the incident.
+
+
+
+
+
+
+
+## Incident resources and metadata
+
+These commands help manage incident resources and metadata (data about the incident).
+
+### /dispatch-update-incident
+
+This command allows the IC to modify several aspects of the incident without ever leaving the conversation interface.
+
+
+
+
+
+
+
+### /dispatch-add-timeline-event
+
+This command helps you add an event to the incident timeline. You may use local time (derived from your Slack profile) or Coordinated Universal Time (UTC).
+
+
+
+
+
+
+
+### /dispatch-list-workflows
+
+This command will list all workflows associated with the current incident.
+
+
+
+
+
+
+
+### /dispatch-run-workflow
+
+This command will run a pre-configured workflow and associate its artifacts with the current incident.
+
+
+
+
+
+
+
+### /dispatch-create-task
+
+This command will create a task for the current incident.
+
+
+{/* TODO(averyl): replace this image */}
+
+
+
+
+### /dispatch-create-case
+
+This command will create a case for the current incident.
+
+
+
+
diff --git a/docs/docs/user-guide/incidents/feedback.mdx b/docs/docs/user-guide/incidents/feedback.mdx
new file mode 100644
index 000000000000..141d7b018f76
--- /dev/null
+++ b/docs/docs/user-guide/incidents/feedback.mdx
@@ -0,0 +1,27 @@
+# Feedback
+
+Dispatch sends a direct message on incident close to all incident participants asking them to rate their satisfaction on how to incident was handled and to provide feedback. Feedback submitted by incident participants is stored and displayed in the feedback section of the UI, and shared with the incident's commander via email on a daily basis.
+
+#### Direct Message
+
+
+
+
+
+
+
+#### Feedback Modal
+
+
+
+
+
+
+
+#### Feedback UI Table
+
+
+
+
+
+
diff --git a/docs/docs/user-guide/incidents/forms.mdx b/docs/docs/user-guide/incidents/forms.mdx
new file mode 100644
index 000000000000..0cea0455cb41
--- /dev/null
+++ b/docs/docs/user-guide/incidents/forms.mdx
@@ -0,0 +1,80 @@
+---
+description: How to create custom incident forms
+---
+
+# Forms
+
+## Creating a custom form
+
+Within Dispatch, admins can create a custom form under Settings -> (choose Project) -> Incident / Form Types. This brings up a table of existing form types (if any). Click on the NEW button to create a new form type.
+
+### Form schema
+
+The form schema takes a JSON array of form objects of the following type:
+
+| Attribute | Possible values | Notes |
+| --------- | --------------------------- | --------------------------------- |
+| type | boolean, select, text, date | |
+| title | | this is the question |
+| if | | conditional (see below) |
+| name | | unique identifier string |
+| multiple | true, false | only for select |
+| options | | list of options / only for select |
+| hint | | shown as small text (optional) |
+
+The following fields are required for each form object: `type`, `title`, and `name`. For `select` types, there must be a corresponding `options` attribute.
+
+Note: be sure to set the form type to "Enabled" so that it will appear in the forms tab in the incident.
+
+#### Conditionals
+
+The `if` attribute can be a complex JavaScript boolean expression. Refer to other form items using the format `$` where `` is the unique name identifier.
+
+##### Example
+
+```
+ [
+ { "type": "boolean", "title": "Is this a good form?", "name": "good_form", "hint": "Check if you like"},
+ { "type": "select", "if": "$good_form", "title": "How good?", "options": [ "Very much", "A lot", "It's ok" ], "multiple": false, "name": "like_level"},
+ { "type": "text", "if": "$good_form && $like_level && $like_level.includes('A lot')", "title": "Provide more feedback", "name": "feedback"}
+ ]
+```
+
+## Fill out a form in an incident
+
+After an incident is opened, go to the View/Edit panel and select the Forms tab at the top. This view will list all of the forms that have been filled out so far. They can either be in the Draft or Completed state. Users can either edit an existing form or create a new one based on any of the enabled form types created as above.
+
+While a form is being completed, the user can **Cancel** to discard any changes, **Save as Draft** to save the filled in information and set as _Draft_, or **Submit** to save and set as _Completed_.
+
+## Attorney review
+
+A new tab on the left "Forms" lists all of the _Draft_ and _Completed_ forms for leadership and attorney review. For each form, users can view/edit, delete, and a special **Attorney Review** option. This option shows relevant incident details and the values filled out in the form. It also provides an attorney status dropdown and two new fields for attorney notes and open questions.
+
+## Additional attorney questions
+
+If additional questions are required for the attorney team to assess the information on the form, each form type has the option of asking additional attorney questions. This schema follows the same format as the main form schema. The additional attorney questions will appear only on the Attorney Section when performing the Attorney Review from the Forms page.
+
+
+
+## Scoring schema
+
+A risk score can be calculated based on the form responses. The scoring schema is a JSON array of objects of the following type:
+
+| Attribute | Type | Notes |
+| --------- | ---- | -------------------------------------- |
+| var | text | the corresponding variable on the form |
+| includes | list | a JSON list of chosen options |
+| score | int | the score to add |
+
+If the form data corresponding to `var` contains any elements in the list `includes`, then the value indicated by `score` is added to the Risk Score. Note that this is currently not cumulative, i.e., if there are two elements on the list in the form data, the score is only added once. For example, using the example form schema above and the below scoring schema, if the user selects "Very much", the risk score would be 10.
+
+```
+ [
+ { "var": "like_level", "includes": [ "Very much", "A lot" ], "score": 10},
+ { "var": "like_level", "includes": [ "It's ok" ], "score": 1},
+ ]
+```
+
+The risk score shows up on the Forms tab in the Risk Score column as well as on the attorney summary section on the Attorney Review card.
+
+
diff --git a/docs/docs/user-guide/incidents/index.mdx b/docs/docs/user-guide/incidents/index.mdx
new file mode 100644
index 000000000000..1b948de7782b
--- /dev/null
+++ b/docs/docs/user-guide/incidents/index.mdx
@@ -0,0 +1,13 @@
+---
+sidebar_position: 2
+---
+
+# Incidents
+
+Most participants will never have to use the Dispatch Incident UI. But for commanders and power users, this view provides a way to search, filter, and interact with incidents even if they are closed.
+
+
+
+
+
+
diff --git a/docs/docs/user-guide/incidents/participant.mdx b/docs/docs/user-guide/incidents/participant.mdx
new file mode 100644
index 000000000000..50e88bf867e3
--- /dev/null
+++ b/docs/docs/user-guide/incidents/participant.mdx
@@ -0,0 +1,83 @@
+---
+description: What to expect as an incident participant.
+---
+
+# Participant
+
+## Reporting
+
+Dispatch attempts to make reporting incidents as easy as possible. Dispatch provides a dedicated incident report form that users throughout the organization can submit to engage incident-related resources.
+
+Located at: `https:///default/incidents/report`
+
+
+
+
+
+
+
+Once submitted, the user is presented with all of the incident resources they need to start managing the incident.
+
+
+
+
+
+
+
+
+
+
+
+
+
+## During
+
+After an incident is created, Dispatch will engage new participants automatically. Which participants are engaged is determined by rules defined in the Dispatch Admin UI.
+
+Each new participant receives a welcome message \(Email + Slack\) providing them resources and information to orient them for this given incident.
+
+
+
+
+
+
+
+
+
+
+
+
+
+Throughout the incident, Dispatch manages the resources necessary to run your investigation, while also providing reminders and notifications.
+
+## After
+
+After an incident is marked stable, Dispatch continues to help with incident management creating additional resources such as Post Incident Review \(PIRs\) documents.
+
+## Notifications
+
+In addition to Dispatch engaging individuals that will be directly responsible for managing the incident, it provides notifications for general awareness throughout the organization.
+
+:::info
+The new incident notification message includes a "Join" button if "Self-Join" is enabled on the project; this allows individuals to add themselves to the incident \(and its resources\) without involvement from the incident commander.
+:::
+
+## Self-service engagement
+
+Often participants will want to "self-subscribe" to incidents given a set of parameters. Dispatch allows individuals to be automatically engaged given these parameters.
+
+To set up an individual's engagement, navigate to `Contact > Individual` and either edit an existing individual or create a new one.
+
+Next, modify the individual's engagement by selecting or adding terms or phrases that you would like to be engaged when found in an incident attributes, inviting the user when a match is found.
+
+For more documentation of incident engagement see [here](administration/settings/contact/index.mdx).
+
+### How it works
+
+For any given set of parameters (incident type, incident priority, title, description, etc.) Dispatch will attempt to engage any individual that has associated with those parameters. Currently, this is an "OR" association between terms. Meaning that if any term is matched, the individual will be pulled into the incident.
+
+As the incident evolves, new information is uncovered. Dispatch will re-evaluate these associations any time those parameters change, adding additional individuals if necessary.
+
+As an example, take an incident that is reported as a "Credential Leak". Dispatch will engage any individual that has associated the terms "Credential", "Leak", and "Credential Leak" (case and punctuation are ignored).
+
+Now, if we find out during the investigation that the incident is really a "System Compromise" and we change the description and title appropriately, Dispatch will then pull in individuals associated with the terms "System", "Compromise", and "System Compromise".
diff --git a/docs/docs/user-guide/incidents/tasks.mdx b/docs/docs/user-guide/incidents/tasks.mdx
new file mode 100644
index 000000000000..485a4b780e74
--- /dev/null
+++ b/docs/docs/user-guide/incidents/tasks.mdx
@@ -0,0 +1,9 @@
+# Tasks
+
+Similar to the incident view, incident tasks are managed in the incident channel itself. But when you want to view all incident tasks across the organization, this view gives you that ability.
+
+
+
+
+
+
diff --git a/docs/docs/user-guide/index.mdx b/docs/docs/user-guide/index.mdx
new file mode 100644
index 000000000000..434ec58577c4
--- /dev/null
+++ b/docs/docs/user-guide/index.mdx
@@ -0,0 +1,14 @@
+---
+title: Introduction
+sidebar_position: 0
+---
+
+# User Guide
+
+There are two _main_ personas within Dispatch: an incident commander and an incident participant.
+
+The incident commander is responsible for driving an incident to resolution. They act as a tie-breaking vote on contentious decisions and are responsible for understanding the _entire_ incident context. Depending on the team and scale of the incident, they are also responsible for delegating tasks and asking questions of other incident participants.
+
+Incident participants are subject matter experts (SMEs) brought into the incident to help resolve tasks or answer the incident commander's questions.
+
+Next, you will find more information about how each of these personas interacts with Dispatch.
diff --git a/docs/docusaurus.config.js b/docs/docusaurus.config.js
new file mode 100644
index 000000000000..f7c4bf327da0
--- /dev/null
+++ b/docs/docusaurus.config.js
@@ -0,0 +1,116 @@
+// @ts-check
+// Note: type annotations allow type checking and IDEs autocompletion
+
+const lightCodeTheme = require("prism-react-renderer/themes/github")
+const darkCodeTheme = require("prism-react-renderer/themes/dracula")
+
+/** @type {import('@docusaurus/types').Config} */
+const config = {
+ title: "Dispatch - Documentation",
+ tagline: "Incident Management for Everyone",
+ favicon: "img/favicon.ico",
+
+ // Set the production url of your site here
+ url: "https://netflix.github.io/",
+ // Set the // pathname under which your site is served
+ // For GitHub pages deployment, it is often '//'
+ baseUrl: "/dispatch",
+
+ // GitHub pages deployment config.
+ // If you aren't using GitHub pages, you don't need these.
+ organizationName: "netflix", // Usually your GitHub org/user name.
+ projectName: "dispatch", // Usually your repo name.
+ trailingSlash: false,
+
+ onBrokenLinks: "throw",
+ onBrokenMarkdownLinks: "warn",
+
+ // Even if you don't use internalization, you can use this field to set useful
+ // metadata like html lang. For example, if your site is Chinese, you may want
+ // to replace "en" with "zh-Hans".
+ i18n: {
+ defaultLocale: "en",
+ locales: ["en"],
+ },
+ plugins: [
+ [
+ require.resolve("@cmfcmf/docusaurus-search-local"),
+ {
+ indexPages: true,
+ style: undefined,
+ },
+ ],
+ ],
+
+ presets: [
+ [
+ "classic",
+ /** @type {import('@docusaurus/preset-classic').Options} */
+ ({
+ docs: {
+ sidebarPath: require.resolve("./sidebars.js"),
+ editUrl: ({ docPath }) =>
+ `https://github.com/netflix/dispatch/edit/main/docs/docs/${docPath}`,
+ },
+ theme: {
+ customCss: require.resolve("./src/css/custom.css"),
+ },
+ }),
+ ],
+ [
+ "redocusaurus",
+ {
+ // Plugin Options for loading OpenAPI files
+ specs: [
+ {
+ spec: "scripts/openapi.yaml",
+ route: "/docs/api/",
+ },
+ ],
+ // Theme Options for modifying how redoc renders them
+ theme: {
+ // Change with your site colors
+ primaryColor: "#E50914",
+ },
+ },
+ ],
+ ],
+
+ themeConfig:
+ /** @type {import('@docusaurus/preset-classic').ThemeConfig} */
+ ({
+ // Replace with your project's social card
+ image: "img/docusaurus-social-card.jpg",
+ navbar: {
+ title: "Dispatch",
+ items: [
+ { to: "/docs/user-guide", label: "User Guide", position: "left" },
+ { to: "/docs/administration", label: "Administration", position: "left" },
+ { to: "/docs/api", label: "API", position: "left" },
+ {
+ to: "/docs/support",
+ label: "Support",
+ position: "left",
+ },
+
+ {
+ href: "https://github.com/Netflix/dispatch",
+ position: "right",
+ className: "header-github-link",
+ "aria-label": "GitHub repository",
+ },
+ ],
+ },
+ footer: {
+ style: "dark",
+ links: [],
+ copyright: `Copyright Š ${new Date().getFullYear()} Dispatch Documentation Built with Docusaurus.`,
+ },
+ prism: {
+ theme: lightCodeTheme,
+ darkTheme: darkCodeTheme,
+ },
+ }),
+}
+
+module.exports = config
diff --git a/docs/faq.md b/docs/faq.md
deleted file mode 100644
index f89f88e4e493..000000000000
--- a/docs/faq.md
+++ /dev/null
@@ -1,16 +0,0 @@
-# FAQ
-
-## Running âdispatch database upgradeâ seems stuck.
-
-Most likely, the upgrade is stuck because an existing query on the database is holding onto a lock that the migration needs.
-
-To resolve, login to your lemur database and run:
-
-> SELECT \* FROM pg\_locks l INNER JOIN pg\_stat\_activity s ON \(l.pid = s.pid\) WHERE waiting AND NOT granted;
-
-This will give you a list of queries that are currently waiting to be executed. From there attempt to idenity the PID of the query blocking the migration. Once found execute:
-
-> select pg\_terminate\_backend\(<blocking-pid>\);
-
-See [http://stackoverflow.com/questions/22896496/alembic-migration-stuck-with-postgresql](http://stackoverflow.com/questions/22896496/alembic-migration-stuck-with-postgresql) for more.
-
diff --git a/docs/installation.md b/docs/installation.md
deleted file mode 100644
index 5445b556ba9e..000000000000
--- a/docs/installation.md
+++ /dev/null
@@ -1,34 +0,0 @@
-# Installation
-
-Dispatch relies on multiple services to work, which are all orchestrated by `Docker Compose`.
-
-### Requirements
-
-* [Docker](https://www.docker.com/) 17.05.0+
-* [Docker Compose](https://docs.docker.com/compose/) 1.19.0+
-* A dedicated \(sub\)domain to host Dispatch on \(for example, dispatch.yourcompany.com\).
-* At least 2400MB memory
-* 2 CPU Cores
-
-## Installing Dispatch Server
-
-We strongly recommend using Docker, for installing Dispatch and all it's services. If you need to to something custom, you can use this repository as the basis of your setup. If you do not wish to use the Docker images we provide, you can still find Dispatch on PyPI. However, we don't recommend that method. You'll need to work your way back from the main Dispatch image. It is not too hard, but you are likely to spend a lot more time and hit some bumps.
-
-To install Dispatch from the repository, clone the repository locally:
-
-```bash
-git clone https://github.com/Netflix/dispatch-docker.git
-```
-
-Before starting installation, we strongly recommend you check out [how to configure your Dispatch instance](configuration/) as you'd need to rebuild your images \(`docker-compose build`\) if you want to change your configuration settings. You may copy and edit the example configs provided in the repository. If none exists, the install script will use these examples as actual configurations.
-
-{% hint style="info" %}
-Note: Dispatch will not start without at least a few required configuration variables, see the example [env](https://github.com/Netflix/dispatch/blob/develop/docker/.env.example).
-{% endhint %}
-
-To start, run the install script:
-
-```bash
-./install.sh
-```
-
diff --git a/docs/package-lock.json b/docs/package-lock.json
new file mode 100644
index 000000000000..23f374933cd6
--- /dev/null
+++ b/docs/package-lock.json
@@ -0,0 +1,13707 @@
+{
+ "name": "docs",
+ "version": "0.0.0",
+ "lockfileVersion": 3,
+ "requires": true,
+ "packages": {
+ "": {
+ "version": "0.0.0",
+ "dependencies": {
+ "@cmfcmf/docusaurus-search-local": "^1.0.0",
+ "@docusaurus/core": "2.4.0",
+ "@docusaurus/preset-classic": "2.4.0",
+ "@mdx-js/react": "^1.6.22",
+ "clsx": "^1.2.1",
+ "prism-react-renderer": "^1.3.5",
+ "react": "^17.0.2",
+ "react-dom": "^17.0.2",
+ "redocusaurus": "^1.6.1"
+ },
+ "devDependencies": {
+ "@docusaurus/module-type-aliases": "2.4.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ }
+ },
+ "node_modules/@algolia/autocomplete-core": {
+ "version": "1.7.4",
+ "resolved": "https://registry.npmjs.org/@algolia/autocomplete-core/-/autocomplete-core-1.7.4.tgz",
+ "integrity": "sha512-daoLpQ3ps/VTMRZDEBfU8ixXd+amZcNJ4QSP3IERGyzqnL5Ch8uSRFt/4G8pUvW9c3o6GA4vtVv4I4lmnkdXyg==",
+ "dependencies": {
+ "@algolia/autocomplete-shared": "1.7.4"
+ }
+ },
+ "node_modules/@algolia/autocomplete-js": {
+ "version": "1.8.3",
+ "resolved": "https://registry.npmjs.org/@algolia/autocomplete-js/-/autocomplete-js-1.8.3.tgz",
+ "integrity": "sha512-h5v/qp8CwmCUOCaNkUa+vaybnIpIoJGEfwE2Ks/84KAqIHYCBgcylwn92PkIL3gbQCok2sc6JoSIlUo0eAgPsQ==",
+ "dependencies": {
+ "@algolia/autocomplete-core": "1.8.3",
+ "@algolia/autocomplete-preset-algolia": "1.8.3",
+ "@algolia/autocomplete-shared": "1.8.3",
+ "htm": "^3.1.1",
+ "preact": "^10.0.0"
+ },
+ "peerDependencies": {
+ "@algolia/client-search": ">= 4.5.1 < 6",
+ "algoliasearch": ">= 4.9.1 < 6"
+ }
+ },
+ "node_modules/@algolia/autocomplete-js/node_modules/@algolia/autocomplete-core": {
+ "version": "1.8.3",
+ "resolved": "https://registry.npmjs.org/@algolia/autocomplete-core/-/autocomplete-core-1.8.3.tgz",
+ "integrity": "sha512-DpNL4PZTes+6pg2ysJQzZZBQUvHSYP1q8IkiJA7UoNqFMf0pdq2bSIehuiMTxNegpMjSszaB7G+o5UgxavKhWA==",
+ "dependencies": {
+ "@algolia/autocomplete-shared": "1.8.3"
+ }
+ },
+ "node_modules/@algolia/autocomplete-js/node_modules/@algolia/autocomplete-preset-algolia": {
+ "version": "1.8.3",
+ "resolved": "https://registry.npmjs.org/@algolia/autocomplete-preset-algolia/-/autocomplete-preset-algolia-1.8.3.tgz",
+ "integrity": "sha512-M5B9VZtMtBFS8KSIzv8m0gtwVYtFBBjCvr8boBi+orbQUqzdoj5f70CqhQxUtnNcFGizHUaShUDV571F33/m7g==",
+ "dependencies": {
+ "@algolia/autocomplete-shared": "1.8.3"
+ },
+ "peerDependencies": {
+ "@algolia/client-search": ">= 4.9.1 < 6",
+ "algoliasearch": ">= 4.9.1 < 6"
+ }
+ },
+ "node_modules/@algolia/autocomplete-js/node_modules/@algolia/autocomplete-shared": {
+ "version": "1.8.3",
+ "resolved": "https://registry.npmjs.org/@algolia/autocomplete-shared/-/autocomplete-shared-1.8.3.tgz",
+ "integrity": "sha512-llwPEemKzVhOjL9AsoZPejkaTTAsCB/2HHBQapC8LgQ2E/ipD5M1kTT6oSJskSVO5zI0YbBOCxAigZhgpPJ3eA=="
+ },
+ "node_modules/@algolia/autocomplete-preset-algolia": {
+ "version": "1.7.4",
+ "resolved": "https://registry.npmjs.org/@algolia/autocomplete-preset-algolia/-/autocomplete-preset-algolia-1.7.4.tgz",
+ "integrity": "sha512-s37hrvLEIfcmKY8VU9LsAXgm2yfmkdHT3DnA3SgHaY93yjZ2qL57wzb5QweVkYuEBZkT2PIREvRoLXC2sxTbpQ==",
+ "dependencies": {
+ "@algolia/autocomplete-shared": "1.7.4"
+ },
+ "peerDependencies": {
+ "@algolia/client-search": ">= 4.9.1 < 6",
+ "algoliasearch": ">= 4.9.1 < 6"
+ }
+ },
+ "node_modules/@algolia/autocomplete-shared": {
+ "version": "1.7.4",
+ "resolved": "https://registry.npmjs.org/@algolia/autocomplete-shared/-/autocomplete-shared-1.7.4.tgz",
+ "integrity": "sha512-2VGCk7I9tA9Ge73Km99+Qg87w0wzW4tgUruvWAn/gfey1ZXgmxZtyIRBebk35R1O8TbK77wujVtCnpsGpRy1kg=="
+ },
+ "node_modules/@algolia/autocomplete-theme-classic": {
+ "version": "1.8.3",
+ "resolved": "https://registry.npmjs.org/@algolia/autocomplete-theme-classic/-/autocomplete-theme-classic-1.8.3.tgz",
+ "integrity": "sha512-sZt8uyBp5bwPTbqM+2cn/T7/OX8y8neEEtX10wWBgD7gakacn3//VEIdD1/+Yu1TJ2frWoP+SwuEbloe/zsMDg=="
+ },
+ "node_modules/@algolia/cache-browser-local-storage": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/@algolia/cache-browser-local-storage/-/cache-browser-local-storage-4.16.0.tgz",
+ "integrity": "sha512-jVrk0YB3tjOhD5/lhBtYCVCeLjZmVpf2kdi4puApofytf/R0scjWz0GdozlW4HhU+Prxmt/c9ge4QFjtv5OAzQ==",
+ "dependencies": {
+ "@algolia/cache-common": "4.16.0"
+ }
+ },
+ "node_modules/@algolia/cache-common": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/@algolia/cache-common/-/cache-common-4.16.0.tgz",
+ "integrity": "sha512-4iHjkSYQYw46pITrNQgXXhvUmcekI8INz1m+SzmqLX8jexSSy4Ky4zfGhZzhhhLHXUP3+x/PK/c0qPjxEvRwKQ=="
+ },
+ "node_modules/@algolia/cache-in-memory": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/@algolia/cache-in-memory/-/cache-in-memory-4.16.0.tgz",
+ "integrity": "sha512-p7RYykvA6Ip6QENxrh99nOD77otVh1sJRivcgcVpnjoZb5sIN3t33eUY1DpB9QSBizcrW+qk19rNkdnZ43a+PQ==",
+ "dependencies": {
+ "@algolia/cache-common": "4.16.0"
+ }
+ },
+ "node_modules/@algolia/client-account": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/@algolia/client-account/-/client-account-4.16.0.tgz",
+ "integrity": "sha512-eydcfpdIyuWoKgUSz5iZ/L0wE/Wl7958kACkvTHLDNXvK/b8Z1zypoJavh6/km1ZNQmFpeYS2jrmq0kUSFn02w==",
+ "dependencies": {
+ "@algolia/client-common": "4.16.0",
+ "@algolia/client-search": "4.16.0",
+ "@algolia/transporter": "4.16.0"
+ }
+ },
+ "node_modules/@algolia/client-analytics": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/@algolia/client-analytics/-/client-analytics-4.16.0.tgz",
+ "integrity": "sha512-cONWXH3BfilgdlCofUm492bJRWtpBLVW/hsUlfoFtiX1u05xoBP7qeiDwh9RR+4pSLHLodYkHAf5U4honQ55Qg==",
+ "dependencies": {
+ "@algolia/client-common": "4.16.0",
+ "@algolia/client-search": "4.16.0",
+ "@algolia/requester-common": "4.16.0",
+ "@algolia/transporter": "4.16.0"
+ }
+ },
+ "node_modules/@algolia/client-common": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/@algolia/client-common/-/client-common-4.16.0.tgz",
+ "integrity": "sha512-QVdR4019ukBH6f5lFr27W60trRxQF1SfS1qo0IP6gjsKhXhUVJuHxOCA6ArF87jrNkeuHEoRoDU+GlvaecNo8g==",
+ "dependencies": {
+ "@algolia/requester-common": "4.16.0",
+ "@algolia/transporter": "4.16.0"
+ }
+ },
+ "node_modules/@algolia/client-personalization": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/@algolia/client-personalization/-/client-personalization-4.16.0.tgz",
+ "integrity": "sha512-irtLafssDGPuhYqIwxqOxiWlVYvrsBD+EMA1P9VJtkKi3vSNBxiWeQ0f0Tn53cUNdSRNEssfoEH84JL97SV2SQ==",
+ "dependencies": {
+ "@algolia/client-common": "4.16.0",
+ "@algolia/requester-common": "4.16.0",
+ "@algolia/transporter": "4.16.0"
+ }
+ },
+ "node_modules/@algolia/client-search": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/@algolia/client-search/-/client-search-4.16.0.tgz",
+ "integrity": "sha512-xsfrAE1jO/JDh1wFrRz+alVyW+aA6qnkzmbWWWZWEgVF3EaFqzIf9r1l/aDtDdBtNTNhX9H3Lg31+BRtd5izQA==",
+ "dependencies": {
+ "@algolia/client-common": "4.16.0",
+ "@algolia/requester-common": "4.16.0",
+ "@algolia/transporter": "4.16.0"
+ }
+ },
+ "node_modules/@algolia/events": {
+ "version": "4.0.1",
+ "resolved": "https://registry.npmjs.org/@algolia/events/-/events-4.0.1.tgz",
+ "integrity": "sha512-FQzvOCgoFXAbf5Y6mYozw2aj5KCJoA3m4heImceldzPSMbdyS4atVjJzXKMsfX3wnZTFYwkkt8/z8UesLHlSBQ=="
+ },
+ "node_modules/@algolia/logger-common": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/@algolia/logger-common/-/logger-common-4.16.0.tgz",
+ "integrity": "sha512-U9H8uCzSDuePJmbnjjTX21aPDRU6x74Tdq3dJmdYu2+pISx02UeBJm4kSgc9RW5jcR5j35G9gnjHY9Q3ngWbyQ=="
+ },
+ "node_modules/@algolia/logger-console": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/@algolia/logger-console/-/logger-console-4.16.0.tgz",
+ "integrity": "sha512-+qymusiM+lPZKrkf0tDjCQA158eEJO2IU+Nr/sJ9TFyI/xkFPjNPzw/Qbc8Iy/xcOXGlc6eMgmyjtVQqAWq6UA==",
+ "dependencies": {
+ "@algolia/logger-common": "4.16.0"
+ }
+ },
+ "node_modules/@algolia/requester-browser-xhr": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/@algolia/requester-browser-xhr/-/requester-browser-xhr-4.16.0.tgz",
+ "integrity": "sha512-gK+kvs6LHl/PaOJfDuwjkopNbG1djzFLsVBklGBsSU6h6VjFkxIpo6Qq80IK14p9cplYZfhfaL12va6Q9p3KVQ==",
+ "dependencies": {
+ "@algolia/requester-common": "4.16.0"
+ }
+ },
+ "node_modules/@algolia/requester-common": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/@algolia/requester-common/-/requester-common-4.16.0.tgz",
+ "integrity": "sha512-3Zmcs/iMubcm4zqZ3vZG6Zum8t+hMWxGMzo0/uY2BD8o9q5vMxIYI0c4ocdgQjkXcix189WtZNkgjSOBzSbkdw=="
+ },
+ "node_modules/@algolia/requester-node-http": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/@algolia/requester-node-http/-/requester-node-http-4.16.0.tgz",
+ "integrity": "sha512-L8JxM2VwZzh8LJ1Zb8TFS6G3icYsCKZsdWW+ahcEs1rGWmyk9SybsOe1MLnjonGBaqPWJkn9NjS7mRdjEmBtKA==",
+ "dependencies": {
+ "@algolia/requester-common": "4.16.0"
+ }
+ },
+ "node_modules/@algolia/transporter": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/@algolia/transporter/-/transporter-4.16.0.tgz",
+ "integrity": "sha512-H9BVB2EAjT65w7XGBNf5drpsW39x2aSZ942j4boSAAJPPlLmjtj5IpAP7UAtsV8g9Beslonh0bLa1XGmE/P0BA==",
+ "dependencies": {
+ "@algolia/cache-common": "4.16.0",
+ "@algolia/logger-common": "4.16.0",
+ "@algolia/requester-common": "4.16.0"
+ }
+ },
+ "node_modules/@ampproject/remapping": {
+ "version": "2.2.0",
+ "resolved": "https://registry.npmjs.org/@ampproject/remapping/-/remapping-2.2.0.tgz",
+ "integrity": "sha512-qRmjj8nj9qmLTQXXmaR1cck3UXSRMPrbsLJAasZpF+t3riI71BXed5ebIOYwQntykeZuhjsdweEc9BxH5Jc26w==",
+ "dependencies": {
+ "@jridgewell/gen-mapping": "^0.1.0",
+ "@jridgewell/trace-mapping": "^0.3.9"
+ },
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/@babel/code-frame": {
+ "version": "7.26.2",
+ "resolved": "https://registry.npmjs.org/@babel/code-frame/-/code-frame-7.26.2.tgz",
+ "integrity": "sha512-RJlIHRueQgwWitWgF8OdFYGZX328Ax5BCemNGlqHfplnRT9ESi8JkFlvaVYbS+UubVY6dpv87Fs2u5M29iNFVQ==",
+ "license": "MIT",
+ "dependencies": {
+ "@babel/helper-validator-identifier": "^7.25.9",
+ "js-tokens": "^4.0.0",
+ "picocolors": "^1.0.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/compat-data": {
+ "version": "7.21.0",
+ "resolved": "https://registry.npmjs.org/@babel/compat-data/-/compat-data-7.21.0.tgz",
+ "integrity": "sha512-gMuZsmsgxk/ENC3O/fRw5QY8A9/uxQbbCEypnLIiYYc/qVJtEV7ouxC3EllIIwNzMqAQee5tanFabWsUOutS7g==",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/core": {
+ "version": "7.21.3",
+ "resolved": "https://registry.npmjs.org/@babel/core/-/core-7.21.3.tgz",
+ "integrity": "sha512-qIJONzoa/qiHghnm0l1n4i/6IIziDpzqc36FBs4pzMhDUraHqponwJLiAKm1hGLP3OSB/TVNz6rMwVGpwxxySw==",
+ "dependencies": {
+ "@ampproject/remapping": "^2.2.0",
+ "@babel/code-frame": "^7.18.6",
+ "@babel/generator": "^7.21.3",
+ "@babel/helper-compilation-targets": "^7.20.7",
+ "@babel/helper-module-transforms": "^7.21.2",
+ "@babel/helpers": "^7.21.0",
+ "@babel/parser": "^7.21.3",
+ "@babel/template": "^7.20.7",
+ "@babel/traverse": "^7.21.3",
+ "@babel/types": "^7.21.3",
+ "convert-source-map": "^1.7.0",
+ "debug": "^4.1.0",
+ "gensync": "^1.0.0-beta.2",
+ "json5": "^2.2.2",
+ "semver": "^6.3.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/babel"
+ }
+ },
+ "node_modules/@babel/core/node_modules/semver": {
+ "version": "6.3.1",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
+ "integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==",
+ "bin": {
+ "semver": "bin/semver.js"
+ }
+ },
+ "node_modules/@babel/generator": {
+ "version": "7.23.0",
+ "resolved": "https://registry.npmjs.org/@babel/generator/-/generator-7.23.0.tgz",
+ "integrity": "sha512-lN85QRR+5IbYrMWM6Y4pE/noaQtg4pNiqeNGX60eqOfo6gtEj6uw/JagelB8vVztSd7R6M5n1+PQkDbHbBRU4g==",
+ "dependencies": {
+ "@babel/types": "^7.23.0",
+ "@jridgewell/gen-mapping": "^0.3.2",
+ "@jridgewell/trace-mapping": "^0.3.17",
+ "jsesc": "^2.5.1"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/generator/node_modules/@jridgewell/gen-mapping": {
+ "version": "0.3.2",
+ "resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.2.tgz",
+ "integrity": "sha512-mh65xKQAzI6iBcFzwv28KVWSmCkdRBWoOh+bYQGW3+6OZvbbN3TqMGo5hqYxQniRcH9F2VZIoJCm4pa3BPDK/A==",
+ "dependencies": {
+ "@jridgewell/set-array": "^1.0.1",
+ "@jridgewell/sourcemap-codec": "^1.4.10",
+ "@jridgewell/trace-mapping": "^0.3.9"
+ },
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/@babel/helper-annotate-as-pure": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/helper-annotate-as-pure/-/helper-annotate-as-pure-7.18.6.tgz",
+ "integrity": "sha512-duORpUiYrEpzKIop6iNbjnwKLAKnJ47csTyRACyEmWj0QdUrm5aqNJGHSSEQSUAvNW0ojX0dOmK9dZduvkfeXA==",
+ "dependencies": {
+ "@babel/types": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-builder-binary-assignment-operator-visitor": {
+ "version": "7.18.9",
+ "resolved": "https://registry.npmjs.org/@babel/helper-builder-binary-assignment-operator-visitor/-/helper-builder-binary-assignment-operator-visitor-7.18.9.tgz",
+ "integrity": "sha512-yFQ0YCHoIqarl8BCRwBL8ulYUaZpz3bNsA7oFepAzee+8/+ImtADXNOmO5vJvsPff3qi+hvpkY/NYBTrBQgdNw==",
+ "dependencies": {
+ "@babel/helper-explode-assignable-expression": "^7.18.6",
+ "@babel/types": "^7.18.9"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-compilation-targets": {
+ "version": "7.20.7",
+ "resolved": "https://registry.npmjs.org/@babel/helper-compilation-targets/-/helper-compilation-targets-7.20.7.tgz",
+ "integrity": "sha512-4tGORmfQcrc+bvrjb5y3dG9Mx1IOZjsHqQVUz7XCNHO+iTmqxWnVg3KRygjGmpRLJGdQSKuvFinbIb0CnZwHAQ==",
+ "dependencies": {
+ "@babel/compat-data": "^7.20.5",
+ "@babel/helper-validator-option": "^7.18.6",
+ "browserslist": "^4.21.3",
+ "lru-cache": "^5.1.1",
+ "semver": "^6.3.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0"
+ }
+ },
+ "node_modules/@babel/helper-compilation-targets/node_modules/semver": {
+ "version": "6.3.1",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
+ "integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==",
+ "bin": {
+ "semver": "bin/semver.js"
+ }
+ },
+ "node_modules/@babel/helper-create-class-features-plugin": {
+ "version": "7.21.0",
+ "resolved": "https://registry.npmjs.org/@babel/helper-create-class-features-plugin/-/helper-create-class-features-plugin-7.21.0.tgz",
+ "integrity": "sha512-Q8wNiMIdwsv5la5SPxNYzzkPnjgC0Sy0i7jLkVOCdllu/xcVNkr3TeZzbHBJrj+XXRqzX5uCyCoV9eu6xUG7KQ==",
+ "dependencies": {
+ "@babel/helper-annotate-as-pure": "^7.18.6",
+ "@babel/helper-environment-visitor": "^7.18.9",
+ "@babel/helper-function-name": "^7.21.0",
+ "@babel/helper-member-expression-to-functions": "^7.21.0",
+ "@babel/helper-optimise-call-expression": "^7.18.6",
+ "@babel/helper-replace-supers": "^7.20.7",
+ "@babel/helper-skip-transparent-expression-wrappers": "^7.20.0",
+ "@babel/helper-split-export-declaration": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0"
+ }
+ },
+ "node_modules/@babel/helper-create-regexp-features-plugin": {
+ "version": "7.21.0",
+ "resolved": "https://registry.npmjs.org/@babel/helper-create-regexp-features-plugin/-/helper-create-regexp-features-plugin-7.21.0.tgz",
+ "integrity": "sha512-N+LaFW/auRSWdx7SHD/HiARwXQju1vXTW4fKr4u5SgBUTm51OKEjKgj+cs00ggW3kEvNqwErnlwuq7Y3xBe4eg==",
+ "dependencies": {
+ "@babel/helper-annotate-as-pure": "^7.18.6",
+ "regexpu-core": "^5.3.1"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0"
+ }
+ },
+ "node_modules/@babel/helper-define-polyfill-provider": {
+ "version": "0.3.3",
+ "resolved": "https://registry.npmjs.org/@babel/helper-define-polyfill-provider/-/helper-define-polyfill-provider-0.3.3.tgz",
+ "integrity": "sha512-z5aQKU4IzbqCC1XH0nAqfsFLMVSo22SBKUc0BxGrLkolTdPTructy0ToNnlO2zA4j9Q/7pjMZf0DSY+DSTYzww==",
+ "dependencies": {
+ "@babel/helper-compilation-targets": "^7.17.7",
+ "@babel/helper-plugin-utils": "^7.16.7",
+ "debug": "^4.1.1",
+ "lodash.debounce": "^4.0.8",
+ "resolve": "^1.14.2",
+ "semver": "^6.1.2"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.4.0-0"
+ }
+ },
+ "node_modules/@babel/helper-define-polyfill-provider/node_modules/semver": {
+ "version": "6.3.1",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
+ "integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==",
+ "bin": {
+ "semver": "bin/semver.js"
+ }
+ },
+ "node_modules/@babel/helper-environment-visitor": {
+ "version": "7.22.20",
+ "resolved": "https://registry.npmjs.org/@babel/helper-environment-visitor/-/helper-environment-visitor-7.22.20.tgz",
+ "integrity": "sha512-zfedSIzFhat/gFhWfHtgWvlec0nqB9YEIVrpuwjruLlXfUSnA8cJB0miHKwqDnQ7d32aKo2xt88/xZptwxbfhA==",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-explode-assignable-expression": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/helper-explode-assignable-expression/-/helper-explode-assignable-expression-7.18.6.tgz",
+ "integrity": "sha512-eyAYAsQmB80jNfg4baAtLeWAQHfHFiR483rzFK+BhETlGZaQC9bsfrugfXDCbRHLQbIA7U5NxhhOxN7p/dWIcg==",
+ "dependencies": {
+ "@babel/types": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-function-name": {
+ "version": "7.23.0",
+ "resolved": "https://registry.npmjs.org/@babel/helper-function-name/-/helper-function-name-7.23.0.tgz",
+ "integrity": "sha512-OErEqsrxjZTJciZ4Oo+eoZqeW9UIiOcuYKRJA4ZAgV9myA+pOXhhmpfNCKjEH/auVfEYVFJ6y1Tc4r0eIApqiw==",
+ "dependencies": {
+ "@babel/template": "^7.22.15",
+ "@babel/types": "^7.23.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-hoist-variables": {
+ "version": "7.22.5",
+ "resolved": "https://registry.npmjs.org/@babel/helper-hoist-variables/-/helper-hoist-variables-7.22.5.tgz",
+ "integrity": "sha512-wGjk9QZVzvknA6yKIUURb8zY3grXCcOZt+/7Wcy8O2uctxhplmUPkOdlgoNhmdVee2c92JXbf1xpMtVNbfoxRw==",
+ "dependencies": {
+ "@babel/types": "^7.22.5"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-member-expression-to-functions": {
+ "version": "7.21.0",
+ "resolved": "https://registry.npmjs.org/@babel/helper-member-expression-to-functions/-/helper-member-expression-to-functions-7.21.0.tgz",
+ "integrity": "sha512-Muu8cdZwNN6mRRNG6lAYErJ5X3bRevgYR2O8wN0yn7jJSnGDu6eG59RfT29JHxGUovyfrh6Pj0XzmR7drNVL3Q==",
+ "dependencies": {
+ "@babel/types": "^7.21.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-module-imports": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/helper-module-imports/-/helper-module-imports-7.18.6.tgz",
+ "integrity": "sha512-0NFvs3VkuSYbFi1x2Vd6tKrywq+z/cLeYC/RJNFrIX/30Bf5aiGYbtvGXolEktzJH8o5E5KJ3tT+nkxuuZFVlA==",
+ "dependencies": {
+ "@babel/types": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-module-transforms": {
+ "version": "7.21.2",
+ "resolved": "https://registry.npmjs.org/@babel/helper-module-transforms/-/helper-module-transforms-7.21.2.tgz",
+ "integrity": "sha512-79yj2AR4U/Oqq/WOV7Lx6hUjau1Zfo4cI+JLAVYeMV5XIlbOhmjEk5ulbTc9fMpmlojzZHkUUxAiK+UKn+hNQQ==",
+ "dependencies": {
+ "@babel/helper-environment-visitor": "^7.18.9",
+ "@babel/helper-module-imports": "^7.18.6",
+ "@babel/helper-simple-access": "^7.20.2",
+ "@babel/helper-split-export-declaration": "^7.18.6",
+ "@babel/helper-validator-identifier": "^7.19.1",
+ "@babel/template": "^7.20.7",
+ "@babel/traverse": "^7.21.2",
+ "@babel/types": "^7.21.2"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-optimise-call-expression": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/helper-optimise-call-expression/-/helper-optimise-call-expression-7.18.6.tgz",
+ "integrity": "sha512-HP59oD9/fEHQkdcbgFCnbmgH5vIQTJbxh2yf+CdM89/glUNnuzr87Q8GIjGEnOktTROemO0Pe0iPAYbqZuOUiA==",
+ "dependencies": {
+ "@babel/types": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-plugin-utils": {
+ "version": "7.20.2",
+ "resolved": "https://registry.npmjs.org/@babel/helper-plugin-utils/-/helper-plugin-utils-7.20.2.tgz",
+ "integrity": "sha512-8RvlJG2mj4huQ4pZ+rU9lqKi9ZKiRmuvGuM2HlWmkmgOhbs6zEAw6IEiJ5cQqGbDzGZOhwuOQNtZMi/ENLjZoQ==",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-remap-async-to-generator": {
+ "version": "7.18.9",
+ "resolved": "https://registry.npmjs.org/@babel/helper-remap-async-to-generator/-/helper-remap-async-to-generator-7.18.9.tgz",
+ "integrity": "sha512-dI7q50YKd8BAv3VEfgg7PS7yD3Rtbi2J1XMXaalXO0W0164hYLnh8zpjRS0mte9MfVp/tltvr/cfdXPvJr1opA==",
+ "dependencies": {
+ "@babel/helper-annotate-as-pure": "^7.18.6",
+ "@babel/helper-environment-visitor": "^7.18.9",
+ "@babel/helper-wrap-function": "^7.18.9",
+ "@babel/types": "^7.18.9"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0"
+ }
+ },
+ "node_modules/@babel/helper-replace-supers": {
+ "version": "7.20.7",
+ "resolved": "https://registry.npmjs.org/@babel/helper-replace-supers/-/helper-replace-supers-7.20.7.tgz",
+ "integrity": "sha512-vujDMtB6LVfNW13jhlCrp48QNslK6JXi7lQG736HVbHz/mbf4Dc7tIRh1Xf5C0rF7BP8iiSxGMCmY6Ci1ven3A==",
+ "dependencies": {
+ "@babel/helper-environment-visitor": "^7.18.9",
+ "@babel/helper-member-expression-to-functions": "^7.20.7",
+ "@babel/helper-optimise-call-expression": "^7.18.6",
+ "@babel/template": "^7.20.7",
+ "@babel/traverse": "^7.20.7",
+ "@babel/types": "^7.20.7"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-simple-access": {
+ "version": "7.20.2",
+ "resolved": "https://registry.npmjs.org/@babel/helper-simple-access/-/helper-simple-access-7.20.2.tgz",
+ "integrity": "sha512-+0woI/WPq59IrqDYbVGfshjT5Dmk/nnbdpcF8SnMhhXObpTq2KNBdLFRFrkVdbDOyUmHBCxzm5FHV1rACIkIbA==",
+ "dependencies": {
+ "@babel/types": "^7.20.2"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-skip-transparent-expression-wrappers": {
+ "version": "7.20.0",
+ "resolved": "https://registry.npmjs.org/@babel/helper-skip-transparent-expression-wrappers/-/helper-skip-transparent-expression-wrappers-7.20.0.tgz",
+ "integrity": "sha512-5y1JYeNKfvnT8sZcK9DVRtpTbGiomYIHviSP3OQWmDPU3DeH4a1ZlT/N2lyQ5P8egjcRaT/Y9aNqUxK0WsnIIg==",
+ "dependencies": {
+ "@babel/types": "^7.20.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-split-export-declaration": {
+ "version": "7.22.6",
+ "resolved": "https://registry.npmjs.org/@babel/helper-split-export-declaration/-/helper-split-export-declaration-7.22.6.tgz",
+ "integrity": "sha512-AsUnxuLhRYsisFiaJwvp1QF+I3KjD5FOxut14q/GzovUe6orHLesW2C7d754kRm53h5gqrz6sFl6sxc4BVtE/g==",
+ "dependencies": {
+ "@babel/types": "^7.22.5"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-string-parser": {
+ "version": "7.25.9",
+ "resolved": "https://registry.npmjs.org/@babel/helper-string-parser/-/helper-string-parser-7.25.9.tgz",
+ "integrity": "sha512-4A/SCr/2KLd5jrtOMFzaKjVtAei3+2r/NChoBNoZ3EyP/+GlhoaEGoWOZUmFmoITP7zOJyHIMm+DYRd8o3PvHA==",
+ "license": "MIT",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-validator-identifier": {
+ "version": "7.25.9",
+ "resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.25.9.tgz",
+ "integrity": "sha512-Ed61U6XJc3CVRfkERJWDz4dJwKe7iLmmJsbOGu9wSloNSFttHV0I8g6UAgb7qnK5ly5bGLPd4oXZlxCdANBOWQ==",
+ "license": "MIT",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-validator-option": {
+ "version": "7.21.0",
+ "resolved": "https://registry.npmjs.org/@babel/helper-validator-option/-/helper-validator-option-7.21.0.tgz",
+ "integrity": "sha512-rmL/B8/f0mKS2baE9ZpyTcTavvEuWhTTW8amjzXNvYG4AwBsqTLikfXsEofsJEfKHf+HQVQbFOHy6o+4cnC/fQ==",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helper-wrap-function": {
+ "version": "7.20.5",
+ "resolved": "https://registry.npmjs.org/@babel/helper-wrap-function/-/helper-wrap-function-7.20.5.tgz",
+ "integrity": "sha512-bYMxIWK5mh+TgXGVqAtnu5Yn1un+v8DDZtqyzKRLUzrh70Eal2O3aZ7aPYiMADO4uKlkzOiRiZ6GX5q3qxvW9Q==",
+ "dependencies": {
+ "@babel/helper-function-name": "^7.19.0",
+ "@babel/template": "^7.18.10",
+ "@babel/traverse": "^7.20.5",
+ "@babel/types": "^7.20.5"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/helpers": {
+ "version": "7.26.10",
+ "resolved": "https://registry.npmjs.org/@babel/helpers/-/helpers-7.26.10.tgz",
+ "integrity": "sha512-UPYc3SauzZ3JGgj87GgZ89JVdC5dj0AoetR5Bw6wj4niittNyFh6+eOGonYvJ1ao6B8lEa3Q3klS7ADZ53bc5g==",
+ "license": "MIT",
+ "dependencies": {
+ "@babel/template": "^7.26.9",
+ "@babel/types": "^7.26.10"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/parser": {
+ "version": "7.26.10",
+ "resolved": "https://registry.npmjs.org/@babel/parser/-/parser-7.26.10.tgz",
+ "integrity": "sha512-6aQR2zGE/QFi8JpDLjUZEPYOs7+mhKXm86VaKFiLP35JQwQb6bwUE+XbvkH0EptsYhbNBSUGaUBLKqxH1xSgsA==",
+ "license": "MIT",
+ "dependencies": {
+ "@babel/types": "^7.26.10"
+ },
+ "bin": {
+ "parser": "bin/babel-parser.js"
+ },
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/@babel/plugin-bugfix-safari-id-destructuring-collision-in-function-expression": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-bugfix-safari-id-destructuring-collision-in-function-expression/-/plugin-bugfix-safari-id-destructuring-collision-in-function-expression-7.18.6.tgz",
+ "integrity": "sha512-Dgxsyg54Fx1d4Nge8UnvTrED63vrwOdPmyvPzlNN/boaliRP54pm3pGzZD1SJUwrBA+Cs/xdG8kXX6Mn/RfISQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0"
+ }
+ },
+ "node_modules/@babel/plugin-bugfix-v8-spread-parameters-in-optional-chaining": {
+ "version": "7.20.7",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-bugfix-v8-spread-parameters-in-optional-chaining/-/plugin-bugfix-v8-spread-parameters-in-optional-chaining-7.20.7.tgz",
+ "integrity": "sha512-sbr9+wNE5aXMBBFBICk01tt7sBf2Oc9ikRFEcem/ZORup9IMUdNhW7/wVLEbbtlWOsEubJet46mHAL2C8+2jKQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/helper-skip-transparent-expression-wrappers": "^7.20.0",
+ "@babel/plugin-proposal-optional-chaining": "^7.20.7"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.13.0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-async-generator-functions": {
+ "version": "7.20.7",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-async-generator-functions/-/plugin-proposal-async-generator-functions-7.20.7.tgz",
+ "integrity": "sha512-xMbiLsn/8RK7Wq7VeVytytS2L6qE69bXPB10YCmMdDZbKF4okCqY74pI/jJQ/8U0b/F6NrT2+14b8/P9/3AMGA==",
+ "dependencies": {
+ "@babel/helper-environment-visitor": "^7.18.9",
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/helper-remap-async-to-generator": "^7.18.9",
+ "@babel/plugin-syntax-async-generators": "^7.8.4"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-class-properties": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-class-properties/-/plugin-proposal-class-properties-7.18.6.tgz",
+ "integrity": "sha512-cumfXOF0+nzZrrN8Rf0t7M+tF6sZc7vhQwYQck9q1/5w2OExlD+b4v4RpMJFaV1Z7WcDRgO6FqvxqxGlwo+RHQ==",
+ "dependencies": {
+ "@babel/helper-create-class-features-plugin": "^7.18.6",
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-class-static-block": {
+ "version": "7.21.0",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-class-static-block/-/plugin-proposal-class-static-block-7.21.0.tgz",
+ "integrity": "sha512-XP5G9MWNUskFuP30IfFSEFB0Z6HzLIUcjYM4bYOPHXl7eiJ9HFv8tWj6TXTN5QODiEhDZAeI4hLok2iHFFV4hw==",
+ "dependencies": {
+ "@babel/helper-create-class-features-plugin": "^7.21.0",
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/plugin-syntax-class-static-block": "^7.14.5"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.12.0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-dynamic-import": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-dynamic-import/-/plugin-proposal-dynamic-import-7.18.6.tgz",
+ "integrity": "sha512-1auuwmK+Rz13SJj36R+jqFPMJWyKEDd7lLSdOj4oJK0UTgGueSAtkrCvz9ewmgyU/P941Rv2fQwZJN8s6QruXw==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6",
+ "@babel/plugin-syntax-dynamic-import": "^7.8.3"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-export-namespace-from": {
+ "version": "7.18.9",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-export-namespace-from/-/plugin-proposal-export-namespace-from-7.18.9.tgz",
+ "integrity": "sha512-k1NtHyOMvlDDFeb9G5PhUXuGj8m/wiwojgQVEhJ/fsVsMCpLyOP4h0uGEjYJKrRI+EVPlb5Jk+Gt9P97lOGwtA==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.9",
+ "@babel/plugin-syntax-export-namespace-from": "^7.8.3"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-json-strings": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-json-strings/-/plugin-proposal-json-strings-7.18.6.tgz",
+ "integrity": "sha512-lr1peyn9kOdbYc0xr0OdHTZ5FMqS6Di+H0Fz2I/JwMzGmzJETNeOFq2pBySw6X/KFL5EWDjlJuMsUGRFb8fQgQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6",
+ "@babel/plugin-syntax-json-strings": "^7.8.3"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-logical-assignment-operators": {
+ "version": "7.20.7",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-logical-assignment-operators/-/plugin-proposal-logical-assignment-operators-7.20.7.tgz",
+ "integrity": "sha512-y7C7cZgpMIjWlKE5T7eJwp+tnRYM89HmRvWM5EQuB5BoHEONjmQ8lSNmBUwOyy/GFRsohJED51YBF79hE1djug==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/plugin-syntax-logical-assignment-operators": "^7.10.4"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-nullish-coalescing-operator": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-nullish-coalescing-operator/-/plugin-proposal-nullish-coalescing-operator-7.18.6.tgz",
+ "integrity": "sha512-wQxQzxYeJqHcfppzBDnm1yAY0jSRkUXR2z8RePZYrKwMKgMlE8+Z6LUno+bd6LvbGh8Gltvy74+9pIYkr+XkKA==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6",
+ "@babel/plugin-syntax-nullish-coalescing-operator": "^7.8.3"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-numeric-separator": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-numeric-separator/-/plugin-proposal-numeric-separator-7.18.6.tgz",
+ "integrity": "sha512-ozlZFogPqoLm8WBr5Z8UckIoE4YQ5KESVcNudyXOR8uqIkliTEgJ3RoketfG6pmzLdeZF0H/wjE9/cCEitBl7Q==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6",
+ "@babel/plugin-syntax-numeric-separator": "^7.10.4"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-object-rest-spread": {
+ "version": "7.20.7",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-object-rest-spread/-/plugin-proposal-object-rest-spread-7.20.7.tgz",
+ "integrity": "sha512-d2S98yCiLxDVmBmE8UjGcfPvNEUbA1U5q5WxaWFUGRzJSVAZqm5W6MbPct0jxnegUZ0niLeNX+IOzEs7wYg9Dg==",
+ "dependencies": {
+ "@babel/compat-data": "^7.20.5",
+ "@babel/helper-compilation-targets": "^7.20.7",
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/plugin-syntax-object-rest-spread": "^7.8.3",
+ "@babel/plugin-transform-parameters": "^7.20.7"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-optional-catch-binding": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-optional-catch-binding/-/plugin-proposal-optional-catch-binding-7.18.6.tgz",
+ "integrity": "sha512-Q40HEhs9DJQyaZfUjjn6vE8Cv4GmMHCYuMGIWUnlxH6400VGxOuwWsPt4FxXxJkC/5eOzgn0z21M9gMT4MOhbw==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6",
+ "@babel/plugin-syntax-optional-catch-binding": "^7.8.3"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-optional-chaining": {
+ "version": "7.21.0",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-optional-chaining/-/plugin-proposal-optional-chaining-7.21.0.tgz",
+ "integrity": "sha512-p4zeefM72gpmEe2fkUr/OnOXpWEf8nAgk7ZYVqqfFiyIG7oFfVZcCrU64hWn5xp4tQ9LkV4bTIa5rD0KANpKNA==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/helper-skip-transparent-expression-wrappers": "^7.20.0",
+ "@babel/plugin-syntax-optional-chaining": "^7.8.3"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-private-methods": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-private-methods/-/plugin-proposal-private-methods-7.18.6.tgz",
+ "integrity": "sha512-nutsvktDItsNn4rpGItSNV2sz1XwS+nfU0Rg8aCx3W3NOKVzdMjJRu0O5OkgDp3ZGICSTbgRpxZoWsxoKRvbeA==",
+ "dependencies": {
+ "@babel/helper-create-class-features-plugin": "^7.18.6",
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-private-property-in-object": {
+ "version": "7.21.0",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-private-property-in-object/-/plugin-proposal-private-property-in-object-7.21.0.tgz",
+ "integrity": "sha512-ha4zfehbJjc5MmXBlHec1igel5TJXXLDDRbuJ4+XT2TJcyD9/V1919BA8gMvsdHcNMBy4WBUBiRb3nw/EQUtBw==",
+ "dependencies": {
+ "@babel/helper-annotate-as-pure": "^7.18.6",
+ "@babel/helper-create-class-features-plugin": "^7.21.0",
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/plugin-syntax-private-property-in-object": "^7.14.5"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-proposal-unicode-property-regex": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-unicode-property-regex/-/plugin-proposal-unicode-property-regex-7.18.6.tgz",
+ "integrity": "sha512-2BShG/d5yoZyXZfVePH91urL5wTG6ASZU9M4o03lKK8u8UW1y08OMttBSOADTcJrnPMpvDXRG3G8fyLh4ovs8w==",
+ "dependencies": {
+ "@babel/helper-create-regexp-features-plugin": "^7.18.6",
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=4"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-async-generators": {
+ "version": "7.8.4",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-async-generators/-/plugin-syntax-async-generators-7.8.4.tgz",
+ "integrity": "sha512-tycmZxkGfZaxhMRbXlPXuVFpdWlXpir2W4AMhSJgRKzk/eDlIXOhb2LHWoLpDF7TEHylV5zNhykX6KAgHJmTNw==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.8.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-class-properties": {
+ "version": "7.12.13",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-class-properties/-/plugin-syntax-class-properties-7.12.13.tgz",
+ "integrity": "sha512-fm4idjKla0YahUNgFNLCB0qySdsoPiZP3iQE3rky0mBUtMZ23yDJ9SJdg6dXTSDnulOVqiF3Hgr9nbXvXTQZYA==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.12.13"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-class-static-block": {
+ "version": "7.14.5",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-class-static-block/-/plugin-syntax-class-static-block-7.14.5.tgz",
+ "integrity": "sha512-b+YyPmr6ldyNnM6sqYeMWE+bgJcJpO6yS4QD7ymxgH34GBPNDM/THBh8iunyvKIZztiwLH4CJZ0RxTk9emgpjw==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.14.5"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-dynamic-import": {
+ "version": "7.8.3",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-dynamic-import/-/plugin-syntax-dynamic-import-7.8.3.tgz",
+ "integrity": "sha512-5gdGbFon+PszYzqs83S3E5mpi7/y/8M9eC90MRTZfduQOYW76ig6SOSPNe41IG5LoP3FGBn2N0RjVDSQiS94kQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.8.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-export-namespace-from": {
+ "version": "7.8.3",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-export-namespace-from/-/plugin-syntax-export-namespace-from-7.8.3.tgz",
+ "integrity": "sha512-MXf5laXo6c1IbEbegDmzGPwGNTsHZmEy6QGznu5Sh2UCWvueywb2ee+CCE4zQiZstxU9BMoQO9i6zUFSY0Kj0Q==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.8.3"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-import-assertions": {
+ "version": "7.20.0",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-import-assertions/-/plugin-syntax-import-assertions-7.20.0.tgz",
+ "integrity": "sha512-IUh1vakzNoWalR8ch/areW7qFopR2AEw03JlG7BbrDqmQ4X3q9uuipQwSGrUn7oGiemKjtSLDhNtQHzMHr1JdQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.19.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-json-strings": {
+ "version": "7.8.3",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-json-strings/-/plugin-syntax-json-strings-7.8.3.tgz",
+ "integrity": "sha512-lY6kdGpWHvjoe2vk4WrAapEuBR69EMxZl+RoGRhrFGNYVK8mOPAW8VfbT/ZgrFbXlDNiiaxQnAtgVCZ6jv30EA==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.8.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-jsx": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-jsx/-/plugin-syntax-jsx-7.18.6.tgz",
+ "integrity": "sha512-6mmljtAedFGTWu2p/8WIORGwy+61PLgOMPOdazc7YoJ9ZCWUyFy3A6CpPkRKLKD1ToAesxX8KGEViAiLo9N+7Q==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-logical-assignment-operators": {
+ "version": "7.10.4",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-logical-assignment-operators/-/plugin-syntax-logical-assignment-operators-7.10.4.tgz",
+ "integrity": "sha512-d8waShlpFDinQ5MtvGU9xDAOzKH47+FFoney2baFIoMr952hKOLp1HR7VszoZvOsV/4+RRszNY7D17ba0te0ig==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.10.4"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-nullish-coalescing-operator": {
+ "version": "7.8.3",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-nullish-coalescing-operator/-/plugin-syntax-nullish-coalescing-operator-7.8.3.tgz",
+ "integrity": "sha512-aSff4zPII1u2QD7y+F8oDsz19ew4IGEJg9SVW+bqwpwtfFleiQDMdzA/R+UlWDzfnHFCxxleFT0PMIrR36XLNQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.8.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-numeric-separator": {
+ "version": "7.10.4",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-numeric-separator/-/plugin-syntax-numeric-separator-7.10.4.tgz",
+ "integrity": "sha512-9H6YdfkcK/uOnY/K7/aA2xpzaAgkQn37yzWUMRK7OaPOqOpGS1+n0H5hxT9AUw9EsSjPW8SVyMJwYRtWs3X3ug==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.10.4"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-object-rest-spread": {
+ "version": "7.8.3",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-object-rest-spread/-/plugin-syntax-object-rest-spread-7.8.3.tgz",
+ "integrity": "sha512-XoqMijGZb9y3y2XskN+P1wUGiVwWZ5JmoDRwx5+3GmEplNyVM2s2Dg8ILFQm8rWM48orGy5YpI5Bl8U1y7ydlA==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.8.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-optional-catch-binding": {
+ "version": "7.8.3",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-optional-catch-binding/-/plugin-syntax-optional-catch-binding-7.8.3.tgz",
+ "integrity": "sha512-6VPD0Pc1lpTqw0aKoeRTMiB+kWhAoT24PA+ksWSBrFtl5SIRVpZlwN3NNPQjehA2E/91FV3RjLWoVTglWcSV3Q==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.8.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-optional-chaining": {
+ "version": "7.8.3",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-optional-chaining/-/plugin-syntax-optional-chaining-7.8.3.tgz",
+ "integrity": "sha512-KoK9ErH1MBlCPxV0VANkXW2/dw4vlbGDrFgz8bmUsBGYkFRcbRwMh6cIJubdPrkxRwuGdtCk0v/wPTKbQgBjkg==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.8.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-private-property-in-object": {
+ "version": "7.14.5",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-private-property-in-object/-/plugin-syntax-private-property-in-object-7.14.5.tgz",
+ "integrity": "sha512-0wVnp9dxJ72ZUJDV27ZfbSj6iHLoytYZmh3rFcxNnvsJF3ktkzLDZPy/mA17HGsaQT3/DQsWYX1f1QGWkCoVUg==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.14.5"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-top-level-await": {
+ "version": "7.14.5",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-top-level-await/-/plugin-syntax-top-level-await-7.14.5.tgz",
+ "integrity": "sha512-hx++upLv5U1rgYfwe1xBQUhRmU41NEvpUvrp8jkrSCdvGSnM5/qdRMtylJ6PG5OFkBaHkbTAKTnd3/YyESRHFw==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.14.5"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-syntax-typescript": {
+ "version": "7.20.0",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-typescript/-/plugin-syntax-typescript-7.20.0.tgz",
+ "integrity": "sha512-rd9TkG+u1CExzS4SM1BlMEhMXwFLKVjOAFFCDx9PbX5ycJWDoWMcwdJH9RhkPu1dOgn5TrxLot/Gx6lWFuAUNQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.19.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-arrow-functions": {
+ "version": "7.20.7",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-arrow-functions/-/plugin-transform-arrow-functions-7.20.7.tgz",
+ "integrity": "sha512-3poA5E7dzDomxj9WXWwuD6A5F3kc7VXwIJO+E+J8qtDtS+pXPAhrgEyh+9GBwBgPq1Z+bB+/JD60lp5jsN7JPQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.20.2"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-async-to-generator": {
+ "version": "7.20.7",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-async-to-generator/-/plugin-transform-async-to-generator-7.20.7.tgz",
+ "integrity": "sha512-Uo5gwHPT9vgnSXQxqGtpdufUiWp96gk7yiP4Mp5bm1QMkEmLXBO7PAGYbKoJ6DhAwiNkcHFBol/x5zZZkL/t0Q==",
+ "dependencies": {
+ "@babel/helper-module-imports": "^7.18.6",
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/helper-remap-async-to-generator": "^7.18.9"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-block-scoped-functions": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-block-scoped-functions/-/plugin-transform-block-scoped-functions-7.18.6.tgz",
+ "integrity": "sha512-ExUcOqpPWnliRcPqves5HJcJOvHvIIWfuS4sroBUenPuMdmW+SMHDakmtS7qOo13sVppmUijqeTv7qqGsvURpQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-block-scoping": {
+ "version": "7.21.0",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-block-scoping/-/plugin-transform-block-scoping-7.21.0.tgz",
+ "integrity": "sha512-Mdrbunoh9SxwFZapeHVrwFmri16+oYotcZysSzhNIVDwIAb1UV+kvnxULSYq9J3/q5MDG+4X6w8QVgD1zhBXNQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.20.2"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-classes": {
+ "version": "7.21.0",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-classes/-/plugin-transform-classes-7.21.0.tgz",
+ "integrity": "sha512-RZhbYTCEUAe6ntPehC4hlslPWosNHDox+vAs4On/mCLRLfoDVHf6hVEd7kuxr1RnHwJmxFfUM3cZiZRmPxJPXQ==",
+ "dependencies": {
+ "@babel/helper-annotate-as-pure": "^7.18.6",
+ "@babel/helper-compilation-targets": "^7.20.7",
+ "@babel/helper-environment-visitor": "^7.18.9",
+ "@babel/helper-function-name": "^7.21.0",
+ "@babel/helper-optimise-call-expression": "^7.18.6",
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/helper-replace-supers": "^7.20.7",
+ "@babel/helper-split-export-declaration": "^7.18.6",
+ "globals": "^11.1.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-computed-properties": {
+ "version": "7.20.7",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-computed-properties/-/plugin-transform-computed-properties-7.20.7.tgz",
+ "integrity": "sha512-Lz7MvBK6DTjElHAmfu6bfANzKcxpyNPeYBGEafyA6E5HtRpjpZwU+u7Qrgz/2OR0z+5TvKYbPdphfSaAcZBrYQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/template": "^7.20.7"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-destructuring": {
+ "version": "7.21.3",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-destructuring/-/plugin-transform-destructuring-7.21.3.tgz",
+ "integrity": "sha512-bp6hwMFzuiE4HqYEyoGJ/V2LeIWn+hLVKc4pnj++E5XQptwhtcGmSayM029d/j2X1bPKGTlsyPwAubuU22KhMA==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.20.2"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-dotall-regex": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-dotall-regex/-/plugin-transform-dotall-regex-7.18.6.tgz",
+ "integrity": "sha512-6S3jpun1eEbAxq7TdjLotAsl4WpQI9DxfkycRcKrjhQYzU87qpXdknpBg/e+TdcMehqGnLFi7tnFUBR02Vq6wg==",
+ "dependencies": {
+ "@babel/helper-create-regexp-features-plugin": "^7.18.6",
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-duplicate-keys": {
+ "version": "7.18.9",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-duplicate-keys/-/plugin-transform-duplicate-keys-7.18.9.tgz",
+ "integrity": "sha512-d2bmXCtZXYc59/0SanQKbiWINadaJXqtvIQIzd4+hNwkWBgyCd5F/2t1kXoUdvPMrxzPvhK6EMQRROxsue+mfw==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.9"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-exponentiation-operator": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-exponentiation-operator/-/plugin-transform-exponentiation-operator-7.18.6.tgz",
+ "integrity": "sha512-wzEtc0+2c88FVR34aQmiz56dxEkxr2g8DQb/KfaFa1JYXOFVsbhvAonFN6PwVWj++fKmku8NP80plJ5Et4wqHw==",
+ "dependencies": {
+ "@babel/helper-builder-binary-assignment-operator-visitor": "^7.18.6",
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-for-of": {
+ "version": "7.21.0",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-for-of/-/plugin-transform-for-of-7.21.0.tgz",
+ "integrity": "sha512-LlUYlydgDkKpIY7mcBWvyPPmMcOphEyYA27Ef4xpbh1IiDNLr0kZsos2nf92vz3IccvJI25QUwp86Eo5s6HmBQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.20.2"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-function-name": {
+ "version": "7.18.9",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-function-name/-/plugin-transform-function-name-7.18.9.tgz",
+ "integrity": "sha512-WvIBoRPaJQ5yVHzcnJFor7oS5Ls0PYixlTYE63lCj2RtdQEl15M68FXQlxnG6wdraJIXRdR7KI+hQ7q/9QjrCQ==",
+ "dependencies": {
+ "@babel/helper-compilation-targets": "^7.18.9",
+ "@babel/helper-function-name": "^7.18.9",
+ "@babel/helper-plugin-utils": "^7.18.9"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-literals": {
+ "version": "7.18.9",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-literals/-/plugin-transform-literals-7.18.9.tgz",
+ "integrity": "sha512-IFQDSRoTPnrAIrI5zoZv73IFeZu2dhu6irxQjY9rNjTT53VmKg9fenjvoiOWOkJ6mm4jKVPtdMzBY98Fp4Z4cg==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.9"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-member-expression-literals": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-member-expression-literals/-/plugin-transform-member-expression-literals-7.18.6.tgz",
+ "integrity": "sha512-qSF1ihLGO3q+/g48k85tUjD033C29TNTVB2paCwZPVmOsjn9pClvYYrM2VeJpBY2bcNkuny0YUyTNRyRxJ54KA==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-modules-amd": {
+ "version": "7.20.11",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-modules-amd/-/plugin-transform-modules-amd-7.20.11.tgz",
+ "integrity": "sha512-NuzCt5IIYOW0O30UvqktzHYR2ud5bOWbY0yaxWZ6G+aFzOMJvrs5YHNikrbdaT15+KNO31nPOy5Fim3ku6Zb5g==",
+ "dependencies": {
+ "@babel/helper-module-transforms": "^7.20.11",
+ "@babel/helper-plugin-utils": "^7.20.2"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-modules-commonjs": {
+ "version": "7.21.2",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-modules-commonjs/-/plugin-transform-modules-commonjs-7.21.2.tgz",
+ "integrity": "sha512-Cln+Yy04Gxua7iPdj6nOV96smLGjpElir5YwzF0LBPKoPlLDNJePNlrGGaybAJkd0zKRnOVXOgizSqPYMNYkzA==",
+ "dependencies": {
+ "@babel/helper-module-transforms": "^7.21.2",
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/helper-simple-access": "^7.20.2"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-modules-systemjs": {
+ "version": "7.20.11",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-modules-systemjs/-/plugin-transform-modules-systemjs-7.20.11.tgz",
+ "integrity": "sha512-vVu5g9BPQKSFEmvt2TA4Da5N+QVS66EX21d8uoOihC+OCpUoGvzVsXeqFdtAEfVa5BILAeFt+U7yVmLbQnAJmw==",
+ "dependencies": {
+ "@babel/helper-hoist-variables": "^7.18.6",
+ "@babel/helper-module-transforms": "^7.20.11",
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/helper-validator-identifier": "^7.19.1"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-modules-umd": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-modules-umd/-/plugin-transform-modules-umd-7.18.6.tgz",
+ "integrity": "sha512-dcegErExVeXcRqNtkRU/z8WlBLnvD4MRnHgNs3MytRO1Mn1sHRyhbcpYbVMGclAqOjdW+9cfkdZno9dFdfKLfQ==",
+ "dependencies": {
+ "@babel/helper-module-transforms": "^7.18.6",
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-named-capturing-groups-regex": {
+ "version": "7.20.5",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-named-capturing-groups-regex/-/plugin-transform-named-capturing-groups-regex-7.20.5.tgz",
+ "integrity": "sha512-mOW4tTzi5iTLnw+78iEq3gr8Aoq4WNRGpmSlrogqaiCBoR1HFhpU4JkpQFOHfeYx3ReVIFWOQJS4aZBRvuZ6mA==",
+ "dependencies": {
+ "@babel/helper-create-regexp-features-plugin": "^7.20.5",
+ "@babel/helper-plugin-utils": "^7.20.2"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-new-target": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-new-target/-/plugin-transform-new-target-7.18.6.tgz",
+ "integrity": "sha512-DjwFA/9Iu3Z+vrAn+8pBUGcjhxKguSMlsFqeCKbhb9BAV756v0krzVK04CRDi/4aqmk8BsHb4a/gFcaA5joXRw==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-object-super": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-object-super/-/plugin-transform-object-super-7.18.6.tgz",
+ "integrity": "sha512-uvGz6zk+pZoS1aTZrOvrbj6Pp/kK2mp45t2B+bTDre2UgsZZ8EZLSJtUg7m/no0zOJUWgFONpB7Zv9W2tSaFlA==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6",
+ "@babel/helper-replace-supers": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-parameters": {
+ "version": "7.21.3",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-parameters/-/plugin-transform-parameters-7.21.3.tgz",
+ "integrity": "sha512-Wxc+TvppQG9xWFYatvCGPvZ6+SIUxQ2ZdiBP+PHYMIjnPXD+uThCshaz4NZOnODAtBjjcVQQ/3OKs9LW28purQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.20.2"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-property-literals": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-property-literals/-/plugin-transform-property-literals-7.18.6.tgz",
+ "integrity": "sha512-cYcs6qlgafTud3PAzrrRNbQtfpQ8+y/+M5tKmksS9+M1ckbH6kzY8MrexEM9mcA6JDsukE19iIRvAyYl463sMg==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-react-constant-elements": {
+ "version": "7.21.3",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-react-constant-elements/-/plugin-transform-react-constant-elements-7.21.3.tgz",
+ "integrity": "sha512-4DVcFeWe/yDYBLp0kBmOGFJ6N2UYg7coGid1gdxb4co62dy/xISDMaYBXBVXEDhfgMk7qkbcYiGtwd5Q/hwDDQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.20.2"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-react-display-name": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-react-display-name/-/plugin-transform-react-display-name-7.18.6.tgz",
+ "integrity": "sha512-TV4sQ+T013n61uMoygyMRm+xf04Bd5oqFpv2jAEQwSZ8NwQA7zeRPg1LMVg2PWi3zWBz+CLKD+v5bcpZ/BS0aA==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-react-jsx": {
+ "version": "7.21.0",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-react-jsx/-/plugin-transform-react-jsx-7.21.0.tgz",
+ "integrity": "sha512-6OAWljMvQrZjR2DaNhVfRz6dkCAVV+ymcLUmaf8bccGOHn2v5rHJK3tTpij0BuhdYWP4LLaqj5lwcdlpAAPuvg==",
+ "dependencies": {
+ "@babel/helper-annotate-as-pure": "^7.18.6",
+ "@babel/helper-module-imports": "^7.18.6",
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/plugin-syntax-jsx": "^7.18.6",
+ "@babel/types": "^7.21.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-react-jsx-development": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-react-jsx-development/-/plugin-transform-react-jsx-development-7.18.6.tgz",
+ "integrity": "sha512-SA6HEjwYFKF7WDjWcMcMGUimmw/nhNRDWxr+KaLSCrkD/LMDBvWRmHAYgE1HDeF8KUuI8OAu+RT6EOtKxSW2qA==",
+ "dependencies": {
+ "@babel/plugin-transform-react-jsx": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-react-pure-annotations": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-react-pure-annotations/-/plugin-transform-react-pure-annotations-7.18.6.tgz",
+ "integrity": "sha512-I8VfEPg9r2TRDdvnHgPepTKvuRomzA8+u+nhY7qSI1fR2hRNebasZEETLyM5mAUr0Ku56OkXJ0I7NHJnO6cJiQ==",
+ "dependencies": {
+ "@babel/helper-annotate-as-pure": "^7.18.6",
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-regenerator": {
+ "version": "7.20.5",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-regenerator/-/plugin-transform-regenerator-7.20.5.tgz",
+ "integrity": "sha512-kW/oO7HPBtntbsahzQ0qSE3tFvkFwnbozz3NWFhLGqH75vLEg+sCGngLlhVkePlCs3Jv0dBBHDzCHxNiFAQKCQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "regenerator-transform": "^0.15.1"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-reserved-words": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-reserved-words/-/plugin-transform-reserved-words-7.18.6.tgz",
+ "integrity": "sha512-oX/4MyMoypzHjFrT1CdivfKZ+XvIPMFXwwxHp/r0Ddy2Vuomt4HDFGmft1TAY2yiTKiNSsh3kjBAzcM8kSdsjA==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-runtime": {
+ "version": "7.21.0",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-runtime/-/plugin-transform-runtime-7.21.0.tgz",
+ "integrity": "sha512-ReY6pxwSzEU0b3r2/T/VhqMKg/AkceBT19X0UptA3/tYi5Pe2eXgEUH+NNMC5nok6c6XQz5tyVTUpuezRfSMSg==",
+ "dependencies": {
+ "@babel/helper-module-imports": "^7.18.6",
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "babel-plugin-polyfill-corejs2": "^0.3.3",
+ "babel-plugin-polyfill-corejs3": "^0.6.0",
+ "babel-plugin-polyfill-regenerator": "^0.4.1",
+ "semver": "^6.3.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-runtime/node_modules/semver": {
+ "version": "6.3.1",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
+ "integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==",
+ "bin": {
+ "semver": "bin/semver.js"
+ }
+ },
+ "node_modules/@babel/plugin-transform-shorthand-properties": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-shorthand-properties/-/plugin-transform-shorthand-properties-7.18.6.tgz",
+ "integrity": "sha512-eCLXXJqv8okzg86ywZJbRn19YJHU4XUa55oz2wbHhaQVn/MM+XhukiT7SYqp/7o00dg52Rj51Ny+Ecw4oyoygw==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-spread": {
+ "version": "7.20.7",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-spread/-/plugin-transform-spread-7.20.7.tgz",
+ "integrity": "sha512-ewBbHQ+1U/VnH1fxltbJqDeWBU1oNLG8Dj11uIv3xVf7nrQu0bPGe5Rf716r7K5Qz+SqtAOVswoVunoiBtGhxw==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/helper-skip-transparent-expression-wrappers": "^7.20.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-sticky-regex": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-sticky-regex/-/plugin-transform-sticky-regex-7.18.6.tgz",
+ "integrity": "sha512-kfiDrDQ+PBsQDO85yj1icueWMfGfJFKN1KCkndygtu/C9+XUfydLC8Iv5UYJqRwy4zk8EcplRxEOeLyjq1gm6Q==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-template-literals": {
+ "version": "7.18.9",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-template-literals/-/plugin-transform-template-literals-7.18.9.tgz",
+ "integrity": "sha512-S8cOWfT82gTezpYOiVaGHrCbhlHgKhQt8XH5ES46P2XWmX92yisoZywf5km75wv5sYcXDUCLMmMxOLCtthDgMA==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.9"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-typeof-symbol": {
+ "version": "7.18.9",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-typeof-symbol/-/plugin-transform-typeof-symbol-7.18.9.tgz",
+ "integrity": "sha512-SRfwTtF11G2aemAZWivL7PD+C9z52v9EvMqH9BuYbabyPuKUvSWks3oCg6041pT925L4zVFqaVBeECwsmlguEw==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.9"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-typescript": {
+ "version": "7.21.3",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-typescript/-/plugin-transform-typescript-7.21.3.tgz",
+ "integrity": "sha512-RQxPz6Iqt8T0uw/WsJNReuBpWpBqs/n7mNo18sKLoTbMp+UrEekhH+pKSVC7gWz+DNjo9gryfV8YzCiT45RgMw==",
+ "dependencies": {
+ "@babel/helper-annotate-as-pure": "^7.18.6",
+ "@babel/helper-create-class-features-plugin": "^7.21.0",
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/plugin-syntax-typescript": "^7.20.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-unicode-escapes": {
+ "version": "7.18.10",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-unicode-escapes/-/plugin-transform-unicode-escapes-7.18.10.tgz",
+ "integrity": "sha512-kKAdAI+YzPgGY/ftStBFXTI1LZFju38rYThnfMykS+IXy8BVx+res7s2fxf1l8I35DV2T97ezo6+SGrXz6B3iQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.9"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/plugin-transform-unicode-regex": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-transform-unicode-regex/-/plugin-transform-unicode-regex-7.18.6.tgz",
+ "integrity": "sha512-gE7A6Lt7YLnNOL3Pb9BNeZvi+d8l7tcRrG4+pwJjK9hD2xX4mEvjlQW60G9EEmfXVYRPv9VRQcyegIVHCql/AA==",
+ "dependencies": {
+ "@babel/helper-create-regexp-features-plugin": "^7.18.6",
+ "@babel/helper-plugin-utils": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/preset-env": {
+ "version": "7.20.2",
+ "resolved": "https://registry.npmjs.org/@babel/preset-env/-/preset-env-7.20.2.tgz",
+ "integrity": "sha512-1G0efQEWR1EHkKvKHqbG+IN/QdgwfByUpM5V5QroDzGV2t3S/WXNQd693cHiHTlCFMpr9B6FkPFXDA2lQcKoDg==",
+ "dependencies": {
+ "@babel/compat-data": "^7.20.1",
+ "@babel/helper-compilation-targets": "^7.20.0",
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/helper-validator-option": "^7.18.6",
+ "@babel/plugin-bugfix-safari-id-destructuring-collision-in-function-expression": "^7.18.6",
+ "@babel/plugin-bugfix-v8-spread-parameters-in-optional-chaining": "^7.18.9",
+ "@babel/plugin-proposal-async-generator-functions": "^7.20.1",
+ "@babel/plugin-proposal-class-properties": "^7.18.6",
+ "@babel/plugin-proposal-class-static-block": "^7.18.6",
+ "@babel/plugin-proposal-dynamic-import": "^7.18.6",
+ "@babel/plugin-proposal-export-namespace-from": "^7.18.9",
+ "@babel/plugin-proposal-json-strings": "^7.18.6",
+ "@babel/plugin-proposal-logical-assignment-operators": "^7.18.9",
+ "@babel/plugin-proposal-nullish-coalescing-operator": "^7.18.6",
+ "@babel/plugin-proposal-numeric-separator": "^7.18.6",
+ "@babel/plugin-proposal-object-rest-spread": "^7.20.2",
+ "@babel/plugin-proposal-optional-catch-binding": "^7.18.6",
+ "@babel/plugin-proposal-optional-chaining": "^7.18.9",
+ "@babel/plugin-proposal-private-methods": "^7.18.6",
+ "@babel/plugin-proposal-private-property-in-object": "^7.18.6",
+ "@babel/plugin-proposal-unicode-property-regex": "^7.18.6",
+ "@babel/plugin-syntax-async-generators": "^7.8.4",
+ "@babel/plugin-syntax-class-properties": "^7.12.13",
+ "@babel/plugin-syntax-class-static-block": "^7.14.5",
+ "@babel/plugin-syntax-dynamic-import": "^7.8.3",
+ "@babel/plugin-syntax-export-namespace-from": "^7.8.3",
+ "@babel/plugin-syntax-import-assertions": "^7.20.0",
+ "@babel/plugin-syntax-json-strings": "^7.8.3",
+ "@babel/plugin-syntax-logical-assignment-operators": "^7.10.4",
+ "@babel/plugin-syntax-nullish-coalescing-operator": "^7.8.3",
+ "@babel/plugin-syntax-numeric-separator": "^7.10.4",
+ "@babel/plugin-syntax-object-rest-spread": "^7.8.3",
+ "@babel/plugin-syntax-optional-catch-binding": "^7.8.3",
+ "@babel/plugin-syntax-optional-chaining": "^7.8.3",
+ "@babel/plugin-syntax-private-property-in-object": "^7.14.5",
+ "@babel/plugin-syntax-top-level-await": "^7.14.5",
+ "@babel/plugin-transform-arrow-functions": "^7.18.6",
+ "@babel/plugin-transform-async-to-generator": "^7.18.6",
+ "@babel/plugin-transform-block-scoped-functions": "^7.18.6",
+ "@babel/plugin-transform-block-scoping": "^7.20.2",
+ "@babel/plugin-transform-classes": "^7.20.2",
+ "@babel/plugin-transform-computed-properties": "^7.18.9",
+ "@babel/plugin-transform-destructuring": "^7.20.2",
+ "@babel/plugin-transform-dotall-regex": "^7.18.6",
+ "@babel/plugin-transform-duplicate-keys": "^7.18.9",
+ "@babel/plugin-transform-exponentiation-operator": "^7.18.6",
+ "@babel/plugin-transform-for-of": "^7.18.8",
+ "@babel/plugin-transform-function-name": "^7.18.9",
+ "@babel/plugin-transform-literals": "^7.18.9",
+ "@babel/plugin-transform-member-expression-literals": "^7.18.6",
+ "@babel/plugin-transform-modules-amd": "^7.19.6",
+ "@babel/plugin-transform-modules-commonjs": "^7.19.6",
+ "@babel/plugin-transform-modules-systemjs": "^7.19.6",
+ "@babel/plugin-transform-modules-umd": "^7.18.6",
+ "@babel/plugin-transform-named-capturing-groups-regex": "^7.19.1",
+ "@babel/plugin-transform-new-target": "^7.18.6",
+ "@babel/plugin-transform-object-super": "^7.18.6",
+ "@babel/plugin-transform-parameters": "^7.20.1",
+ "@babel/plugin-transform-property-literals": "^7.18.6",
+ "@babel/plugin-transform-regenerator": "^7.18.6",
+ "@babel/plugin-transform-reserved-words": "^7.18.6",
+ "@babel/plugin-transform-shorthand-properties": "^7.18.6",
+ "@babel/plugin-transform-spread": "^7.19.0",
+ "@babel/plugin-transform-sticky-regex": "^7.18.6",
+ "@babel/plugin-transform-template-literals": "^7.18.9",
+ "@babel/plugin-transform-typeof-symbol": "^7.18.9",
+ "@babel/plugin-transform-unicode-escapes": "^7.18.10",
+ "@babel/plugin-transform-unicode-regex": "^7.18.6",
+ "@babel/preset-modules": "^0.1.5",
+ "@babel/types": "^7.20.2",
+ "babel-plugin-polyfill-corejs2": "^0.3.3",
+ "babel-plugin-polyfill-corejs3": "^0.6.0",
+ "babel-plugin-polyfill-regenerator": "^0.4.1",
+ "core-js-compat": "^3.25.1",
+ "semver": "^6.3.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/preset-env/node_modules/semver": {
+ "version": "6.3.1",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
+ "integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==",
+ "bin": {
+ "semver": "bin/semver.js"
+ }
+ },
+ "node_modules/@babel/preset-modules": {
+ "version": "0.1.5",
+ "resolved": "https://registry.npmjs.org/@babel/preset-modules/-/preset-modules-0.1.5.tgz",
+ "integrity": "sha512-A57th6YRG7oR3cq/yt/Y84MvGgE0eJG2F1JLhKuyG+jFxEgrd/HAMJatiFtmOiZurz+0DkrvbheCLaV5f2JfjA==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.0.0",
+ "@babel/plugin-proposal-unicode-property-regex": "^7.4.4",
+ "@babel/plugin-transform-dotall-regex": "^7.4.4",
+ "@babel/types": "^7.4.4",
+ "esutils": "^2.0.2"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/preset-react": {
+ "version": "7.18.6",
+ "resolved": "https://registry.npmjs.org/@babel/preset-react/-/preset-react-7.18.6.tgz",
+ "integrity": "sha512-zXr6atUmyYdiWRVLOZahakYmOBHtWc2WGCkP8PYTgZi0iJXDY2CN180TdrIW4OGOAdLc7TifzDIvtx6izaRIzg==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.18.6",
+ "@babel/helper-validator-option": "^7.18.6",
+ "@babel/plugin-transform-react-display-name": "^7.18.6",
+ "@babel/plugin-transform-react-jsx": "^7.18.6",
+ "@babel/plugin-transform-react-jsx-development": "^7.18.6",
+ "@babel/plugin-transform-react-pure-annotations": "^7.18.6"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/preset-typescript": {
+ "version": "7.21.0",
+ "resolved": "https://registry.npmjs.org/@babel/preset-typescript/-/preset-typescript-7.21.0.tgz",
+ "integrity": "sha512-myc9mpoVA5m1rF8K8DgLEatOYFDpwC+RkMkjZ0Du6uI62YvDe8uxIEYVs/VCdSJ097nlALiU/yBC7//3nI+hNg==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.20.2",
+ "@babel/helper-validator-option": "^7.21.0",
+ "@babel/plugin-transform-typescript": "^7.21.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@babel/regjsgen": {
+ "version": "0.8.0",
+ "resolved": "https://registry.npmjs.org/@babel/regjsgen/-/regjsgen-0.8.0.tgz",
+ "integrity": "sha512-x/rqGMdzj+fWZvCOYForTghzbtqPDZ5gPwaoNGHdgDfF2QA/XZbCBp4Moo5scrkAMPhB7z26XM/AaHuIJdgauA=="
+ },
+ "node_modules/@babel/runtime": {
+ "version": "7.27.0",
+ "resolved": "https://registry.npmjs.org/@babel/runtime/-/runtime-7.27.0.tgz",
+ "integrity": "sha512-VtPOkrdPHZsKc/clNqyi9WUA8TINkZ4cGk63UUE3u4pmB2k+ZMQRDuIOagv8UVd6j7k0T3+RRIb7beKTebNbcw==",
+ "license": "MIT",
+ "dependencies": {
+ "regenerator-runtime": "^0.14.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/runtime-corejs3": {
+ "version": "7.27.0",
+ "resolved": "https://registry.npmjs.org/@babel/runtime-corejs3/-/runtime-corejs3-7.27.0.tgz",
+ "integrity": "sha512-UWjX6t+v+0ckwZ50Y5ShZLnlk95pP5MyW/pon9tiYzl3+18pkTHTFNTKr7rQbfRXPkowt2QAn30o1b6oswszew==",
+ "license": "MIT",
+ "dependencies": {
+ "core-js-pure": "^3.30.2",
+ "regenerator-runtime": "^0.14.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/template": {
+ "version": "7.26.9",
+ "resolved": "https://registry.npmjs.org/@babel/template/-/template-7.26.9.tgz",
+ "integrity": "sha512-qyRplbeIpNZhmzOysF/wFMuP9sctmh2cFzRAZOn1YapxBsE1i9bJIY586R/WBLfLcmcBlM8ROBiQURnnNy+zfA==",
+ "license": "MIT",
+ "dependencies": {
+ "@babel/code-frame": "^7.26.2",
+ "@babel/parser": "^7.26.9",
+ "@babel/types": "^7.26.9"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/traverse": {
+ "version": "7.23.2",
+ "resolved": "https://registry.npmjs.org/@babel/traverse/-/traverse-7.23.2.tgz",
+ "integrity": "sha512-azpe59SQ48qG6nu2CzcMLbxUudtN+dOM9kDbUqGq3HXUJRlo7i8fvPoxQUzYgLZ4cMVmuZgm8vvBpNeRhd6XSw==",
+ "dependencies": {
+ "@babel/code-frame": "^7.22.13",
+ "@babel/generator": "^7.23.0",
+ "@babel/helper-environment-visitor": "^7.22.20",
+ "@babel/helper-function-name": "^7.23.0",
+ "@babel/helper-hoist-variables": "^7.22.5",
+ "@babel/helper-split-export-declaration": "^7.22.6",
+ "@babel/parser": "^7.23.0",
+ "@babel/types": "^7.23.0",
+ "debug": "^4.1.0",
+ "globals": "^11.1.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@babel/types": {
+ "version": "7.26.10",
+ "resolved": "https://registry.npmjs.org/@babel/types/-/types-7.26.10.tgz",
+ "integrity": "sha512-emqcG3vHrpxUKTrxcblR36dcrcoRDvKmnL/dCL6ZsHaShW80qxCAcNhzQZrpeM765VzEos+xOi4s+r4IXzTwdQ==",
+ "license": "MIT",
+ "dependencies": {
+ "@babel/helper-string-parser": "^7.25.9",
+ "@babel/helper-validator-identifier": "^7.25.9"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/@cmfcmf/docusaurus-search-local": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/@cmfcmf/docusaurus-search-local/-/docusaurus-search-local-1.0.0.tgz",
+ "integrity": "sha512-glwxlOP3mo+sBxr1NchIZtrb7CXSAVl5vQpiDYj503W4X1H21zx0icXqSDckuK78j6JNMdCzo2VLI5XQRGh4pA==",
+ "dependencies": {
+ "@algolia/autocomplete-js": "^1.8.2",
+ "@algolia/autocomplete-theme-classic": "^1.8.2",
+ "@algolia/client-search": "^4.12.0",
+ "algoliasearch": "^4.12.0",
+ "cheerio": "^1.0.0-rc.9",
+ "clsx": "^1.1.1",
+ "lunr-languages": "^1.4.0",
+ "mark.js": "^8.11.1"
+ },
+ "peerDependencies": {
+ "@docusaurus/core": "^2.0.0",
+ "nodejieba": "^2.5.0"
+ },
+ "peerDependenciesMeta": {
+ "nodejieba": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/@colors/colors": {
+ "version": "1.5.0",
+ "resolved": "https://registry.npmjs.org/@colors/colors/-/colors-1.5.0.tgz",
+ "integrity": "sha512-ooWCrlZP11i8GImSjTHYHLkvFDP48nS4+204nGb1RiX/WXYHmJA2III9/e2DWVabCESdW7hBAEzHRqUn9OUVvQ==",
+ "optional": true,
+ "engines": {
+ "node": ">=0.1.90"
+ }
+ },
+ "node_modules/@discoveryjs/json-ext": {
+ "version": "0.5.7",
+ "resolved": "https://registry.npmjs.org/@discoveryjs/json-ext/-/json-ext-0.5.7.tgz",
+ "integrity": "sha512-dBVuXR082gk3jsFp7Rd/JI4kytwGHecnCoTtXFb7DB6CNHp4rg5k1bhg0nWdLGLnOV71lmDzGQaLMy8iPLY0pw==",
+ "engines": {
+ "node": ">=10.0.0"
+ }
+ },
+ "node_modules/@docsearch/css": {
+ "version": "3.3.3",
+ "resolved": "https://registry.npmjs.org/@docsearch/css/-/css-3.3.3.tgz",
+ "integrity": "sha512-6SCwI7P8ao+se1TUsdZ7B4XzL+gqeQZnBc+2EONZlcVa0dVrk0NjETxozFKgMv0eEGH8QzP1fkN+A1rH61l4eg=="
+ },
+ "node_modules/@docsearch/react": {
+ "version": "3.3.3",
+ "resolved": "https://registry.npmjs.org/@docsearch/react/-/react-3.3.3.tgz",
+ "integrity": "sha512-pLa0cxnl+G0FuIDuYlW+EBK6Rw2jwLw9B1RHIeS4N4s2VhsfJ/wzeCi3CWcs5yVfxLd5ZK50t//TMA5e79YT7Q==",
+ "dependencies": {
+ "@algolia/autocomplete-core": "1.7.4",
+ "@algolia/autocomplete-preset-algolia": "1.7.4",
+ "@docsearch/css": "3.3.3",
+ "algoliasearch": "^4.0.0"
+ },
+ "peerDependencies": {
+ "@types/react": ">= 16.8.0 < 19.0.0",
+ "react": ">= 16.8.0 < 19.0.0",
+ "react-dom": ">= 16.8.0 < 19.0.0"
+ },
+ "peerDependenciesMeta": {
+ "@types/react": {
+ "optional": true
+ },
+ "react": {
+ "optional": true
+ },
+ "react-dom": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/@docusaurus/core": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/core/-/core-2.4.0.tgz",
+ "integrity": "sha512-J55/WEoIpRcLf3afO5POHPguVZosKmJEQWKBL+K7TAnfuE7i+Y0NPLlkKtnWCehagGsgTqClfQEexH/UT4kELA==",
+ "dependencies": {
+ "@babel/core": "^7.18.6",
+ "@babel/generator": "^7.18.7",
+ "@babel/plugin-syntax-dynamic-import": "^7.8.3",
+ "@babel/plugin-transform-runtime": "^7.18.6",
+ "@babel/preset-env": "^7.18.6",
+ "@babel/preset-react": "^7.18.6",
+ "@babel/preset-typescript": "^7.18.6",
+ "@babel/runtime": "^7.18.6",
+ "@babel/runtime-corejs3": "^7.18.6",
+ "@babel/traverse": "^7.18.8",
+ "@docusaurus/cssnano-preset": "2.4.0",
+ "@docusaurus/logger": "2.4.0",
+ "@docusaurus/mdx-loader": "2.4.0",
+ "@docusaurus/react-loadable": "5.5.2",
+ "@docusaurus/utils": "2.4.0",
+ "@docusaurus/utils-common": "2.4.0",
+ "@docusaurus/utils-validation": "2.4.0",
+ "@slorber/static-site-generator-webpack-plugin": "^4.0.7",
+ "@svgr/webpack": "^6.2.1",
+ "autoprefixer": "^10.4.7",
+ "babel-loader": "^8.2.5",
+ "babel-plugin-dynamic-import-node": "^2.3.3",
+ "boxen": "^6.2.1",
+ "chalk": "^4.1.2",
+ "chokidar": "^3.5.3",
+ "clean-css": "^5.3.0",
+ "cli-table3": "^0.6.2",
+ "combine-promises": "^1.1.0",
+ "commander": "^5.1.0",
+ "copy-webpack-plugin": "^11.0.0",
+ "core-js": "^3.23.3",
+ "css-loader": "^6.7.1",
+ "css-minimizer-webpack-plugin": "^4.0.0",
+ "cssnano": "^5.1.12",
+ "del": "^6.1.1",
+ "detect-port": "^1.3.0",
+ "escape-html": "^1.0.3",
+ "eta": "^2.0.0",
+ "file-loader": "^6.2.0",
+ "fs-extra": "^10.1.0",
+ "html-minifier-terser": "^6.1.0",
+ "html-tags": "^3.2.0",
+ "html-webpack-plugin": "^5.5.0",
+ "import-fresh": "^3.3.0",
+ "leven": "^3.1.0",
+ "lodash": "^4.17.21",
+ "mini-css-extract-plugin": "^2.6.1",
+ "postcss": "^8.4.14",
+ "postcss-loader": "^7.0.0",
+ "prompts": "^2.4.2",
+ "react-dev-utils": "^12.0.1",
+ "react-helmet-async": "^1.3.0",
+ "react-loadable": "npm:@docusaurus/react-loadable@5.5.2",
+ "react-loadable-ssr-addon-v5-slorber": "^1.0.1",
+ "react-router": "^5.3.3",
+ "react-router-config": "^5.1.1",
+ "react-router-dom": "^5.3.3",
+ "rtl-detect": "^1.0.4",
+ "semver": "^7.3.7",
+ "serve-handler": "^6.1.3",
+ "shelljs": "^0.8.5",
+ "terser-webpack-plugin": "^5.3.3",
+ "tslib": "^2.4.0",
+ "update-notifier": "^5.1.0",
+ "url-loader": "^4.1.1",
+ "wait-on": "^6.0.1",
+ "webpack": "^5.73.0",
+ "webpack-bundle-analyzer": "^4.5.0",
+ "webpack-dev-server": "^4.9.3",
+ "webpack-merge": "^5.8.0",
+ "webpackbar": "^5.0.2"
+ },
+ "bin": {
+ "docusaurus": "bin/docusaurus.mjs"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/cssnano-preset": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/cssnano-preset/-/cssnano-preset-2.4.0.tgz",
+ "integrity": "sha512-RmdiA3IpsLgZGXRzqnmTbGv43W4OD44PCo+6Q/aYjEM2V57vKCVqNzuafE94jv0z/PjHoXUrjr69SaRymBKYYw==",
+ "dependencies": {
+ "cssnano-preset-advanced": "^5.3.8",
+ "postcss": "^8.4.14",
+ "postcss-sort-media-queries": "^4.2.1",
+ "tslib": "^2.4.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ }
+ },
+ "node_modules/@docusaurus/logger": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/logger/-/logger-2.4.0.tgz",
+ "integrity": "sha512-T8+qR4APN+MjcC9yL2Es+xPJ2923S9hpzDmMtdsOcUGLqpCGBbU1vp3AAqDwXtVgFkq+NsEk7sHdVsfLWR/AXw==",
+ "dependencies": {
+ "chalk": "^4.1.2",
+ "tslib": "^2.4.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ }
+ },
+ "node_modules/@docusaurus/mdx-loader": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/mdx-loader/-/mdx-loader-2.4.0.tgz",
+ "integrity": "sha512-GWoH4izZKOmFoC+gbI2/y8deH/xKLvzz/T5BsEexBye8EHQlwsA7FMrVa48N063bJBH4FUOiRRXxk5rq9cC36g==",
+ "dependencies": {
+ "@babel/parser": "^7.18.8",
+ "@babel/traverse": "^7.18.8",
+ "@docusaurus/logger": "2.4.0",
+ "@docusaurus/utils": "2.4.0",
+ "@mdx-js/mdx": "^1.6.22",
+ "escape-html": "^1.0.3",
+ "file-loader": "^6.2.0",
+ "fs-extra": "^10.1.0",
+ "image-size": "^1.0.1",
+ "mdast-util-to-string": "^2.0.0",
+ "remark-emoji": "^2.2.0",
+ "stringify-object": "^3.3.0",
+ "tslib": "^2.4.0",
+ "unified": "^9.2.2",
+ "unist-util-visit": "^2.0.3",
+ "url-loader": "^4.1.1",
+ "webpack": "^5.73.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/module-type-aliases": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/module-type-aliases/-/module-type-aliases-2.4.0.tgz",
+ "integrity": "sha512-YEQO2D3UXs72qCn8Cr+RlycSQXVGN9iEUyuHwTuK4/uL/HFomB2FHSU0vSDM23oLd+X/KibQ3Ez6nGjQLqXcHg==",
+ "dependencies": {
+ "@docusaurus/react-loadable": "5.5.2",
+ "@docusaurus/types": "2.4.0",
+ "@types/history": "^4.7.11",
+ "@types/react": "*",
+ "@types/react-router-config": "*",
+ "@types/react-router-dom": "*",
+ "react-helmet-async": "*",
+ "react-loadable": "npm:@docusaurus/react-loadable@5.5.2"
+ },
+ "peerDependencies": {
+ "react": "*",
+ "react-dom": "*"
+ }
+ },
+ "node_modules/@docusaurus/plugin-content-blog": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/plugin-content-blog/-/plugin-content-blog-2.4.0.tgz",
+ "integrity": "sha512-YwkAkVUxtxoBAIj/MCb4ohN0SCtHBs4AS75jMhPpf67qf3j+U/4n33cELq7567hwyZ6fMz2GPJcVmctzlGGThQ==",
+ "dependencies": {
+ "@docusaurus/core": "2.4.0",
+ "@docusaurus/logger": "2.4.0",
+ "@docusaurus/mdx-loader": "2.4.0",
+ "@docusaurus/types": "2.4.0",
+ "@docusaurus/utils": "2.4.0",
+ "@docusaurus/utils-common": "2.4.0",
+ "@docusaurus/utils-validation": "2.4.0",
+ "cheerio": "^1.0.0-rc.12",
+ "feed": "^4.2.2",
+ "fs-extra": "^10.1.0",
+ "lodash": "^4.17.21",
+ "reading-time": "^1.5.0",
+ "tslib": "^2.4.0",
+ "unist-util-visit": "^2.0.3",
+ "utility-types": "^3.10.0",
+ "webpack": "^5.73.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/plugin-content-docs": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/plugin-content-docs/-/plugin-content-docs-2.4.0.tgz",
+ "integrity": "sha512-ic/Z/ZN5Rk/RQo+Io6rUGpToOtNbtPloMR2JcGwC1xT2riMu6zzfSwmBi9tHJgdXH6CB5jG+0dOZZO8QS5tmDg==",
+ "dependencies": {
+ "@docusaurus/core": "2.4.0",
+ "@docusaurus/logger": "2.4.0",
+ "@docusaurus/mdx-loader": "2.4.0",
+ "@docusaurus/module-type-aliases": "2.4.0",
+ "@docusaurus/types": "2.4.0",
+ "@docusaurus/utils": "2.4.0",
+ "@docusaurus/utils-validation": "2.4.0",
+ "@types/react-router-config": "^5.0.6",
+ "combine-promises": "^1.1.0",
+ "fs-extra": "^10.1.0",
+ "import-fresh": "^3.3.0",
+ "js-yaml": "^4.1.0",
+ "lodash": "^4.17.21",
+ "tslib": "^2.4.0",
+ "utility-types": "^3.10.0",
+ "webpack": "^5.73.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/plugin-content-pages": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/plugin-content-pages/-/plugin-content-pages-2.4.0.tgz",
+ "integrity": "sha512-Pk2pOeOxk8MeU3mrTU0XLIgP9NZixbdcJmJ7RUFrZp1Aj42nd0RhIT14BGvXXyqb8yTQlk4DmYGAzqOfBsFyGw==",
+ "dependencies": {
+ "@docusaurus/core": "2.4.0",
+ "@docusaurus/mdx-loader": "2.4.0",
+ "@docusaurus/types": "2.4.0",
+ "@docusaurus/utils": "2.4.0",
+ "@docusaurus/utils-validation": "2.4.0",
+ "fs-extra": "^10.1.0",
+ "tslib": "^2.4.0",
+ "webpack": "^5.73.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/plugin-debug": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/plugin-debug/-/plugin-debug-2.4.0.tgz",
+ "integrity": "sha512-KC56DdYjYT7Txyux71vXHXGYZuP6yYtqwClvYpjKreWIHWus5Zt6VNi23rMZv3/QKhOCrN64zplUbdfQMvddBQ==",
+ "dependencies": {
+ "@docusaurus/core": "2.4.0",
+ "@docusaurus/types": "2.4.0",
+ "@docusaurus/utils": "2.4.0",
+ "fs-extra": "^10.1.0",
+ "react-json-view": "^1.21.3",
+ "tslib": "^2.4.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/plugin-google-analytics": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/plugin-google-analytics/-/plugin-google-analytics-2.4.0.tgz",
+ "integrity": "sha512-uGUzX67DOAIglygdNrmMOvEp8qG03X20jMWadeqVQktS6nADvozpSLGx4J0xbkblhJkUzN21WiilsP9iVP+zkw==",
+ "dependencies": {
+ "@docusaurus/core": "2.4.0",
+ "@docusaurus/types": "2.4.0",
+ "@docusaurus/utils-validation": "2.4.0",
+ "tslib": "^2.4.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/plugin-google-gtag": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/plugin-google-gtag/-/plugin-google-gtag-2.4.0.tgz",
+ "integrity": "sha512-adj/70DANaQs2+TF/nRdMezDXFAV/O/pjAbUgmKBlyOTq5qoMe0Tk4muvQIwWUmiUQxFJe+sKlZGM771ownyOg==",
+ "dependencies": {
+ "@docusaurus/core": "2.4.0",
+ "@docusaurus/types": "2.4.0",
+ "@docusaurus/utils-validation": "2.4.0",
+ "tslib": "^2.4.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/plugin-google-tag-manager": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/plugin-google-tag-manager/-/plugin-google-tag-manager-2.4.0.tgz",
+ "integrity": "sha512-E66uGcYs4l7yitmp/8kMEVQftFPwV9iC62ORh47Veqzs6ExwnhzBkJmwDnwIysHBF1vlxnzET0Fl2LfL5fRR3A==",
+ "dependencies": {
+ "@docusaurus/core": "2.4.0",
+ "@docusaurus/types": "2.4.0",
+ "@docusaurus/utils-validation": "2.4.0",
+ "tslib": "^2.4.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/plugin-sitemap": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/plugin-sitemap/-/plugin-sitemap-2.4.0.tgz",
+ "integrity": "sha512-pZxh+ygfnI657sN8a/FkYVIAmVv0CGk71QMKqJBOfMmDHNN1FeDeFkBjWP49ejBqpqAhjufkv5UWq3UOu2soCw==",
+ "dependencies": {
+ "@docusaurus/core": "2.4.0",
+ "@docusaurus/logger": "2.4.0",
+ "@docusaurus/types": "2.4.0",
+ "@docusaurus/utils": "2.4.0",
+ "@docusaurus/utils-common": "2.4.0",
+ "@docusaurus/utils-validation": "2.4.0",
+ "fs-extra": "^10.1.0",
+ "sitemap": "^7.1.1",
+ "tslib": "^2.4.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/preset-classic": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/preset-classic/-/preset-classic-2.4.0.tgz",
+ "integrity": "sha512-/5z5o/9bc6+P5ool2y01PbJhoGddEGsC0ej1MF6mCoazk8A+kW4feoUd68l7Bnv01rCnG3xy7kHUQP97Y0grUA==",
+ "dependencies": {
+ "@docusaurus/core": "2.4.0",
+ "@docusaurus/plugin-content-blog": "2.4.0",
+ "@docusaurus/plugin-content-docs": "2.4.0",
+ "@docusaurus/plugin-content-pages": "2.4.0",
+ "@docusaurus/plugin-debug": "2.4.0",
+ "@docusaurus/plugin-google-analytics": "2.4.0",
+ "@docusaurus/plugin-google-gtag": "2.4.0",
+ "@docusaurus/plugin-google-tag-manager": "2.4.0",
+ "@docusaurus/plugin-sitemap": "2.4.0",
+ "@docusaurus/theme-classic": "2.4.0",
+ "@docusaurus/theme-common": "2.4.0",
+ "@docusaurus/theme-search-algolia": "2.4.0",
+ "@docusaurus/types": "2.4.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/react-loadable": {
+ "version": "5.5.2",
+ "resolved": "https://registry.npmjs.org/@docusaurus/react-loadable/-/react-loadable-5.5.2.tgz",
+ "integrity": "sha512-A3dYjdBGuy0IGT+wyLIGIKLRE+sAk1iNk0f1HjNDysO7u8lhL4N3VEm+FAubmJbAztn94F7MxBTPmnixbiyFdQ==",
+ "dependencies": {
+ "@types/react": "*",
+ "prop-types": "^15.6.2"
+ },
+ "peerDependencies": {
+ "react": "*"
+ }
+ },
+ "node_modules/@docusaurus/theme-classic": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/theme-classic/-/theme-classic-2.4.0.tgz",
+ "integrity": "sha512-GMDX5WU6Z0OC65eQFgl3iNNEbI9IMJz9f6KnOyuMxNUR6q0qVLsKCNopFUDfFNJ55UU50o7P7o21yVhkwpfJ9w==",
+ "dependencies": {
+ "@docusaurus/core": "2.4.0",
+ "@docusaurus/mdx-loader": "2.4.0",
+ "@docusaurus/module-type-aliases": "2.4.0",
+ "@docusaurus/plugin-content-blog": "2.4.0",
+ "@docusaurus/plugin-content-docs": "2.4.0",
+ "@docusaurus/plugin-content-pages": "2.4.0",
+ "@docusaurus/theme-common": "2.4.0",
+ "@docusaurus/theme-translations": "2.4.0",
+ "@docusaurus/types": "2.4.0",
+ "@docusaurus/utils": "2.4.0",
+ "@docusaurus/utils-common": "2.4.0",
+ "@docusaurus/utils-validation": "2.4.0",
+ "@mdx-js/react": "^1.6.22",
+ "clsx": "^1.2.1",
+ "copy-text-to-clipboard": "^3.0.1",
+ "infima": "0.2.0-alpha.43",
+ "lodash": "^4.17.21",
+ "nprogress": "^0.2.0",
+ "postcss": "^8.4.14",
+ "prism-react-renderer": "^1.3.5",
+ "prismjs": "^1.28.0",
+ "react-router-dom": "^5.3.3",
+ "rtlcss": "^3.5.0",
+ "tslib": "^2.4.0",
+ "utility-types": "^3.10.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/theme-common": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/theme-common/-/theme-common-2.4.0.tgz",
+ "integrity": "sha512-IkG/l5f/FLY6cBIxtPmFnxpuPzc5TupuqlOx+XDN+035MdQcAh8wHXXZJAkTeYDeZ3anIUSUIvWa7/nRKoQEfg==",
+ "dependencies": {
+ "@docusaurus/mdx-loader": "2.4.0",
+ "@docusaurus/module-type-aliases": "2.4.0",
+ "@docusaurus/plugin-content-blog": "2.4.0",
+ "@docusaurus/plugin-content-docs": "2.4.0",
+ "@docusaurus/plugin-content-pages": "2.4.0",
+ "@docusaurus/utils": "2.4.0",
+ "@docusaurus/utils-common": "2.4.0",
+ "@types/history": "^4.7.11",
+ "@types/react": "*",
+ "@types/react-router-config": "*",
+ "clsx": "^1.2.1",
+ "parse-numeric-range": "^1.3.0",
+ "prism-react-renderer": "^1.3.5",
+ "tslib": "^2.4.0",
+ "use-sync-external-store": "^1.2.0",
+ "utility-types": "^3.10.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/theme-search-algolia": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/theme-search-algolia/-/theme-search-algolia-2.4.0.tgz",
+ "integrity": "sha512-pPCJSCL1Qt4pu/Z0uxBAuke0yEBbxh0s4fOvimna7TEcBLPq0x06/K78AaABXrTVQM6S0vdocFl9EoNgU17hqA==",
+ "dependencies": {
+ "@docsearch/react": "^3.1.1",
+ "@docusaurus/core": "2.4.0",
+ "@docusaurus/logger": "2.4.0",
+ "@docusaurus/plugin-content-docs": "2.4.0",
+ "@docusaurus/theme-common": "2.4.0",
+ "@docusaurus/theme-translations": "2.4.0",
+ "@docusaurus/utils": "2.4.0",
+ "@docusaurus/utils-validation": "2.4.0",
+ "algoliasearch": "^4.13.1",
+ "algoliasearch-helper": "^3.10.0",
+ "clsx": "^1.2.1",
+ "eta": "^2.0.0",
+ "fs-extra": "^10.1.0",
+ "lodash": "^4.17.21",
+ "tslib": "^2.4.0",
+ "utility-types": "^3.10.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/theme-translations": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/theme-translations/-/theme-translations-2.4.0.tgz",
+ "integrity": "sha512-kEoITnPXzDPUMBHk3+fzEzbopxLD3fR5sDoayNH0vXkpUukA88/aDL1bqkhxWZHA3LOfJ3f0vJbOwmnXW5v85Q==",
+ "dependencies": {
+ "fs-extra": "^10.1.0",
+ "tslib": "^2.4.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ }
+ },
+ "node_modules/@docusaurus/types": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/types/-/types-2.4.0.tgz",
+ "integrity": "sha512-xaBXr+KIPDkIaef06c+i2HeTqVNixB7yFut5fBXPGI2f1rrmEV2vLMznNGsFwvZ5XmA3Quuefd4OGRkdo97Dhw==",
+ "dependencies": {
+ "@types/history": "^4.7.11",
+ "@types/react": "*",
+ "commander": "^5.1.0",
+ "joi": "^17.6.0",
+ "react-helmet-async": "^1.3.0",
+ "utility-types": "^3.10.0",
+ "webpack": "^5.73.0",
+ "webpack-merge": "^5.8.0"
+ },
+ "peerDependencies": {
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0"
+ }
+ },
+ "node_modules/@docusaurus/utils": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/utils/-/utils-2.4.0.tgz",
+ "integrity": "sha512-89hLYkvtRX92j+C+ERYTuSUK6nF9bGM32QThcHPg2EDDHVw6FzYQXmX6/p+pU5SDyyx5nBlE4qXR92RxCAOqfg==",
+ "dependencies": {
+ "@docusaurus/logger": "2.4.0",
+ "@svgr/webpack": "^6.2.1",
+ "escape-string-regexp": "^4.0.0",
+ "file-loader": "^6.2.0",
+ "fs-extra": "^10.1.0",
+ "github-slugger": "^1.4.0",
+ "globby": "^11.1.0",
+ "gray-matter": "^4.0.3",
+ "js-yaml": "^4.1.0",
+ "lodash": "^4.17.21",
+ "micromatch": "^4.0.5",
+ "resolve-pathname": "^3.0.0",
+ "shelljs": "^0.8.5",
+ "tslib": "^2.4.0",
+ "url-loader": "^4.1.1",
+ "webpack": "^5.73.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "@docusaurus/types": "*"
+ },
+ "peerDependenciesMeta": {
+ "@docusaurus/types": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/@docusaurus/utils-common": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/utils-common/-/utils-common-2.4.0.tgz",
+ "integrity": "sha512-zIMf10xuKxddYfLg5cS19x44zud/E9I7lj3+0bv8UIs0aahpErfNrGhijEfJpAfikhQ8tL3m35nH3hJ3sOG82A==",
+ "dependencies": {
+ "tslib": "^2.4.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ },
+ "peerDependencies": {
+ "@docusaurus/types": "*"
+ },
+ "peerDependenciesMeta": {
+ "@docusaurus/types": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/@docusaurus/utils-validation": {
+ "version": "2.4.0",
+ "resolved": "https://registry.npmjs.org/@docusaurus/utils-validation/-/utils-validation-2.4.0.tgz",
+ "integrity": "sha512-IrBsBbbAp6y7mZdJx4S4pIA7dUyWSA0GNosPk6ZJ0fX3uYIEQgcQSGIgTeSC+8xPEx3c16o03en1jSDpgQgz/w==",
+ "dependencies": {
+ "@docusaurus/logger": "2.4.0",
+ "@docusaurus/utils": "2.4.0",
+ "joi": "^17.6.0",
+ "js-yaml": "^4.1.0",
+ "tslib": "^2.4.0"
+ },
+ "engines": {
+ "node": ">=16.14"
+ }
+ },
+ "node_modules/@emotion/is-prop-valid": {
+ "version": "1.2.0",
+ "resolved": "https://registry.npmjs.org/@emotion/is-prop-valid/-/is-prop-valid-1.2.0.tgz",
+ "integrity": "sha512-3aDpDprjM0AwaxGE09bOPkNxHpBd+kA6jty3RnaEXdweX1DF1U3VQpPYb0g1IStAuK7SVQ1cy+bNBBKp4W3Fjg==",
+ "dependencies": {
+ "@emotion/memoize": "^0.8.0"
+ }
+ },
+ "node_modules/@emotion/memoize": {
+ "version": "0.8.0",
+ "resolved": "https://registry.npmjs.org/@emotion/memoize/-/memoize-0.8.0.tgz",
+ "integrity": "sha512-G/YwXTkv7Den9mXDO7AhLWkE3q+I92B+VqAE+dYG4NGPaHZGvt3G8Q0p9vmE+sq7rTGphUbAvmQ9YpbfMQGGlA=="
+ },
+ "node_modules/@emotion/stylis": {
+ "version": "0.8.5",
+ "resolved": "https://registry.npmjs.org/@emotion/stylis/-/stylis-0.8.5.tgz",
+ "integrity": "sha512-h6KtPihKFn3T9fuIrwvXXUOwlx3rfUvfZIcP5a6rh8Y7zjE3O06hT5Ss4S/YI1AYhuZ1kjaE/5EaOOI2NqSylQ=="
+ },
+ "node_modules/@emotion/unitless": {
+ "version": "0.7.5",
+ "resolved": "https://registry.npmjs.org/@emotion/unitless/-/unitless-0.7.5.tgz",
+ "integrity": "sha512-OWORNpfjMsSSUBVrRBVGECkhWcULOAJz9ZW8uK9qgxD+87M7jHRcvh/A96XXNhXTLmKcoYSQtBEX7lHMO7YRwg=="
+ },
+ "node_modules/@exodus/schemasafe": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/@exodus/schemasafe/-/schemasafe-1.0.0.tgz",
+ "integrity": "sha512-2cyupPIZI69HQxEAPllLXBjQp4njDKkOjYRCYxvMZe3/LY9pp9fBM3Tb1wiFAdP6Emo4v3OEbCLGj6u73Q5KLw=="
+ },
+ "node_modules/@hapi/hoek": {
+ "version": "9.3.0",
+ "resolved": "https://registry.npmjs.org/@hapi/hoek/-/hoek-9.3.0.tgz",
+ "integrity": "sha512-/c6rf4UJlmHlC9b5BaNvzAcFv7HZ2QHaV0D4/HNlBdvFnvQq8RI4kYdhyPCl7Xj+oWvTWQ8ujhqS53LIgAe6KQ=="
+ },
+ "node_modules/@hapi/topo": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/@hapi/topo/-/topo-5.1.0.tgz",
+ "integrity": "sha512-foQZKJig7Ob0BMAYBfcJk8d77QtOe7Wo4ox7ff1lQYoNNAb6jwcY1ncdoy2e9wQZzvNy7ODZCYJkK8kzmcAnAg==",
+ "dependencies": {
+ "@hapi/hoek": "^9.0.0"
+ }
+ },
+ "node_modules/@jest/schemas": {
+ "version": "29.4.3",
+ "resolved": "https://registry.npmjs.org/@jest/schemas/-/schemas-29.4.3.tgz",
+ "integrity": "sha512-VLYKXQmtmuEz6IxJsrZwzG9NvtkQsWNnWMsKxqWNu3+CnfzJQhp0WDDKWLVV9hLKr0l3SLLFRqcYHjhtyuDVxg==",
+ "dependencies": {
+ "@sinclair/typebox": "^0.25.16"
+ },
+ "engines": {
+ "node": "^14.15.0 || ^16.10.0 || >=18.0.0"
+ }
+ },
+ "node_modules/@jest/types": {
+ "version": "29.5.0",
+ "resolved": "https://registry.npmjs.org/@jest/types/-/types-29.5.0.tgz",
+ "integrity": "sha512-qbu7kN6czmVRc3xWFQcAN03RAUamgppVUdXrvl1Wr3jlNF93o9mJbGcDWrwGB6ht44u7efB1qCFgVQmca24Uog==",
+ "dependencies": {
+ "@jest/schemas": "^29.4.3",
+ "@types/istanbul-lib-coverage": "^2.0.0",
+ "@types/istanbul-reports": "^3.0.0",
+ "@types/node": "*",
+ "@types/yargs": "^17.0.8",
+ "chalk": "^4.0.0"
+ },
+ "engines": {
+ "node": "^14.15.0 || ^16.10.0 || >=18.0.0"
+ }
+ },
+ "node_modules/@jridgewell/gen-mapping": {
+ "version": "0.1.1",
+ "resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.1.1.tgz",
+ "integrity": "sha512-sQXCasFk+U8lWYEe66WxRDOE9PjVz4vSM51fTu3Hw+ClTpUSQb718772vH3pyS5pShp6lvQM7SxgIDXXXmOX7w==",
+ "dependencies": {
+ "@jridgewell/set-array": "^1.0.0",
+ "@jridgewell/sourcemap-codec": "^1.4.10"
+ },
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/@jridgewell/resolve-uri": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/@jridgewell/resolve-uri/-/resolve-uri-3.1.0.tgz",
+ "integrity": "sha512-F2msla3tad+Mfht5cJq7LSXcdudKTWCVYUgw6pLFOOHSTtZlj6SWNYAp+AhuqLmWdBO2X5hPrLcu8cVP8fy28w==",
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/@jridgewell/set-array": {
+ "version": "1.2.1",
+ "resolved": "https://registry.npmjs.org/@jridgewell/set-array/-/set-array-1.2.1.tgz",
+ "integrity": "sha512-R8gLRTZeyp03ymzP/6Lil/28tGeGEzhx1q2k703KGWRAI1VdvPIXdG70VJc2pAMw3NA6JKL5hhFu1sJX0Mnn/A==",
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/@jridgewell/source-map": {
+ "version": "0.3.6",
+ "resolved": "https://registry.npmjs.org/@jridgewell/source-map/-/source-map-0.3.6.tgz",
+ "integrity": "sha512-1ZJTZebgqllO79ue2bm3rIGud/bOe0pP5BjSRCRxxYkEZS8STV7zN84UBbiYu7jy+eCKSnVIUgoWWE/tt+shMQ==",
+ "dependencies": {
+ "@jridgewell/gen-mapping": "^0.3.5",
+ "@jridgewell/trace-mapping": "^0.3.25"
+ }
+ },
+ "node_modules/@jridgewell/source-map/node_modules/@jridgewell/gen-mapping": {
+ "version": "0.3.5",
+ "resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.5.tgz",
+ "integrity": "sha512-IzL8ZoEDIBRWEzlCcRhOaCupYyN5gdIK+Q6fbFdPDg6HqX6jpkItn7DFIpW9LQzXG6Df9sA7+OKnq0qlz/GaQg==",
+ "dependencies": {
+ "@jridgewell/set-array": "^1.2.1",
+ "@jridgewell/sourcemap-codec": "^1.4.10",
+ "@jridgewell/trace-mapping": "^0.3.24"
+ },
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/@jridgewell/sourcemap-codec": {
+ "version": "1.4.14",
+ "resolved": "https://registry.npmjs.org/@jridgewell/sourcemap-codec/-/sourcemap-codec-1.4.14.tgz",
+ "integrity": "sha512-XPSJHWmi394fuUuzDnGz1wiKqWfo1yXecHQMRf2l6hztTO+nPru658AyDngaBe7isIxEkRsPR3FZh+s7iVa4Uw=="
+ },
+ "node_modules/@jridgewell/trace-mapping": {
+ "version": "0.3.25",
+ "resolved": "https://registry.npmjs.org/@jridgewell/trace-mapping/-/trace-mapping-0.3.25.tgz",
+ "integrity": "sha512-vNk6aEwybGtawWmy/PzwnGDOjCkLWSD2wqvjGGAgOAwCGWySYXfYoxt00IJkTF+8Lb57DwOb3Aa0o9CApepiYQ==",
+ "dependencies": {
+ "@jridgewell/resolve-uri": "^3.1.0",
+ "@jridgewell/sourcemap-codec": "^1.4.14"
+ }
+ },
+ "node_modules/@leichtgewicht/ip-codec": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/@leichtgewicht/ip-codec/-/ip-codec-2.0.4.tgz",
+ "integrity": "sha512-Hcv+nVC0kZnQ3tD9GVu5xSMR4VVYOteQIr/hwFPVEvPdlXqgGEuRjiheChHgdM+JyqdgNcmzZOX/tnl0JOiI7A=="
+ },
+ "node_modules/@mdx-js/mdx": {
+ "version": "1.6.22",
+ "resolved": "https://registry.npmjs.org/@mdx-js/mdx/-/mdx-1.6.22.tgz",
+ "integrity": "sha512-AMxuLxPz2j5/6TpF/XSdKpQP1NlG0z11dFOlq+2IP/lSgl11GY8ji6S/rgsViN/L0BDvHvUMruRb7ub+24LUYA==",
+ "dependencies": {
+ "@babel/core": "7.12.9",
+ "@babel/plugin-syntax-jsx": "7.12.1",
+ "@babel/plugin-syntax-object-rest-spread": "7.8.3",
+ "@mdx-js/util": "1.6.22",
+ "babel-plugin-apply-mdx-type-prop": "1.6.22",
+ "babel-plugin-extract-import-names": "1.6.22",
+ "camelcase-css": "2.0.1",
+ "detab": "2.0.4",
+ "hast-util-raw": "6.0.1",
+ "lodash.uniq": "4.5.0",
+ "mdast-util-to-hast": "10.0.1",
+ "remark-footnotes": "2.0.0",
+ "remark-mdx": "1.6.22",
+ "remark-parse": "8.0.3",
+ "remark-squeeze-paragraphs": "4.0.0",
+ "style-to-object": "0.3.0",
+ "unified": "9.2.0",
+ "unist-builder": "2.0.3",
+ "unist-util-visit": "2.0.3"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/@mdx-js/mdx/node_modules/@babel/core": {
+ "version": "7.12.9",
+ "resolved": "https://registry.npmjs.org/@babel/core/-/core-7.12.9.tgz",
+ "integrity": "sha512-gTXYh3M5wb7FRXQy+FErKFAv90BnlOuNn1QkCK2lREoPAjrQCO49+HVSrFoe5uakFAF5eenS75KbO2vQiLrTMQ==",
+ "dependencies": {
+ "@babel/code-frame": "^7.10.4",
+ "@babel/generator": "^7.12.5",
+ "@babel/helper-module-transforms": "^7.12.1",
+ "@babel/helpers": "^7.12.5",
+ "@babel/parser": "^7.12.7",
+ "@babel/template": "^7.12.7",
+ "@babel/traverse": "^7.12.9",
+ "@babel/types": "^7.12.7",
+ "convert-source-map": "^1.7.0",
+ "debug": "^4.1.0",
+ "gensync": "^1.0.0-beta.1",
+ "json5": "^2.1.2",
+ "lodash": "^4.17.19",
+ "resolve": "^1.3.2",
+ "semver": "^5.4.1",
+ "source-map": "^0.5.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/babel"
+ }
+ },
+ "node_modules/@mdx-js/mdx/node_modules/@babel/plugin-syntax-jsx": {
+ "version": "7.12.1",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-jsx/-/plugin-syntax-jsx-7.12.1.tgz",
+ "integrity": "sha512-1yRi7yAtB0ETgxdY9ti/p2TivUxJkTdhu/ZbF9MshVGqOx1TdB3b7xCXs49Fupgg50N45KcAsRP/ZqWjs9SRjg==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.10.4"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@mdx-js/mdx/node_modules/semver": {
+ "version": "5.7.2",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-5.7.2.tgz",
+ "integrity": "sha512-cBznnQ9KjJqU67B52RMC65CMarK2600WFnbkcaiwWq3xy/5haFJlshgnpjovMVJ+Hff49d8GEn0b87C5pDQ10g==",
+ "bin": {
+ "semver": "bin/semver"
+ }
+ },
+ "node_modules/@mdx-js/mdx/node_modules/source-map": {
+ "version": "0.5.7",
+ "resolved": "https://registry.npmjs.org/source-map/-/source-map-0.5.7.tgz",
+ "integrity": "sha512-LbrmJOMUSdEVxIKvdcJzQC+nQhe8FUZQTXQy6+I75skNgn3OoQ0DZA8YnFa7gp8tqtL3KPf1kmo0R5DoApeSGQ==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/@mdx-js/mdx/node_modules/unified": {
+ "version": "9.2.0",
+ "resolved": "https://registry.npmjs.org/unified/-/unified-9.2.0.tgz",
+ "integrity": "sha512-vx2Z0vY+a3YoTj8+pttM3tiJHCwY5UFbYdiWrwBEbHmK8pvsPj2rtAX2BFfgXen8T39CJWblWRDT4L5WGXtDdg==",
+ "dependencies": {
+ "bail": "^1.0.0",
+ "extend": "^3.0.0",
+ "is-buffer": "^2.0.0",
+ "is-plain-obj": "^2.0.0",
+ "trough": "^1.0.0",
+ "vfile": "^4.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/@mdx-js/react": {
+ "version": "1.6.22",
+ "resolved": "https://registry.npmjs.org/@mdx-js/react/-/react-1.6.22.tgz",
+ "integrity": "sha512-TDoPum4SHdfPiGSAaRBw7ECyI8VaHpK8GJugbJIJuqyh6kzw9ZLJZW3HGL3NNrJGxcAixUvqROm+YuQOo5eXtg==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ },
+ "peerDependencies": {
+ "react": "^16.13.1 || ^17.0.0"
+ }
+ },
+ "node_modules/@mdx-js/util": {
+ "version": "1.6.22",
+ "resolved": "https://registry.npmjs.org/@mdx-js/util/-/util-1.6.22.tgz",
+ "integrity": "sha512-H1rQc1ZOHANWBvPcW+JpGwr+juXSxM8Q8YCkm3GhZd8REu1fHR3z99CErO1p9pkcfcxZnMdIZdIsXkOHY0NilA==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/@nodelib/fs.scandir": {
+ "version": "2.1.5",
+ "resolved": "https://registry.npmjs.org/@nodelib/fs.scandir/-/fs.scandir-2.1.5.tgz",
+ "integrity": "sha512-vq24Bq3ym5HEQm2NKCr3yXDwjc7vTsEThRDnkp2DK9p1uqLR+DHurm/NOTo0KG7HYHU7eppKZj3MyqYuMBf62g==",
+ "dependencies": {
+ "@nodelib/fs.stat": "2.0.5",
+ "run-parallel": "^1.1.9"
+ },
+ "engines": {
+ "node": ">= 8"
+ }
+ },
+ "node_modules/@nodelib/fs.stat": {
+ "version": "2.0.5",
+ "resolved": "https://registry.npmjs.org/@nodelib/fs.stat/-/fs.stat-2.0.5.tgz",
+ "integrity": "sha512-RkhPPp2zrqDAQA/2jNhnztcPAlv64XdhIp7a7454A5ovI7Bukxgt7MX7udwAu3zg1DcpPU0rz3VV1SeaqvY4+A==",
+ "engines": {
+ "node": ">= 8"
+ }
+ },
+ "node_modules/@nodelib/fs.walk": {
+ "version": "1.2.8",
+ "resolved": "https://registry.npmjs.org/@nodelib/fs.walk/-/fs.walk-1.2.8.tgz",
+ "integrity": "sha512-oGB+UxlgWcgQkgwo8GcEGwemoTFt3FIO9ababBmaGwXIoBKZ+GTy0pP185beGg7Llih/NSHSV2XAs1lnznocSg==",
+ "dependencies": {
+ "@nodelib/fs.scandir": "2.1.5",
+ "fastq": "^1.6.0"
+ },
+ "engines": {
+ "node": ">= 8"
+ }
+ },
+ "node_modules/@polka/url": {
+ "version": "1.0.0-next.21",
+ "resolved": "https://registry.npmjs.org/@polka/url/-/url-1.0.0-next.21.tgz",
+ "integrity": "sha512-a5Sab1C4/icpTZVzZc5Ghpz88yQtGOyNqYXcZgOssB2uuAr+wF/MvN6bgtW32q7HHrvBki+BsZ0OuNv6EV3K9g=="
+ },
+ "node_modules/@redocly/ajv": {
+ "version": "8.11.0",
+ "resolved": "https://registry.npmjs.org/@redocly/ajv/-/ajv-8.11.0.tgz",
+ "integrity": "sha512-9GWx27t7xWhDIR02PA18nzBdLcKQRgc46xNQvjFkrYk4UOmvKhJ/dawwiX0cCOeetN5LcaaiqQbVOWYK62SGHw==",
+ "dependencies": {
+ "fast-deep-equal": "^3.1.1",
+ "json-schema-traverse": "^1.0.0",
+ "require-from-string": "^2.0.2",
+ "uri-js": "^4.2.2"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/epoberezkin"
+ }
+ },
+ "node_modules/@redocly/ajv/node_modules/json-schema-traverse": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-1.0.0.tgz",
+ "integrity": "sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug=="
+ },
+ "node_modules/@redocly/openapi-core": {
+ "version": "1.0.0-beta.123",
+ "resolved": "https://registry.npmjs.org/@redocly/openapi-core/-/openapi-core-1.0.0-beta.123.tgz",
+ "integrity": "sha512-W6MbUWpb/VaV+Kf0c3jmMIJw3WwwF7iK5nAfcOS+ZwrlbxtIl37+1hEydFlJ209vCR9HL12PaMwdh2Vpihj6Jw==",
+ "dependencies": {
+ "@redocly/ajv": "^8.11.0",
+ "@types/node": "^14.11.8",
+ "colorette": "^1.2.0",
+ "js-levenshtein": "^1.1.6",
+ "js-yaml": "^4.1.0",
+ "lodash.isequal": "^4.5.0",
+ "minimatch": "^5.0.1",
+ "node-fetch": "^2.6.1",
+ "pluralize": "^8.0.0",
+ "yaml-ast-parser": "0.0.43"
+ },
+ "engines": {
+ "node": ">=12.0.0"
+ }
+ },
+ "node_modules/@redocly/openapi-core/node_modules/@types/node": {
+ "version": "14.18.42",
+ "resolved": "https://registry.npmjs.org/@types/node/-/node-14.18.42.tgz",
+ "integrity": "sha512-xefu+RBie4xWlK8hwAzGh3npDz/4VhF6icY/shU+zv/1fNn+ZVG7T7CRwe9LId9sAYRPxI+59QBPuKL3WpyGRg=="
+ },
+ "node_modules/@redocly/openapi-core/node_modules/brace-expansion": {
+ "version": "2.0.2",
+ "resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-2.0.2.tgz",
+ "integrity": "sha512-Jt0vHyM+jmUBqojB7E1NIYadt0vI0Qxjxd2TErW94wDz+E2LAm5vKMXXwg6ZZBTHPuUlDgQHKXvjGBdfcF1ZDQ==",
+ "license": "MIT",
+ "dependencies": {
+ "balanced-match": "^1.0.0"
+ }
+ },
+ "node_modules/@redocly/openapi-core/node_modules/colorette": {
+ "version": "1.4.0",
+ "resolved": "https://registry.npmjs.org/colorette/-/colorette-1.4.0.tgz",
+ "integrity": "sha512-Y2oEozpomLn7Q3HFP7dpww7AtMJplbM9lGZP6RDfHqmbeRjiwRg4n6VM6j4KLmRke85uWEI7JqF17f3pqdRA0g=="
+ },
+ "node_modules/@redocly/openapi-core/node_modules/minimatch": {
+ "version": "5.1.6",
+ "resolved": "https://registry.npmjs.org/minimatch/-/minimatch-5.1.6.tgz",
+ "integrity": "sha512-lKwV/1brpG6mBUFHtb7NUmtABCb2WZZmm2wNiOA5hAb8VdCS4B3dtMWyvcoViccwAW/COERjXLt0zP1zXUN26g==",
+ "dependencies": {
+ "brace-expansion": "^2.0.1"
+ },
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/@sideway/address": {
+ "version": "4.1.4",
+ "resolved": "https://registry.npmjs.org/@sideway/address/-/address-4.1.4.tgz",
+ "integrity": "sha512-7vwq+rOHVWjyXxVlR76Agnvhy8I9rpzjosTESvmhNeXOXdZZB15Fl+TI9x1SiHZH5Jv2wTGduSxFDIaq0m3DUw==",
+ "dependencies": {
+ "@hapi/hoek": "^9.0.0"
+ }
+ },
+ "node_modules/@sideway/formula": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/@sideway/formula/-/formula-3.0.1.tgz",
+ "integrity": "sha512-/poHZJJVjx3L+zVD6g9KgHfYnb443oi7wLu/XKojDviHy6HOEOA6z1Trk5aR1dGcmPenJEgb2sK2I80LeS3MIg=="
+ },
+ "node_modules/@sideway/pinpoint": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/@sideway/pinpoint/-/pinpoint-2.0.0.tgz",
+ "integrity": "sha512-RNiOoTPkptFtSVzQevY/yWtZwf/RxyVnPy/OcA9HBM3MlGDnBEYL5B41H0MTn0Uec8Hi+2qUtTfG2WWZBmMejQ=="
+ },
+ "node_modules/@sinclair/typebox": {
+ "version": "0.25.24",
+ "resolved": "https://registry.npmjs.org/@sinclair/typebox/-/typebox-0.25.24.tgz",
+ "integrity": "sha512-XJfwUVUKDHF5ugKwIcxEgc9k8b7HbznCp6eUfWgu710hMPNIO4aw4/zB5RogDQz8nd6gyCDpU9O/m6qYEWY6yQ=="
+ },
+ "node_modules/@sindresorhus/is": {
+ "version": "0.14.0",
+ "resolved": "https://registry.npmjs.org/@sindresorhus/is/-/is-0.14.0.tgz",
+ "integrity": "sha512-9NET910DNaIPngYnLLPeg+Ogzqsi9uM4mSboU5y6p8S5DzMTVEsJZrawi+BoDNUVBa2DhJqQYUFvMDfgU062LQ==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/@slorber/static-site-generator-webpack-plugin": {
+ "version": "4.0.7",
+ "resolved": "https://registry.npmjs.org/@slorber/static-site-generator-webpack-plugin/-/static-site-generator-webpack-plugin-4.0.7.tgz",
+ "integrity": "sha512-Ug7x6z5lwrz0WqdnNFOMYrDQNTPAprvHLSh6+/fmml3qUiz6l5eq+2MzLKWtn/q5K5NpSiFsZTP/fck/3vjSxA==",
+ "dependencies": {
+ "eval": "^0.1.8",
+ "p-map": "^4.0.0",
+ "webpack-sources": "^3.2.2"
+ },
+ "engines": {
+ "node": ">=14"
+ }
+ },
+ "node_modules/@svgr/babel-plugin-add-jsx-attribute": {
+ "version": "6.5.1",
+ "resolved": "https://registry.npmjs.org/@svgr/babel-plugin-add-jsx-attribute/-/babel-plugin-add-jsx-attribute-6.5.1.tgz",
+ "integrity": "sha512-9PYGcXrAxitycIjRmZB+Q0JaN07GZIWaTBIGQzfaZv+qr1n8X1XUEJ5rZ/vx6OVD9RRYlrNnXWExQXcmZeD/BQ==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/gregberge"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@svgr/babel-plugin-remove-jsx-attribute": {
+ "version": "7.0.0",
+ "resolved": "https://registry.npmjs.org/@svgr/babel-plugin-remove-jsx-attribute/-/babel-plugin-remove-jsx-attribute-7.0.0.tgz",
+ "integrity": "sha512-iiZaIvb3H/c7d3TH2HBeK91uI2rMhZNwnsIrvd7ZwGLkFw6mmunOCoVnjdYua662MqGFxlN9xTq4fv9hgR4VXQ==",
+ "engines": {
+ "node": ">=14"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/gregberge"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@svgr/babel-plugin-remove-jsx-empty-expression": {
+ "version": "7.0.0",
+ "resolved": "https://registry.npmjs.org/@svgr/babel-plugin-remove-jsx-empty-expression/-/babel-plugin-remove-jsx-empty-expression-7.0.0.tgz",
+ "integrity": "sha512-sQQmyo+qegBx8DfFc04PFmIO1FP1MHI1/QEpzcIcclo5OAISsOJPW76ZIs0bDyO/DBSJEa/tDa1W26pVtt0FRw==",
+ "engines": {
+ "node": ">=14"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/gregberge"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@svgr/babel-plugin-replace-jsx-attribute-value": {
+ "version": "6.5.1",
+ "resolved": "https://registry.npmjs.org/@svgr/babel-plugin-replace-jsx-attribute-value/-/babel-plugin-replace-jsx-attribute-value-6.5.1.tgz",
+ "integrity": "sha512-8DPaVVE3fd5JKuIC29dqyMB54sA6mfgki2H2+swh+zNJoynC8pMPzOkidqHOSc6Wj032fhl8Z0TVn1GiPpAiJg==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/gregberge"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@svgr/babel-plugin-svg-dynamic-title": {
+ "version": "6.5.1",
+ "resolved": "https://registry.npmjs.org/@svgr/babel-plugin-svg-dynamic-title/-/babel-plugin-svg-dynamic-title-6.5.1.tgz",
+ "integrity": "sha512-FwOEi0Il72iAzlkaHrlemVurgSQRDFbk0OC8dSvD5fSBPHltNh7JtLsxmZUhjYBZo2PpcU/RJvvi6Q0l7O7ogw==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/gregberge"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@svgr/babel-plugin-svg-em-dimensions": {
+ "version": "6.5.1",
+ "resolved": "https://registry.npmjs.org/@svgr/babel-plugin-svg-em-dimensions/-/babel-plugin-svg-em-dimensions-6.5.1.tgz",
+ "integrity": "sha512-gWGsiwjb4tw+ITOJ86ndY/DZZ6cuXMNE/SjcDRg+HLuCmwpcjOktwRF9WgAiycTqJD/QXqL2f8IzE2Rzh7aVXA==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/gregberge"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@svgr/babel-plugin-transform-react-native-svg": {
+ "version": "6.5.1",
+ "resolved": "https://registry.npmjs.org/@svgr/babel-plugin-transform-react-native-svg/-/babel-plugin-transform-react-native-svg-6.5.1.tgz",
+ "integrity": "sha512-2jT3nTayyYP7kI6aGutkyfJ7UMGtuguD72OjeGLwVNyfPRBD8zQthlvL+fAbAKk5n9ZNcvFkp/b1lZ7VsYqVJg==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/gregberge"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@svgr/babel-plugin-transform-svg-component": {
+ "version": "6.5.1",
+ "resolved": "https://registry.npmjs.org/@svgr/babel-plugin-transform-svg-component/-/babel-plugin-transform-svg-component-6.5.1.tgz",
+ "integrity": "sha512-a1p6LF5Jt33O3rZoVRBqdxL350oge54iZWHNI6LJB5tQ7EelvD/Mb1mfBiZNAan0dt4i3VArkFRjA4iObuNykQ==",
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/gregberge"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@svgr/babel-preset": {
+ "version": "6.5.1",
+ "resolved": "https://registry.npmjs.org/@svgr/babel-preset/-/babel-preset-6.5.1.tgz",
+ "integrity": "sha512-6127fvO/FF2oi5EzSQOAjo1LE3OtNVh11R+/8FXa+mHx1ptAaS4cknIjnUA7e6j6fwGGJ17NzaTJFUwOV2zwCw==",
+ "dependencies": {
+ "@svgr/babel-plugin-add-jsx-attribute": "^6.5.1",
+ "@svgr/babel-plugin-remove-jsx-attribute": "*",
+ "@svgr/babel-plugin-remove-jsx-empty-expression": "*",
+ "@svgr/babel-plugin-replace-jsx-attribute-value": "^6.5.1",
+ "@svgr/babel-plugin-svg-dynamic-title": "^6.5.1",
+ "@svgr/babel-plugin-svg-em-dimensions": "^6.5.1",
+ "@svgr/babel-plugin-transform-react-native-svg": "^6.5.1",
+ "@svgr/babel-plugin-transform-svg-component": "^6.5.1"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/gregberge"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/@svgr/core": {
+ "version": "6.5.1",
+ "resolved": "https://registry.npmjs.org/@svgr/core/-/core-6.5.1.tgz",
+ "integrity": "sha512-/xdLSWxK5QkqG524ONSjvg3V/FkNyCv538OIBdQqPNaAta3AsXj/Bd2FbvR87yMbXO2hFSWiAe/Q6IkVPDw+mw==",
+ "dependencies": {
+ "@babel/core": "^7.19.6",
+ "@svgr/babel-preset": "^6.5.1",
+ "@svgr/plugin-jsx": "^6.5.1",
+ "camelcase": "^6.2.0",
+ "cosmiconfig": "^7.0.1"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/gregberge"
+ }
+ },
+ "node_modules/@svgr/hast-util-to-babel-ast": {
+ "version": "6.5.1",
+ "resolved": "https://registry.npmjs.org/@svgr/hast-util-to-babel-ast/-/hast-util-to-babel-ast-6.5.1.tgz",
+ "integrity": "sha512-1hnUxxjd83EAxbL4a0JDJoD3Dao3hmjvyvyEV8PzWmLK3B9m9NPlW7GKjFyoWE8nM7HnXzPcmmSyOW8yOddSXw==",
+ "dependencies": {
+ "@babel/types": "^7.20.0",
+ "entities": "^4.4.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/gregberge"
+ }
+ },
+ "node_modules/@svgr/plugin-jsx": {
+ "version": "6.5.1",
+ "resolved": "https://registry.npmjs.org/@svgr/plugin-jsx/-/plugin-jsx-6.5.1.tgz",
+ "integrity": "sha512-+UdQxI3jgtSjCykNSlEMuy1jSRQlGC7pqBCPvkG/2dATdWo082zHTTK3uhnAju2/6XpE6B5mZ3z4Z8Ns01S8Gw==",
+ "dependencies": {
+ "@babel/core": "^7.19.6",
+ "@svgr/babel-preset": "^6.5.1",
+ "@svgr/hast-util-to-babel-ast": "^6.5.1",
+ "svg-parser": "^2.0.4"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/gregberge"
+ },
+ "peerDependencies": {
+ "@svgr/core": "^6.0.0"
+ }
+ },
+ "node_modules/@svgr/plugin-svgo": {
+ "version": "6.5.1",
+ "resolved": "https://registry.npmjs.org/@svgr/plugin-svgo/-/plugin-svgo-6.5.1.tgz",
+ "integrity": "sha512-omvZKf8ixP9z6GWgwbtmP9qQMPX4ODXi+wzbVZgomNFsUIlHA1sf4fThdwTWSsZGgvGAG6yE+b/F5gWUkcZ/iQ==",
+ "dependencies": {
+ "cosmiconfig": "^7.0.1",
+ "deepmerge": "^4.2.2",
+ "svgo": "^2.8.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/gregberge"
+ },
+ "peerDependencies": {
+ "@svgr/core": "*"
+ }
+ },
+ "node_modules/@svgr/webpack": {
+ "version": "6.5.1",
+ "resolved": "https://registry.npmjs.org/@svgr/webpack/-/webpack-6.5.1.tgz",
+ "integrity": "sha512-cQ/AsnBkXPkEK8cLbv4Dm7JGXq2XrumKnL1dRpJD9rIO2fTIlJI9a1uCciYG1F2aUsox/hJQyNGbt3soDxSRkA==",
+ "dependencies": {
+ "@babel/core": "^7.19.6",
+ "@babel/plugin-transform-react-constant-elements": "^7.18.12",
+ "@babel/preset-env": "^7.19.4",
+ "@babel/preset-react": "^7.18.6",
+ "@babel/preset-typescript": "^7.18.6",
+ "@svgr/core": "^6.5.1",
+ "@svgr/plugin-jsx": "^6.5.1",
+ "@svgr/plugin-svgo": "^6.5.1"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/gregberge"
+ }
+ },
+ "node_modules/@szmarczak/http-timer": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmjs.org/@szmarczak/http-timer/-/http-timer-1.1.2.tgz",
+ "integrity": "sha512-XIB2XbzHTN6ieIjfIMV9hlVcfPU26s2vafYWQcZHWXHOxiaRZYEDKEwdl129Zyg50+foYV2jCgtrqSA6qNuNSA==",
+ "dependencies": {
+ "defer-to-connect": "^1.0.1"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/@trysound/sax": {
+ "version": "0.2.0",
+ "resolved": "https://registry.npmjs.org/@trysound/sax/-/sax-0.2.0.tgz",
+ "integrity": "sha512-L7z9BgrNEcYyUYtF+HaEfiS5ebkh9jXqbszz7pC0hRBPaatV0XjSD3+eHrpqFemQfgwiFF0QPIarnIihIDn7OA==",
+ "engines": {
+ "node": ">=10.13.0"
+ }
+ },
+ "node_modules/@types/body-parser": {
+ "version": "1.19.2",
+ "resolved": "https://registry.npmjs.org/@types/body-parser/-/body-parser-1.19.2.tgz",
+ "integrity": "sha512-ALYone6pm6QmwZoAgeyNksccT9Q4AWZQ6PvfwR37GT6r6FWUPguq6sUmNGSMV2Wr761oQoBxwGGa6DR5o1DC9g==",
+ "dependencies": {
+ "@types/connect": "*",
+ "@types/node": "*"
+ }
+ },
+ "node_modules/@types/bonjour": {
+ "version": "3.5.10",
+ "resolved": "https://registry.npmjs.org/@types/bonjour/-/bonjour-3.5.10.tgz",
+ "integrity": "sha512-p7ienRMiS41Nu2/igbJxxLDWrSZ0WxM8UQgCeO9KhoVF7cOVFkrKsiDr1EsJIla8vV3oEEjGcz11jc5yimhzZw==",
+ "dependencies": {
+ "@types/node": "*"
+ }
+ },
+ "node_modules/@types/connect": {
+ "version": "3.4.35",
+ "resolved": "https://registry.npmjs.org/@types/connect/-/connect-3.4.35.tgz",
+ "integrity": "sha512-cdeYyv4KWoEgpBISTxWvqYsVy444DOqehiF3fM3ne10AmJ62RSyNkUnxMJXHQWRQQX2eR94m5y1IZyDwBjV9FQ==",
+ "dependencies": {
+ "@types/node": "*"
+ }
+ },
+ "node_modules/@types/connect-history-api-fallback": {
+ "version": "1.3.5",
+ "resolved": "https://registry.npmjs.org/@types/connect-history-api-fallback/-/connect-history-api-fallback-1.3.5.tgz",
+ "integrity": "sha512-h8QJa8xSb1WD4fpKBDcATDNGXghFj6/3GRWG6dhmRcu0RX1Ubasur2Uvx5aeEwlf0MwblEC2bMzzMQntxnw/Cw==",
+ "dependencies": {
+ "@types/express-serve-static-core": "*",
+ "@types/node": "*"
+ }
+ },
+ "node_modules/@types/estree": {
+ "version": "1.0.5",
+ "resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.5.tgz",
+ "integrity": "sha512-/kYRxGDLWzHOB7q+wtSUQlFrtcdUccpfy+X+9iMBpHK8QLLhx2wIPYuS5DYtR9Wa/YlZAbIovy7qVdB1Aq6Lyw=="
+ },
+ "node_modules/@types/express": {
+ "version": "4.17.17",
+ "resolved": "https://registry.npmjs.org/@types/express/-/express-4.17.17.tgz",
+ "integrity": "sha512-Q4FmmuLGBG58btUnfS1c1r/NQdlp3DMfGDGig8WhfpA2YRUtEkxAjkZb0yvplJGYdF1fsQ81iMDcH24sSCNC/Q==",
+ "dependencies": {
+ "@types/body-parser": "*",
+ "@types/express-serve-static-core": "^4.17.33",
+ "@types/qs": "*",
+ "@types/serve-static": "*"
+ }
+ },
+ "node_modules/@types/express-serve-static-core": {
+ "version": "4.17.33",
+ "resolved": "https://registry.npmjs.org/@types/express-serve-static-core/-/express-serve-static-core-4.17.33.tgz",
+ "integrity": "sha512-TPBqmR/HRYI3eC2E5hmiivIzv+bidAfXofM+sbonAGvyDhySGw9/PQZFt2BLOrjUUR++4eJVpx6KnLQK1Fk9tA==",
+ "dependencies": {
+ "@types/node": "*",
+ "@types/qs": "*",
+ "@types/range-parser": "*"
+ }
+ },
+ "node_modules/@types/hast": {
+ "version": "2.3.4",
+ "resolved": "https://registry.npmjs.org/@types/hast/-/hast-2.3.4.tgz",
+ "integrity": "sha512-wLEm0QvaoawEDoTRwzTXp4b4jpwiJDvR5KMnFnVodm3scufTlBOWRD6N1OBf9TZMhjlNsSfcO5V+7AF4+Vy+9g==",
+ "dependencies": {
+ "@types/unist": "*"
+ }
+ },
+ "node_modules/@types/history": {
+ "version": "4.7.11",
+ "resolved": "https://registry.npmjs.org/@types/history/-/history-4.7.11.tgz",
+ "integrity": "sha512-qjDJRrmvBMiTx+jyLxvLfJU7UznFuokDv4f3WRuriHKERccVpFU+8XMQUAbDzoiJCsmexxRExQeMwwCdamSKDA=="
+ },
+ "node_modules/@types/html-minifier-terser": {
+ "version": "6.1.0",
+ "resolved": "https://registry.npmjs.org/@types/html-minifier-terser/-/html-minifier-terser-6.1.0.tgz",
+ "integrity": "sha512-oh/6byDPnL1zeNXFrDXFLyZjkr1MsBG667IM792caf1L2UPOOMf65NFzjUH/ltyfwjAGfs1rsX1eftK0jC/KIg=="
+ },
+ "node_modules/@types/http-proxy": {
+ "version": "1.17.10",
+ "resolved": "https://registry.npmjs.org/@types/http-proxy/-/http-proxy-1.17.10.tgz",
+ "integrity": "sha512-Qs5aULi+zV1bwKAg5z1PWnDXWmsn+LxIvUGv6E2+OOMYhclZMO+OXd9pYVf2gLykf2I7IV2u7oTHwChPNsvJ7g==",
+ "dependencies": {
+ "@types/node": "*"
+ }
+ },
+ "node_modules/@types/istanbul-lib-coverage": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/@types/istanbul-lib-coverage/-/istanbul-lib-coverage-2.0.4.tgz",
+ "integrity": "sha512-z/QT1XN4K4KYuslS23k62yDIDLwLFkzxOuMplDtObz0+y7VqJCaO2o+SPwHCvLFZh7xazvvoor2tA/hPz9ee7g=="
+ },
+ "node_modules/@types/istanbul-lib-report": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/@types/istanbul-lib-report/-/istanbul-lib-report-3.0.0.tgz",
+ "integrity": "sha512-plGgXAPfVKFoYfa9NpYDAkseG+g6Jr294RqeqcqDixSbU34MZVJRi/P+7Y8GDpzkEwLaGZZOpKIEmeVZNtKsrg==",
+ "dependencies": {
+ "@types/istanbul-lib-coverage": "*"
+ }
+ },
+ "node_modules/@types/istanbul-reports": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/@types/istanbul-reports/-/istanbul-reports-3.0.1.tgz",
+ "integrity": "sha512-c3mAZEuK0lvBp8tmuL74XRKn1+y2dcwOUpH7x4WrF6gk1GIgiluDRgMYQtw2OFcBvAJWlt6ASU3tSqxp0Uu0Aw==",
+ "dependencies": {
+ "@types/istanbul-lib-report": "*"
+ }
+ },
+ "node_modules/@types/json-schema": {
+ "version": "7.0.11",
+ "resolved": "https://registry.npmjs.org/@types/json-schema/-/json-schema-7.0.11.tgz",
+ "integrity": "sha512-wOuvG1SN4Us4rez+tylwwwCV1psiNVOkJeM3AUWUNWg/jDQY2+HE/444y5gc+jBmRqASOm2Oeh5c1axHobwRKQ=="
+ },
+ "node_modules/@types/mdast": {
+ "version": "3.0.11",
+ "resolved": "https://registry.npmjs.org/@types/mdast/-/mdast-3.0.11.tgz",
+ "integrity": "sha512-Y/uImid8aAwrEA24/1tcRZwpxX3pIFTSilcNDKSPn+Y2iDywSEachzRuvgAYYLR3wpGXAsMbv5lvKLDZLeYPAw==",
+ "dependencies": {
+ "@types/unist": "*"
+ }
+ },
+ "node_modules/@types/mime": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/@types/mime/-/mime-3.0.1.tgz",
+ "integrity": "sha512-Y4XFY5VJAuw0FgAqPNd6NNoV44jbq9Bz2L7Rh/J6jLTiHBSBJa9fxqQIvkIld4GsoDOcCbvzOUAbLPsSKKg+uA=="
+ },
+ "node_modules/@types/node": {
+ "version": "18.15.11",
+ "resolved": "https://registry.npmjs.org/@types/node/-/node-18.15.11.tgz",
+ "integrity": "sha512-E5Kwq2n4SbMzQOn6wnmBjuK9ouqlURrcZDVfbo9ftDDTFt3nk7ZKK4GMOzoYgnpQJKcxwQw+lGaBvvlMo0qN/Q=="
+ },
+ "node_modules/@types/parse-json": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/@types/parse-json/-/parse-json-4.0.0.tgz",
+ "integrity": "sha512-//oorEZjL6sbPcKUaCdIGlIUeH26mgzimjBB77G6XRgnDl/L5wOnpyBGRe/Mmf5CVW3PwEBE1NjiMZ/ssFh4wA=="
+ },
+ "node_modules/@types/parse5": {
+ "version": "5.0.3",
+ "resolved": "https://registry.npmjs.org/@types/parse5/-/parse5-5.0.3.tgz",
+ "integrity": "sha512-kUNnecmtkunAoQ3CnjmMkzNU/gtxG8guhi+Fk2U/kOpIKjIMKnXGp4IJCgQJrXSgMsWYimYG4TGjz/UzbGEBTw=="
+ },
+ "node_modules/@types/prop-types": {
+ "version": "15.7.5",
+ "resolved": "https://registry.npmjs.org/@types/prop-types/-/prop-types-15.7.5.tgz",
+ "integrity": "sha512-JCB8C6SnDoQf0cNycqd/35A7MjcnK+ZTqE7judS6o7utxUCg6imJg3QK2qzHKszlTjcj2cn+NwMB2i96ubpj7w=="
+ },
+ "node_modules/@types/qs": {
+ "version": "6.9.7",
+ "resolved": "https://registry.npmjs.org/@types/qs/-/qs-6.9.7.tgz",
+ "integrity": "sha512-FGa1F62FT09qcrueBA6qYTrJPVDzah9a+493+o2PCXsesWHIn27G98TsSMs3WPNbZIEj4+VJf6saSFpvD+3Zsw=="
+ },
+ "node_modules/@types/range-parser": {
+ "version": "1.2.4",
+ "resolved": "https://registry.npmjs.org/@types/range-parser/-/range-parser-1.2.4.tgz",
+ "integrity": "sha512-EEhsLsD6UsDM1yFhAvy0Cjr6VwmpMWqFBCb9w07wVugF7w9nfajxLuVmngTIpgS6svCnm6Vaw+MZhoDCKnOfsw=="
+ },
+ "node_modules/@types/react": {
+ "version": "18.0.31",
+ "resolved": "https://registry.npmjs.org/@types/react/-/react-18.0.31.tgz",
+ "integrity": "sha512-EEG67of7DsvRDU6BLLI0p+k1GojDLz9+lZsnCpCRTa/lOokvyPBvp8S5x+A24hME3yyQuIipcP70KJ6H7Qupww==",
+ "dependencies": {
+ "@types/prop-types": "*",
+ "@types/scheduler": "*",
+ "csstype": "^3.0.2"
+ }
+ },
+ "node_modules/@types/react-router": {
+ "version": "5.1.20",
+ "resolved": "https://registry.npmjs.org/@types/react-router/-/react-router-5.1.20.tgz",
+ "integrity": "sha512-jGjmu/ZqS7FjSH6owMcD5qpq19+1RS9DeVRqfl1FeBMxTDQAGwlMWOcs52NDoXaNKyG3d1cYQFMs9rCrb88o9Q==",
+ "dependencies": {
+ "@types/history": "^4.7.11",
+ "@types/react": "*"
+ }
+ },
+ "node_modules/@types/react-router-config": {
+ "version": "5.0.6",
+ "resolved": "https://registry.npmjs.org/@types/react-router-config/-/react-router-config-5.0.6.tgz",
+ "integrity": "sha512-db1mx37a1EJDf1XeX8jJN7R3PZABmJQXR8r28yUjVMFSjkmnQo6X6pOEEmNl+Tp2gYQOGPdYbFIipBtdElZ3Yg==",
+ "dependencies": {
+ "@types/history": "^4.7.11",
+ "@types/react": "*",
+ "@types/react-router": "*"
+ }
+ },
+ "node_modules/@types/react-router-dom": {
+ "version": "5.3.3",
+ "resolved": "https://registry.npmjs.org/@types/react-router-dom/-/react-router-dom-5.3.3.tgz",
+ "integrity": "sha512-kpqnYK4wcdm5UaWI3fLcELopqLrHgLqNsdpHauzlQktfkHL3npOSwtj1Uz9oKBAzs7lFtVkV8j83voAz2D8fhw==",
+ "dependencies": {
+ "@types/history": "^4.7.11",
+ "@types/react": "*",
+ "@types/react-router": "*"
+ }
+ },
+ "node_modules/@types/retry": {
+ "version": "0.12.0",
+ "resolved": "https://registry.npmjs.org/@types/retry/-/retry-0.12.0.tgz",
+ "integrity": "sha512-wWKOClTTiizcZhXnPY4wikVAwmdYHp8q6DmC+EJUzAMsycb7HB32Kh9RN4+0gExjmPmZSAQjgURXIGATPegAvA=="
+ },
+ "node_modules/@types/sax": {
+ "version": "1.2.4",
+ "resolved": "https://registry.npmjs.org/@types/sax/-/sax-1.2.4.tgz",
+ "integrity": "sha512-pSAff4IAxJjfAXUG6tFkO7dsSbTmf8CtUpfhhZ5VhkRpC4628tJhh3+V6H1E+/Gs9piSzYKT5yzHO5M4GG9jkw==",
+ "dependencies": {
+ "@types/node": "*"
+ }
+ },
+ "node_modules/@types/scheduler": {
+ "version": "0.16.3",
+ "resolved": "https://registry.npmjs.org/@types/scheduler/-/scheduler-0.16.3.tgz",
+ "integrity": "sha512-5cJ8CB4yAx7BH1oMvdU0Jh9lrEXyPkar6F9G/ERswkCuvP4KQZfZkSjcMbAICCpQTN4OuZn8tz0HiKv9TGZgrQ=="
+ },
+ "node_modules/@types/serve-index": {
+ "version": "1.9.1",
+ "resolved": "https://registry.npmjs.org/@types/serve-index/-/serve-index-1.9.1.tgz",
+ "integrity": "sha512-d/Hs3nWDxNL2xAczmOVZNj92YZCS6RGxfBPjKzuu/XirCgXdpKEb88dYNbrYGint6IVWLNP+yonwVAuRC0T2Dg==",
+ "dependencies": {
+ "@types/express": "*"
+ }
+ },
+ "node_modules/@types/serve-static": {
+ "version": "1.15.1",
+ "resolved": "https://registry.npmjs.org/@types/serve-static/-/serve-static-1.15.1.tgz",
+ "integrity": "sha512-NUo5XNiAdULrJENtJXZZ3fHtfMolzZwczzBbnAeBbqBwG+LaG6YaJtuwzwGSQZ2wsCrxjEhNNjAkKigy3n8teQ==",
+ "dependencies": {
+ "@types/mime": "*",
+ "@types/node": "*"
+ }
+ },
+ "node_modules/@types/sockjs": {
+ "version": "0.3.33",
+ "resolved": "https://registry.npmjs.org/@types/sockjs/-/sockjs-0.3.33.tgz",
+ "integrity": "sha512-f0KEEe05NvUnat+boPTZ0dgaLZ4SfSouXUgv5noUiefG2ajgKjmETo9ZJyuqsl7dfl2aHlLJUiki6B4ZYldiiw==",
+ "dependencies": {
+ "@types/node": "*"
+ }
+ },
+ "node_modules/@types/unist": {
+ "version": "2.0.6",
+ "resolved": "https://registry.npmjs.org/@types/unist/-/unist-2.0.6.tgz",
+ "integrity": "sha512-PBjIUxZHOuj0R15/xuwJYjFi+KZdNFrehocChv4g5hu6aFroHue8m0lBP0POdK2nKzbw0cgV1mws8+V/JAcEkQ=="
+ },
+ "node_modules/@types/ws": {
+ "version": "8.5.4",
+ "resolved": "https://registry.npmjs.org/@types/ws/-/ws-8.5.4.tgz",
+ "integrity": "sha512-zdQDHKUgcX/zBc4GrwsE/7dVdAD8JR4EuiAXiiUhhfyIJXXb2+PrGshFyeXWQPMmmZ2XxgaqclgpIC7eTXc1mg==",
+ "dependencies": {
+ "@types/node": "*"
+ }
+ },
+ "node_modules/@types/yargs": {
+ "version": "17.0.24",
+ "resolved": "https://registry.npmjs.org/@types/yargs/-/yargs-17.0.24.tgz",
+ "integrity": "sha512-6i0aC7jV6QzQB8ne1joVZ0eSFIstHsCrobmOtghM11yGlH0j43FKL2UhWdELkyps0zuf7qVTUVCCR+tgSlyLLw==",
+ "dependencies": {
+ "@types/yargs-parser": "*"
+ }
+ },
+ "node_modules/@types/yargs-parser": {
+ "version": "21.0.0",
+ "resolved": "https://registry.npmjs.org/@types/yargs-parser/-/yargs-parser-21.0.0.tgz",
+ "integrity": "sha512-iO9ZQHkZxHn4mSakYV0vFHAVDyEOIJQrV2uZ06HxEPcx+mt8swXoZHIbaaJ2crJYFfErySgktuTZ3BeLz+XmFA=="
+ },
+ "node_modules/@webassemblyjs/ast": {
+ "version": "1.12.1",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/ast/-/ast-1.12.1.tgz",
+ "integrity": "sha512-EKfMUOPRRUTy5UII4qJDGPpqfwjOmZ5jeGFwid9mnoqIFK+e0vqoi1qH56JpmZSzEL53jKnNzScdmftJyG5xWg==",
+ "dependencies": {
+ "@webassemblyjs/helper-numbers": "1.11.6",
+ "@webassemblyjs/helper-wasm-bytecode": "1.11.6"
+ }
+ },
+ "node_modules/@webassemblyjs/floating-point-hex-parser": {
+ "version": "1.11.6",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/floating-point-hex-parser/-/floating-point-hex-parser-1.11.6.tgz",
+ "integrity": "sha512-ejAj9hfRJ2XMsNHk/v6Fu2dGS+i4UaXBXGemOfQ/JfQ6mdQg/WXtwleQRLLS4OvfDhv8rYnVwH27YJLMyYsxhw=="
+ },
+ "node_modules/@webassemblyjs/helper-api-error": {
+ "version": "1.11.6",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/helper-api-error/-/helper-api-error-1.11.6.tgz",
+ "integrity": "sha512-o0YkoP4pVu4rN8aTJgAyj9hC2Sv5UlkzCHhxqWj8butaLvnpdc2jOwh4ewE6CX0txSfLn/UYaV/pheS2Txg//Q=="
+ },
+ "node_modules/@webassemblyjs/helper-buffer": {
+ "version": "1.12.1",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/helper-buffer/-/helper-buffer-1.12.1.tgz",
+ "integrity": "sha512-nzJwQw99DNDKr9BVCOZcLuJJUlqkJh+kVzVl6Fmq/tI5ZtEyWT1KZMyOXltXLZJmDtvLCDgwsyrkohEtopTXCw=="
+ },
+ "node_modules/@webassemblyjs/helper-numbers": {
+ "version": "1.11.6",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/helper-numbers/-/helper-numbers-1.11.6.tgz",
+ "integrity": "sha512-vUIhZ8LZoIWHBohiEObxVm6hwP034jwmc9kuq5GdHZH0wiLVLIPcMCdpJzG4C11cHoQ25TFIQj9kaVADVX7N3g==",
+ "dependencies": {
+ "@webassemblyjs/floating-point-hex-parser": "1.11.6",
+ "@webassemblyjs/helper-api-error": "1.11.6",
+ "@xtuc/long": "4.2.2"
+ }
+ },
+ "node_modules/@webassemblyjs/helper-wasm-bytecode": {
+ "version": "1.11.6",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/helper-wasm-bytecode/-/helper-wasm-bytecode-1.11.6.tgz",
+ "integrity": "sha512-sFFHKwcmBprO9e7Icf0+gddyWYDViL8bpPjJJl0WHxCdETktXdmtWLGVzoHbqUcY4Be1LkNfwTmXOJUFZYSJdA=="
+ },
+ "node_modules/@webassemblyjs/helper-wasm-section": {
+ "version": "1.12.1",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/helper-wasm-section/-/helper-wasm-section-1.12.1.tgz",
+ "integrity": "sha512-Jif4vfB6FJlUlSbgEMHUyk1j234GTNG9dBJ4XJdOySoj518Xj0oGsNi59cUQF4RRMS9ouBUxDDdyBVfPTypa5g==",
+ "dependencies": {
+ "@webassemblyjs/ast": "1.12.1",
+ "@webassemblyjs/helper-buffer": "1.12.1",
+ "@webassemblyjs/helper-wasm-bytecode": "1.11.6",
+ "@webassemblyjs/wasm-gen": "1.12.1"
+ }
+ },
+ "node_modules/@webassemblyjs/ieee754": {
+ "version": "1.11.6",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/ieee754/-/ieee754-1.11.6.tgz",
+ "integrity": "sha512-LM4p2csPNvbij6U1f19v6WR56QZ8JcHg3QIJTlSwzFcmx6WSORicYj6I63f9yU1kEUtrpG+kjkiIAkevHpDXrg==",
+ "dependencies": {
+ "@xtuc/ieee754": "^1.2.0"
+ }
+ },
+ "node_modules/@webassemblyjs/leb128": {
+ "version": "1.11.6",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/leb128/-/leb128-1.11.6.tgz",
+ "integrity": "sha512-m7a0FhE67DQXgouf1tbN5XQcdWoNgaAuoULHIfGFIEVKA6tu/edls6XnIlkmS6FrXAquJRPni3ZZKjw6FSPjPQ==",
+ "dependencies": {
+ "@xtuc/long": "4.2.2"
+ }
+ },
+ "node_modules/@webassemblyjs/utf8": {
+ "version": "1.11.6",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/utf8/-/utf8-1.11.6.tgz",
+ "integrity": "sha512-vtXf2wTQ3+up9Zsg8sa2yWiQpzSsMyXj0qViVP6xKGCUT8p8YJ6HqI7l5eCnWx1T/FYdsv07HQs2wTFbbof/RA=="
+ },
+ "node_modules/@webassemblyjs/wasm-edit": {
+ "version": "1.12.1",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/wasm-edit/-/wasm-edit-1.12.1.tgz",
+ "integrity": "sha512-1DuwbVvADvS5mGnXbE+c9NfA8QRcZ6iKquqjjmR10k6o+zzsRVesil54DKexiowcFCPdr/Q0qaMgB01+SQ1u6g==",
+ "dependencies": {
+ "@webassemblyjs/ast": "1.12.1",
+ "@webassemblyjs/helper-buffer": "1.12.1",
+ "@webassemblyjs/helper-wasm-bytecode": "1.11.6",
+ "@webassemblyjs/helper-wasm-section": "1.12.1",
+ "@webassemblyjs/wasm-gen": "1.12.1",
+ "@webassemblyjs/wasm-opt": "1.12.1",
+ "@webassemblyjs/wasm-parser": "1.12.1",
+ "@webassemblyjs/wast-printer": "1.12.1"
+ }
+ },
+ "node_modules/@webassemblyjs/wasm-gen": {
+ "version": "1.12.1",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/wasm-gen/-/wasm-gen-1.12.1.tgz",
+ "integrity": "sha512-TDq4Ojh9fcohAw6OIMXqiIcTq5KUXTGRkVxbSo1hQnSy6lAM5GSdfwWeSxpAo0YzgsgF182E/U0mDNhuA0tW7w==",
+ "dependencies": {
+ "@webassemblyjs/ast": "1.12.1",
+ "@webassemblyjs/helper-wasm-bytecode": "1.11.6",
+ "@webassemblyjs/ieee754": "1.11.6",
+ "@webassemblyjs/leb128": "1.11.6",
+ "@webassemblyjs/utf8": "1.11.6"
+ }
+ },
+ "node_modules/@webassemblyjs/wasm-opt": {
+ "version": "1.12.1",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/wasm-opt/-/wasm-opt-1.12.1.tgz",
+ "integrity": "sha512-Jg99j/2gG2iaz3hijw857AVYekZe2SAskcqlWIZXjji5WStnOpVoat3gQfT/Q5tb2djnCjBtMocY/Su1GfxPBg==",
+ "dependencies": {
+ "@webassemblyjs/ast": "1.12.1",
+ "@webassemblyjs/helper-buffer": "1.12.1",
+ "@webassemblyjs/wasm-gen": "1.12.1",
+ "@webassemblyjs/wasm-parser": "1.12.1"
+ }
+ },
+ "node_modules/@webassemblyjs/wasm-parser": {
+ "version": "1.12.1",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/wasm-parser/-/wasm-parser-1.12.1.tgz",
+ "integrity": "sha512-xikIi7c2FHXysxXe3COrVUPSheuBtpcfhbpFj4gmu7KRLYOzANztwUU0IbsqvMqzuNK2+glRGWCEqZo1WCLyAQ==",
+ "dependencies": {
+ "@webassemblyjs/ast": "1.12.1",
+ "@webassemblyjs/helper-api-error": "1.11.6",
+ "@webassemblyjs/helper-wasm-bytecode": "1.11.6",
+ "@webassemblyjs/ieee754": "1.11.6",
+ "@webassemblyjs/leb128": "1.11.6",
+ "@webassemblyjs/utf8": "1.11.6"
+ }
+ },
+ "node_modules/@webassemblyjs/wast-printer": {
+ "version": "1.12.1",
+ "resolved": "https://registry.npmjs.org/@webassemblyjs/wast-printer/-/wast-printer-1.12.1.tgz",
+ "integrity": "sha512-+X4WAlOisVWQMikjbcvY2e0rwPsKQ9F688lksZhBcPycBBuii3O7m8FACbDMWDojpAqvjIncrG8J0XHKyQfVeA==",
+ "dependencies": {
+ "@webassemblyjs/ast": "1.12.1",
+ "@xtuc/long": "4.2.2"
+ }
+ },
+ "node_modules/@xtuc/ieee754": {
+ "version": "1.2.0",
+ "resolved": "https://registry.npmjs.org/@xtuc/ieee754/-/ieee754-1.2.0.tgz",
+ "integrity": "sha512-DX8nKgqcGwsc0eJSqYt5lwP4DH5FlHnmuWWBRy7X0NcaGR0ZtuyeESgMwTYVEtxmsNGY+qit4QYT/MIYTOTPeA=="
+ },
+ "node_modules/@xtuc/long": {
+ "version": "4.2.2",
+ "resolved": "https://registry.npmjs.org/@xtuc/long/-/long-4.2.2.tgz",
+ "integrity": "sha512-NuHqBY1PB/D8xU6s/thBgOAiAP7HOYDQ32+BFZILJ8ivkUkAHQnWfn6WhL79Owj1qmUnoN/YPhktdIoucipkAQ=="
+ },
+ "node_modules/accepts": {
+ "version": "1.3.8",
+ "resolved": "https://registry.npmjs.org/accepts/-/accepts-1.3.8.tgz",
+ "integrity": "sha512-PYAthTa2m2VKxuvSD3DPC/Gy+U+sOA1LAuT8mkmRuvw+NACSaeXEQ+NHcVF7rONl6qcaxV3Uuemwawk+7+SJLw==",
+ "dependencies": {
+ "mime-types": "~2.1.34",
+ "negotiator": "0.6.3"
+ },
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/accepts/node_modules/mime-db": {
+ "version": "1.52.0",
+ "resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
+ "integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/accepts/node_modules/mime-types": {
+ "version": "2.1.35",
+ "resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
+ "integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
+ "dependencies": {
+ "mime-db": "1.52.0"
+ },
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/acorn": {
+ "version": "8.8.2",
+ "resolved": "https://registry.npmjs.org/acorn/-/acorn-8.8.2.tgz",
+ "integrity": "sha512-xjIYgE8HBrkpd/sJqOGNspf8uHG+NOHGOw6a/Urj8taM2EXfdNAH2oFcPeIFfsv3+kz/mJrS5VuMqbNLjCa2vw==",
+ "bin": {
+ "acorn": "bin/acorn"
+ },
+ "engines": {
+ "node": ">=0.4.0"
+ }
+ },
+ "node_modules/acorn-import-attributes": {
+ "version": "1.9.5",
+ "resolved": "https://registry.npmjs.org/acorn-import-attributes/-/acorn-import-attributes-1.9.5.tgz",
+ "integrity": "sha512-n02Vykv5uA3eHGM/Z2dQrcD56kL8TyDb2p1+0P83PClMnC/nc+anbQRhIOWnSq4Ke/KvDPrY3C9hDtC/A3eHnQ==",
+ "peerDependencies": {
+ "acorn": "^8"
+ }
+ },
+ "node_modules/acorn-walk": {
+ "version": "8.2.0",
+ "resolved": "https://registry.npmjs.org/acorn-walk/-/acorn-walk-8.2.0.tgz",
+ "integrity": "sha512-k+iyHEuPgSw6SbuDpGQM+06HQUa04DZ3o+F6CSzXMvvI5KMvnaEqXe+YVe555R9nn6GPt404fos4wcgpw12SDA==",
+ "engines": {
+ "node": ">=0.4.0"
+ }
+ },
+ "node_modules/address": {
+ "version": "1.2.2",
+ "resolved": "https://registry.npmjs.org/address/-/address-1.2.2.tgz",
+ "integrity": "sha512-4B/qKCfeE/ODUaAUpSwfzazo5x29WD4r3vXiWsB7I2mSDAihwEqKO+g8GELZUQSSAo5e1XTYh3ZVfLyxBc12nA==",
+ "engines": {
+ "node": ">= 10.0.0"
+ }
+ },
+ "node_modules/aggregate-error": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/aggregate-error/-/aggregate-error-3.1.0.tgz",
+ "integrity": "sha512-4I7Td01quW/RpocfNayFdFVk1qSuoh0E7JrbRJ16nH01HhKFQ88INq9Sd+nd72zqRySlr9BmDA8xlEJ6vJMrYA==",
+ "dependencies": {
+ "clean-stack": "^2.0.0",
+ "indent-string": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/ajv": {
+ "version": "6.12.6",
+ "resolved": "https://registry.npmjs.org/ajv/-/ajv-6.12.6.tgz",
+ "integrity": "sha512-j3fVLgvTo527anyYyJOGTYJbG+vnnQYvE0m5mmkc1TK+nxAppkCLMIL0aZ4dblVCNoGShhm+kzE4ZUykBoMg4g==",
+ "dependencies": {
+ "fast-deep-equal": "^3.1.1",
+ "fast-json-stable-stringify": "^2.0.0",
+ "json-schema-traverse": "^0.4.1",
+ "uri-js": "^4.2.2"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/epoberezkin"
+ }
+ },
+ "node_modules/ajv-formats": {
+ "version": "2.1.1",
+ "resolved": "https://registry.npmjs.org/ajv-formats/-/ajv-formats-2.1.1.tgz",
+ "integrity": "sha512-Wx0Kx52hxE7C18hkMEggYlEifqWZtYaRgouJor+WMdPnQyEK13vgEWyVNup7SoeeoLMsr4kf5h6dOW11I15MUA==",
+ "dependencies": {
+ "ajv": "^8.0.0"
+ },
+ "peerDependencies": {
+ "ajv": "^8.0.0"
+ },
+ "peerDependenciesMeta": {
+ "ajv": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/ajv-formats/node_modules/ajv": {
+ "version": "8.12.0",
+ "resolved": "https://registry.npmjs.org/ajv/-/ajv-8.12.0.tgz",
+ "integrity": "sha512-sRu1kpcO9yLtYxBKvqfTeh9KzZEwO3STyX1HT+4CaDzC6HpTGYhIhPIzj9XuKU7KYDwnaeh5hcOwjy1QuJzBPA==",
+ "dependencies": {
+ "fast-deep-equal": "^3.1.1",
+ "json-schema-traverse": "^1.0.0",
+ "require-from-string": "^2.0.2",
+ "uri-js": "^4.2.2"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/epoberezkin"
+ }
+ },
+ "node_modules/ajv-formats/node_modules/json-schema-traverse": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-1.0.0.tgz",
+ "integrity": "sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug=="
+ },
+ "node_modules/ajv-keywords": {
+ "version": "3.5.2",
+ "resolved": "https://registry.npmjs.org/ajv-keywords/-/ajv-keywords-3.5.2.tgz",
+ "integrity": "sha512-5p6WTN0DdTGVQk6VjcEju19IgaHudalcfabD7yhDGeA6bcQnmL+CpveLJq/3hvfwd1aof6L386Ougkx6RfyMIQ==",
+ "peerDependencies": {
+ "ajv": "^6.9.1"
+ }
+ },
+ "node_modules/algoliasearch": {
+ "version": "4.16.0",
+ "resolved": "https://registry.npmjs.org/algoliasearch/-/algoliasearch-4.16.0.tgz",
+ "integrity": "sha512-HAjKJ6bBblaXqO4dYygF4qx251GuJ6zCZt+qbJ+kU7sOC+yc84pawEjVpJByh+cGP2APFCsao2Giz50cDlKNPA==",
+ "dependencies": {
+ "@algolia/cache-browser-local-storage": "4.16.0",
+ "@algolia/cache-common": "4.16.0",
+ "@algolia/cache-in-memory": "4.16.0",
+ "@algolia/client-account": "4.16.0",
+ "@algolia/client-analytics": "4.16.0",
+ "@algolia/client-common": "4.16.0",
+ "@algolia/client-personalization": "4.16.0",
+ "@algolia/client-search": "4.16.0",
+ "@algolia/logger-common": "4.16.0",
+ "@algolia/logger-console": "4.16.0",
+ "@algolia/requester-browser-xhr": "4.16.0",
+ "@algolia/requester-common": "4.16.0",
+ "@algolia/requester-node-http": "4.16.0",
+ "@algolia/transporter": "4.16.0"
+ }
+ },
+ "node_modules/algoliasearch-helper": {
+ "version": "3.12.0",
+ "resolved": "https://registry.npmjs.org/algoliasearch-helper/-/algoliasearch-helper-3.12.0.tgz",
+ "integrity": "sha512-/j1U3PEwdan0n6P/QqSnSpNSLC5+cEMvyljd5CnmNmUjDlGrys+vFEOwjVEnqELIiAGMHEA/Nl3CiKVFBUYqyQ==",
+ "dependencies": {
+ "@algolia/events": "^4.0.1"
+ },
+ "peerDependencies": {
+ "algoliasearch": ">= 3.1 < 6"
+ }
+ },
+ "node_modules/ansi-align": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/ansi-align/-/ansi-align-3.0.1.tgz",
+ "integrity": "sha512-IOfwwBF5iczOjp/WeY4YxyjqAFMQoZufdQWDd19SEExbVLNXqvpzSJ/M7Za4/sCPmQ0+GRquoA7bGcINcxew6w==",
+ "dependencies": {
+ "string-width": "^4.1.0"
+ }
+ },
+ "node_modules/ansi-align/node_modules/emoji-regex": {
+ "version": "8.0.0",
+ "resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
+ "integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="
+ },
+ "node_modules/ansi-align/node_modules/string-width": {
+ "version": "4.2.3",
+ "resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
+ "integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
+ "dependencies": {
+ "emoji-regex": "^8.0.0",
+ "is-fullwidth-code-point": "^3.0.0",
+ "strip-ansi": "^6.0.1"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/ansi-html-community": {
+ "version": "0.0.8",
+ "resolved": "https://registry.npmjs.org/ansi-html-community/-/ansi-html-community-0.0.8.tgz",
+ "integrity": "sha512-1APHAyr3+PCamwNw3bXCPp4HFLONZt/yIH0sZp0/469KWNTEy+qN5jQ3GVX6DMZ1UXAi34yVwtTeaG/HpBuuzw==",
+ "engines": [
+ "node >= 0.8.0"
+ ],
+ "bin": {
+ "ansi-html": "bin/ansi-html"
+ }
+ },
+ "node_modules/ansi-regex": {
+ "version": "5.0.1",
+ "resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz",
+ "integrity": "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/ansi-styles": {
+ "version": "4.3.0",
+ "resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz",
+ "integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==",
+ "dependencies": {
+ "color-convert": "^2.0.1"
+ },
+ "engines": {
+ "node": ">=8"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/ansi-styles?sponsor=1"
+ }
+ },
+ "node_modules/anymatch": {
+ "version": "3.1.3",
+ "resolved": "https://registry.npmjs.org/anymatch/-/anymatch-3.1.3.tgz",
+ "integrity": "sha512-KMReFUr0B4t+D+OBkjR3KYqvocp2XaSzO55UcB6mgQMd3KbcE+mWTyvVV7D/zsdEbNnV6acZUutkiHQXvTr1Rw==",
+ "dependencies": {
+ "normalize-path": "^3.0.0",
+ "picomatch": "^2.0.4"
+ },
+ "engines": {
+ "node": ">= 8"
+ }
+ },
+ "node_modules/arg": {
+ "version": "5.0.2",
+ "resolved": "https://registry.npmjs.org/arg/-/arg-5.0.2.tgz",
+ "integrity": "sha512-PYjyFOLKQ9y57JvQ6QLo8dAgNqswh8M1RMJYdQduT6xbWSgK36P/Z/v+p888pM69jMMfS8Xd8F6I1kQ/I9HUGg=="
+ },
+ "node_modules/argparse": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/argparse/-/argparse-2.0.1.tgz",
+ "integrity": "sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q=="
+ },
+ "node_modules/array-flatten": {
+ "version": "2.1.2",
+ "resolved": "https://registry.npmjs.org/array-flatten/-/array-flatten-2.1.2.tgz",
+ "integrity": "sha512-hNfzcOV8W4NdualtqBFPyVO+54DSJuZGY9qT4pRroB6S9e3iiido2ISIC5h9R2sPJ8H3FHCIiEnsv1lPXO3KtQ=="
+ },
+ "node_modules/array-union": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/array-union/-/array-union-2.1.0.tgz",
+ "integrity": "sha512-HGyxoOTYUyCM6stUe6EJgnd4EoewAI7zMdfqO+kGjnlZmBDz/cR5pf8r/cR4Wq60sL/p0IkcjUEEPwS3GFrIyw==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/asap": {
+ "version": "2.0.6",
+ "resolved": "https://registry.npmjs.org/asap/-/asap-2.0.6.tgz",
+ "integrity": "sha512-BSHWgDSAiKs50o2Re8ppvp3seVHXSRM44cdSsT9FfNEUUZLOGWVCsiWaRPWM1Znn+mqZ1OfVZ3z3DWEzSp7hRA=="
+ },
+ "node_modules/at-least-node": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/at-least-node/-/at-least-node-1.0.0.tgz",
+ "integrity": "sha512-+q/t7Ekv1EDY2l6Gda6LLiX14rU9TV20Wa3ofeQmwPFZbOMo9DXrLbOjFaaclkXKWidIaopwAObQDqwWtGUjqg==",
+ "engines": {
+ "node": ">= 4.0.0"
+ }
+ },
+ "node_modules/autoprefixer": {
+ "version": "10.4.14",
+ "resolved": "https://registry.npmjs.org/autoprefixer/-/autoprefixer-10.4.14.tgz",
+ "integrity": "sha512-FQzyfOsTlwVzjHxKEqRIAdJx9niO6VCBCoEwax/VLSoQF29ggECcPuBqUMZ+u8jCZOPSy8b8/8KnuFbp0SaFZQ==",
+ "funding": [
+ {
+ "type": "opencollective",
+ "url": "https://opencollective.com/postcss/"
+ },
+ {
+ "type": "tidelift",
+ "url": "https://tidelift.com/funding/github/npm/autoprefixer"
+ }
+ ],
+ "dependencies": {
+ "browserslist": "^4.21.5",
+ "caniuse-lite": "^1.0.30001464",
+ "fraction.js": "^4.2.0",
+ "normalize-range": "^0.1.2",
+ "picocolors": "^1.0.0",
+ "postcss-value-parser": "^4.2.0"
+ },
+ "bin": {
+ "autoprefixer": "bin/autoprefixer"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14"
+ },
+ "peerDependencies": {
+ "postcss": "^8.1.0"
+ }
+ },
+ "node_modules/axios": {
+ "version": "0.25.0",
+ "resolved": "https://registry.npmjs.org/axios/-/axios-0.25.0.tgz",
+ "integrity": "sha512-cD8FOb0tRH3uuEe6+evtAbgJtfxr7ly3fQjYcMcuPlgkwVS9xboaVIpcDV+cYQe+yGykgwZCs1pzjntcGa6l5g==",
+ "dependencies": {
+ "follow-redirects": "^1.14.7"
+ }
+ },
+ "node_modules/babel-loader": {
+ "version": "8.3.0",
+ "resolved": "https://registry.npmjs.org/babel-loader/-/babel-loader-8.3.0.tgz",
+ "integrity": "sha512-H8SvsMF+m9t15HNLMipppzkC+Y2Yq+v3SonZyU70RBL/h1gxPkH08Ot8pEE9Z4Kd+czyWJClmFS8qzIP9OZ04Q==",
+ "dependencies": {
+ "find-cache-dir": "^3.3.1",
+ "loader-utils": "^2.0.0",
+ "make-dir": "^3.1.0",
+ "schema-utils": "^2.6.5"
+ },
+ "engines": {
+ "node": ">= 8.9"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0",
+ "webpack": ">=2"
+ }
+ },
+ "node_modules/babel-plugin-apply-mdx-type-prop": {
+ "version": "1.6.22",
+ "resolved": "https://registry.npmjs.org/babel-plugin-apply-mdx-type-prop/-/babel-plugin-apply-mdx-type-prop-1.6.22.tgz",
+ "integrity": "sha512-VefL+8o+F/DfK24lPZMtJctrCVOfgbqLAGZSkxwhazQv4VxPg3Za/i40fu22KR2m8eEda+IfSOlPLUSIiLcnCQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "7.10.4",
+ "@mdx-js/util": "1.6.22"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.11.6"
+ }
+ },
+ "node_modules/babel-plugin-apply-mdx-type-prop/node_modules/@babel/helper-plugin-utils": {
+ "version": "7.10.4",
+ "resolved": "https://registry.npmjs.org/@babel/helper-plugin-utils/-/helper-plugin-utils-7.10.4.tgz",
+ "integrity": "sha512-O4KCvQA6lLiMU9l2eawBPMf1xPP8xPfB3iEQw150hOVTqj/rfXz0ThTb4HEzqQfs2Bmo5Ay8BzxfzVtBrr9dVg=="
+ },
+ "node_modules/babel-plugin-dynamic-import-node": {
+ "version": "2.3.3",
+ "resolved": "https://registry.npmjs.org/babel-plugin-dynamic-import-node/-/babel-plugin-dynamic-import-node-2.3.3.tgz",
+ "integrity": "sha512-jZVI+s9Zg3IqA/kdi0i6UDCybUI3aSBLnglhYbSSjKlV7yF1F/5LWv8MakQmvYpnbJDS6fcBL2KzHSxNCMtWSQ==",
+ "dependencies": {
+ "object.assign": "^4.1.0"
+ }
+ },
+ "node_modules/babel-plugin-extract-import-names": {
+ "version": "1.6.22",
+ "resolved": "https://registry.npmjs.org/babel-plugin-extract-import-names/-/babel-plugin-extract-import-names-1.6.22.tgz",
+ "integrity": "sha512-yJ9BsJaISua7d8zNT7oRG1ZLBJCIdZ4PZqmH8qa9N5AK01ifk3fnkc98AXhtzE7UkfCsEumvoQWgoYLhOnJ7jQ==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "7.10.4"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/babel-plugin-extract-import-names/node_modules/@babel/helper-plugin-utils": {
+ "version": "7.10.4",
+ "resolved": "https://registry.npmjs.org/@babel/helper-plugin-utils/-/helper-plugin-utils-7.10.4.tgz",
+ "integrity": "sha512-O4KCvQA6lLiMU9l2eawBPMf1xPP8xPfB3iEQw150hOVTqj/rfXz0ThTb4HEzqQfs2Bmo5Ay8BzxfzVtBrr9dVg=="
+ },
+ "node_modules/babel-plugin-polyfill-corejs2": {
+ "version": "0.3.3",
+ "resolved": "https://registry.npmjs.org/babel-plugin-polyfill-corejs2/-/babel-plugin-polyfill-corejs2-0.3.3.tgz",
+ "integrity": "sha512-8hOdmFYFSZhqg2C/JgLUQ+t52o5nirNwaWM2B9LWteozwIvM14VSwdsCAUET10qT+kmySAlseadmfeeSWFCy+Q==",
+ "dependencies": {
+ "@babel/compat-data": "^7.17.7",
+ "@babel/helper-define-polyfill-provider": "^0.3.3",
+ "semver": "^6.1.1"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/babel-plugin-polyfill-corejs2/node_modules/semver": {
+ "version": "6.3.1",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
+ "integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==",
+ "bin": {
+ "semver": "bin/semver.js"
+ }
+ },
+ "node_modules/babel-plugin-polyfill-corejs3": {
+ "version": "0.6.0",
+ "resolved": "https://registry.npmjs.org/babel-plugin-polyfill-corejs3/-/babel-plugin-polyfill-corejs3-0.6.0.tgz",
+ "integrity": "sha512-+eHqR6OPcBhJOGgsIar7xoAB1GcSwVUA3XjAd7HJNzOXT4wv6/H7KIdA/Nc60cvUlDbKApmqNvD1B1bzOt4nyA==",
+ "dependencies": {
+ "@babel/helper-define-polyfill-provider": "^0.3.3",
+ "core-js-compat": "^3.25.1"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/babel-plugin-polyfill-regenerator": {
+ "version": "0.4.1",
+ "resolved": "https://registry.npmjs.org/babel-plugin-polyfill-regenerator/-/babel-plugin-polyfill-regenerator-0.4.1.tgz",
+ "integrity": "sha512-NtQGmyQDXjQqQ+IzRkBVwEOz9lQ4zxAQZgoAYEtU9dJjnl1Oc98qnN7jcp+bE7O7aYzVpavXE3/VKXNzUbh7aw==",
+ "dependencies": {
+ "@babel/helper-define-polyfill-provider": "^0.3.3"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/babel-plugin-syntax-jsx": {
+ "version": "6.18.0",
+ "resolved": "https://registry.npmjs.org/babel-plugin-syntax-jsx/-/babel-plugin-syntax-jsx-6.18.0.tgz",
+ "integrity": "sha512-qrPaCSo9c8RHNRHIotaufGbuOBN8rtdC4QrrFFc43vyWCCz7Kl7GL1PGaXtMGQZUXrkCjNEgxDfmAuAabr/rlw=="
+ },
+ "node_modules/bail": {
+ "version": "1.0.5",
+ "resolved": "https://registry.npmjs.org/bail/-/bail-1.0.5.tgz",
+ "integrity": "sha512-xFbRxM1tahm08yHBP16MMjVUAvDaBMD38zsM9EMAUN61omwLmKlOpB/Zku5QkjZ8TZ4vn53pj+t518cH0S03RQ==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/balanced-match": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
+ "integrity": "sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw=="
+ },
+ "node_modules/base16": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/base16/-/base16-1.0.0.tgz",
+ "integrity": "sha512-pNdYkNPiJUnEhnfXV56+sQy8+AaPcG3POZAUnwr4EeqCUZFz4u2PePbo3e5Gj4ziYPCWGUZT9RHisvJKnwFuBQ=="
+ },
+ "node_modules/batch": {
+ "version": "0.6.1",
+ "resolved": "https://registry.npmjs.org/batch/-/batch-0.6.1.tgz",
+ "integrity": "sha512-x+VAiMRL6UPkx+kudNvxTl6hB2XNNCG2r+7wixVfIYwu/2HKRXimwQyaumLjMveWvT2Hkd/cAJw+QBMfJ/EKVw=="
+ },
+ "node_modules/big.js": {
+ "version": "5.2.2",
+ "resolved": "https://registry.npmjs.org/big.js/-/big.js-5.2.2.tgz",
+ "integrity": "sha512-vyL2OymJxmarO8gxMr0mhChsO9QGwhynfuu4+MHTAW6czfq9humCB7rKpUjDd9YUiDPU4mzpyupFSvOClAwbmQ==",
+ "engines": {
+ "node": "*"
+ }
+ },
+ "node_modules/binary-extensions": {
+ "version": "2.2.0",
+ "resolved": "https://registry.npmjs.org/binary-extensions/-/binary-extensions-2.2.0.tgz",
+ "integrity": "sha512-jDctJ/IVQbZoJykoeHbhXpOlNBqGNcwXJKJog42E5HDPUwQTSdjCHdihjj0DlnheQ7blbT6dHOafNAiS8ooQKA==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/body-parser": {
+ "version": "1.20.3",
+ "resolved": "https://registry.npmjs.org/body-parser/-/body-parser-1.20.3.tgz",
+ "integrity": "sha512-7rAxByjUMqQ3/bHJy7D6OGXvx/MMc4IqBn/X0fcM1QUcAItpZrBEYhWGem+tzXH90c+G01ypMcYJBO9Y30203g==",
+ "dependencies": {
+ "bytes": "3.1.2",
+ "content-type": "~1.0.5",
+ "debug": "2.6.9",
+ "depd": "2.0.0",
+ "destroy": "1.2.0",
+ "http-errors": "2.0.0",
+ "iconv-lite": "0.4.24",
+ "on-finished": "2.4.1",
+ "qs": "6.13.0",
+ "raw-body": "2.5.2",
+ "type-is": "~1.6.18",
+ "unpipe": "1.0.0"
+ },
+ "engines": {
+ "node": ">= 0.8",
+ "npm": "1.2.8000 || >= 1.4.16"
+ }
+ },
+ "node_modules/body-parser/node_modules/bytes": {
+ "version": "3.1.2",
+ "resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.2.tgz",
+ "integrity": "sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg==",
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/body-parser/node_modules/debug": {
+ "version": "2.6.9",
+ "resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
+ "integrity": "sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA==",
+ "dependencies": {
+ "ms": "2.0.0"
+ }
+ },
+ "node_modules/body-parser/node_modules/ms": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz",
+ "integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A=="
+ },
+ "node_modules/bonjour-service": {
+ "version": "1.1.1",
+ "resolved": "https://registry.npmjs.org/bonjour-service/-/bonjour-service-1.1.1.tgz",
+ "integrity": "sha512-Z/5lQRMOG9k7W+FkeGTNjh7htqn/2LMnfOvBZ8pynNZCM9MwkQkI3zeI4oz09uWdcgmgHugVvBqxGg4VQJ5PCg==",
+ "dependencies": {
+ "array-flatten": "^2.1.2",
+ "dns-equal": "^1.0.0",
+ "fast-deep-equal": "^3.1.3",
+ "multicast-dns": "^7.2.5"
+ }
+ },
+ "node_modules/boolbase": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/boolbase/-/boolbase-1.0.0.tgz",
+ "integrity": "sha512-JZOSA7Mo9sNGB8+UjSgzdLtokWAky1zbztM3WRLCbZ70/3cTANmQmOdR7y2g+J0e2WXywy1yS468tY+IruqEww=="
+ },
+ "node_modules/boxen": {
+ "version": "6.2.1",
+ "resolved": "https://registry.npmjs.org/boxen/-/boxen-6.2.1.tgz",
+ "integrity": "sha512-H4PEsJXfFI/Pt8sjDWbHlQPx4zL/bvSQjcilJmaulGt5mLDorHOHpmdXAJcBcmru7PhYSp/cDMWRko4ZUMFkSw==",
+ "dependencies": {
+ "ansi-align": "^3.0.1",
+ "camelcase": "^6.2.0",
+ "chalk": "^4.1.2",
+ "cli-boxes": "^3.0.0",
+ "string-width": "^5.0.1",
+ "type-fest": "^2.5.0",
+ "widest-line": "^4.0.1",
+ "wrap-ansi": "^8.0.1"
+ },
+ "engines": {
+ "node": "^12.20.0 || ^14.13.1 || >=16.0.0"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/brace-expansion": {
+ "version": "1.1.12",
+ "resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.12.tgz",
+ "integrity": "sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg==",
+ "license": "MIT",
+ "dependencies": {
+ "balanced-match": "^1.0.0",
+ "concat-map": "0.0.1"
+ }
+ },
+ "node_modules/braces": {
+ "version": "3.0.3",
+ "resolved": "https://registry.npmjs.org/braces/-/braces-3.0.3.tgz",
+ "integrity": "sha512-yQbXgO/OSZVD2IsiLlro+7Hf6Q18EJrKSEsdoMzKePKXct3gvD8oLcOQdIzGupr5Fj+EDe8gO/lxc1BzfMpxvA==",
+ "dependencies": {
+ "fill-range": "^7.1.1"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/browserslist": {
+ "version": "4.23.3",
+ "resolved": "https://registry.npmjs.org/browserslist/-/browserslist-4.23.3.tgz",
+ "integrity": "sha512-btwCFJVjI4YWDNfau8RhZ+B1Q/VLoUITrm3RlP6y1tYGWIOa+InuYiRGXUBXo8nA1qKmHMyLB/iVQg5TT4eFoA==",
+ "funding": [
+ {
+ "type": "opencollective",
+ "url": "https://opencollective.com/browserslist"
+ },
+ {
+ "type": "tidelift",
+ "url": "https://tidelift.com/funding/github/npm/browserslist"
+ },
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/ai"
+ }
+ ],
+ "dependencies": {
+ "caniuse-lite": "^1.0.30001646",
+ "electron-to-chromium": "^1.5.4",
+ "node-releases": "^2.0.18",
+ "update-browserslist-db": "^1.1.0"
+ },
+ "bin": {
+ "browserslist": "cli.js"
+ },
+ "engines": {
+ "node": "^6 || ^7 || ^8 || ^9 || ^10 || ^11 || ^12 || >=13.7"
+ }
+ },
+ "node_modules/buffer-from": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmjs.org/buffer-from/-/buffer-from-1.1.2.tgz",
+ "integrity": "sha512-E+XQCRwSbaaiChtv6k6Dwgc+bx+Bs6vuKJHHl5kox/BaKbhiXzqQOwK4cO22yElGp2OCmjwVhT3HmxgyPGnJfQ=="
+ },
+ "node_modules/bytes": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/bytes/-/bytes-3.0.0.tgz",
+ "integrity": "sha512-pMhOfFDPiv9t5jjIXkHosWmkSyQbvsgEVNkz0ERHbuLh2T/7j4Mqqpz523Fe8MVY89KC6Sh/QfS2sM+SjgFDcw==",
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/cacheable-request": {
+ "version": "6.1.0",
+ "resolved": "https://registry.npmjs.org/cacheable-request/-/cacheable-request-6.1.0.tgz",
+ "integrity": "sha512-Oj3cAGPCqOZX7Rz64Uny2GYAZNliQSqfbePrgAQ1wKAihYmCUnraBtJtKcGR4xz7wF+LoJC+ssFZvv5BgF9Igg==",
+ "dependencies": {
+ "clone-response": "^1.0.2",
+ "get-stream": "^5.1.0",
+ "http-cache-semantics": "^4.0.0",
+ "keyv": "^3.0.0",
+ "lowercase-keys": "^2.0.0",
+ "normalize-url": "^4.1.0",
+ "responselike": "^1.0.2"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/cacheable-request/node_modules/get-stream": {
+ "version": "5.2.0",
+ "resolved": "https://registry.npmjs.org/get-stream/-/get-stream-5.2.0.tgz",
+ "integrity": "sha512-nBF+F1rAZVCu/p7rjzgA+Yb4lfYXrpl7a6VmJrU8wF9I1CKvP/QwPNZHnOlwbTkY6dvtFIzFMSyQXbLoTQPRpA==",
+ "dependencies": {
+ "pump": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/cacheable-request/node_modules/lowercase-keys": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/lowercase-keys/-/lowercase-keys-2.0.0.tgz",
+ "integrity": "sha512-tqNXrS78oMOE73NMxK4EMLQsQowWf8jKooH9g7xPavRT706R6bkQJ6DY2Te7QukaZsulxa30wQ7bk0pm4XiHmA==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/cacheable-request/node_modules/normalize-url": {
+ "version": "4.5.1",
+ "resolved": "https://registry.npmjs.org/normalize-url/-/normalize-url-4.5.1.tgz",
+ "integrity": "sha512-9UZCFRHQdNrfTpGg8+1INIg93B6zE0aXMVFkw1WFwvO4SlZywU6aLg5Of0Ap/PgcbSw4LNxvMWXMeugwMCX0AA==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/call-bind": {
+ "version": "1.0.7",
+ "resolved": "https://registry.npmjs.org/call-bind/-/call-bind-1.0.7.tgz",
+ "integrity": "sha512-GHTSNSYICQ7scH7sZ+M2rFopRoLh8t2bLSW6BbgrtLsahOIB5iyAVJf9GjWK3cYTDaMj4XdBpM1cA6pIS0Kv2w==",
+ "dependencies": {
+ "es-define-property": "^1.0.0",
+ "es-errors": "^1.3.0",
+ "function-bind": "^1.1.2",
+ "get-intrinsic": "^1.2.4",
+ "set-function-length": "^1.2.1"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/call-me-maybe": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/call-me-maybe/-/call-me-maybe-1.0.2.tgz",
+ "integrity": "sha512-HpX65o1Hnr9HH25ojC1YGs7HCQLq0GCOibSaWER0eNpgJ/Z1MZv2mTc7+xh6WOPxbRVcmgbv4hGU+uSQ/2xFZQ=="
+ },
+ "node_modules/callsites": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz",
+ "integrity": "sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/camel-case": {
+ "version": "4.1.2",
+ "resolved": "https://registry.npmjs.org/camel-case/-/camel-case-4.1.2.tgz",
+ "integrity": "sha512-gxGWBrTT1JuMx6R+o5PTXMmUnhnVzLQ9SNutD4YqKtI6ap897t3tKECYla6gCWEkplXnlNybEkZg9GEGxKFCgw==",
+ "dependencies": {
+ "pascal-case": "^3.1.2",
+ "tslib": "^2.0.3"
+ }
+ },
+ "node_modules/camelcase": {
+ "version": "6.3.0",
+ "resolved": "https://registry.npmjs.org/camelcase/-/camelcase-6.3.0.tgz",
+ "integrity": "sha512-Gmy6FhYlCY7uOElZUSbxo2UCDH8owEk996gkbrpsgGtrJLM3J7jGxl9Ic7Qwwj4ivOE5AWZWRMecDdF7hqGjFA==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/camelcase-css": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/camelcase-css/-/camelcase-css-2.0.1.tgz",
+ "integrity": "sha512-QOSvevhslijgYwRx6Rv7zKdMF8lbRmx+uQGx2+vDc+KI/eBnsy9kit5aj23AgGu3pa4t9AgwbnXWqS+iOY+2aA==",
+ "engines": {
+ "node": ">= 6"
+ }
+ },
+ "node_modules/camelize": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/camelize/-/camelize-1.0.1.tgz",
+ "integrity": "sha512-dU+Tx2fsypxTgtLoE36npi3UqcjSSMNYfkqgmoEhtZrraP5VWq0K7FkWVTYa8eMPtnU/G2txVsfdCJTn9uzpuQ==",
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/caniuse-api": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/caniuse-api/-/caniuse-api-3.0.0.tgz",
+ "integrity": "sha512-bsTwuIg/BZZK/vreVTYYbSWoe2F+71P7K5QGEX+pT250DZbfU1MQ5prOKpPR+LL6uWKK3KMwMCAS74QB3Um1uw==",
+ "dependencies": {
+ "browserslist": "^4.0.0",
+ "caniuse-lite": "^1.0.0",
+ "lodash.memoize": "^4.1.2",
+ "lodash.uniq": "^4.5.0"
+ }
+ },
+ "node_modules/caniuse-lite": {
+ "version": "1.0.30001653",
+ "resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001653.tgz",
+ "integrity": "sha512-XGWQVB8wFQ2+9NZwZ10GxTYC5hk0Fa+q8cSkr0tgvMhYhMHP/QC+WTgrePMDBWiWc/pV+1ik82Al20XOK25Gcw==",
+ "funding": [
+ {
+ "type": "opencollective",
+ "url": "https://opencollective.com/browserslist"
+ },
+ {
+ "type": "tidelift",
+ "url": "https://tidelift.com/funding/github/npm/caniuse-lite"
+ },
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/ai"
+ }
+ ]
+ },
+ "node_modules/ccount": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/ccount/-/ccount-1.1.0.tgz",
+ "integrity": "sha512-vlNK021QdI7PNeiUh/lKkC/mNHHfV0m/Ad5JoI0TYtlBnJAslM/JIkm/tGC88bkLIwO6OQ5uV6ztS6kVAtCDlg==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/chalk": {
+ "version": "4.1.2",
+ "resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz",
+ "integrity": "sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==",
+ "dependencies": {
+ "ansi-styles": "^4.1.0",
+ "supports-color": "^7.1.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/chalk?sponsor=1"
+ }
+ },
+ "node_modules/character-entities": {
+ "version": "1.2.4",
+ "resolved": "https://registry.npmjs.org/character-entities/-/character-entities-1.2.4.tgz",
+ "integrity": "sha512-iBMyeEHxfVnIakwOuDXpVkc54HijNgCyQB2w0VfGQThle6NXn50zU6V/u+LDhxHcDUPojn6Kpga3PTAD8W1bQw==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/character-entities-legacy": {
+ "version": "1.1.4",
+ "resolved": "https://registry.npmjs.org/character-entities-legacy/-/character-entities-legacy-1.1.4.tgz",
+ "integrity": "sha512-3Xnr+7ZFS1uxeiUDvV02wQ+QDbc55o97tIV5zHScSPJpcLm/r0DFPcoY3tYRp+VZukxuMeKgXYmsXQHO05zQeA==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/character-reference-invalid": {
+ "version": "1.1.4",
+ "resolved": "https://registry.npmjs.org/character-reference-invalid/-/character-reference-invalid-1.1.4.tgz",
+ "integrity": "sha512-mKKUkUbhPpQlCOfIuZkvSEgktjPFIsZKRRbC6KWVEMvlzblj3i3asQv5ODsrwt0N3pHAEvjP8KTQPHkp0+6jOg==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/cheerio": {
+ "version": "1.0.0-rc.12",
+ "resolved": "https://registry.npmjs.org/cheerio/-/cheerio-1.0.0-rc.12.tgz",
+ "integrity": "sha512-VqR8m68vM46BNnuZ5NtnGBKIE/DfN0cRIzg9n40EIq9NOv90ayxLBXA8fXC5gquFRGJSTRqBq25Jt2ECLR431Q==",
+ "dependencies": {
+ "cheerio-select": "^2.1.0",
+ "dom-serializer": "^2.0.0",
+ "domhandler": "^5.0.3",
+ "domutils": "^3.0.1",
+ "htmlparser2": "^8.0.1",
+ "parse5": "^7.0.0",
+ "parse5-htmlparser2-tree-adapter": "^7.0.0"
+ },
+ "engines": {
+ "node": ">= 6"
+ },
+ "funding": {
+ "url": "https://github.com/cheeriojs/cheerio?sponsor=1"
+ }
+ },
+ "node_modules/cheerio-select": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/cheerio-select/-/cheerio-select-2.1.0.tgz",
+ "integrity": "sha512-9v9kG0LvzrlcungtnJtpGNxY+fzECQKhK4EGJX2vByejiMX84MFNQw4UxPJl3bFbTMw+Dfs37XaIkCwTZfLh4g==",
+ "dependencies": {
+ "boolbase": "^1.0.0",
+ "css-select": "^5.1.0",
+ "css-what": "^6.1.0",
+ "domelementtype": "^2.3.0",
+ "domhandler": "^5.0.3",
+ "domutils": "^3.0.1"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/fb55"
+ }
+ },
+ "node_modules/chokidar": {
+ "version": "3.5.3",
+ "resolved": "https://registry.npmjs.org/chokidar/-/chokidar-3.5.3.tgz",
+ "integrity": "sha512-Dr3sfKRP6oTcjf2JmUmFJfeVMvXBdegxB0iVQ5eb2V10uFJUCAS8OByZdVAyVb8xXNz3GjjTgj9kLWsZTqE6kw==",
+ "funding": [
+ {
+ "type": "individual",
+ "url": "https://paulmillr.com/funding/"
+ }
+ ],
+ "dependencies": {
+ "anymatch": "~3.1.2",
+ "braces": "~3.0.2",
+ "glob-parent": "~5.1.2",
+ "is-binary-path": "~2.1.0",
+ "is-glob": "~4.0.1",
+ "normalize-path": "~3.0.0",
+ "readdirp": "~3.6.0"
+ },
+ "engines": {
+ "node": ">= 8.10.0"
+ },
+ "optionalDependencies": {
+ "fsevents": "~2.3.2"
+ }
+ },
+ "node_modules/chrome-trace-event": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmjs.org/chrome-trace-event/-/chrome-trace-event-1.0.3.tgz",
+ "integrity": "sha512-p3KULyQg4S7NIHixdwbGX+nFHkoBiA4YQmyWtjb8XngSKV124nJmRysgAeujbUVb15vh+RvFUfCPqU7rXk+hZg==",
+ "engines": {
+ "node": ">=6.0"
+ }
+ },
+ "node_modules/ci-info": {
+ "version": "3.8.0",
+ "resolved": "https://registry.npmjs.org/ci-info/-/ci-info-3.8.0.tgz",
+ "integrity": "sha512-eXTggHWSooYhq49F2opQhuHWgzucfF2YgODK4e1566GQs5BIfP30B0oenwBJHfWxAs2fyPB1s7Mg949zLf61Yw==",
+ "funding": [
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/sibiraj-s"
+ }
+ ],
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/classnames": {
+ "version": "2.3.2",
+ "resolved": "https://registry.npmjs.org/classnames/-/classnames-2.3.2.tgz",
+ "integrity": "sha512-CSbhY4cFEJRe6/GQzIk5qXZ4Jeg5pcsP7b5peFSDpffpe1cqjASH/n9UTjBwOp6XpMSTwQ8Za2K5V02ueA7Tmw=="
+ },
+ "node_modules/clean-css": {
+ "version": "5.3.2",
+ "resolved": "https://registry.npmjs.org/clean-css/-/clean-css-5.3.2.tgz",
+ "integrity": "sha512-JVJbM+f3d3Q704rF4bqQ5UUyTtuJ0JRKNbTKVEeujCCBoMdkEi+V+e8oktO9qGQNSvHrFTM6JZRXrUvGR1czww==",
+ "dependencies": {
+ "source-map": "~0.6.0"
+ },
+ "engines": {
+ "node": ">= 10.0"
+ }
+ },
+ "node_modules/clean-stack": {
+ "version": "2.2.0",
+ "resolved": "https://registry.npmjs.org/clean-stack/-/clean-stack-2.2.0.tgz",
+ "integrity": "sha512-4diC9HaTE+KRAMWhDhrGOECgWZxoevMc5TlkObMqNSsVU62PYzXZ/SMTjzyGAFF1YusgxGcSWTEXBhp0CPwQ1A==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/cli-boxes": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/cli-boxes/-/cli-boxes-3.0.0.tgz",
+ "integrity": "sha512-/lzGpEWL/8PfI0BmBOPRwp0c/wFNX1RdUML3jK/RcSBA9T8mZDdQpqYBKtCFTOfQbwPqWEOpjqW+Fnayc0969g==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/cli-table3": {
+ "version": "0.6.3",
+ "resolved": "https://registry.npmjs.org/cli-table3/-/cli-table3-0.6.3.tgz",
+ "integrity": "sha512-w5Jac5SykAeZJKntOxJCrm63Eg5/4dhMWIcuTbo9rpE+brgaSZo0RuNJZeOyMgsUdhDeojvgyQLmjI+K50ZGyg==",
+ "dependencies": {
+ "string-width": "^4.2.0"
+ },
+ "engines": {
+ "node": "10.* || >= 12.*"
+ },
+ "optionalDependencies": {
+ "@colors/colors": "1.5.0"
+ }
+ },
+ "node_modules/cli-table3/node_modules/emoji-regex": {
+ "version": "8.0.0",
+ "resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
+ "integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="
+ },
+ "node_modules/cli-table3/node_modules/string-width": {
+ "version": "4.2.3",
+ "resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
+ "integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
+ "dependencies": {
+ "emoji-regex": "^8.0.0",
+ "is-fullwidth-code-point": "^3.0.0",
+ "strip-ansi": "^6.0.1"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/cliui": {
+ "version": "7.0.4",
+ "resolved": "https://registry.npmjs.org/cliui/-/cliui-7.0.4.tgz",
+ "integrity": "sha512-OcRE68cOsVMXp1Yvonl/fzkQOyjLSu/8bhPDfQt0e0/Eb283TKP20Fs2MqoPsr9SwA595rRCA+QMzYc9nBP+JQ==",
+ "dependencies": {
+ "string-width": "^4.2.0",
+ "strip-ansi": "^6.0.0",
+ "wrap-ansi": "^7.0.0"
+ }
+ },
+ "node_modules/cliui/node_modules/emoji-regex": {
+ "version": "8.0.0",
+ "resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
+ "integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="
+ },
+ "node_modules/cliui/node_modules/string-width": {
+ "version": "4.2.3",
+ "resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
+ "integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
+ "dependencies": {
+ "emoji-regex": "^8.0.0",
+ "is-fullwidth-code-point": "^3.0.0",
+ "strip-ansi": "^6.0.1"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/cliui/node_modules/wrap-ansi": {
+ "version": "7.0.0",
+ "resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz",
+ "integrity": "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==",
+ "dependencies": {
+ "ansi-styles": "^4.0.0",
+ "string-width": "^4.1.0",
+ "strip-ansi": "^6.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/wrap-ansi?sponsor=1"
+ }
+ },
+ "node_modules/clone-deep": {
+ "version": "4.0.1",
+ "resolved": "https://registry.npmjs.org/clone-deep/-/clone-deep-4.0.1.tgz",
+ "integrity": "sha512-neHB9xuzh/wk0dIHweyAXv2aPGZIVk3pLMe+/RNzINf17fe0OG96QroktYAUm7SM1PBnzTabaLboqqxDyMU+SQ==",
+ "dependencies": {
+ "is-plain-object": "^2.0.4",
+ "kind-of": "^6.0.2",
+ "shallow-clone": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/clone-response": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmjs.org/clone-response/-/clone-response-1.0.3.tgz",
+ "integrity": "sha512-ROoL94jJH2dUVML2Y/5PEDNaSHgeOdSDicUyS7izcF63G6sTc/FTjLub4b8Il9S8S0beOfYt0TaA5qvFK+w0wA==",
+ "dependencies": {
+ "mimic-response": "^1.0.0"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/clsx": {
+ "version": "1.2.1",
+ "resolved": "https://registry.npmjs.org/clsx/-/clsx-1.2.1.tgz",
+ "integrity": "sha512-EcR6r5a8bj6pu3ycsa/E/cKVGuTgZJZdsyUYHOksG/UHIiKfjxzRxYJpyVBwYaQeOvghal9fcc4PidlgzugAQg==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/collapse-white-space": {
+ "version": "1.0.6",
+ "resolved": "https://registry.npmjs.org/collapse-white-space/-/collapse-white-space-1.0.6.tgz",
+ "integrity": "sha512-jEovNnrhMuqyCcjfEJA56v0Xq8SkIoPKDyaHahwo3POf4qcSXqMYuwNcOTzp74vTsR9Tn08z4MxWqAhcekogkQ==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/color-convert": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
+ "integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
+ "dependencies": {
+ "color-name": "~1.1.4"
+ },
+ "engines": {
+ "node": ">=7.0.0"
+ }
+ },
+ "node_modules/color-name": {
+ "version": "1.1.4",
+ "resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
+ "integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA=="
+ },
+ "node_modules/colord": {
+ "version": "2.9.3",
+ "resolved": "https://registry.npmjs.org/colord/-/colord-2.9.3.tgz",
+ "integrity": "sha512-jeC1axXpnb0/2nn/Y1LPuLdgXBLH7aDcHu4KEKfqw3CUhX7ZpfBSlPKyqXE6btIgEzfWtrX3/tyBCaCvXvMkOw=="
+ },
+ "node_modules/colorette": {
+ "version": "2.0.19",
+ "resolved": "https://registry.npmjs.org/colorette/-/colorette-2.0.19.tgz",
+ "integrity": "sha512-3tlv/dIP7FWvj3BsbHrGLJ6l/oKh1O3TcgBqMn+yyCagOxc23fyzDS6HypQbgxWbkpDnf52p1LuR4eWDQ/K9WQ=="
+ },
+ "node_modules/combine-promises": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/combine-promises/-/combine-promises-1.1.0.tgz",
+ "integrity": "sha512-ZI9jvcLDxqwaXEixOhArm3r7ReIivsXkpbyEWyeOhzz1QS0iSgBPnWvEqvIQtYyamGCYA88gFhmUrs9hrrQ0pg==",
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/comma-separated-tokens": {
+ "version": "1.0.8",
+ "resolved": "https://registry.npmjs.org/comma-separated-tokens/-/comma-separated-tokens-1.0.8.tgz",
+ "integrity": "sha512-GHuDRO12Sypu2cV70d1dkA2EUmXHgntrzbpvOB+Qy+49ypNfGgFQIC2fhhXbnyrJRynDCAARsT7Ou0M6hirpfw==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/commander": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/commander/-/commander-5.1.0.tgz",
+ "integrity": "sha512-P0CysNDQ7rtVw4QIQtm+MRxV66vKFSvlsQvGYXZWR3qFU0jlMKHZZZgw8e+8DSah4UDKMqnknRDQz+xuQXQ/Zg==",
+ "engines": {
+ "node": ">= 6"
+ }
+ },
+ "node_modules/commondir": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/commondir/-/commondir-1.0.1.tgz",
+ "integrity": "sha512-W9pAhw0ja1Edb5GVdIF1mjZw/ASI0AlShXM83UUGe2DVr5TdAPEA1OA8m/g8zWp9x6On7gqufY+FatDbC3MDQg=="
+ },
+ "node_modules/compressible": {
+ "version": "2.0.18",
+ "resolved": "https://registry.npmjs.org/compressible/-/compressible-2.0.18.tgz",
+ "integrity": "sha512-AF3r7P5dWxL8MxyITRMlORQNaOA2IkAFaTr4k7BUumjPtRpGDTZpl0Pb1XCO6JeDCBdp126Cgs9sMxqSjgYyRg==",
+ "dependencies": {
+ "mime-db": ">= 1.43.0 < 2"
+ },
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/compressible/node_modules/mime-db": {
+ "version": "1.52.0",
+ "resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
+ "integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/compression": {
+ "version": "1.8.1",
+ "resolved": "https://registry.npmjs.org/compression/-/compression-1.8.1.tgz",
+ "integrity": "sha512-9mAqGPHLakhCLeNyxPkK4xVo746zQ/czLH1Ky+vkitMnWfWZps8r0qXuwhwizagCRttsL4lfG4pIOvaWLpAP0w==",
+ "license": "MIT",
+ "dependencies": {
+ "bytes": "3.1.2",
+ "compressible": "~2.0.18",
+ "debug": "2.6.9",
+ "negotiator": "~0.6.4",
+ "on-headers": "~1.1.0",
+ "safe-buffer": "5.2.1",
+ "vary": "~1.1.2"
+ },
+ "engines": {
+ "node": ">= 0.8.0"
+ }
+ },
+ "node_modules/compression/node_modules/bytes": {
+ "version": "3.1.2",
+ "resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.2.tgz",
+ "integrity": "sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg==",
+ "license": "MIT",
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/compression/node_modules/debug": {
+ "version": "2.6.9",
+ "resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
+ "integrity": "sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA==",
+ "dependencies": {
+ "ms": "2.0.0"
+ }
+ },
+ "node_modules/compression/node_modules/ms": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz",
+ "integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A=="
+ },
+ "node_modules/compression/node_modules/negotiator": {
+ "version": "0.6.4",
+ "resolved": "https://registry.npmjs.org/negotiator/-/negotiator-0.6.4.tgz",
+ "integrity": "sha512-myRT3DiWPHqho5PrJaIRyaMv2kgYf0mUVgBNOYMuCH5Ki1yEiQaf/ZJuQ62nvpc44wL5WDbTX7yGJi1Neevw8w==",
+ "license": "MIT",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/concat-map": {
+ "version": "0.0.1",
+ "resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz",
+ "integrity": "sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg=="
+ },
+ "node_modules/configstore": {
+ "version": "5.0.1",
+ "resolved": "https://registry.npmjs.org/configstore/-/configstore-5.0.1.tgz",
+ "integrity": "sha512-aMKprgk5YhBNyH25hj8wGt2+D52Sw1DRRIzqBwLp2Ya9mFmY8KPvvtvmna8SxVR9JMZ4kzMD68N22vlaRpkeFA==",
+ "dependencies": {
+ "dot-prop": "^5.2.0",
+ "graceful-fs": "^4.1.2",
+ "make-dir": "^3.0.0",
+ "unique-string": "^2.0.0",
+ "write-file-atomic": "^3.0.0",
+ "xdg-basedir": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/connect-history-api-fallback": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/connect-history-api-fallback/-/connect-history-api-fallback-2.0.0.tgz",
+ "integrity": "sha512-U73+6lQFmfiNPrYbXqr6kZ1i1wiRqXnp2nhMsINseWXO8lDau0LGEffJ8kQi4EjLZympVgRdvqjAgiZ1tgzDDA==",
+ "engines": {
+ "node": ">=0.8"
+ }
+ },
+ "node_modules/consola": {
+ "version": "2.15.3",
+ "resolved": "https://registry.npmjs.org/consola/-/consola-2.15.3.tgz",
+ "integrity": "sha512-9vAdYbHj6x2fLKC4+oPH0kFzY/orMZyG2Aj+kNylHxKGJ/Ed4dpNyAQYwJOdqO4zdM7XpVHmyejQDcQHrnuXbw=="
+ },
+ "node_modules/content-disposition": {
+ "version": "0.5.2",
+ "resolved": "https://registry.npmjs.org/content-disposition/-/content-disposition-0.5.2.tgz",
+ "integrity": "sha512-kRGRZw3bLlFISDBgwTSA1TMBFN6J6GWDeubmDE3AF+3+yXL8hTWv8r5rkLbqYXY4RjPk/EzHnClI3zQf1cFmHA==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/content-type": {
+ "version": "1.0.5",
+ "resolved": "https://registry.npmjs.org/content-type/-/content-type-1.0.5.tgz",
+ "integrity": "sha512-nTjqfcBFEipKdXCv4YDQWCfmcLZKm81ldF0pAopTvyrFGVbcR6P/VAAd5G7N+0tTr8QqiU0tFadD6FK4NtJwOA==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/convert-source-map": {
+ "version": "1.9.0",
+ "resolved": "https://registry.npmjs.org/convert-source-map/-/convert-source-map-1.9.0.tgz",
+ "integrity": "sha512-ASFBup0Mz1uyiIjANan1jzLQami9z1PoYSZCiiYW2FczPbenXc45FZdBZLzOT+r6+iciuEModtmCti+hjaAk0A=="
+ },
+ "node_modules/cookie": {
+ "version": "0.7.1",
+ "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.1.tgz",
+ "integrity": "sha512-6DnInpx7SJ2AK3+CTUE/ZM0vWTUboZCegxhC2xiIydHR9jNuTAASBrfEpHhiGOZw/nX51bHt6YQl8jsGo4y/0w==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/cookie-signature": {
+ "version": "1.0.6",
+ "resolved": "https://registry.npmjs.org/cookie-signature/-/cookie-signature-1.0.6.tgz",
+ "integrity": "sha512-QADzlaHc8icV8I7vbaJXJwod9HWYp8uCqf1xa4OfNu1T7JVxQIrUgOWtHdNDtPiywmFbiS12VjotIXLrKM3orQ=="
+ },
+ "node_modules/copy-text-to-clipboard": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/copy-text-to-clipboard/-/copy-text-to-clipboard-3.1.0.tgz",
+ "integrity": "sha512-PFM6BnjLnOON/lB3ta/Jg7Ywsv+l9kQGD4TWDCSlRBGmqnnTM5MrDkhAFgw+8HZt0wW6Q2BBE4cmy9sq+s9Qng==",
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/copy-webpack-plugin": {
+ "version": "11.0.0",
+ "resolved": "https://registry.npmjs.org/copy-webpack-plugin/-/copy-webpack-plugin-11.0.0.tgz",
+ "integrity": "sha512-fX2MWpamkW0hZxMEg0+mYnA40LTosOSa5TqZ9GYIBzyJa9C3QUaMPSE2xAi/buNr8u89SfD9wHSQVBzrRa/SOQ==",
+ "dependencies": {
+ "fast-glob": "^3.2.11",
+ "glob-parent": "^6.0.1",
+ "globby": "^13.1.1",
+ "normalize-path": "^3.0.0",
+ "schema-utils": "^4.0.0",
+ "serialize-javascript": "^6.0.0"
+ },
+ "engines": {
+ "node": ">= 14.15.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ },
+ "peerDependencies": {
+ "webpack": "^5.1.0"
+ }
+ },
+ "node_modules/copy-webpack-plugin/node_modules/ajv": {
+ "version": "8.12.0",
+ "resolved": "https://registry.npmjs.org/ajv/-/ajv-8.12.0.tgz",
+ "integrity": "sha512-sRu1kpcO9yLtYxBKvqfTeh9KzZEwO3STyX1HT+4CaDzC6HpTGYhIhPIzj9XuKU7KYDwnaeh5hcOwjy1QuJzBPA==",
+ "dependencies": {
+ "fast-deep-equal": "^3.1.1",
+ "json-schema-traverse": "^1.0.0",
+ "require-from-string": "^2.0.2",
+ "uri-js": "^4.2.2"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/epoberezkin"
+ }
+ },
+ "node_modules/copy-webpack-plugin/node_modules/ajv-keywords": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/ajv-keywords/-/ajv-keywords-5.1.0.tgz",
+ "integrity": "sha512-YCS/JNFAUyr5vAuhk1DWm1CBxRHW9LbJ2ozWeemrIqpbsqKjHVxYPyi5GC0rjZIT5JxJ3virVTS8wk4i/Z+krw==",
+ "dependencies": {
+ "fast-deep-equal": "^3.1.3"
+ },
+ "peerDependencies": {
+ "ajv": "^8.8.2"
+ }
+ },
+ "node_modules/copy-webpack-plugin/node_modules/glob-parent": {
+ "version": "6.0.2",
+ "resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-6.0.2.tgz",
+ "integrity": "sha512-XxwI8EOhVQgWp6iDL+3b0r86f4d6AX6zSU55HfB4ydCEuXLXc5FcYeOu+nnGftS4TEju/11rt4KJPTMgbfmv4A==",
+ "dependencies": {
+ "is-glob": "^4.0.3"
+ },
+ "engines": {
+ "node": ">=10.13.0"
+ }
+ },
+ "node_modules/copy-webpack-plugin/node_modules/globby": {
+ "version": "13.1.3",
+ "resolved": "https://registry.npmjs.org/globby/-/globby-13.1.3.tgz",
+ "integrity": "sha512-8krCNHXvlCgHDpegPzleMq07yMYTO2sXKASmZmquEYWEmCx6J5UTRbp5RwMJkTJGtcQ44YpiUYUiN0b9mzy8Bw==",
+ "dependencies": {
+ "dir-glob": "^3.0.1",
+ "fast-glob": "^3.2.11",
+ "ignore": "^5.2.0",
+ "merge2": "^1.4.1",
+ "slash": "^4.0.0"
+ },
+ "engines": {
+ "node": "^12.20.0 || ^14.13.1 || >=16.0.0"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/copy-webpack-plugin/node_modules/json-schema-traverse": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-1.0.0.tgz",
+ "integrity": "sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug=="
+ },
+ "node_modules/copy-webpack-plugin/node_modules/schema-utils": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/schema-utils/-/schema-utils-4.0.0.tgz",
+ "integrity": "sha512-1edyXKgh6XnJsJSQ8mKWXnN/BVaIbFMLpouRUrXgVq7WYne5kw3MW7UPhO44uRXQSIpTSXoJbmrR2X0w9kUTyg==",
+ "dependencies": {
+ "@types/json-schema": "^7.0.9",
+ "ajv": "^8.8.0",
+ "ajv-formats": "^2.1.1",
+ "ajv-keywords": "^5.0.0"
+ },
+ "engines": {
+ "node": ">= 12.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ }
+ },
+ "node_modules/copy-webpack-plugin/node_modules/slash": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/slash/-/slash-4.0.0.tgz",
+ "integrity": "sha512-3dOsAHXXUkQTpOYcoAxLIorMTp4gIQr5IW3iVb7A7lFIp0VHhnynm9izx6TssdrIcVIESAlVjtnO2K8bg+Coew==",
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/copyfiles": {
+ "version": "2.4.1",
+ "resolved": "https://registry.npmjs.org/copyfiles/-/copyfiles-2.4.1.tgz",
+ "integrity": "sha512-fereAvAvxDrQDOXybk3Qu3dPbOoKoysFMWtkY3mv5BsL8//OSZVL5DCLYqgRfY5cWirgRzlC+WSrxp6Bo3eNZg==",
+ "dependencies": {
+ "glob": "^7.0.5",
+ "minimatch": "^3.0.3",
+ "mkdirp": "^1.0.4",
+ "noms": "0.0.0",
+ "through2": "^2.0.1",
+ "untildify": "^4.0.0",
+ "yargs": "^16.1.0"
+ },
+ "bin": {
+ "copyfiles": "copyfiles",
+ "copyup": "copyfiles"
+ }
+ },
+ "node_modules/core-js": {
+ "version": "3.29.1",
+ "resolved": "https://registry.npmjs.org/core-js/-/core-js-3.29.1.tgz",
+ "integrity": "sha512-+jwgnhg6cQxKYIIjGtAHq2nwUOolo9eoFZ4sHfUH09BLXBgxnH4gA0zEd+t+BO2cNB8idaBtZFcFTRjQJRJmAw==",
+ "hasInstallScript": true,
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/core-js"
+ }
+ },
+ "node_modules/core-js-compat": {
+ "version": "3.29.1",
+ "resolved": "https://registry.npmjs.org/core-js-compat/-/core-js-compat-3.29.1.tgz",
+ "integrity": "sha512-QmchCua884D8wWskMX8tW5ydINzd8oSJVx38lx/pVkFGqztxt73GYre3pm/hyYq8bPf+MW5In4I/uRShFDsbrA==",
+ "dependencies": {
+ "browserslist": "^4.21.5"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/core-js"
+ }
+ },
+ "node_modules/core-js-pure": {
+ "version": "3.41.0",
+ "resolved": "https://registry.npmjs.org/core-js-pure/-/core-js-pure-3.41.0.tgz",
+ "integrity": "sha512-71Gzp96T9YPk63aUvE5Q5qP+DryB4ZloUZPSOebGM88VNw8VNfvdA7z6kGA8iGOTEzAomsRidp4jXSmUIJsL+Q==",
+ "hasInstallScript": true,
+ "license": "MIT",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/core-js"
+ }
+ },
+ "node_modules/core-util-is": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.3.tgz",
+ "integrity": "sha512-ZQBvi1DcpJ4GDqanjucZ2Hj3wEO5pZDS89BWbkcrvdxksJorwUDDZamX9ldFkp9aw2lmBDLgkObEA4DWNJ9FYQ=="
+ },
+ "node_modules/cosmiconfig": {
+ "version": "7.1.0",
+ "resolved": "https://registry.npmjs.org/cosmiconfig/-/cosmiconfig-7.1.0.tgz",
+ "integrity": "sha512-AdmX6xUzdNASswsFtmwSt7Vj8po9IuqXm0UXz7QKPuEUmPB4XyjGfaAr2PSuELMwkRMVH1EpIkX5bTZGRB3eCA==",
+ "dependencies": {
+ "@types/parse-json": "^4.0.0",
+ "import-fresh": "^3.2.1",
+ "parse-json": "^5.0.0",
+ "path-type": "^4.0.0",
+ "yaml": "^1.10.0"
+ },
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/cross-fetch": {
+ "version": "3.1.5",
+ "resolved": "https://registry.npmjs.org/cross-fetch/-/cross-fetch-3.1.5.tgz",
+ "integrity": "sha512-lvb1SBsI0Z7GDwmuid+mU3kWVBwTVUbe7S0H52yaaAdQOXq2YktTCZdlAcNKFzE6QtRz0snpw9bNiPeOIkkQvw==",
+ "dependencies": {
+ "node-fetch": "2.6.7"
+ }
+ },
+ "node_modules/cross-spawn": {
+ "version": "7.0.6",
+ "resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz",
+ "integrity": "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==",
+ "dependencies": {
+ "path-key": "^3.1.0",
+ "shebang-command": "^2.0.0",
+ "which": "^2.0.1"
+ },
+ "engines": {
+ "node": ">= 8"
+ }
+ },
+ "node_modules/crypto-random-string": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/crypto-random-string/-/crypto-random-string-2.0.0.tgz",
+ "integrity": "sha512-v1plID3y9r/lPhviJ1wrXpLeyUIGAZ2SHNYTEapm7/8A9nLPoyvVp3RK/EPFqn5kEznyWgYZNsRtYYIWbuG8KA==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/css-color-keywords": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/css-color-keywords/-/css-color-keywords-1.0.0.tgz",
+ "integrity": "sha512-FyyrDHZKEjXDpNJYvVsV960FiqQyXc/LlYmsxl2BcdMb2WPx0OGRVgTg55rPSyLSNMqP52R9r8geSp7apN3Ofg==",
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/css-declaration-sorter": {
+ "version": "6.4.0",
+ "resolved": "https://registry.npmjs.org/css-declaration-sorter/-/css-declaration-sorter-6.4.0.tgz",
+ "integrity": "sha512-jDfsatwWMWN0MODAFuHszfjphEXfNw9JUAhmY4pLu3TyTU+ohUpsbVtbU+1MZn4a47D9kqh03i4eyOm+74+zew==",
+ "engines": {
+ "node": "^10 || ^12 || >=14"
+ },
+ "peerDependencies": {
+ "postcss": "^8.0.9"
+ }
+ },
+ "node_modules/css-loader": {
+ "version": "6.7.3",
+ "resolved": "https://registry.npmjs.org/css-loader/-/css-loader-6.7.3.tgz",
+ "integrity": "sha512-qhOH1KlBMnZP8FzRO6YCH9UHXQhVMcEGLyNdb7Hv2cpcmJbW0YrddO+tG1ab5nT41KpHIYGsbeHqxB9xPu1pKQ==",
+ "dependencies": {
+ "icss-utils": "^5.1.0",
+ "postcss": "^8.4.19",
+ "postcss-modules-extract-imports": "^3.0.0",
+ "postcss-modules-local-by-default": "^4.0.0",
+ "postcss-modules-scope": "^3.0.0",
+ "postcss-modules-values": "^4.0.0",
+ "postcss-value-parser": "^4.2.0",
+ "semver": "^7.3.8"
+ },
+ "engines": {
+ "node": ">= 12.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ },
+ "peerDependencies": {
+ "webpack": "^5.0.0"
+ }
+ },
+ "node_modules/css-minimizer-webpack-plugin": {
+ "version": "4.2.2",
+ "resolved": "https://registry.npmjs.org/css-minimizer-webpack-plugin/-/css-minimizer-webpack-plugin-4.2.2.tgz",
+ "integrity": "sha512-s3Of/4jKfw1Hj9CxEO1E5oXhQAxlayuHO2y/ML+C6I9sQ7FdzfEV6QgMLN3vI+qFsjJGIAFLKtQK7t8BOXAIyA==",
+ "dependencies": {
+ "cssnano": "^5.1.8",
+ "jest-worker": "^29.1.2",
+ "postcss": "^8.4.17",
+ "schema-utils": "^4.0.0",
+ "serialize-javascript": "^6.0.0",
+ "source-map": "^0.6.1"
+ },
+ "engines": {
+ "node": ">= 14.15.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ },
+ "peerDependencies": {
+ "webpack": "^5.0.0"
+ },
+ "peerDependenciesMeta": {
+ "@parcel/css": {
+ "optional": true
+ },
+ "@swc/css": {
+ "optional": true
+ },
+ "clean-css": {
+ "optional": true
+ },
+ "csso": {
+ "optional": true
+ },
+ "esbuild": {
+ "optional": true
+ },
+ "lightningcss": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/css-minimizer-webpack-plugin/node_modules/ajv": {
+ "version": "8.12.0",
+ "resolved": "https://registry.npmjs.org/ajv/-/ajv-8.12.0.tgz",
+ "integrity": "sha512-sRu1kpcO9yLtYxBKvqfTeh9KzZEwO3STyX1HT+4CaDzC6HpTGYhIhPIzj9XuKU7KYDwnaeh5hcOwjy1QuJzBPA==",
+ "dependencies": {
+ "fast-deep-equal": "^3.1.1",
+ "json-schema-traverse": "^1.0.0",
+ "require-from-string": "^2.0.2",
+ "uri-js": "^4.2.2"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/epoberezkin"
+ }
+ },
+ "node_modules/css-minimizer-webpack-plugin/node_modules/ajv-keywords": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/ajv-keywords/-/ajv-keywords-5.1.0.tgz",
+ "integrity": "sha512-YCS/JNFAUyr5vAuhk1DWm1CBxRHW9LbJ2ozWeemrIqpbsqKjHVxYPyi5GC0rjZIT5JxJ3virVTS8wk4i/Z+krw==",
+ "dependencies": {
+ "fast-deep-equal": "^3.1.3"
+ },
+ "peerDependencies": {
+ "ajv": "^8.8.2"
+ }
+ },
+ "node_modules/css-minimizer-webpack-plugin/node_modules/json-schema-traverse": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-1.0.0.tgz",
+ "integrity": "sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug=="
+ },
+ "node_modules/css-minimizer-webpack-plugin/node_modules/schema-utils": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/schema-utils/-/schema-utils-4.0.0.tgz",
+ "integrity": "sha512-1edyXKgh6XnJsJSQ8mKWXnN/BVaIbFMLpouRUrXgVq7WYne5kw3MW7UPhO44uRXQSIpTSXoJbmrR2X0w9kUTyg==",
+ "dependencies": {
+ "@types/json-schema": "^7.0.9",
+ "ajv": "^8.8.0",
+ "ajv-formats": "^2.1.1",
+ "ajv-keywords": "^5.0.0"
+ },
+ "engines": {
+ "node": ">= 12.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ }
+ },
+ "node_modules/css-select": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/css-select/-/css-select-5.1.0.tgz",
+ "integrity": "sha512-nwoRF1rvRRnnCqqY7updORDsuqKzqYJ28+oSMaJMMgOauh3fvwHqMS7EZpIPqK8GL+g9mKxF1vP/ZjSeNjEVHg==",
+ "dependencies": {
+ "boolbase": "^1.0.0",
+ "css-what": "^6.1.0",
+ "domhandler": "^5.0.2",
+ "domutils": "^3.0.1",
+ "nth-check": "^2.0.1"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/fb55"
+ }
+ },
+ "node_modules/css-to-react-native": {
+ "version": "3.2.0",
+ "resolved": "https://registry.npmjs.org/css-to-react-native/-/css-to-react-native-3.2.0.tgz",
+ "integrity": "sha512-e8RKaLXMOFii+02mOlqwjbD00KSEKqblnpO9e++1aXS1fPQOpS1YoqdVHBqPjHNoxeF2mimzVqawm2KCbEdtHQ==",
+ "dependencies": {
+ "camelize": "^1.0.0",
+ "css-color-keywords": "^1.0.0",
+ "postcss-value-parser": "^4.0.2"
+ }
+ },
+ "node_modules/css-tree": {
+ "version": "1.1.3",
+ "resolved": "https://registry.npmjs.org/css-tree/-/css-tree-1.1.3.tgz",
+ "integrity": "sha512-tRpdppF7TRazZrjJ6v3stzv93qxRcSsFmW6cX0Zm2NVKpxE1WV1HblnghVv9TreireHkqI/VDEsfolRF1p6y7Q==",
+ "dependencies": {
+ "mdn-data": "2.0.14",
+ "source-map": "^0.6.1"
+ },
+ "engines": {
+ "node": ">=8.0.0"
+ }
+ },
+ "node_modules/css-what": {
+ "version": "6.1.0",
+ "resolved": "https://registry.npmjs.org/css-what/-/css-what-6.1.0.tgz",
+ "integrity": "sha512-HTUrgRJ7r4dsZKU6GjmpfRK1O76h97Z8MfS1G0FozR+oF2kG6Vfe8JE6zwrkbxigziPHinCJ+gCPjA9EaBDtRw==",
+ "engines": {
+ "node": ">= 6"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/fb55"
+ }
+ },
+ "node_modules/cssesc": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/cssesc/-/cssesc-3.0.0.tgz",
+ "integrity": "sha512-/Tb/JcjK111nNScGob5MNtsntNM1aCNUDipB/TkwZFhyDrrE47SOx/18wF2bbjgc3ZzCSKW1T5nt5EbFoAz/Vg==",
+ "bin": {
+ "cssesc": "bin/cssesc"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/cssnano": {
+ "version": "5.1.15",
+ "resolved": "https://registry.npmjs.org/cssnano/-/cssnano-5.1.15.tgz",
+ "integrity": "sha512-j+BKgDcLDQA+eDifLx0EO4XSA56b7uut3BQFH+wbSaSTuGLuiyTa/wbRYthUXX8LC9mLg+WWKe8h+qJuwTAbHw==",
+ "dependencies": {
+ "cssnano-preset-default": "^5.2.14",
+ "lilconfig": "^2.0.3",
+ "yaml": "^1.10.2"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/cssnano"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/cssnano-preset-advanced": {
+ "version": "5.3.10",
+ "resolved": "https://registry.npmjs.org/cssnano-preset-advanced/-/cssnano-preset-advanced-5.3.10.tgz",
+ "integrity": "sha512-fnYJyCS9jgMU+cmHO1rPSPf9axbQyD7iUhLO5Df6O4G+fKIOMps+ZbU0PdGFejFBBZ3Pftf18fn1eG7MAPUSWQ==",
+ "dependencies": {
+ "autoprefixer": "^10.4.12",
+ "cssnano-preset-default": "^5.2.14",
+ "postcss-discard-unused": "^5.1.0",
+ "postcss-merge-idents": "^5.1.1",
+ "postcss-reduce-idents": "^5.2.0",
+ "postcss-zindex": "^5.1.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/cssnano-preset-default": {
+ "version": "5.2.14",
+ "resolved": "https://registry.npmjs.org/cssnano-preset-default/-/cssnano-preset-default-5.2.14.tgz",
+ "integrity": "sha512-t0SFesj/ZV2OTylqQVOrFgEh5uanxbO6ZAdeCrNsUQ6fVuXwYTxJPNAGvGTxHbD68ldIJNec7PyYZDBrfDQ+6A==",
+ "dependencies": {
+ "css-declaration-sorter": "^6.3.1",
+ "cssnano-utils": "^3.1.0",
+ "postcss-calc": "^8.2.3",
+ "postcss-colormin": "^5.3.1",
+ "postcss-convert-values": "^5.1.3",
+ "postcss-discard-comments": "^5.1.2",
+ "postcss-discard-duplicates": "^5.1.0",
+ "postcss-discard-empty": "^5.1.1",
+ "postcss-discard-overridden": "^5.1.0",
+ "postcss-merge-longhand": "^5.1.7",
+ "postcss-merge-rules": "^5.1.4",
+ "postcss-minify-font-values": "^5.1.0",
+ "postcss-minify-gradients": "^5.1.1",
+ "postcss-minify-params": "^5.1.4",
+ "postcss-minify-selectors": "^5.2.1",
+ "postcss-normalize-charset": "^5.1.0",
+ "postcss-normalize-display-values": "^5.1.0",
+ "postcss-normalize-positions": "^5.1.1",
+ "postcss-normalize-repeat-style": "^5.1.1",
+ "postcss-normalize-string": "^5.1.0",
+ "postcss-normalize-timing-functions": "^5.1.0",
+ "postcss-normalize-unicode": "^5.1.1",
+ "postcss-normalize-url": "^5.1.0",
+ "postcss-normalize-whitespace": "^5.1.1",
+ "postcss-ordered-values": "^5.1.3",
+ "postcss-reduce-initial": "^5.1.2",
+ "postcss-reduce-transforms": "^5.1.0",
+ "postcss-svgo": "^5.1.0",
+ "postcss-unique-selectors": "^5.1.1"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/cssnano-utils": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/cssnano-utils/-/cssnano-utils-3.1.0.tgz",
+ "integrity": "sha512-JQNR19/YZhz4psLX/rQ9M83e3z2Wf/HdJbryzte4a3NSuafyp9w/I4U+hx5C2S9g41qlstH7DEWnZaaj83OuEA==",
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/csso": {
+ "version": "4.2.0",
+ "resolved": "https://registry.npmjs.org/csso/-/csso-4.2.0.tgz",
+ "integrity": "sha512-wvlcdIbf6pwKEk7vHj8/Bkc0B4ylXZruLvOgs9doS5eOsOpuodOV2zJChSpkp+pRpYQLQMeF04nr3Z68Sta9jA==",
+ "dependencies": {
+ "css-tree": "^1.1.2"
+ },
+ "engines": {
+ "node": ">=8.0.0"
+ }
+ },
+ "node_modules/csstype": {
+ "version": "3.1.1",
+ "resolved": "https://registry.npmjs.org/csstype/-/csstype-3.1.1.tgz",
+ "integrity": "sha512-DJR/VvkAvSZW9bTouZue2sSxDwdTN92uHjqeKVm+0dAqdfNykRzQ95tay8aXMBAAPpUiq4Qcug2L7neoRh2Egw=="
+ },
+ "node_modules/debug": {
+ "version": "4.3.4",
+ "resolved": "https://registry.npmjs.org/debug/-/debug-4.3.4.tgz",
+ "integrity": "sha512-PRWFHuSU3eDtQJPvnNY7Jcket1j0t5OuOsFzPPzsekD52Zl8qUfFIPEiswXqIvHWGVHOgX+7G/vCNNhehwxfkQ==",
+ "dependencies": {
+ "ms": "2.1.2"
+ },
+ "engines": {
+ "node": ">=6.0"
+ },
+ "peerDependenciesMeta": {
+ "supports-color": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/decko": {
+ "version": "1.2.0",
+ "resolved": "https://registry.npmjs.org/decko/-/decko-1.2.0.tgz",
+ "integrity": "sha512-m8FnyHXV1QX+S1cl+KPFDIl6NMkxtKsy6+U/aYyjrOqWMuwAwYWu7ePqrsUHtDR5Y8Yk2pi/KIDSgF+vT4cPOQ=="
+ },
+ "node_modules/decompress-response": {
+ "version": "3.3.0",
+ "resolved": "https://registry.npmjs.org/decompress-response/-/decompress-response-3.3.0.tgz",
+ "integrity": "sha512-BzRPQuY1ip+qDonAOz42gRm/pg9F768C+npV/4JOsxRC2sq+Rlk+Q4ZCAsOhnIaMrgarILY+RMUIvMmmX1qAEA==",
+ "dependencies": {
+ "mimic-response": "^1.0.0"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/deep-extend": {
+ "version": "0.6.0",
+ "resolved": "https://registry.npmjs.org/deep-extend/-/deep-extend-0.6.0.tgz",
+ "integrity": "sha512-LOHxIOaPYdHlJRtCQfDIVZtfw/ufM8+rVj649RIHzcm/vGwQRXFt6OPqIFWsm2XEMrNIEtWR64sY1LEKD2vAOA==",
+ "engines": {
+ "node": ">=4.0.0"
+ }
+ },
+ "node_modules/deepmerge": {
+ "version": "4.3.1",
+ "resolved": "https://registry.npmjs.org/deepmerge/-/deepmerge-4.3.1.tgz",
+ "integrity": "sha512-3sUqbMEc77XqpdNO7FRyRog+eW3ph+GYCbj+rK+uYyRMuwsVy0rMiVtPn+QJlKFvWP/1PYpapqYn0Me2knFn+A==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/default-gateway": {
+ "version": "6.0.3",
+ "resolved": "https://registry.npmjs.org/default-gateway/-/default-gateway-6.0.3.tgz",
+ "integrity": "sha512-fwSOJsbbNzZ/CUFpqFBqYfYNLj1NbMPm8MMCIzHjC83iSJRBEGmDUxU+WP661BaBQImeC2yHwXtz+P/O9o+XEg==",
+ "dependencies": {
+ "execa": "^5.0.0"
+ },
+ "engines": {
+ "node": ">= 10"
+ }
+ },
+ "node_modules/defer-to-connect": {
+ "version": "1.1.3",
+ "resolved": "https://registry.npmjs.org/defer-to-connect/-/defer-to-connect-1.1.3.tgz",
+ "integrity": "sha512-0ISdNousHvZT2EiFlZeZAHBUvSxmKswVCEf8hW7KWgG4a8MVEu/3Vb6uWYozkjylyCxe0JBIiRB1jV45S70WVQ=="
+ },
+ "node_modules/define-data-property": {
+ "version": "1.1.4",
+ "resolved": "https://registry.npmjs.org/define-data-property/-/define-data-property-1.1.4.tgz",
+ "integrity": "sha512-rBMvIzlpA8v6E+SJZoo++HAYqsLrkg7MSfIinMPFhmkorw7X+dOXVJQs+QT69zGkzMyfDnIMN2Wid1+NbL3T+A==",
+ "dependencies": {
+ "es-define-property": "^1.0.0",
+ "es-errors": "^1.3.0",
+ "gopd": "^1.0.1"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/define-lazy-prop": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/define-lazy-prop/-/define-lazy-prop-2.0.0.tgz",
+ "integrity": "sha512-Ds09qNh8yw3khSjiJjiUInaGX9xlqZDY7JVryGxdxV7NPeuqQfplOpQ66yJFZut3jLa5zOwkXw1g9EI2uKh4Og==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/define-properties": {
+ "version": "1.2.0",
+ "resolved": "https://registry.npmjs.org/define-properties/-/define-properties-1.2.0.tgz",
+ "integrity": "sha512-xvqAVKGfT1+UAvPwKTVw/njhdQ8ZhXK4lI0bCIuCMrp2up9nPnaDftrLtmpTazqd1o+UY4zgzU+avtMbDP+ldA==",
+ "dependencies": {
+ "has-property-descriptors": "^1.0.0",
+ "object-keys": "^1.1.1"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/del": {
+ "version": "6.1.1",
+ "resolved": "https://registry.npmjs.org/del/-/del-6.1.1.tgz",
+ "integrity": "sha512-ua8BhapfP0JUJKC/zV9yHHDW/rDoDxP4Zhn3AkA6/xT6gY7jYXJiaeyBZznYVujhZZET+UgcbZiQ7sN3WqcImg==",
+ "dependencies": {
+ "globby": "^11.0.1",
+ "graceful-fs": "^4.2.4",
+ "is-glob": "^4.0.1",
+ "is-path-cwd": "^2.2.0",
+ "is-path-inside": "^3.0.2",
+ "p-map": "^4.0.0",
+ "rimraf": "^3.0.2",
+ "slash": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/depd": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/depd/-/depd-2.0.0.tgz",
+ "integrity": "sha512-g7nH6P6dyDioJogAAGprGpCtVImJhpPk/roCzdb3fIh61/s/nPsfR6onyMwkCAR/OlC3yBC0lESvUoQEAssIrw==",
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/destroy": {
+ "version": "1.2.0",
+ "resolved": "https://registry.npmjs.org/destroy/-/destroy-1.2.0.tgz",
+ "integrity": "sha512-2sJGJTaXIIaR1w4iJSNoN0hnMY7Gpc/n8D4qSCJw8QqFWXf7cuAgnEHxBpweaVcPevC2l3KpjYCx3NypQQgaJg==",
+ "engines": {
+ "node": ">= 0.8",
+ "npm": "1.2.8000 || >= 1.4.16"
+ }
+ },
+ "node_modules/detab": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/detab/-/detab-2.0.4.tgz",
+ "integrity": "sha512-8zdsQA5bIkoRECvCrNKPla84lyoR7DSAyf7p0YgXzBO9PDJx8KntPUay7NS6yp+KdxdVtiE5SpHKtbp2ZQyA9g==",
+ "dependencies": {
+ "repeat-string": "^1.5.4"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/detect-node": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/detect-node/-/detect-node-2.1.0.tgz",
+ "integrity": "sha512-T0NIuQpnTvFDATNuHN5roPwSBG83rFsuO+MXXH9/3N1eFbn4wcPjttvjMLEPWJ0RGUYgQE7cGgS3tNxbqCGM7g=="
+ },
+ "node_modules/detect-port": {
+ "version": "1.5.1",
+ "resolved": "https://registry.npmjs.org/detect-port/-/detect-port-1.5.1.tgz",
+ "integrity": "sha512-aBzdj76lueB6uUst5iAs7+0H/oOjqI5D16XUWxlWMIMROhcM0rfsNVk93zTngq1dDNpoXRr++Sus7ETAExppAQ==",
+ "dependencies": {
+ "address": "^1.0.1",
+ "debug": "4"
+ },
+ "bin": {
+ "detect": "bin/detect-port.js",
+ "detect-port": "bin/detect-port.js"
+ }
+ },
+ "node_modules/detect-port-alt": {
+ "version": "1.1.6",
+ "resolved": "https://registry.npmjs.org/detect-port-alt/-/detect-port-alt-1.1.6.tgz",
+ "integrity": "sha512-5tQykt+LqfJFBEYaDITx7S7cR7mJ/zQmLXZ2qt5w04ainYZw6tBf9dBunMjVeVOdYVRUzUOE4HkY5J7+uttb5Q==",
+ "dependencies": {
+ "address": "^1.0.1",
+ "debug": "^2.6.0"
+ },
+ "bin": {
+ "detect": "bin/detect-port",
+ "detect-port": "bin/detect-port"
+ },
+ "engines": {
+ "node": ">= 4.2.1"
+ }
+ },
+ "node_modules/detect-port-alt/node_modules/debug": {
+ "version": "2.6.9",
+ "resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
+ "integrity": "sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA==",
+ "dependencies": {
+ "ms": "2.0.0"
+ }
+ },
+ "node_modules/detect-port-alt/node_modules/ms": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz",
+ "integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A=="
+ },
+ "node_modules/dir-glob": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/dir-glob/-/dir-glob-3.0.1.tgz",
+ "integrity": "sha512-WkrWp9GR4KXfKGYzOLmTuGVi1UWFfws377n9cc55/tb6DuqyF6pcQ5AbiHEshaDpY9v6oaSr2XCDidGmMwdzIA==",
+ "dependencies": {
+ "path-type": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/dns-equal": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/dns-equal/-/dns-equal-1.0.0.tgz",
+ "integrity": "sha512-z+paD6YUQsk+AbGCEM4PrOXSss5gd66QfcVBFTKR/HpFL9jCqikS94HYwKww6fQyO7IxrIIyUu+g0Ka9tUS2Cg=="
+ },
+ "node_modules/dns-packet": {
+ "version": "5.5.0",
+ "resolved": "https://registry.npmjs.org/dns-packet/-/dns-packet-5.5.0.tgz",
+ "integrity": "sha512-USawdAUzRkV6xrqTjiAEp6M9YagZEzWcSUaZTcIFAiyQWW1SoI6KyId8y2+/71wbgHKQAKd+iupLv4YvEwYWvA==",
+ "dependencies": {
+ "@leichtgewicht/ip-codec": "^2.0.1"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/docusaurus-plugin-redoc": {
+ "version": "1.6.0",
+ "resolved": "https://registry.npmjs.org/docusaurus-plugin-redoc/-/docusaurus-plugin-redoc-1.6.0.tgz",
+ "integrity": "sha512-bvOmVcJ9Lo6ymyaHCoXTjN6Ck7/Dog1KRsJgZilB6ukHQ7d6nJrAwAEoDF1rXto8tOvIUqVb6Zzy7qDPvBQA1Q==",
+ "dependencies": {
+ "@redocly/openapi-core": "1.0.0-beta.123",
+ "redoc": "2.0.0"
+ },
+ "engines": {
+ "node": ">=14"
+ },
+ "peerDependencies": {
+ "@docusaurus/utils": "^2.0.0"
+ }
+ },
+ "node_modules/docusaurus-theme-redoc": {
+ "version": "1.6.1",
+ "resolved": "https://registry.npmjs.org/docusaurus-theme-redoc/-/docusaurus-theme-redoc-1.6.1.tgz",
+ "integrity": "sha512-dPAESXH0PXUb5d8kL+KdIFmidap3BZNK5Aa8ilcT/Rp6X8LAwJXn878lQbwy4Wappg1dBR7u6JRVKY1oWIW1nw==",
+ "dependencies": {
+ "@redocly/openapi-core": "1.0.0-beta.123",
+ "clsx": "^1.2.1",
+ "copyfiles": "^2.4.1",
+ "lodash": "^4.17.21",
+ "mobx": "^6.8.0",
+ "redoc": "2.0.0",
+ "styled-components": "^5.3.6"
+ },
+ "engines": {
+ "node": ">=14"
+ },
+ "peerDependencies": {
+ "@docusaurus/theme-common": "^2.0.0"
+ }
+ },
+ "node_modules/dom-converter": {
+ "version": "0.2.0",
+ "resolved": "https://registry.npmjs.org/dom-converter/-/dom-converter-0.2.0.tgz",
+ "integrity": "sha512-gd3ypIPfOMr9h5jIKq8E3sHOTCjeirnl0WK5ZdS1AW0Odt0b1PaWaHdJ4Qk4klv+YB9aJBS7mESXjFoDQPu6DA==",
+ "dependencies": {
+ "utila": "~0.4"
+ }
+ },
+ "node_modules/dom-serializer": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/dom-serializer/-/dom-serializer-2.0.0.tgz",
+ "integrity": "sha512-wIkAryiqt/nV5EQKqQpo3SToSOV9J0DnbJqwK7Wv/Trc92zIAYZ4FlMu+JPFW1DfGFt81ZTCGgDEabffXeLyJg==",
+ "dependencies": {
+ "domelementtype": "^2.3.0",
+ "domhandler": "^5.0.2",
+ "entities": "^4.2.0"
+ },
+ "funding": {
+ "url": "https://github.com/cheeriojs/dom-serializer?sponsor=1"
+ }
+ },
+ "node_modules/domelementtype": {
+ "version": "2.3.0",
+ "resolved": "https://registry.npmjs.org/domelementtype/-/domelementtype-2.3.0.tgz",
+ "integrity": "sha512-OLETBj6w0OsagBwdXnPdN0cnMfF9opN69co+7ZrbfPGrdpPVNBUj02spi6B1N7wChLQiPn4CSH/zJvXw56gmHw==",
+ "funding": [
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/fb55"
+ }
+ ]
+ },
+ "node_modules/domhandler": {
+ "version": "5.0.3",
+ "resolved": "https://registry.npmjs.org/domhandler/-/domhandler-5.0.3.tgz",
+ "integrity": "sha512-cgwlv/1iFQiFnU96XXgROh8xTeetsnJiDsTc7TYCLFd9+/WNkIqPTxiM/8pSd8VIrhXGTf1Ny1q1hquVqDJB5w==",
+ "dependencies": {
+ "domelementtype": "^2.3.0"
+ },
+ "engines": {
+ "node": ">= 4"
+ },
+ "funding": {
+ "url": "https://github.com/fb55/domhandler?sponsor=1"
+ }
+ },
+ "node_modules/dompurify": {
+ "version": "2.5.6",
+ "resolved": "https://registry.npmjs.org/dompurify/-/dompurify-2.5.6.tgz",
+ "integrity": "sha512-zUTaUBO8pY4+iJMPE1B9XlO2tXVYIcEA4SNGtvDELzTSCQO7RzH+j7S180BmhmJId78lqGU2z19vgVx2Sxs/PQ=="
+ },
+ "node_modules/domutils": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/domutils/-/domutils-3.0.1.tgz",
+ "integrity": "sha512-z08c1l761iKhDFtfXO04C7kTdPBLi41zwOZl00WS8b5eiaebNpY00HKbztwBq+e3vyqWNwWF3mP9YLUeqIrF+Q==",
+ "dependencies": {
+ "dom-serializer": "^2.0.0",
+ "domelementtype": "^2.3.0",
+ "domhandler": "^5.0.1"
+ },
+ "funding": {
+ "url": "https://github.com/fb55/domutils?sponsor=1"
+ }
+ },
+ "node_modules/dot-case": {
+ "version": "3.0.4",
+ "resolved": "https://registry.npmjs.org/dot-case/-/dot-case-3.0.4.tgz",
+ "integrity": "sha512-Kv5nKlh6yRrdrGvxeJ2e5y2eRUpkUosIW4A2AS38zwSz27zu7ufDwQPi5Jhs3XAlGNetl3bmnGhQsMtkKJnj3w==",
+ "dependencies": {
+ "no-case": "^3.0.4",
+ "tslib": "^2.0.3"
+ }
+ },
+ "node_modules/dot-prop": {
+ "version": "5.3.0",
+ "resolved": "https://registry.npmjs.org/dot-prop/-/dot-prop-5.3.0.tgz",
+ "integrity": "sha512-QM8q3zDe58hqUqjraQOmzZ1LIH9SWQJTlEKCH4kJ2oQvLZk7RbQXvtDM2XEq3fwkV9CCvvH4LA0AV+ogFsBM2Q==",
+ "dependencies": {
+ "is-obj": "^2.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/dot-prop/node_modules/is-obj": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/is-obj/-/is-obj-2.0.0.tgz",
+ "integrity": "sha512-drqDG3cbczxxEJRoOXcOjtdp1J/lyp1mNn0xaznRs8+muBhgQcrnbspox5X5fOw0HnMnbfDzvnEMEtqDEJEo8w==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/duplexer": {
+ "version": "0.1.2",
+ "resolved": "https://registry.npmjs.org/duplexer/-/duplexer-0.1.2.tgz",
+ "integrity": "sha512-jtD6YG370ZCIi/9GTaJKQxWTZD045+4R4hTk/x1UyoqadyJ9x9CgSi1RlVDQF8U2sxLLSnFkCaMihqljHIWgMg=="
+ },
+ "node_modules/duplexer3": {
+ "version": "0.1.5",
+ "resolved": "https://registry.npmjs.org/duplexer3/-/duplexer3-0.1.5.tgz",
+ "integrity": "sha512-1A8za6ws41LQgv9HrE/66jyC5yuSjQ3L/KOpFtoBilsAK2iA2wuS5rTt1OCzIvtS2V7nVmedsUU+DGRcjBmOYA=="
+ },
+ "node_modules/eastasianwidth": {
+ "version": "0.2.0",
+ "resolved": "https://registry.npmjs.org/eastasianwidth/-/eastasianwidth-0.2.0.tgz",
+ "integrity": "sha512-I88TYZWc9XiYHRQ4/3c5rjjfgkjhLyW2luGIheGERbNQ6OY7yTybanSpDXZa8y7VUP9YmDcYa+eyq4ca7iLqWA=="
+ },
+ "node_modules/ee-first": {
+ "version": "1.1.1",
+ "resolved": "https://registry.npmjs.org/ee-first/-/ee-first-1.1.1.tgz",
+ "integrity": "sha512-WMwm9LhRUo+WUaRN+vRuETqG89IgZphVSNkdFgeb6sS/E4OrDIN7t48CAewSHXc6C8lefD8KKfr5vY61brQlow=="
+ },
+ "node_modules/electron-to-chromium": {
+ "version": "1.5.13",
+ "resolved": "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.5.13.tgz",
+ "integrity": "sha512-lbBcvtIJ4J6sS4tb5TLp1b4LyfCdMkwStzXPyAgVgTRAsep4bvrAGaBOP7ZJtQMNJpSQ9SqG4brWOroNaQtm7Q=="
+ },
+ "node_modules/emoji-regex": {
+ "version": "9.2.2",
+ "resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-9.2.2.tgz",
+ "integrity": "sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg=="
+ },
+ "node_modules/emojis-list": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/emojis-list/-/emojis-list-3.0.0.tgz",
+ "integrity": "sha512-/kyM18EfinwXZbno9FyUGeFh87KC8HRQBQGildHZbEuRyWFOmv1U10o9BBp8XVZDVNNuQKyIGIu5ZYAAXJ0V2Q==",
+ "engines": {
+ "node": ">= 4"
+ }
+ },
+ "node_modules/emoticon": {
+ "version": "3.2.0",
+ "resolved": "https://registry.npmjs.org/emoticon/-/emoticon-3.2.0.tgz",
+ "integrity": "sha512-SNujglcLTTg+lDAcApPNgEdudaqQFiAbJCqzjNxJkvN9vAwCGi0uu8IUVvx+f16h+V44KCY6Y2yboroc9pilHg==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/encodeurl": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/encodeurl/-/encodeurl-1.0.2.tgz",
+ "integrity": "sha512-TPJXq8JqFaVYm2CWmPvnP2Iyo4ZSM7/QKcSmuMLDObfpH5fi7RUGmd/rTDf+rut/saiDiQEeVTNgAmJEdAOx0w==",
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/end-of-stream": {
+ "version": "1.4.4",
+ "resolved": "https://registry.npmjs.org/end-of-stream/-/end-of-stream-1.4.4.tgz",
+ "integrity": "sha512-+uw1inIHVPQoaVuHzRyXd21icM+cnt4CzD5rW+NC1wjOUSTOs+Te7FOv7AhN7vS9x/oIyhLP5PR1H+phQAHu5Q==",
+ "dependencies": {
+ "once": "^1.4.0"
+ }
+ },
+ "node_modules/enhanced-resolve": {
+ "version": "5.17.1",
+ "resolved": "https://registry.npmjs.org/enhanced-resolve/-/enhanced-resolve-5.17.1.tgz",
+ "integrity": "sha512-LMHl3dXhTcfv8gM4kEzIUeTQ+7fpdA0l2tUf34BddXPkz2A5xJ5L/Pchd5BL6rdccM9QGvu0sWZzK1Z1t4wwyg==",
+ "dependencies": {
+ "graceful-fs": "^4.2.4",
+ "tapable": "^2.2.0"
+ },
+ "engines": {
+ "node": ">=10.13.0"
+ }
+ },
+ "node_modules/entities": {
+ "version": "4.4.0",
+ "resolved": "https://registry.npmjs.org/entities/-/entities-4.4.0.tgz",
+ "integrity": "sha512-oYp7156SP8LkeGD0GF85ad1X9Ai79WtRsZ2gxJqtBuzH+98YUV6jkHEKlZkMbcrjJjIVJNIDP/3WL9wQkoPbWA==",
+ "engines": {
+ "node": ">=0.12"
+ },
+ "funding": {
+ "url": "https://github.com/fb55/entities?sponsor=1"
+ }
+ },
+ "node_modules/error-ex": {
+ "version": "1.3.2",
+ "resolved": "https://registry.npmjs.org/error-ex/-/error-ex-1.3.2.tgz",
+ "integrity": "sha512-7dFHNmqeFSEt2ZBsCriorKnn3Z2pj+fd9kmI6QoWw4//DL+icEBfc0U7qJCisqrTsKTjw4fNFy2pW9OqStD84g==",
+ "dependencies": {
+ "is-arrayish": "^0.2.1"
+ }
+ },
+ "node_modules/es-define-property": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.0.tgz",
+ "integrity": "sha512-jxayLKShrEqqzJ0eumQbVhTYQM27CfT1T35+gCgDFoL82JLsXqTJ76zv6A0YLOgEnLUMvLzsDsGIrl8NFpT2gQ==",
+ "dependencies": {
+ "get-intrinsic": "^1.2.4"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ }
+ },
+ "node_modules/es-errors": {
+ "version": "1.3.0",
+ "resolved": "https://registry.npmjs.org/es-errors/-/es-errors-1.3.0.tgz",
+ "integrity": "sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw==",
+ "engines": {
+ "node": ">= 0.4"
+ }
+ },
+ "node_modules/es-module-lexer": {
+ "version": "1.5.4",
+ "resolved": "https://registry.npmjs.org/es-module-lexer/-/es-module-lexer-1.5.4.tgz",
+ "integrity": "sha512-MVNK56NiMrOwitFB7cqDwq0CQutbw+0BvLshJSse0MUNU+y1FC3bUS/AQg7oUng+/wKrrki7JfmwtVHkVfPLlw=="
+ },
+ "node_modules/es6-promise": {
+ "version": "3.3.1",
+ "resolved": "https://registry.npmjs.org/es6-promise/-/es6-promise-3.3.1.tgz",
+ "integrity": "sha512-SOp9Phqvqn7jtEUxPWdWfWoLmyt2VaJ6MpvP9Comy1MceMXqE6bxvaTu4iaxpYYPzhny28Lc+M87/c2cPK6lDg=="
+ },
+ "node_modules/escalade": {
+ "version": "3.1.2",
+ "resolved": "https://registry.npmjs.org/escalade/-/escalade-3.1.2.tgz",
+ "integrity": "sha512-ErCHMCae19vR8vQGe50xIsVomy19rg6gFu3+r3jkEO46suLMWBksvVyoGgQV+jOfl84ZSOSlmv6Gxa89PmTGmA==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/escape-goat": {
+ "version": "2.1.1",
+ "resolved": "https://registry.npmjs.org/escape-goat/-/escape-goat-2.1.1.tgz",
+ "integrity": "sha512-8/uIhbG12Csjy2JEW7D9pHbreaVaS/OpN3ycnyvElTdwM5n6GY6W6e2IPemfvGZeUMqZ9A/3GqIZMgKnBhAw/Q==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/escape-html": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmjs.org/escape-html/-/escape-html-1.0.3.tgz",
+ "integrity": "sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow=="
+ },
+ "node_modules/escape-string-regexp": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/escape-string-regexp/-/escape-string-regexp-4.0.0.tgz",
+ "integrity": "sha512-TtpcNJ3XAzx3Gq8sWRzJaVajRs0uVxA2YAkdb1jm2YkPz4G6egUFAyA3n5vtEIZefPk5Wa4UXbKuS5fKkJWdgA==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/eslint-scope": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/eslint-scope/-/eslint-scope-5.1.1.tgz",
+ "integrity": "sha512-2NxwbF/hZ0KpepYN0cNbo+FN6XoK7GaHlQhgx/hIZl6Va0bF45RQOOwhLIy8lQDbuCiadSLCBnH2CFYquit5bw==",
+ "dependencies": {
+ "esrecurse": "^4.3.0",
+ "estraverse": "^4.1.1"
+ },
+ "engines": {
+ "node": ">=8.0.0"
+ }
+ },
+ "node_modules/esprima": {
+ "version": "4.0.1",
+ "resolved": "https://registry.npmjs.org/esprima/-/esprima-4.0.1.tgz",
+ "integrity": "sha512-eGuFFw7Upda+g4p+QHvnW0RyTX/SVeJBDM/gCtMARO0cLuT2HcEKnTPvhjV6aGeqrCB/sbNop0Kszm0jsaWU4A==",
+ "bin": {
+ "esparse": "bin/esparse.js",
+ "esvalidate": "bin/esvalidate.js"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/esrecurse": {
+ "version": "4.3.0",
+ "resolved": "https://registry.npmjs.org/esrecurse/-/esrecurse-4.3.0.tgz",
+ "integrity": "sha512-KmfKL3b6G+RXvP8N1vr3Tq1kL/oCFgn2NYXEtqP8/L3pKapUA4G8cFVaoF3SU323CD4XypR/ffioHmkti6/Tag==",
+ "dependencies": {
+ "estraverse": "^5.2.0"
+ },
+ "engines": {
+ "node": ">=4.0"
+ }
+ },
+ "node_modules/esrecurse/node_modules/estraverse": {
+ "version": "5.3.0",
+ "resolved": "https://registry.npmjs.org/estraverse/-/estraverse-5.3.0.tgz",
+ "integrity": "sha512-MMdARuVEQziNTeJD8DgMqmhwR11BRQ/cBP+pLtYdSTnf3MIO8fFeiINEbX36ZdNlfU/7A9f3gUw49B3oQsvwBA==",
+ "engines": {
+ "node": ">=4.0"
+ }
+ },
+ "node_modules/estraverse": {
+ "version": "4.3.0",
+ "resolved": "https://registry.npmjs.org/estraverse/-/estraverse-4.3.0.tgz",
+ "integrity": "sha512-39nnKffWz8xN1BU/2c79n9nB9HDzo0niYUqx6xyqUnyoAnQyyWpOTdZEeiCch8BBu515t4wp9ZmgVfVhn9EBpw==",
+ "engines": {
+ "node": ">=4.0"
+ }
+ },
+ "node_modules/esutils": {
+ "version": "2.0.3",
+ "resolved": "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz",
+ "integrity": "sha512-kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/eta": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/eta/-/eta-2.0.1.tgz",
+ "integrity": "sha512-46E2qDPDm7QA+usjffUWz9KfXsxVZclPOuKsXs4ZWZdI/X1wpDF7AO424pt7fdYohCzWsIkXAhNGXSlwo5naAg==",
+ "engines": {
+ "node": ">=6.0.0"
+ },
+ "funding": {
+ "url": "https://github.com/eta-dev/eta?sponsor=1"
+ }
+ },
+ "node_modules/etag": {
+ "version": "1.8.1",
+ "resolved": "https://registry.npmjs.org/etag/-/etag-1.8.1.tgz",
+ "integrity": "sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/eval": {
+ "version": "0.1.8",
+ "resolved": "https://registry.npmjs.org/eval/-/eval-0.1.8.tgz",
+ "integrity": "sha512-EzV94NYKoO09GLXGjXj9JIlXijVck4ONSr5wiCWDvhsvj5jxSrzTmRU/9C1DyB6uToszLs8aifA6NQ7lEQdvFw==",
+ "dependencies": {
+ "@types/node": "*",
+ "require-like": ">= 0.1.1"
+ },
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/eventemitter3": {
+ "version": "4.0.7",
+ "resolved": "https://registry.npmjs.org/eventemitter3/-/eventemitter3-4.0.7.tgz",
+ "integrity": "sha512-8guHBZCwKnFhYdHr2ysuRWErTwhoN2X8XELRlrRwpmfeY2jjuUN4taQMsULKUVo1K4DvZl+0pgfyoysHxvmvEw=="
+ },
+ "node_modules/events": {
+ "version": "3.3.0",
+ "resolved": "https://registry.npmjs.org/events/-/events-3.3.0.tgz",
+ "integrity": "sha512-mQw+2fkQbALzQ7V0MY0IqdnXNOeTtP4r0lN9z7AAawCXgqea7bDii20AYrIBrFd/Hx0M2Ocz6S111CaFkUcb0Q==",
+ "engines": {
+ "node": ">=0.8.x"
+ }
+ },
+ "node_modules/execa": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/execa/-/execa-5.1.1.tgz",
+ "integrity": "sha512-8uSpZZocAZRBAPIEINJj3Lo9HyGitllczc27Eh5YYojjMFMn8yHMDMaUHE2Jqfq05D/wucwI4JGURyXt1vchyg==",
+ "dependencies": {
+ "cross-spawn": "^7.0.3",
+ "get-stream": "^6.0.0",
+ "human-signals": "^2.1.0",
+ "is-stream": "^2.0.0",
+ "merge-stream": "^2.0.0",
+ "npm-run-path": "^4.0.1",
+ "onetime": "^5.1.2",
+ "signal-exit": "^3.0.3",
+ "strip-final-newline": "^2.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sindresorhus/execa?sponsor=1"
+ }
+ },
+ "node_modules/execa/node_modules/get-stream": {
+ "version": "6.0.1",
+ "resolved": "https://registry.npmjs.org/get-stream/-/get-stream-6.0.1.tgz",
+ "integrity": "sha512-ts6Wi+2j3jQjqi70w5AlN8DFnkSwC+MqmxEzdEALB2qXZYV3X/b1CTfgPLGJNMeAWxdPfU8FO1ms3NUfaHCPYg==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/express": {
+ "version": "4.21.1",
+ "resolved": "https://registry.npmjs.org/express/-/express-4.21.1.tgz",
+ "integrity": "sha512-YSFlK1Ee0/GC8QaO91tHcDxJiE/X4FbpAyQWkxAvG6AXCuR65YzK8ua6D9hvi/TzUfZMpc+BwuM1IPw8fmQBiQ==",
+ "dependencies": {
+ "accepts": "~1.3.8",
+ "array-flatten": "1.1.1",
+ "body-parser": "1.20.3",
+ "content-disposition": "0.5.4",
+ "content-type": "~1.0.4",
+ "cookie": "0.7.1",
+ "cookie-signature": "1.0.6",
+ "debug": "2.6.9",
+ "depd": "2.0.0",
+ "encodeurl": "~2.0.0",
+ "escape-html": "~1.0.3",
+ "etag": "~1.8.1",
+ "finalhandler": "1.3.1",
+ "fresh": "0.5.2",
+ "http-errors": "2.0.0",
+ "merge-descriptors": "1.0.3",
+ "methods": "~1.1.2",
+ "on-finished": "2.4.1",
+ "parseurl": "~1.3.3",
+ "path-to-regexp": "0.1.10",
+ "proxy-addr": "~2.0.7",
+ "qs": "6.13.0",
+ "range-parser": "~1.2.1",
+ "safe-buffer": "5.2.1",
+ "send": "0.19.0",
+ "serve-static": "1.16.2",
+ "setprototypeof": "1.2.0",
+ "statuses": "2.0.1",
+ "type-is": "~1.6.18",
+ "utils-merge": "1.0.1",
+ "vary": "~1.1.2"
+ },
+ "engines": {
+ "node": ">= 0.10.0"
+ }
+ },
+ "node_modules/express/node_modules/array-flatten": {
+ "version": "1.1.1",
+ "resolved": "https://registry.npmjs.org/array-flatten/-/array-flatten-1.1.1.tgz",
+ "integrity": "sha512-PCVAQswWemu6UdxsDFFX/+gVeYqKAod3D3UVm91jHwynguOwAvYPhx8nNlM++NqRcK6CxxpUafjmhIdKiHibqg=="
+ },
+ "node_modules/express/node_modules/content-disposition": {
+ "version": "0.5.4",
+ "resolved": "https://registry.npmjs.org/content-disposition/-/content-disposition-0.5.4.tgz",
+ "integrity": "sha512-FveZTNuGw04cxlAiWbzi6zTAL/lhehaWbTtgluJh4/E95DqMwTmha3KZN1aAWA8cFIhHzMZUvLevkw5Rqk+tSQ==",
+ "dependencies": {
+ "safe-buffer": "5.2.1"
+ },
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/express/node_modules/debug": {
+ "version": "2.6.9",
+ "resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
+ "integrity": "sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA==",
+ "dependencies": {
+ "ms": "2.0.0"
+ }
+ },
+ "node_modules/express/node_modules/encodeurl": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/encodeurl/-/encodeurl-2.0.0.tgz",
+ "integrity": "sha512-Q0n9HRi4m6JuGIV1eFlmvJB7ZEVxu93IrMyiMsGC0lrMJMWzRgx6WGquyfQgZVb31vhGgXnfmPNNXmxnOkRBrg==",
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/express/node_modules/ms": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz",
+ "integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A=="
+ },
+ "node_modules/express/node_modules/path-to-regexp": {
+ "version": "0.1.10",
+ "resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.10.tgz",
+ "integrity": "sha512-7lf7qcQidTku0Gu3YDPc8DJ1q7OOucfa/BSsIwjuh56VU7katFvuM8hULfkwB3Fns/rsVF7PwPKVw1sl5KQS9w=="
+ },
+ "node_modules/express/node_modules/range-parser": {
+ "version": "1.2.1",
+ "resolved": "https://registry.npmjs.org/range-parser/-/range-parser-1.2.1.tgz",
+ "integrity": "sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/extend": {
+ "version": "3.0.2",
+ "resolved": "https://registry.npmjs.org/extend/-/extend-3.0.2.tgz",
+ "integrity": "sha512-fjquC59cD7CyW6urNXK0FBufkZcoiGG80wTuPujX590cB5Ttln20E2UB4S/WARVqhXffZl2LNgS+gQdPIIim/g=="
+ },
+ "node_modules/extend-shallow": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/extend-shallow/-/extend-shallow-2.0.1.tgz",
+ "integrity": "sha512-zCnTtlxNoAiDc3gqY2aYAWFx7XWWiasuF2K8Me5WbN8otHKTUKBwjPtNpRs/rbUZm7KxWAaNj7P1a/p52GbVug==",
+ "dependencies": {
+ "is-extendable": "^0.1.0"
+ },
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/fast-deep-equal": {
+ "version": "3.1.3",
+ "resolved": "https://registry.npmjs.org/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz",
+ "integrity": "sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q=="
+ },
+ "node_modules/fast-glob": {
+ "version": "3.2.12",
+ "resolved": "https://registry.npmjs.org/fast-glob/-/fast-glob-3.2.12.tgz",
+ "integrity": "sha512-DVj4CQIYYow0BlaelwK1pHl5n5cRSJfM60UA0zK891sVInoPri2Ekj7+e1CT3/3qxXenpI+nBBmQAcJPJgaj4w==",
+ "dependencies": {
+ "@nodelib/fs.stat": "^2.0.2",
+ "@nodelib/fs.walk": "^1.2.3",
+ "glob-parent": "^5.1.2",
+ "merge2": "^1.3.0",
+ "micromatch": "^4.0.4"
+ },
+ "engines": {
+ "node": ">=8.6.0"
+ }
+ },
+ "node_modules/fast-json-stable-stringify": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/fast-json-stable-stringify/-/fast-json-stable-stringify-2.1.0.tgz",
+ "integrity": "sha512-lhd/wF+Lk98HZoTCtlVraHtfh5XYijIjalXck7saUtuanSDyLMxnHhSXEDJqHxD7msR8D0uCmqlkwjCV8xvwHw=="
+ },
+ "node_modules/fast-safe-stringify": {
+ "version": "2.1.1",
+ "resolved": "https://registry.npmjs.org/fast-safe-stringify/-/fast-safe-stringify-2.1.1.tgz",
+ "integrity": "sha512-W+KJc2dmILlPplD/H4K9l9LcAHAfPtP6BY84uVLXQ6Evcz9Lcg33Y2z1IVblT6xdY54PXYVHEv+0Wpq8Io6zkA=="
+ },
+ "node_modules/fast-url-parser": {
+ "version": "1.1.3",
+ "resolved": "https://registry.npmjs.org/fast-url-parser/-/fast-url-parser-1.1.3.tgz",
+ "integrity": "sha512-5jOCVXADYNuRkKFzNJ0dCCewsZiYo0dz8QNYljkOpFC6r2U4OBmKtvm/Tsuh4w1YYdDqDb31a8TVhBJ2OJKdqQ==",
+ "dependencies": {
+ "punycode": "^1.3.2"
+ }
+ },
+ "node_modules/fastq": {
+ "version": "1.15.0",
+ "resolved": "https://registry.npmjs.org/fastq/-/fastq-1.15.0.tgz",
+ "integrity": "sha512-wBrocU2LCXXa+lWBt8RoIRD89Fi8OdABODa/kEnyeyjS5aZO5/GNvI5sEINADqP/h8M29UHTHUb53sUu5Ihqdw==",
+ "dependencies": {
+ "reusify": "^1.0.4"
+ }
+ },
+ "node_modules/faye-websocket": {
+ "version": "0.11.4",
+ "resolved": "https://registry.npmjs.org/faye-websocket/-/faye-websocket-0.11.4.tgz",
+ "integrity": "sha512-CzbClwlXAuiRQAlUyfqPgvPoNKTckTPGfwZV4ZdAhVcP2lh9KUxJg2b5GkE7XbjKQ3YJnQ9z6D9ntLAlB+tP8g==",
+ "dependencies": {
+ "websocket-driver": ">=0.5.1"
+ },
+ "engines": {
+ "node": ">=0.8.0"
+ }
+ },
+ "node_modules/fbemitter": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/fbemitter/-/fbemitter-3.0.0.tgz",
+ "integrity": "sha512-KWKaceCwKQU0+HPoop6gn4eOHk50bBv/VxjJtGMfwmJt3D29JpN4H4eisCtIPA+a8GVBam+ldMMpMjJUvpDyHw==",
+ "dependencies": {
+ "fbjs": "^3.0.0"
+ }
+ },
+ "node_modules/fbjs": {
+ "version": "3.0.4",
+ "resolved": "https://registry.npmjs.org/fbjs/-/fbjs-3.0.4.tgz",
+ "integrity": "sha512-ucV0tDODnGV3JCnnkmoszb5lf4bNpzjv80K41wd4k798Etq+UYD0y0TIfalLjZoKgjive6/adkRnszwapiDgBQ==",
+ "dependencies": {
+ "cross-fetch": "^3.1.5",
+ "fbjs-css-vars": "^1.0.0",
+ "loose-envify": "^1.0.0",
+ "object-assign": "^4.1.0",
+ "promise": "^7.1.1",
+ "setimmediate": "^1.0.5",
+ "ua-parser-js": "^0.7.30"
+ }
+ },
+ "node_modules/fbjs-css-vars": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/fbjs-css-vars/-/fbjs-css-vars-1.0.2.tgz",
+ "integrity": "sha512-b2XGFAFdWZWg0phtAWLHCk836A1Xann+I+Dgd3Gk64MHKZO44FfoD1KxyvbSh0qZsIoXQGGlVztIY+oitJPpRQ=="
+ },
+ "node_modules/feed": {
+ "version": "4.2.2",
+ "resolved": "https://registry.npmjs.org/feed/-/feed-4.2.2.tgz",
+ "integrity": "sha512-u5/sxGfiMfZNtJ3OvQpXcvotFpYkL0n9u9mM2vkui2nGo8b4wvDkJ8gAkYqbA8QpGyFCv3RK0Z+Iv+9veCS9bQ==",
+ "dependencies": {
+ "xml-js": "^1.6.11"
+ },
+ "engines": {
+ "node": ">=0.4.0"
+ }
+ },
+ "node_modules/file-loader": {
+ "version": "6.2.0",
+ "resolved": "https://registry.npmjs.org/file-loader/-/file-loader-6.2.0.tgz",
+ "integrity": "sha512-qo3glqyTa61Ytg4u73GultjHGjdRyig3tG6lPtyX/jOEJvHif9uB0/OCI2Kif6ctF3caQTW2G5gym21oAsI4pw==",
+ "dependencies": {
+ "loader-utils": "^2.0.0",
+ "schema-utils": "^3.0.0"
+ },
+ "engines": {
+ "node": ">= 10.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ },
+ "peerDependencies": {
+ "webpack": "^4.0.0 || ^5.0.0"
+ }
+ },
+ "node_modules/file-loader/node_modules/schema-utils": {
+ "version": "3.1.1",
+ "resolved": "https://registry.npmjs.org/schema-utils/-/schema-utils-3.1.1.tgz",
+ "integrity": "sha512-Y5PQxS4ITlC+EahLuXaY86TXfR7Dc5lw294alXOq86JAHCihAIZfqv8nNCWvaEJvaC51uN9hbLGeV0cFBdH+Fw==",
+ "dependencies": {
+ "@types/json-schema": "^7.0.8",
+ "ajv": "^6.12.5",
+ "ajv-keywords": "^3.5.2"
+ },
+ "engines": {
+ "node": ">= 10.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ }
+ },
+ "node_modules/filesize": {
+ "version": "8.0.7",
+ "resolved": "https://registry.npmjs.org/filesize/-/filesize-8.0.7.tgz",
+ "integrity": "sha512-pjmC+bkIF8XI7fWaH8KxHcZL3DPybs1roSKP4rKDvy20tAWwIObE4+JIseG2byfGKhud5ZnM4YSGKBz7Sh0ndQ==",
+ "engines": {
+ "node": ">= 0.4.0"
+ }
+ },
+ "node_modules/fill-range": {
+ "version": "7.1.1",
+ "resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.1.1.tgz",
+ "integrity": "sha512-YsGpe3WHLK8ZYi4tWDg2Jy3ebRz2rXowDxnld4bkQB00cc/1Zw9AWnC0i9ztDJitivtQvaI9KaLyKrc+hBW0yg==",
+ "dependencies": {
+ "to-regex-range": "^5.0.1"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/finalhandler": {
+ "version": "1.3.1",
+ "resolved": "https://registry.npmjs.org/finalhandler/-/finalhandler-1.3.1.tgz",
+ "integrity": "sha512-6BN9trH7bp3qvnrRyzsBz+g3lZxTNZTbVO2EV1CS0WIcDbawYVdYvGflME/9QP0h0pYlCDBCTjYa9nZzMDpyxQ==",
+ "dependencies": {
+ "debug": "2.6.9",
+ "encodeurl": "~2.0.0",
+ "escape-html": "~1.0.3",
+ "on-finished": "2.4.1",
+ "parseurl": "~1.3.3",
+ "statuses": "2.0.1",
+ "unpipe": "~1.0.0"
+ },
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/finalhandler/node_modules/debug": {
+ "version": "2.6.9",
+ "resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
+ "integrity": "sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA==",
+ "dependencies": {
+ "ms": "2.0.0"
+ }
+ },
+ "node_modules/finalhandler/node_modules/encodeurl": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/encodeurl/-/encodeurl-2.0.0.tgz",
+ "integrity": "sha512-Q0n9HRi4m6JuGIV1eFlmvJB7ZEVxu93IrMyiMsGC0lrMJMWzRgx6WGquyfQgZVb31vhGgXnfmPNNXmxnOkRBrg==",
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/finalhandler/node_modules/ms": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz",
+ "integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A=="
+ },
+ "node_modules/find-cache-dir": {
+ "version": "3.3.2",
+ "resolved": "https://registry.npmjs.org/find-cache-dir/-/find-cache-dir-3.3.2.tgz",
+ "integrity": "sha512-wXZV5emFEjrridIgED11OoUKLxiYjAcqot/NJdAkOhlJ+vGzwhOAfcG5OX1jP+S0PcjEn8bdMJv+g2jwQ3Onig==",
+ "dependencies": {
+ "commondir": "^1.0.1",
+ "make-dir": "^3.0.2",
+ "pkg-dir": "^4.1.0"
+ },
+ "engines": {
+ "node": ">=8"
+ },
+ "funding": {
+ "url": "https://github.com/avajs/find-cache-dir?sponsor=1"
+ }
+ },
+ "node_modules/find-up": {
+ "version": "4.1.0",
+ "resolved": "https://registry.npmjs.org/find-up/-/find-up-4.1.0.tgz",
+ "integrity": "sha512-PpOwAdQ/YlXQ2vj8a3h8IipDuYRi3wceVQQGYWxNINccq40Anw7BlsEXCMbt1Zt+OLA6Fq9suIpIWD0OsnISlw==",
+ "dependencies": {
+ "locate-path": "^5.0.0",
+ "path-exists": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/flux": {
+ "version": "4.0.4",
+ "resolved": "https://registry.npmjs.org/flux/-/flux-4.0.4.tgz",
+ "integrity": "sha512-NCj3XlayA2UsapRpM7va6wU1+9rE5FIL7qoMcmxWHRzbp0yujihMBm9BBHZ1MDIk5h5o2Bl6eGiCe8rYELAmYw==",
+ "dependencies": {
+ "fbemitter": "^3.0.0",
+ "fbjs": "^3.0.1"
+ },
+ "peerDependencies": {
+ "react": "^15.0.2 || ^16.0.0 || ^17.0.0"
+ }
+ },
+ "node_modules/follow-redirects": {
+ "version": "1.15.6",
+ "resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.15.6.tgz",
+ "integrity": "sha512-wWN62YITEaOpSK584EZXJafH1AGpO8RVgElfkuXbTOrPX4fIfOyEpW/CsiNd8JdYrAoOvafRTOEnvsO++qCqFA==",
+ "funding": [
+ {
+ "type": "individual",
+ "url": "https://github.com/sponsors/RubenVerborgh"
+ }
+ ],
+ "engines": {
+ "node": ">=4.0"
+ },
+ "peerDependenciesMeta": {
+ "debug": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/foreach": {
+ "version": "2.0.6",
+ "resolved": "https://registry.npmjs.org/foreach/-/foreach-2.0.6.tgz",
+ "integrity": "sha512-k6GAGDyqLe9JaebCsFCoudPPWfihKu8pylYXRlqP1J7ms39iPoTtk2fviNglIeQEwdh0bQeKJ01ZPyuyQvKzwg=="
+ },
+ "node_modules/fork-ts-checker-webpack-plugin": {
+ "version": "6.5.3",
+ "resolved": "https://registry.npmjs.org/fork-ts-checker-webpack-plugin/-/fork-ts-checker-webpack-plugin-6.5.3.tgz",
+ "integrity": "sha512-SbH/l9ikmMWycd5puHJKTkZJKddF4iRLyW3DeZ08HTI7NGyLS38MXd/KGgeWumQO7YNQbW2u/NtPT2YowbPaGQ==",
+ "dependencies": {
+ "@babel/code-frame": "^7.8.3",
+ "@types/json-schema": "^7.0.5",
+ "chalk": "^4.1.0",
+ "chokidar": "^3.4.2",
+ "cosmiconfig": "^6.0.0",
+ "deepmerge": "^4.2.2",
+ "fs-extra": "^9.0.0",
+ "glob": "^7.1.6",
+ "memfs": "^3.1.2",
+ "minimatch": "^3.0.4",
+ "schema-utils": "2.7.0",
+ "semver": "^7.3.2",
+ "tapable": "^1.0.0"
+ },
+ "engines": {
+ "node": ">=10",
+ "yarn": ">=1.0.0"
+ },
+ "peerDependencies": {
+ "eslint": ">= 6",
+ "typescript": ">= 2.7",
+ "vue-template-compiler": "*",
+ "webpack": ">= 4"
+ },
+ "peerDependenciesMeta": {
+ "eslint": {
+ "optional": true
+ },
+ "vue-template-compiler": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/fork-ts-checker-webpack-plugin/node_modules/cosmiconfig": {
+ "version": "6.0.0",
+ "resolved": "https://registry.npmjs.org/cosmiconfig/-/cosmiconfig-6.0.0.tgz",
+ "integrity": "sha512-xb3ZL6+L8b9JLLCx3ZdoZy4+2ECphCMo2PwqgP1tlfVq6M6YReyzBJtvWWtbDSpNr9hn96pkCiZqUcFEc+54Qg==",
+ "dependencies": {
+ "@types/parse-json": "^4.0.0",
+ "import-fresh": "^3.1.0",
+ "parse-json": "^5.0.0",
+ "path-type": "^4.0.0",
+ "yaml": "^1.7.2"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/fork-ts-checker-webpack-plugin/node_modules/fs-extra": {
+ "version": "9.1.0",
+ "resolved": "https://registry.npmjs.org/fs-extra/-/fs-extra-9.1.0.tgz",
+ "integrity": "sha512-hcg3ZmepS30/7BSFqRvoo3DOMQu7IjqxO5nCDt+zM9XWjb33Wg7ziNT+Qvqbuc3+gWpzO02JubVyk2G4Zvo1OQ==",
+ "dependencies": {
+ "at-least-node": "^1.0.0",
+ "graceful-fs": "^4.2.0",
+ "jsonfile": "^6.0.1",
+ "universalify": "^2.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/fork-ts-checker-webpack-plugin/node_modules/schema-utils": {
+ "version": "2.7.0",
+ "resolved": "https://registry.npmjs.org/schema-utils/-/schema-utils-2.7.0.tgz",
+ "integrity": "sha512-0ilKFI6QQF5nxDZLFn2dMjvc4hjg/Wkg7rHd3jK6/A4a1Hl9VFdQWvgB1UMGoU94pad1P/8N7fMcEnLnSiju8A==",
+ "dependencies": {
+ "@types/json-schema": "^7.0.4",
+ "ajv": "^6.12.2",
+ "ajv-keywords": "^3.4.1"
+ },
+ "engines": {
+ "node": ">= 8.9.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ }
+ },
+ "node_modules/fork-ts-checker-webpack-plugin/node_modules/tapable": {
+ "version": "1.1.3",
+ "resolved": "https://registry.npmjs.org/tapable/-/tapable-1.1.3.tgz",
+ "integrity": "sha512-4WK/bYZmj8xLr+HUCODHGF1ZFzsYffasLUgEiMBY4fgtltdO6B4WJtlSbPaDTLpYTcGVwM2qLnFTICEcNxs3kA==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/forwarded": {
+ "version": "0.2.0",
+ "resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz",
+ "integrity": "sha512-buRG0fpBtRHSTCOASe6hD258tEubFoRLb4ZNA6NxMVHNw2gOcwHo9wyablzMzOA5z9xA9L1KNjk/Nt6MT9aYow==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/fraction.js": {
+ "version": "4.2.0",
+ "resolved": "https://registry.npmjs.org/fraction.js/-/fraction.js-4.2.0.tgz",
+ "integrity": "sha512-MhLuK+2gUcnZe8ZHlaaINnQLl0xRIGRfcGk2yl8xoQAfHrSsL3rYu6FCmBdkdbhc9EPlwyGHewaRsvwRMJtAlA==",
+ "engines": {
+ "node": "*"
+ },
+ "funding": {
+ "type": "patreon",
+ "url": "https://www.patreon.com/infusion"
+ }
+ },
+ "node_modules/fresh": {
+ "version": "0.5.2",
+ "resolved": "https://registry.npmjs.org/fresh/-/fresh-0.5.2.tgz",
+ "integrity": "sha512-zJ2mQYM18rEFOudeV4GShTGIQ7RbzA7ozbU9I/XBpm7kqgMywgmylMwXHxZJmkVoYkna9d2pVXVXPdYTP9ej8Q==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/fs-extra": {
+ "version": "10.1.0",
+ "resolved": "https://registry.npmjs.org/fs-extra/-/fs-extra-10.1.0.tgz",
+ "integrity": "sha512-oRXApq54ETRj4eMiFzGnHWGy+zo5raudjuxN0b8H7s/RU2oW0Wvsx9O0ACRN/kRq9E8Vu/ReskGB5o3ji+FzHQ==",
+ "dependencies": {
+ "graceful-fs": "^4.2.0",
+ "jsonfile": "^6.0.1",
+ "universalify": "^2.0.0"
+ },
+ "engines": {
+ "node": ">=12"
+ }
+ },
+ "node_modules/fs-monkey": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmjs.org/fs-monkey/-/fs-monkey-1.0.3.tgz",
+ "integrity": "sha512-cybjIfiiE+pTWicSCLFHSrXZ6EilF30oh91FDP9S2B051prEa7QWfrVTQm10/dDpswBDXZugPa1Ogu8Yh+HV0Q=="
+ },
+ "node_modules/fs.realpath": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/fs.realpath/-/fs.realpath-1.0.0.tgz",
+ "integrity": "sha512-OO0pH2lK6a0hZnAdau5ItzHPI6pUlvI7jMVnxUQRtw4owF2wk8lOSabtGDCTP4Ggrg2MbGnWO9X8K1t4+fGMDw=="
+ },
+ "node_modules/fsevents": {
+ "version": "2.3.2",
+ "resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.2.tgz",
+ "integrity": "sha512-xiqMQR4xAeHTuB9uWm+fFRcIOgKBMiOBP+eXiyT7jsgVCq1bkVygt00oASowB7EdtpOHaaPgKt812P9ab+DDKA==",
+ "hasInstallScript": true,
+ "optional": true,
+ "os": [
+ "darwin"
+ ],
+ "engines": {
+ "node": "^8.16.0 || ^10.6.0 || >=11.0.0"
+ }
+ },
+ "node_modules/function-bind": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz",
+ "integrity": "sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA==",
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/gensync": {
+ "version": "1.0.0-beta.2",
+ "resolved": "https://registry.npmjs.org/gensync/-/gensync-1.0.0-beta.2.tgz",
+ "integrity": "sha512-3hN7NaskYvMDLQY55gnW3NQ+mesEAepTqlg+VEbj7zzqEMBVNhzcGYYeqFo/TlYz6eQiFcp1HcsCZO+nGgS8zg==",
+ "engines": {
+ "node": ">=6.9.0"
+ }
+ },
+ "node_modules/get-caller-file": {
+ "version": "2.0.5",
+ "resolved": "https://registry.npmjs.org/get-caller-file/-/get-caller-file-2.0.5.tgz",
+ "integrity": "sha512-DyFP3BM/3YHTQOCUL/w0OZHR0lpKeGrxotcHWcqNEdnltqFwXVfhEBQ94eIo34AfQpo0rGki4cyIiftY06h2Fg==",
+ "engines": {
+ "node": "6.* || 8.* || >= 10.*"
+ }
+ },
+ "node_modules/get-intrinsic": {
+ "version": "1.2.4",
+ "resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.2.4.tgz",
+ "integrity": "sha512-5uYhsJH8VJBTv7oslg4BznJYhDoRI6waYCxMmCdnTrcCrHA/fCFKoTFz2JKKE0HdDFUF7/oQuhzumXJK7paBRQ==",
+ "dependencies": {
+ "es-errors": "^1.3.0",
+ "function-bind": "^1.1.2",
+ "has-proto": "^1.0.1",
+ "has-symbols": "^1.0.3",
+ "hasown": "^2.0.0"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/get-own-enumerable-property-symbols": {
+ "version": "3.0.2",
+ "resolved": "https://registry.npmjs.org/get-own-enumerable-property-symbols/-/get-own-enumerable-property-symbols-3.0.2.tgz",
+ "integrity": "sha512-I0UBV/XOz1XkIJHEUDMZAbzCThU/H8DxmSfmdGcKPnVhu2VfFqr34jr9777IyaTYvxjedWhqVIilEDsCdP5G6g=="
+ },
+ "node_modules/get-stream": {
+ "version": "4.1.0",
+ "resolved": "https://registry.npmjs.org/get-stream/-/get-stream-4.1.0.tgz",
+ "integrity": "sha512-GMat4EJ5161kIy2HevLlr4luNjBgvmj413KaQA7jt4V8B4RDsfpHk7WQ9GVqfYyyx8OS/L66Kox+rJRNklLK7w==",
+ "dependencies": {
+ "pump": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/github-slugger": {
+ "version": "1.5.0",
+ "resolved": "https://registry.npmjs.org/github-slugger/-/github-slugger-1.5.0.tgz",
+ "integrity": "sha512-wIh+gKBI9Nshz2o46B0B3f5k/W+WI9ZAv6y5Dn5WJ5SK1t0TnDimB4WE5rmTD05ZAIn8HALCZVmCsvj0w0v0lw=="
+ },
+ "node_modules/glob": {
+ "version": "7.2.3",
+ "resolved": "https://registry.npmjs.org/glob/-/glob-7.2.3.tgz",
+ "integrity": "sha512-nFR0zLpU2YCaRxwoCJvL6UvCH2JFyFVIvwTLsIf21AuHlMskA1hhTdk+LlYJtOlYt9v6dvszD2BGRqBL+iQK9Q==",
+ "dependencies": {
+ "fs.realpath": "^1.0.0",
+ "inflight": "^1.0.4",
+ "inherits": "2",
+ "minimatch": "^3.1.1",
+ "once": "^1.3.0",
+ "path-is-absolute": "^1.0.0"
+ },
+ "engines": {
+ "node": "*"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/isaacs"
+ }
+ },
+ "node_modules/glob-parent": {
+ "version": "5.1.2",
+ "resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.2.tgz",
+ "integrity": "sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow==",
+ "dependencies": {
+ "is-glob": "^4.0.1"
+ },
+ "engines": {
+ "node": ">= 6"
+ }
+ },
+ "node_modules/glob-to-regexp": {
+ "version": "0.4.1",
+ "resolved": "https://registry.npmjs.org/glob-to-regexp/-/glob-to-regexp-0.4.1.tgz",
+ "integrity": "sha512-lkX1HJXwyMcprw/5YUZc2s7DrpAiHB21/V+E1rHUrVNokkvB6bqMzT0VfV6/86ZNabt1k14YOIaT7nDvOX3Iiw=="
+ },
+ "node_modules/global-dirs": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/global-dirs/-/global-dirs-3.0.1.tgz",
+ "integrity": "sha512-NBcGGFbBA9s1VzD41QXDG+3++t9Mn5t1FpLdhESY6oKY4gYTFpX4wO3sqGUa0Srjtbfj3szX0RnemmrVRUdULA==",
+ "dependencies": {
+ "ini": "2.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/global-dirs/node_modules/ini": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/ini/-/ini-2.0.0.tgz",
+ "integrity": "sha512-7PnF4oN3CvZF23ADhA5wRaYEQpJ8qygSkbtTXWBeXWXmEVRXK+1ITciHWwHhsjv1TmW0MgacIv6hEi5pX5NQdA==",
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/global-modules": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/global-modules/-/global-modules-2.0.0.tgz",
+ "integrity": "sha512-NGbfmJBp9x8IxyJSd1P+otYK8vonoJactOogrVfFRIAEY1ukil8RSKDz2Yo7wh1oihl51l/r6W4epkeKJHqL8A==",
+ "dependencies": {
+ "global-prefix": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/global-prefix": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/global-prefix/-/global-prefix-3.0.0.tgz",
+ "integrity": "sha512-awConJSVCHVGND6x3tmMaKcQvwXLhjdkmomy2W+Goaui8YPgYgXJZewhg3fWC+DlfqqQuWg8AwqjGTD2nAPVWg==",
+ "dependencies": {
+ "ini": "^1.3.5",
+ "kind-of": "^6.0.2",
+ "which": "^1.3.1"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/global-prefix/node_modules/which": {
+ "version": "1.3.1",
+ "resolved": "https://registry.npmjs.org/which/-/which-1.3.1.tgz",
+ "integrity": "sha512-HxJdYWq1MTIQbJ3nw0cqssHoTNU267KlrDuGZ1WYlxDStUtKUhOaJmh112/TZmHxxUfuJqPXSOm7tDyas0OSIQ==",
+ "dependencies": {
+ "isexe": "^2.0.0"
+ },
+ "bin": {
+ "which": "bin/which"
+ }
+ },
+ "node_modules/globals": {
+ "version": "11.12.0",
+ "resolved": "https://registry.npmjs.org/globals/-/globals-11.12.0.tgz",
+ "integrity": "sha512-WOBp/EEGUiIsJSp7wcv/y6MO+lV9UoncWqxuFfm8eBwzWNgyfBd6Gz+IeKQ9jCmyhoH99g15M3T+QaVHFjizVA==",
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/globby": {
+ "version": "11.1.0",
+ "resolved": "https://registry.npmjs.org/globby/-/globby-11.1.0.tgz",
+ "integrity": "sha512-jhIXaOzy1sb8IyocaruWSn1TjmnBVs8Ayhcy83rmxNJ8q2uWKCAj3CnJY+KpGSXCueAPc0i05kVvVKtP1t9S3g==",
+ "dependencies": {
+ "array-union": "^2.1.0",
+ "dir-glob": "^3.0.1",
+ "fast-glob": "^3.2.9",
+ "ignore": "^5.2.0",
+ "merge2": "^1.4.1",
+ "slash": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/gopd": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/gopd/-/gopd-1.0.1.tgz",
+ "integrity": "sha512-d65bNlIadxvpb/A2abVdlqKqV563juRnZ1Wtk6s1sIR8uNsXR70xqIzVqxVf1eTqDunwT2MkczEeaezCKTZhwA==",
+ "dependencies": {
+ "get-intrinsic": "^1.1.3"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/got": {
+ "version": "9.6.0",
+ "resolved": "https://registry.npmjs.org/got/-/got-9.6.0.tgz",
+ "integrity": "sha512-R7eWptXuGYxwijs0eV+v3o6+XH1IqVK8dJOEecQfTmkncw9AV4dcw/Dhxi8MdlqPthxxpZyizMzyg8RTmEsG+Q==",
+ "dependencies": {
+ "@sindresorhus/is": "^0.14.0",
+ "@szmarczak/http-timer": "^1.1.2",
+ "cacheable-request": "^6.0.0",
+ "decompress-response": "^3.3.0",
+ "duplexer3": "^0.1.4",
+ "get-stream": "^4.1.0",
+ "lowercase-keys": "^1.0.1",
+ "mimic-response": "^1.0.1",
+ "p-cancelable": "^1.0.0",
+ "to-readable-stream": "^1.0.0",
+ "url-parse-lax": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=8.6"
+ }
+ },
+ "node_modules/graceful-fs": {
+ "version": "4.2.11",
+ "resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.11.tgz",
+ "integrity": "sha512-RbJ5/jmFcNNCcDV5o9eTnBLJ/HszWV0P73bc+Ff4nS/rJj+YaS6IGyiOL0VoBYX+l1Wrl3k63h/KrH+nhJ0XvQ=="
+ },
+ "node_modules/gray-matter": {
+ "version": "4.0.3",
+ "resolved": "https://registry.npmjs.org/gray-matter/-/gray-matter-4.0.3.tgz",
+ "integrity": "sha512-5v6yZd4JK3eMI3FqqCouswVqwugaA9r4dNZB1wwcmrD02QkV5H0y7XBQW8QwQqEaZY1pM9aqORSORhJRdNK44Q==",
+ "dependencies": {
+ "js-yaml": "^3.13.1",
+ "kind-of": "^6.0.2",
+ "section-matter": "^1.0.0",
+ "strip-bom-string": "^1.0.0"
+ },
+ "engines": {
+ "node": ">=6.0"
+ }
+ },
+ "node_modules/gray-matter/node_modules/argparse": {
+ "version": "1.0.10",
+ "resolved": "https://registry.npmjs.org/argparse/-/argparse-1.0.10.tgz",
+ "integrity": "sha512-o5Roy6tNG4SL/FOkCAN6RzjiakZS25RLYFrcMttJqbdd8BWrnA+fGz57iN5Pb06pvBGvl5gQ0B48dJlslXvoTg==",
+ "dependencies": {
+ "sprintf-js": "~1.0.2"
+ }
+ },
+ "node_modules/gray-matter/node_modules/js-yaml": {
+ "version": "3.14.1",
+ "resolved": "https://registry.npmjs.org/js-yaml/-/js-yaml-3.14.1.tgz",
+ "integrity": "sha512-okMH7OXXJ7YrN9Ok3/SXrnu4iX9yOk+25nqX4imS2npuvTYDmo/QEZoqwZkYaIDk3jVvBOTOIEgEhaLOynBS9g==",
+ "dependencies": {
+ "argparse": "^1.0.7",
+ "esprima": "^4.0.0"
+ },
+ "bin": {
+ "js-yaml": "bin/js-yaml.js"
+ }
+ },
+ "node_modules/gzip-size": {
+ "version": "6.0.0",
+ "resolved": "https://registry.npmjs.org/gzip-size/-/gzip-size-6.0.0.tgz",
+ "integrity": "sha512-ax7ZYomf6jqPTQ4+XCpUGyXKHk5WweS+e05MBO4/y3WJ5RkmPXNKvX+bx1behVILVwr6JSQvZAku021CHPXG3Q==",
+ "dependencies": {
+ "duplexer": "^0.1.2"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/handle-thing": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/handle-thing/-/handle-thing-2.0.1.tgz",
+ "integrity": "sha512-9Qn4yBxelxoh2Ow62nP+Ka/kMnOXRi8BXnRaUwezLNhqelnN49xKz4F/dPP8OYLxLxq6JDtZb2i9XznUQbNPTg=="
+ },
+ "node_modules/has": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmjs.org/has/-/has-1.0.3.tgz",
+ "integrity": "sha512-f2dvO0VU6Oej7RkWJGrehjbzMAjFp5/VKPp5tTpWIV4JHHZK1/BxbFRtf/siA2SWTe09caDmVtYYzWEIbBS4zw==",
+ "dependencies": {
+ "function-bind": "^1.1.1"
+ },
+ "engines": {
+ "node": ">= 0.4.0"
+ }
+ },
+ "node_modules/has-flag": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz",
+ "integrity": "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/has-property-descriptors": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/has-property-descriptors/-/has-property-descriptors-1.0.2.tgz",
+ "integrity": "sha512-55JNKuIW+vq4Ke1BjOTjM2YctQIvCT7GFzHwmfZPGo5wnrgkid0YQtnAleFSqumZm4az3n2BS+erby5ipJdgrg==",
+ "dependencies": {
+ "es-define-property": "^1.0.0"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/has-proto": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmjs.org/has-proto/-/has-proto-1.0.3.tgz",
+ "integrity": "sha512-SJ1amZAJUiZS+PhsVLf5tGydlaVB8EdFpaSO4gmiUKUOxk8qzn5AIy4ZeJUmh22znIdk/uMAUT2pl3FxzVUH+Q==",
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/has-symbols": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.0.3.tgz",
+ "integrity": "sha512-l3LCuF6MgDNwTDKkdYGEihYjt5pRPbEg46rtlmnSPlUbgmB8LOIrKJbYYFBSbnPaJexMKtiPO8hmeRjRz2Td+A==",
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/has-yarn": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/has-yarn/-/has-yarn-2.1.0.tgz",
+ "integrity": "sha512-UqBRqi4ju7T+TqGNdqAO0PaSVGsDGJUBQvk9eUWNGRY1CFGDzYhLWoM7JQEemnlvVcv/YEmc2wNW8BC24EnUsw==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/hasown": {
+ "version": "2.0.2",
+ "resolved": "https://registry.npmjs.org/hasown/-/hasown-2.0.2.tgz",
+ "integrity": "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ==",
+ "dependencies": {
+ "function-bind": "^1.1.2"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ }
+ },
+ "node_modules/hast-to-hyperscript": {
+ "version": "9.0.1",
+ "resolved": "https://registry.npmjs.org/hast-to-hyperscript/-/hast-to-hyperscript-9.0.1.tgz",
+ "integrity": "sha512-zQgLKqF+O2F72S1aa4y2ivxzSlko3MAvxkwG8ehGmNiqd98BIN3JM1rAJPmplEyLmGLO2QZYJtIneOSZ2YbJuA==",
+ "dependencies": {
+ "@types/unist": "^2.0.3",
+ "comma-separated-tokens": "^1.0.0",
+ "property-information": "^5.3.0",
+ "space-separated-tokens": "^1.0.0",
+ "style-to-object": "^0.3.0",
+ "unist-util-is": "^4.0.0",
+ "web-namespaces": "^1.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/hast-util-from-parse5": {
+ "version": "6.0.1",
+ "resolved": "https://registry.npmjs.org/hast-util-from-parse5/-/hast-util-from-parse5-6.0.1.tgz",
+ "integrity": "sha512-jeJUWiN5pSxW12Rh01smtVkZgZr33wBokLzKLwinYOUfSzm1Nl/c3GUGebDyOKjdsRgMvoVbV0VpAcpjF4NrJA==",
+ "dependencies": {
+ "@types/parse5": "^5.0.0",
+ "hastscript": "^6.0.0",
+ "property-information": "^5.0.0",
+ "vfile": "^4.0.0",
+ "vfile-location": "^3.2.0",
+ "web-namespaces": "^1.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/hast-util-parse-selector": {
+ "version": "2.2.5",
+ "resolved": "https://registry.npmjs.org/hast-util-parse-selector/-/hast-util-parse-selector-2.2.5.tgz",
+ "integrity": "sha512-7j6mrk/qqkSehsM92wQjdIgWM2/BW61u/53G6xmC8i1OmEdKLHbk419QKQUjz6LglWsfqoiHmyMRkP1BGjecNQ==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/hast-util-raw": {
+ "version": "6.0.1",
+ "resolved": "https://registry.npmjs.org/hast-util-raw/-/hast-util-raw-6.0.1.tgz",
+ "integrity": "sha512-ZMuiYA+UF7BXBtsTBNcLBF5HzXzkyE6MLzJnL605LKE8GJylNjGc4jjxazAHUtcwT5/CEt6afRKViYB4X66dig==",
+ "dependencies": {
+ "@types/hast": "^2.0.0",
+ "hast-util-from-parse5": "^6.0.0",
+ "hast-util-to-parse5": "^6.0.0",
+ "html-void-elements": "^1.0.0",
+ "parse5": "^6.0.0",
+ "unist-util-position": "^3.0.0",
+ "vfile": "^4.0.0",
+ "web-namespaces": "^1.0.0",
+ "xtend": "^4.0.0",
+ "zwitch": "^1.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/hast-util-raw/node_modules/parse5": {
+ "version": "6.0.1",
+ "resolved": "https://registry.npmjs.org/parse5/-/parse5-6.0.1.tgz",
+ "integrity": "sha512-Ofn/CTFzRGTTxwpNEs9PP93gXShHcTq255nzRYSKe8AkVpZY7e1fpmTfOyoIvjP5HG7Z2ZM7VS9PPhQGW2pOpw=="
+ },
+ "node_modules/hast-util-to-parse5": {
+ "version": "6.0.0",
+ "resolved": "https://registry.npmjs.org/hast-util-to-parse5/-/hast-util-to-parse5-6.0.0.tgz",
+ "integrity": "sha512-Lu5m6Lgm/fWuz8eWnrKezHtVY83JeRGaNQ2kn9aJgqaxvVkFCZQBEhgodZUDUvoodgyROHDb3r5IxAEdl6suJQ==",
+ "dependencies": {
+ "hast-to-hyperscript": "^9.0.0",
+ "property-information": "^5.0.0",
+ "web-namespaces": "^1.0.0",
+ "xtend": "^4.0.0",
+ "zwitch": "^1.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/hastscript": {
+ "version": "6.0.0",
+ "resolved": "https://registry.npmjs.org/hastscript/-/hastscript-6.0.0.tgz",
+ "integrity": "sha512-nDM6bvd7lIqDUiYEiu5Sl/+6ReP0BMk/2f4U/Rooccxkj0P5nm+acM5PrGJ/t5I8qPGiqZSE6hVAwZEdZIvP4w==",
+ "dependencies": {
+ "@types/hast": "^2.0.0",
+ "comma-separated-tokens": "^1.0.0",
+ "hast-util-parse-selector": "^2.0.0",
+ "property-information": "^5.0.0",
+ "space-separated-tokens": "^1.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/he": {
+ "version": "1.2.0",
+ "resolved": "https://registry.npmjs.org/he/-/he-1.2.0.tgz",
+ "integrity": "sha512-F/1DnUGPopORZi0ni+CvrCgHQ5FyEAHRLSApuYWMmrbSwoN2Mn/7k+Gl38gJnR7yyDZk6WLXwiGod1JOWNDKGw==",
+ "bin": {
+ "he": "bin/he"
+ }
+ },
+ "node_modules/history": {
+ "version": "4.10.1",
+ "resolved": "https://registry.npmjs.org/history/-/history-4.10.1.tgz",
+ "integrity": "sha512-36nwAD620w12kuzPAsyINPWJqlNbij+hpK1k9XRloDtym8mxzGYl2c17LnV6IAGB2Dmg4tEa7G7DlawS0+qjew==",
+ "dependencies": {
+ "@babel/runtime": "^7.1.2",
+ "loose-envify": "^1.2.0",
+ "resolve-pathname": "^3.0.0",
+ "tiny-invariant": "^1.0.2",
+ "tiny-warning": "^1.0.0",
+ "value-equal": "^1.0.1"
+ }
+ },
+ "node_modules/hoist-non-react-statics": {
+ "version": "3.3.2",
+ "resolved": "https://registry.npmjs.org/hoist-non-react-statics/-/hoist-non-react-statics-3.3.2.tgz",
+ "integrity": "sha512-/gGivxi8JPKWNm/W0jSmzcMPpfpPLc3dY/6GxhX2hQ9iGj3aDfklV4ET7NjKpSinLpJ5vafa9iiGIEZg10SfBw==",
+ "dependencies": {
+ "react-is": "^16.7.0"
+ }
+ },
+ "node_modules/hpack.js": {
+ "version": "2.1.6",
+ "resolved": "https://registry.npmjs.org/hpack.js/-/hpack.js-2.1.6.tgz",
+ "integrity": "sha512-zJxVehUdMGIKsRaNt7apO2Gqp0BdqW5yaiGHXXmbpvxgBYVZnAql+BJb4RO5ad2MgpbZKn5G6nMnegrH1FcNYQ==",
+ "dependencies": {
+ "inherits": "^2.0.1",
+ "obuf": "^1.0.0",
+ "readable-stream": "^2.0.1",
+ "wbuf": "^1.1.0"
+ }
+ },
+ "node_modules/hpack.js/node_modules/isarray": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/isarray/-/isarray-1.0.0.tgz",
+ "integrity": "sha512-VLghIWNM6ELQzo7zwmcg0NmTVyWKYjvIeM83yjp0wRDTmUnrM678fQbcKBo6n2CJEF0szoG//ytg+TKla89ALQ=="
+ },
+ "node_modules/hpack.js/node_modules/readable-stream": {
+ "version": "2.3.8",
+ "resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.8.tgz",
+ "integrity": "sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA==",
+ "dependencies": {
+ "core-util-is": "~1.0.0",
+ "inherits": "~2.0.3",
+ "isarray": "~1.0.0",
+ "process-nextick-args": "~2.0.0",
+ "safe-buffer": "~5.1.1",
+ "string_decoder": "~1.1.1",
+ "util-deprecate": "~1.0.1"
+ }
+ },
+ "node_modules/hpack.js/node_modules/safe-buffer": {
+ "version": "5.1.2",
+ "resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz",
+ "integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g=="
+ },
+ "node_modules/hpack.js/node_modules/string_decoder": {
+ "version": "1.1.1",
+ "resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-1.1.1.tgz",
+ "integrity": "sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg==",
+ "dependencies": {
+ "safe-buffer": "~5.1.0"
+ }
+ },
+ "node_modules/htm": {
+ "version": "3.1.1",
+ "resolved": "https://registry.npmjs.org/htm/-/htm-3.1.1.tgz",
+ "integrity": "sha512-983Vyg8NwUE7JkZ6NmOqpCZ+sh1bKv2iYTlUkzlWmA5JD2acKoxd4KVxbMmxX/85mtfdnDmTFoNKcg5DGAvxNQ=="
+ },
+ "node_modules/html-entities": {
+ "version": "2.3.3",
+ "resolved": "https://registry.npmjs.org/html-entities/-/html-entities-2.3.3.tgz",
+ "integrity": "sha512-DV5Ln36z34NNTDgnz0EWGBLZENelNAtkiFA4kyNOG2tDI6Mz1uSWiq1wAKdyjnJwyDiDO7Fa2SO1CTxPXL8VxA=="
+ },
+ "node_modules/html-minifier-terser": {
+ "version": "6.1.0",
+ "resolved": "https://registry.npmjs.org/html-minifier-terser/-/html-minifier-terser-6.1.0.tgz",
+ "integrity": "sha512-YXxSlJBZTP7RS3tWnQw74ooKa6L9b9i9QYXY21eUEvhZ3u9XLfv6OnFsQq6RxkhHygsaUMvYsZRV5rU/OVNZxw==",
+ "dependencies": {
+ "camel-case": "^4.1.2",
+ "clean-css": "^5.2.2",
+ "commander": "^8.3.0",
+ "he": "^1.2.0",
+ "param-case": "^3.0.4",
+ "relateurl": "^0.2.7",
+ "terser": "^5.10.0"
+ },
+ "bin": {
+ "html-minifier-terser": "cli.js"
+ },
+ "engines": {
+ "node": ">=12"
+ }
+ },
+ "node_modules/html-minifier-terser/node_modules/commander": {
+ "version": "8.3.0",
+ "resolved": "https://registry.npmjs.org/commander/-/commander-8.3.0.tgz",
+ "integrity": "sha512-OkTL9umf+He2DZkUq8f8J9of7yL6RJKI24dVITBmNfZBmri9zYZQrKkuXiKhyfPSu8tUhnVBB1iKXevvnlR4Ww==",
+ "engines": {
+ "node": ">= 12"
+ }
+ },
+ "node_modules/html-tags": {
+ "version": "3.2.0",
+ "resolved": "https://registry.npmjs.org/html-tags/-/html-tags-3.2.0.tgz",
+ "integrity": "sha512-vy7ClnArOZwCnqZgvv+ddgHgJiAFXe3Ge9ML5/mBctVJoUoYPCdxVucOywjDARn6CVoh3dRSFdPHy2sX80L0Wg==",
+ "engines": {
+ "node": ">=8"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/html-void-elements": {
+ "version": "1.0.5",
+ "resolved": "https://registry.npmjs.org/html-void-elements/-/html-void-elements-1.0.5.tgz",
+ "integrity": "sha512-uE/TxKuyNIcx44cIWnjr/rfIATDH7ZaOMmstu0CwhFG1Dunhlp4OC6/NMbhiwoq5BpW0ubi303qnEk/PZj614w==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/html-webpack-plugin": {
+ "version": "5.5.0",
+ "resolved": "https://registry.npmjs.org/html-webpack-plugin/-/html-webpack-plugin-5.5.0.tgz",
+ "integrity": "sha512-sy88PC2cRTVxvETRgUHFrL4No3UxvcH8G1NepGhqaTT+GXN2kTamqasot0inS5hXeg1cMbFDt27zzo9p35lZVw==",
+ "dependencies": {
+ "@types/html-minifier-terser": "^6.0.0",
+ "html-minifier-terser": "^6.0.2",
+ "lodash": "^4.17.21",
+ "pretty-error": "^4.0.0",
+ "tapable": "^2.0.0"
+ },
+ "engines": {
+ "node": ">=10.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/html-webpack-plugin"
+ },
+ "peerDependencies": {
+ "webpack": "^5.20.0"
+ }
+ },
+ "node_modules/htmlparser2": {
+ "version": "8.0.2",
+ "resolved": "https://registry.npmjs.org/htmlparser2/-/htmlparser2-8.0.2.tgz",
+ "integrity": "sha512-GYdjWKDkbRLkZ5geuHs5NY1puJ+PXwP7+fHPRz06Eirsb9ugf6d8kkXav6ADhcODhFFPMIXyxkxSuMf3D6NCFA==",
+ "funding": [
+ "https://github.com/fb55/htmlparser2?sponsor=1",
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/fb55"
+ }
+ ],
+ "dependencies": {
+ "domelementtype": "^2.3.0",
+ "domhandler": "^5.0.3",
+ "domutils": "^3.0.1",
+ "entities": "^4.4.0"
+ }
+ },
+ "node_modules/http-cache-semantics": {
+ "version": "4.1.1",
+ "resolved": "https://registry.npmjs.org/http-cache-semantics/-/http-cache-semantics-4.1.1.tgz",
+ "integrity": "sha512-er295DKPVsV82j5kw1Gjt+ADA/XYHsajl82cGNQG2eyoPkvgUhX+nDIyelzhIWbbsXP39EHcI6l5tYs2FYqYXQ=="
+ },
+ "node_modules/http-deceiver": {
+ "version": "1.2.7",
+ "resolved": "https://registry.npmjs.org/http-deceiver/-/http-deceiver-1.2.7.tgz",
+ "integrity": "sha512-LmpOGxTfbpgtGVxJrj5k7asXHCgNZp5nLfp+hWc8QQRqtb7fUy6kRY3BO1h9ddF6yIPYUARgxGOwB42DnxIaNw=="
+ },
+ "node_modules/http-errors": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/http-errors/-/http-errors-2.0.0.tgz",
+ "integrity": "sha512-FtwrG/euBzaEjYeRqOgly7G0qviiXoJWnvEH2Z1plBdXgbyjv34pHTSb9zoeHMyDy33+DWy5Wt9Wo+TURtOYSQ==",
+ "dependencies": {
+ "depd": "2.0.0",
+ "inherits": "2.0.4",
+ "setprototypeof": "1.2.0",
+ "statuses": "2.0.1",
+ "toidentifier": "1.0.1"
+ },
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/http-parser-js": {
+ "version": "0.5.8",
+ "resolved": "https://registry.npmjs.org/http-parser-js/-/http-parser-js-0.5.8.tgz",
+ "integrity": "sha512-SGeBX54F94Wgu5RH3X5jsDtf4eHyRogWX1XGT3b4HuW3tQPM4AaBzoUji/4AAJNXCEOWZ5O0DgZmJw1947gD5Q=="
+ },
+ "node_modules/http-proxy": {
+ "version": "1.18.1",
+ "resolved": "https://registry.npmjs.org/http-proxy/-/http-proxy-1.18.1.tgz",
+ "integrity": "sha512-7mz/721AbnJwIVbnaSv1Cz3Am0ZLT/UBwkC92VlxhXv/k/BBQfM2fXElQNC27BVGr0uwUpplYPQM9LnaBMR5NQ==",
+ "dependencies": {
+ "eventemitter3": "^4.0.0",
+ "follow-redirects": "^1.0.0",
+ "requires-port": "^1.0.0"
+ },
+ "engines": {
+ "node": ">=8.0.0"
+ }
+ },
+ "node_modules/http-proxy-middleware": {
+ "version": "2.0.9",
+ "resolved": "https://registry.npmjs.org/http-proxy-middleware/-/http-proxy-middleware-2.0.9.tgz",
+ "integrity": "sha512-c1IyJYLYppU574+YI7R4QyX2ystMtVXZwIdzazUIPIJsHuWNd+mho2j+bKoHftndicGj9yh+xjd+l0yj7VeT1Q==",
+ "license": "MIT",
+ "dependencies": {
+ "@types/http-proxy": "^1.17.8",
+ "http-proxy": "^1.18.1",
+ "is-glob": "^4.0.1",
+ "is-plain-obj": "^3.0.0",
+ "micromatch": "^4.0.2"
+ },
+ "engines": {
+ "node": ">=12.0.0"
+ },
+ "peerDependencies": {
+ "@types/express": "^4.17.13"
+ },
+ "peerDependenciesMeta": {
+ "@types/express": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/http-proxy-middleware/node_modules/is-plain-obj": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/is-plain-obj/-/is-plain-obj-3.0.0.tgz",
+ "integrity": "sha512-gwsOE28k+23GP1B6vFl1oVh/WOzmawBrKwo5Ev6wMKzPkaXaCDIQKzLnvsA42DRlbVTWorkgTKIviAKCWkfUwA==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/http2-client": {
+ "version": "1.3.5",
+ "resolved": "https://registry.npmjs.org/http2-client/-/http2-client-1.3.5.tgz",
+ "integrity": "sha512-EC2utToWl4RKfs5zd36Mxq7nzHHBuomZboI0yYL6Y0RmBgT7Sgkq4rQ0ezFTYoIsSs7Tm9SJe+o2FcAg6GBhGA=="
+ },
+ "node_modules/human-signals": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/human-signals/-/human-signals-2.1.0.tgz",
+ "integrity": "sha512-B4FFZ6q/T2jhhksgkbEW3HBvWIfDW85snkQgawt07S7J5QXTk6BkNV+0yAeZrM5QpMAdYlocGoljn0sJ/WQkFw==",
+ "engines": {
+ "node": ">=10.17.0"
+ }
+ },
+ "node_modules/iconv-lite": {
+ "version": "0.4.24",
+ "resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.4.24.tgz",
+ "integrity": "sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==",
+ "dependencies": {
+ "safer-buffer": ">= 2.1.2 < 3"
+ },
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/icss-utils": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/icss-utils/-/icss-utils-5.1.0.tgz",
+ "integrity": "sha512-soFhflCVWLfRNOPU3iv5Z9VUdT44xFRbzjLsEzSr5AQmgqPMTHdU3PMT1Cf1ssx8fLNJDA1juftYl+PUcv3MqA==",
+ "engines": {
+ "node": "^10 || ^12 || >= 14"
+ },
+ "peerDependencies": {
+ "postcss": "^8.1.0"
+ }
+ },
+ "node_modules/ignore": {
+ "version": "5.2.4",
+ "resolved": "https://registry.npmjs.org/ignore/-/ignore-5.2.4.tgz",
+ "integrity": "sha512-MAb38BcSbH0eHNBxn7ql2NH/kX33OkB3lZ1BNdh7ENeRChHTYsTvWrMubiIAMNS2llXEEgZ1MUOBtXChP3kaFQ==",
+ "engines": {
+ "node": ">= 4"
+ }
+ },
+ "node_modules/image-size": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/image-size/-/image-size-1.0.2.tgz",
+ "integrity": "sha512-xfOoWjceHntRb3qFCrh5ZFORYH8XCdYpASltMhZ/Q0KZiOwjdE/Yl2QCiWdwD+lygV5bMCvauzgu5PxBX/Yerg==",
+ "dependencies": {
+ "queue": "6.0.2"
+ },
+ "bin": {
+ "image-size": "bin/image-size.js"
+ },
+ "engines": {
+ "node": ">=14.0.0"
+ }
+ },
+ "node_modules/immer": {
+ "version": "9.0.21",
+ "resolved": "https://registry.npmjs.org/immer/-/immer-9.0.21.tgz",
+ "integrity": "sha512-bc4NBHqOqSfRW7POMkHd51LvClaeMXpm8dx0e8oE2GORbq5aRK7Bxl4FyzVLdGtLmvLKL7BTDBG5ACQm4HWjTA==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/immer"
+ }
+ },
+ "node_modules/import-fresh": {
+ "version": "3.3.0",
+ "resolved": "https://registry.npmjs.org/import-fresh/-/import-fresh-3.3.0.tgz",
+ "integrity": "sha512-veYYhQa+D1QBKznvhUHxb8faxlrwUnxseDAbAp457E0wLNio2bOSKnjYDhMj+YiAq61xrMGhQk9iXVk5FzgQMw==",
+ "dependencies": {
+ "parent-module": "^1.0.0",
+ "resolve-from": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=6"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/import-lazy": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/import-lazy/-/import-lazy-2.1.0.tgz",
+ "integrity": "sha512-m7ZEHgtw69qOGw+jwxXkHlrlIPdTGkyh66zXZ1ajZbxkDBNjSY/LGbmjc7h0s2ELsUDTAhFr55TrPSSqJGPG0A==",
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/imurmurhash": {
+ "version": "0.1.4",
+ "resolved": "https://registry.npmjs.org/imurmurhash/-/imurmurhash-0.1.4.tgz",
+ "integrity": "sha512-JmXMZ6wuvDmLiHEml9ykzqO6lwFbof0GG4IkcGaENdCRDDmMVnny7s5HsIgHCbaq0w2MyPhDqkhTUgS2LU2PHA==",
+ "engines": {
+ "node": ">=0.8.19"
+ }
+ },
+ "node_modules/indent-string": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/indent-string/-/indent-string-4.0.0.tgz",
+ "integrity": "sha512-EdDDZu4A2OyIK7Lr/2zG+w5jmbuk1DVBnEwREQvBzspBJkCEbRa8GxU1lghYcaGJCnRWibjDXlq779X1/y5xwg==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/infima": {
+ "version": "0.2.0-alpha.43",
+ "resolved": "https://registry.npmjs.org/infima/-/infima-0.2.0-alpha.43.tgz",
+ "integrity": "sha512-2uw57LvUqW0rK/SWYnd/2rRfxNA5DDNOh33jxF7fy46VWoNhGxiUQyVZHbBMjQ33mQem0cjdDVwgWVAmlRfgyQ==",
+ "engines": {
+ "node": ">=12"
+ }
+ },
+ "node_modules/inflight": {
+ "version": "1.0.6",
+ "resolved": "https://registry.npmjs.org/inflight/-/inflight-1.0.6.tgz",
+ "integrity": "sha512-k92I/b08q4wvFscXCLvqfsHCrjrF7yiXsQuIVvVE7N82W3+aqpzuUdBbfhWcy/FZR3/4IgflMgKLOsvPDrGCJA==",
+ "dependencies": {
+ "once": "^1.3.0",
+ "wrappy": "1"
+ }
+ },
+ "node_modules/inherits": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz",
+ "integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ=="
+ },
+ "node_modules/ini": {
+ "version": "1.3.8",
+ "resolved": "https://registry.npmjs.org/ini/-/ini-1.3.8.tgz",
+ "integrity": "sha512-JV/yugV2uzW5iMRSiZAyDtQd+nxtUnjeLt0acNdw98kKLrvuRVyB80tsREOE7yvGVgalhZ6RNXCmEHkUKBKxew=="
+ },
+ "node_modules/inline-style-parser": {
+ "version": "0.1.1",
+ "resolved": "https://registry.npmjs.org/inline-style-parser/-/inline-style-parser-0.1.1.tgz",
+ "integrity": "sha512-7NXolsK4CAS5+xvdj5OMMbI962hU/wvwoxk+LWR9Ek9bVtyuuYScDN6eS0rUm6TxApFpw7CX1o4uJzcd4AyD3Q=="
+ },
+ "node_modules/interpret": {
+ "version": "1.4.0",
+ "resolved": "https://registry.npmjs.org/interpret/-/interpret-1.4.0.tgz",
+ "integrity": "sha512-agE4QfB2Lkp9uICn7BAqoscw4SZP9kTE2hxiFI3jBPmXJfdqiahTbUuKGsMoN2GtqL9AxhYioAcVvgsb1HvRbA==",
+ "engines": {
+ "node": ">= 0.10"
+ }
+ },
+ "node_modules/invariant": {
+ "version": "2.2.4",
+ "resolved": "https://registry.npmjs.org/invariant/-/invariant-2.2.4.tgz",
+ "integrity": "sha512-phJfQVBuaJM5raOpJjSfkiD6BpbCE4Ns//LaXl6wGYtUBY83nWS6Rf9tXm2e8VaK60JEjYldbPif/A2B1C2gNA==",
+ "dependencies": {
+ "loose-envify": "^1.0.0"
+ }
+ },
+ "node_modules/ipaddr.js": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/ipaddr.js/-/ipaddr.js-2.0.1.tgz",
+ "integrity": "sha512-1qTgH9NG+IIJ4yfKs2e6Pp1bZg8wbDbKHT21HrLIeYBTRLgMYKnMTPAuI3Lcs61nfx5h1xlXnbJtH1kX5/d/ng==",
+ "engines": {
+ "node": ">= 10"
+ }
+ },
+ "node_modules/is-alphabetical": {
+ "version": "1.0.4",
+ "resolved": "https://registry.npmjs.org/is-alphabetical/-/is-alphabetical-1.0.4.tgz",
+ "integrity": "sha512-DwzsA04LQ10FHTZuL0/grVDk4rFoVH1pjAToYwBrHSxcrBIGQuXrQMtD5U1b0U2XVgKZCTLLP8u2Qxqhy3l2Vg==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/is-alphanumerical": {
+ "version": "1.0.4",
+ "resolved": "https://registry.npmjs.org/is-alphanumerical/-/is-alphanumerical-1.0.4.tgz",
+ "integrity": "sha512-UzoZUr+XfVz3t3v4KyGEniVL9BDRoQtY7tOyrRybkVNjDFWyo1yhXNGrrBTQxp3ib9BLAWs7k2YKBQsFRkZG9A==",
+ "dependencies": {
+ "is-alphabetical": "^1.0.0",
+ "is-decimal": "^1.0.0"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/is-arrayish": {
+ "version": "0.2.1",
+ "resolved": "https://registry.npmjs.org/is-arrayish/-/is-arrayish-0.2.1.tgz",
+ "integrity": "sha512-zz06S8t0ozoDXMG+ube26zeCTNXcKIPJZJi8hBrF4idCLms4CG9QtK7qBl1boi5ODzFpjswb5JPmHCbMpjaYzg=="
+ },
+ "node_modules/is-binary-path": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/is-binary-path/-/is-binary-path-2.1.0.tgz",
+ "integrity": "sha512-ZMERYes6pDydyuGidse7OsHxtbI7WVeUEozgR/g7rd0xUimYNlvZRE/K2MgZTjWy725IfelLeVcEM97mmtRGXw==",
+ "dependencies": {
+ "binary-extensions": "^2.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/is-buffer": {
+ "version": "2.0.5",
+ "resolved": "https://registry.npmjs.org/is-buffer/-/is-buffer-2.0.5.tgz",
+ "integrity": "sha512-i2R6zNFDwgEHJyQUtJEk0XFi1i0dPFn/oqjK3/vPCcDeJvW5NQ83V8QbicfF1SupOaB0h8ntgBC2YiE7dfyctQ==",
+ "funding": [
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/feross"
+ },
+ {
+ "type": "patreon",
+ "url": "https://www.patreon.com/feross"
+ },
+ {
+ "type": "consulting",
+ "url": "https://feross.org/support"
+ }
+ ],
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/is-ci": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/is-ci/-/is-ci-2.0.0.tgz",
+ "integrity": "sha512-YfJT7rkpQB0updsdHLGWrvhBJfcfzNNawYDNIyQXJz0IViGf75O8EBPKSdvw2rF+LGCsX4FZ8tcr3b19LcZq4w==",
+ "dependencies": {
+ "ci-info": "^2.0.0"
+ },
+ "bin": {
+ "is-ci": "bin.js"
+ }
+ },
+ "node_modules/is-ci/node_modules/ci-info": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/ci-info/-/ci-info-2.0.0.tgz",
+ "integrity": "sha512-5tK7EtrZ0N+OLFMthtqOj4fI2Jeb88C4CAZPu25LDVUgXJ0A3Js4PMGqrn0JU1W0Mh1/Z8wZzYPxqUrXeBboCQ=="
+ },
+ "node_modules/is-core-module": {
+ "version": "2.11.0",
+ "resolved": "https://registry.npmjs.org/is-core-module/-/is-core-module-2.11.0.tgz",
+ "integrity": "sha512-RRjxlvLDkD1YJwDbroBHMb+cukurkDWNyHx7D3oNB5x9rb5ogcksMC5wHCadcXoo67gVr/+3GFySh3134zi6rw==",
+ "dependencies": {
+ "has": "^1.0.3"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/is-decimal": {
+ "version": "1.0.4",
+ "resolved": "https://registry.npmjs.org/is-decimal/-/is-decimal-1.0.4.tgz",
+ "integrity": "sha512-RGdriMmQQvZ2aqaQq3awNA6dCGtKpiDFcOzrTWrDAT2MiWrKQVPmxLGHl7Y2nNu6led0kEyoX0enY0qXYsv9zw==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/is-docker": {
+ "version": "2.2.1",
+ "resolved": "https://registry.npmjs.org/is-docker/-/is-docker-2.2.1.tgz",
+ "integrity": "sha512-F+i2BKsFrH66iaUFc0woD8sLy8getkwTwtOBjvs56Cx4CgJDeKQeqfz8wAYiSb8JOprWhHH5p77PbmYCvvUuXQ==",
+ "bin": {
+ "is-docker": "cli.js"
+ },
+ "engines": {
+ "node": ">=8"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/is-extendable": {
+ "version": "0.1.1",
+ "resolved": "https://registry.npmjs.org/is-extendable/-/is-extendable-0.1.1.tgz",
+ "integrity": "sha512-5BMULNob1vgFX6EjQw5izWDxrecWK9AM72rugNr0TFldMOi0fj6Jk+zeKIt0xGj4cEfQIJth4w3OKWOJ4f+AFw==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/is-extglob": {
+ "version": "2.1.1",
+ "resolved": "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz",
+ "integrity": "sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/is-fullwidth-code-point": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz",
+ "integrity": "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/is-glob": {
+ "version": "4.0.3",
+ "resolved": "https://registry.npmjs.org/is-glob/-/is-glob-4.0.3.tgz",
+ "integrity": "sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg==",
+ "dependencies": {
+ "is-extglob": "^2.1.1"
+ },
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/is-hexadecimal": {
+ "version": "1.0.4",
+ "resolved": "https://registry.npmjs.org/is-hexadecimal/-/is-hexadecimal-1.0.4.tgz",
+ "integrity": "sha512-gyPJuv83bHMpocVYoqof5VDiZveEoGoFL8m3BXNb2VW8Xs+rz9kqO8LOQ5DH6EsuvilT1ApazU0pyl+ytbPtlw==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/is-installed-globally": {
+ "version": "0.4.0",
+ "resolved": "https://registry.npmjs.org/is-installed-globally/-/is-installed-globally-0.4.0.tgz",
+ "integrity": "sha512-iwGqO3J21aaSkC7jWnHP/difazwS7SFeIqxv6wEtLU8Y5KlzFTjyqcSIT0d8s4+dDhKytsk9PJZ2BkS5eZwQRQ==",
+ "dependencies": {
+ "global-dirs": "^3.0.0",
+ "is-path-inside": "^3.0.2"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/is-npm": {
+ "version": "5.0.0",
+ "resolved": "https://registry.npmjs.org/is-npm/-/is-npm-5.0.0.tgz",
+ "integrity": "sha512-WW/rQLOazUq+ST/bCAVBp/2oMERWLsR7OrKyt052dNDk4DHcDE0/7QSXITlmi+VBcV13DfIbysG3tZJm5RfdBA==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/is-number": {
+ "version": "7.0.0",
+ "resolved": "https://registry.npmjs.org/is-number/-/is-number-7.0.0.tgz",
+ "integrity": "sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng==",
+ "engines": {
+ "node": ">=0.12.0"
+ }
+ },
+ "node_modules/is-obj": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/is-obj/-/is-obj-1.0.1.tgz",
+ "integrity": "sha512-l4RyHgRqGN4Y3+9JHVrNqO+tN0rV5My76uW5/nuO4K1b6vw5G8d/cmFjP9tRfEsdhZNt0IFdZuK/c2Vr4Nb+Qg==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/is-path-cwd": {
+ "version": "2.2.0",
+ "resolved": "https://registry.npmjs.org/is-path-cwd/-/is-path-cwd-2.2.0.tgz",
+ "integrity": "sha512-w942bTcih8fdJPJmQHFzkS76NEP8Kzzvmw92cXsazb8intwLqPibPPdXf4ANdKV3rYMuuQYGIWtvz9JilB3NFQ==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/is-path-inside": {
+ "version": "3.0.3",
+ "resolved": "https://registry.npmjs.org/is-path-inside/-/is-path-inside-3.0.3.tgz",
+ "integrity": "sha512-Fd4gABb+ycGAmKou8eMftCupSir5lRxqf4aD/vd0cD2qc4HL07OjCeuHMr8Ro4CoMaeCKDB0/ECBOVWjTwUvPQ==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/is-plain-obj": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/is-plain-obj/-/is-plain-obj-2.1.0.tgz",
+ "integrity": "sha512-YWnfyRwxL/+SsrWYfOpUtz5b3YD+nyfkHvjbcanzk8zgyO4ASD67uVMRt8k5bM4lLMDnXfriRhOpemw+NfT1eA==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/is-plain-object": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/is-plain-object/-/is-plain-object-2.0.4.tgz",
+ "integrity": "sha512-h5PpgXkWitc38BBMYawTYMWJHFZJVnBquFE57xFpjB8pJFiF6gZ+bU+WyI/yqXiFR5mdLsgYNaPe8uao6Uv9Og==",
+ "dependencies": {
+ "isobject": "^3.0.1"
+ },
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/is-regexp": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/is-regexp/-/is-regexp-1.0.0.tgz",
+ "integrity": "sha512-7zjFAPO4/gwyQAAgRRmqeEeyIICSdmCqa3tsVHMdBzaXXRiqopZL4Cyghg/XulGWrtABTpbnYYzzIRffLkP4oA==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/is-root": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/is-root/-/is-root-2.1.0.tgz",
+ "integrity": "sha512-AGOriNp96vNBd3HtU+RzFEc75FfR5ymiYv8E553I71SCeXBiMsVDUtdio1OEFvrPyLIQ9tVR5RxXIFe5PUFjMg==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/is-stream": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/is-stream/-/is-stream-2.0.1.tgz",
+ "integrity": "sha512-hFoiJiTl63nn+kstHGBtewWSKnQLpyb155KHheA1l39uvtO9nWIop1p3udqPcUd/xbF1VLMO4n7OI6p7RbngDg==",
+ "engines": {
+ "node": ">=8"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/is-typedarray": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/is-typedarray/-/is-typedarray-1.0.0.tgz",
+ "integrity": "sha512-cyA56iCMHAh5CdzjJIa4aohJyeO1YbwLi3Jc35MmRU6poroFjIGZzUzupGiRPOjgHg9TLu43xbpwXk523fMxKA=="
+ },
+ "node_modules/is-whitespace-character": {
+ "version": "1.0.4",
+ "resolved": "https://registry.npmjs.org/is-whitespace-character/-/is-whitespace-character-1.0.4.tgz",
+ "integrity": "sha512-SDweEzfIZM0SJV0EUga669UTKlmL0Pq8Lno0QDQsPnvECB3IM2aP0gdx5TrU0A01MAPfViaZiI2V1QMZLaKK5w==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/is-word-character": {
+ "version": "1.0.4",
+ "resolved": "https://registry.npmjs.org/is-word-character/-/is-word-character-1.0.4.tgz",
+ "integrity": "sha512-5SMO8RVennx3nZrqtKwCGyyetPE9VDba5ugvKLaD4KopPG5kR4mQ7tNt/r7feL5yt5h3lpuBbIUmCOG2eSzXHA==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/is-wsl": {
+ "version": "2.2.0",
+ "resolved": "https://registry.npmjs.org/is-wsl/-/is-wsl-2.2.0.tgz",
+ "integrity": "sha512-fKzAra0rGJUUBwGBgNkHZuToZcn+TtXHpeCgmkMJMMYx1sQDYaCSyjJBSCa2nH1DGm7s3n1oBnohoVTBaN7Lww==",
+ "dependencies": {
+ "is-docker": "^2.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/is-yarn-global": {
+ "version": "0.3.0",
+ "resolved": "https://registry.npmjs.org/is-yarn-global/-/is-yarn-global-0.3.0.tgz",
+ "integrity": "sha512-VjSeb/lHmkoyd8ryPVIKvOCn4D1koMqY+vqyjjUfc3xyKtP4dYOxM44sZrnqQSzSds3xyOrUTLTC9LVCVgLngw=="
+ },
+ "node_modules/isarray": {
+ "version": "0.0.1",
+ "resolved": "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz",
+ "integrity": "sha512-D2S+3GLxWH+uhrNEcoh/fnmYeP8E8/zHl644d/jdA0g2uyXvy3sb0qxotE+ne0LtccHknQzWwZEzhak7oJ0COQ=="
+ },
+ "node_modules/isexe": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/isexe/-/isexe-2.0.0.tgz",
+ "integrity": "sha512-RHxMLp9lnKHGHRng9QFhRCMbYAcVpn69smSGcq3f36xjgVVWThj4qqLbTLlq7Ssj8B+fIQ1EuCEGI2lKsyQeIw=="
+ },
+ "node_modules/isobject": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/isobject/-/isobject-3.0.1.tgz",
+ "integrity": "sha512-WhB9zCku7EGTj/HQQRz5aUQEUeoQZH2bWcltRErOpymJ4boYE6wL9Tbr23krRPSZ+C5zqNSrSw+Cc7sZZ4b7vg==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/jest-util": {
+ "version": "29.5.0",
+ "resolved": "https://registry.npmjs.org/jest-util/-/jest-util-29.5.0.tgz",
+ "integrity": "sha512-RYMgG/MTadOr5t8KdhejfvUU82MxsCu5MF6KuDUHl+NuwzUt+Sm6jJWxTJVrDR1j5M/gJVCPKQEpWXY+yIQ6lQ==",
+ "dependencies": {
+ "@jest/types": "^29.5.0",
+ "@types/node": "*",
+ "chalk": "^4.0.0",
+ "ci-info": "^3.2.0",
+ "graceful-fs": "^4.2.9",
+ "picomatch": "^2.2.3"
+ },
+ "engines": {
+ "node": "^14.15.0 || ^16.10.0 || >=18.0.0"
+ }
+ },
+ "node_modules/jest-worker": {
+ "version": "29.5.0",
+ "resolved": "https://registry.npmjs.org/jest-worker/-/jest-worker-29.5.0.tgz",
+ "integrity": "sha512-NcrQnevGoSp4b5kg+akIpthoAFHxPBcb5P6mYPY0fUNT+sSvmtu6jlkEle3anczUKIKEbMxFimk9oTP/tpIPgA==",
+ "dependencies": {
+ "@types/node": "*",
+ "jest-util": "^29.5.0",
+ "merge-stream": "^2.0.0",
+ "supports-color": "^8.0.0"
+ },
+ "engines": {
+ "node": "^14.15.0 || ^16.10.0 || >=18.0.0"
+ }
+ },
+ "node_modules/jest-worker/node_modules/supports-color": {
+ "version": "8.1.1",
+ "resolved": "https://registry.npmjs.org/supports-color/-/supports-color-8.1.1.tgz",
+ "integrity": "sha512-MpUEN2OodtUzxvKQl72cUF7RQ5EiHsGvSsVG0ia9c5RbWGL2CI4C7EpPS8UTBIplnlzZiNuV56w+FuNxy3ty2Q==",
+ "dependencies": {
+ "has-flag": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/supports-color?sponsor=1"
+ }
+ },
+ "node_modules/joi": {
+ "version": "17.9.1",
+ "resolved": "https://registry.npmjs.org/joi/-/joi-17.9.1.tgz",
+ "integrity": "sha512-FariIi9j6QODKATGBrEX7HZcja8Bsh3rfdGYy/Sb65sGlZWK/QWesU1ghk7aJWDj95knjXlQfSmzFSPPkLVsfw==",
+ "dependencies": {
+ "@hapi/hoek": "^9.0.0",
+ "@hapi/topo": "^5.0.0",
+ "@sideway/address": "^4.1.3",
+ "@sideway/formula": "^3.0.1",
+ "@sideway/pinpoint": "^2.0.0"
+ }
+ },
+ "node_modules/js-levenshtein": {
+ "version": "1.1.6",
+ "resolved": "https://registry.npmjs.org/js-levenshtein/-/js-levenshtein-1.1.6.tgz",
+ "integrity": "sha512-X2BB11YZtrRqY4EnQcLX5Rh373zbK4alC1FW7D7MBhL2gtcC17cTnr6DmfHZeS0s2rTHjUTMMHfG7gO8SSdw+g==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/js-tokens": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-4.0.0.tgz",
+ "integrity": "sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ=="
+ },
+ "node_modules/js-yaml": {
+ "version": "4.1.0",
+ "resolved": "https://registry.npmjs.org/js-yaml/-/js-yaml-4.1.0.tgz",
+ "integrity": "sha512-wpxZs9NoxZaJESJGIZTyDEaYpl0FKSA+FB9aJiyemKhMwkxQg63h4T1KJgUGHpTqPDNRcmmYLugrRjJlBtWvRA==",
+ "dependencies": {
+ "argparse": "^2.0.1"
+ },
+ "bin": {
+ "js-yaml": "bin/js-yaml.js"
+ }
+ },
+ "node_modules/jsesc": {
+ "version": "2.5.2",
+ "resolved": "https://registry.npmjs.org/jsesc/-/jsesc-2.5.2.tgz",
+ "integrity": "sha512-OYu7XEzjkCQ3C5Ps3QIZsQfNpqoJyZZA99wd9aWd05NCtC5pWOkShK2mkL6HXQR6/Cy2lbNdPlZBpuQHXE63gA==",
+ "bin": {
+ "jsesc": "bin/jsesc"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/json-buffer": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/json-buffer/-/json-buffer-3.0.0.tgz",
+ "integrity": "sha512-CuUqjv0FUZIdXkHPI8MezCnFCdaTAacej1TZYulLoAg1h/PhwkdXFN4V/gzY4g+fMBCOV2xF+rp7t2XD2ns/NQ=="
+ },
+ "node_modules/json-parse-even-better-errors": {
+ "version": "2.3.1",
+ "resolved": "https://registry.npmjs.org/json-parse-even-better-errors/-/json-parse-even-better-errors-2.3.1.tgz",
+ "integrity": "sha512-xyFwyhro/JEof6Ghe2iz2NcXoj2sloNsWr/XsERDK/oiPCfaNhl5ONfp+jQdAZRQQ0IJWNzH9zIZF7li91kh2w=="
+ },
+ "node_modules/json-pointer": {
+ "version": "0.6.2",
+ "resolved": "https://registry.npmjs.org/json-pointer/-/json-pointer-0.6.2.tgz",
+ "integrity": "sha512-vLWcKbOaXlO+jvRy4qNd+TI1QUPZzfJj1tpJ3vAXDych5XJf93ftpUKe5pKCrzyIIwgBJcOcCVRUfqQP25afBw==",
+ "dependencies": {
+ "foreach": "^2.0.4"
+ }
+ },
+ "node_modules/json-schema-traverse": {
+ "version": "0.4.1",
+ "resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-0.4.1.tgz",
+ "integrity": "sha512-xbbCH5dCYU5T8LcEhhuh7HJ88HXuW3qsI3Y0zOZFKfZEHcpWiHU/Jxzk629Brsab/mMiHQti9wMP+845RPe3Vg=="
+ },
+ "node_modules/json5": {
+ "version": "2.2.3",
+ "resolved": "https://registry.npmjs.org/json5/-/json5-2.2.3.tgz",
+ "integrity": "sha512-XmOWe7eyHYH14cLdVPoyg+GOH3rYX++KpzrylJwSW98t3Nk+U8XOl8FWKOgwtzdb8lXGf6zYwDUzeHMWfxasyg==",
+ "bin": {
+ "json5": "lib/cli.js"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/jsonfile": {
+ "version": "6.1.0",
+ "resolved": "https://registry.npmjs.org/jsonfile/-/jsonfile-6.1.0.tgz",
+ "integrity": "sha512-5dgndWOriYSm5cnYaJNhalLNDKOqFwyDB/rr1E9ZsGciGvKPs8R2xYGCacuf3z6K1YKDz182fd+fY3cn3pMqXQ==",
+ "dependencies": {
+ "universalify": "^2.0.0"
+ },
+ "optionalDependencies": {
+ "graceful-fs": "^4.1.6"
+ }
+ },
+ "node_modules/keyv": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/keyv/-/keyv-3.1.0.tgz",
+ "integrity": "sha512-9ykJ/46SN/9KPM/sichzQ7OvXyGDYKGTaDlKMGCAlg2UK8KRy4jb0d8sFc+0Tt0YYnThq8X2RZgCg74RPxgcVA==",
+ "dependencies": {
+ "json-buffer": "3.0.0"
+ }
+ },
+ "node_modules/kind-of": {
+ "version": "6.0.3",
+ "resolved": "https://registry.npmjs.org/kind-of/-/kind-of-6.0.3.tgz",
+ "integrity": "sha512-dcS1ul+9tmeD95T+x28/ehLgd9mENa3LsvDTtzm3vyBEO7RPptvAD+t44WVXaUjTBRcrpFeFlC8WCruUR456hw==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/kleur": {
+ "version": "3.0.3",
+ "resolved": "https://registry.npmjs.org/kleur/-/kleur-3.0.3.tgz",
+ "integrity": "sha512-eTIzlVOSUR+JxdDFepEYcBMtZ9Qqdef+rnzWdRZuMbOywu5tO2w2N7rqjoANZ5k9vywhL6Br1VRjUIgTQx4E8w==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/klona": {
+ "version": "2.0.6",
+ "resolved": "https://registry.npmjs.org/klona/-/klona-2.0.6.tgz",
+ "integrity": "sha512-dhG34DXATL5hSxJbIexCft8FChFXtmskoZYnoPWjXQuebWYCNkVeV3KkGegCK9CP1oswI/vQibS2GY7Em/sJJA==",
+ "engines": {
+ "node": ">= 8"
+ }
+ },
+ "node_modules/latest-version": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/latest-version/-/latest-version-5.1.0.tgz",
+ "integrity": "sha512-weT+r0kTkRQdCdYCNtkMwWXQTMEswKrFBkm4ckQOMVhhqhIMI1UT2hMj+1iigIhgSZm5gTmrRXBNoGUgaTY1xA==",
+ "dependencies": {
+ "package-json": "^6.3.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/launch-editor": {
+ "version": "2.6.0",
+ "resolved": "https://registry.npmjs.org/launch-editor/-/launch-editor-2.6.0.tgz",
+ "integrity": "sha512-JpDCcQnyAAzZZaZ7vEiSqL690w7dAEyLao+KC96zBplnYbJS7TYNjvM3M7y3dGz+v7aIsJk3hllWuc0kWAjyRQ==",
+ "dependencies": {
+ "picocolors": "^1.0.0",
+ "shell-quote": "^1.7.3"
+ }
+ },
+ "node_modules/leven": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/leven/-/leven-3.1.0.tgz",
+ "integrity": "sha512-qsda+H8jTaUaN/x5vzW2rzc+8Rw4TAQ/4KjB46IwK5VH+IlVeeeje/EoZRpiXvIqjFgK84QffqPztGI3VBLG1A==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/lilconfig": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/lilconfig/-/lilconfig-2.1.0.tgz",
+ "integrity": "sha512-utWOt/GHzuUxnLKxB6dk81RoOeoNeHgbrXiuGk4yyF5qlRz+iIVWu56E2fqGHFrXz0QNUhLB/8nKqvRH66JKGQ==",
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/lines-and-columns": {
+ "version": "1.2.4",
+ "resolved": "https://registry.npmjs.org/lines-and-columns/-/lines-and-columns-1.2.4.tgz",
+ "integrity": "sha512-7ylylesZQ/PV29jhEDl3Ufjo6ZX7gCqJr5F7PKrqc93v7fzSymt1BpwEU8nAUXs8qzzvqhbjhK5QZg6Mt/HkBg=="
+ },
+ "node_modules/loader-runner": {
+ "version": "4.3.0",
+ "resolved": "https://registry.npmjs.org/loader-runner/-/loader-runner-4.3.0.tgz",
+ "integrity": "sha512-3R/1M+yS3j5ou80Me59j7F9IMs4PXs3VqRrm0TU3AbKPxlmpoY1TNscJV/oGJXo8qCatFGTfDbY6W6ipGOYXfg==",
+ "engines": {
+ "node": ">=6.11.5"
+ }
+ },
+ "node_modules/loader-utils": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/loader-utils/-/loader-utils-2.0.4.tgz",
+ "integrity": "sha512-xXqpXoINfFhgua9xiqD8fPFHgkoq1mmmpE92WlDbm9rNRd/EbRb+Gqf908T2DMfuHjjJlksiK2RbHVOdD/MqSw==",
+ "dependencies": {
+ "big.js": "^5.2.2",
+ "emojis-list": "^3.0.0",
+ "json5": "^2.1.2"
+ },
+ "engines": {
+ "node": ">=8.9.0"
+ }
+ },
+ "node_modules/locate-path": {
+ "version": "5.0.0",
+ "resolved": "https://registry.npmjs.org/locate-path/-/locate-path-5.0.0.tgz",
+ "integrity": "sha512-t7hw9pI+WvuwNJXwk5zVHpyhIqzg2qTlklJOf0mVxGSbe3Fp2VieZcduNYjaLDoy6p9uGpQEGWG87WpMKlNq8g==",
+ "dependencies": {
+ "p-locate": "^4.1.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/lodash": {
+ "version": "4.17.21",
+ "resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz",
+ "integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg=="
+ },
+ "node_modules/lodash.curry": {
+ "version": "4.1.1",
+ "resolved": "https://registry.npmjs.org/lodash.curry/-/lodash.curry-4.1.1.tgz",
+ "integrity": "sha512-/u14pXGviLaweY5JI0IUzgzF2J6Ne8INyzAZjImcryjgkZ+ebruBxy2/JaOOkTqScddcYtakjhSaeemV8lR0tA=="
+ },
+ "node_modules/lodash.debounce": {
+ "version": "4.0.8",
+ "resolved": "https://registry.npmjs.org/lodash.debounce/-/lodash.debounce-4.0.8.tgz",
+ "integrity": "sha512-FT1yDzDYEoYWhnSGnpE/4Kj1fLZkDFyqRb7fNt6FdYOSxlUWAtp42Eh6Wb0rGIv/m9Bgo7x4GhQbm5Ys4SG5ow=="
+ },
+ "node_modules/lodash.flow": {
+ "version": "3.5.0",
+ "resolved": "https://registry.npmjs.org/lodash.flow/-/lodash.flow-3.5.0.tgz",
+ "integrity": "sha512-ff3BX/tSioo+XojX4MOsOMhJw0nZoUEF011LX8g8d3gvjVbxd89cCio4BCXronjxcTUIJUoqKEUA+n4CqvvRPw=="
+ },
+ "node_modules/lodash.isequal": {
+ "version": "4.5.0",
+ "resolved": "https://registry.npmjs.org/lodash.isequal/-/lodash.isequal-4.5.0.tgz",
+ "integrity": "sha512-pDo3lu8Jhfjqls6GkMgpahsF9kCyayhgykjyLMNFTKWrpVdAQtYyB4muAMWozBB4ig/dtWAmsMxLEI8wuz+DYQ=="
+ },
+ "node_modules/lodash.memoize": {
+ "version": "4.1.2",
+ "resolved": "https://registry.npmjs.org/lodash.memoize/-/lodash.memoize-4.1.2.tgz",
+ "integrity": "sha512-t7j+NzmgnQzTAYXcsHYLgimltOV1MXHtlOWf6GjL9Kj8GK5FInw5JotxvbOs+IvV1/Dzo04/fCGfLVs7aXb4Ag=="
+ },
+ "node_modules/lodash.uniq": {
+ "version": "4.5.0",
+ "resolved": "https://registry.npmjs.org/lodash.uniq/-/lodash.uniq-4.5.0.tgz",
+ "integrity": "sha512-xfBaXQd9ryd9dlSDvnvI0lvxfLJlYAZzXomUYzLKtUeOQvOP5piqAWuGtrhWeqaXK9hhoM/iyJc5AV+XfsX3HQ=="
+ },
+ "node_modules/loose-envify": {
+ "version": "1.4.0",
+ "resolved": "https://registry.npmjs.org/loose-envify/-/loose-envify-1.4.0.tgz",
+ "integrity": "sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q==",
+ "dependencies": {
+ "js-tokens": "^3.0.0 || ^4.0.0"
+ },
+ "bin": {
+ "loose-envify": "cli.js"
+ }
+ },
+ "node_modules/lower-case": {
+ "version": "2.0.2",
+ "resolved": "https://registry.npmjs.org/lower-case/-/lower-case-2.0.2.tgz",
+ "integrity": "sha512-7fm3l3NAF9WfN6W3JOmf5drwpVqX78JtoGJ3A6W0a6ZnldM41w2fV5D490psKFTpMds8TJse/eHLFFsNHHjHgg==",
+ "dependencies": {
+ "tslib": "^2.0.3"
+ }
+ },
+ "node_modules/lowercase-keys": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/lowercase-keys/-/lowercase-keys-1.0.1.tgz",
+ "integrity": "sha512-G2Lj61tXDnVFFOi8VZds+SoQjtQC3dgokKdDG2mTm1tx4m50NUHBOZSBwQQHyy0V12A0JTG4icfZQH+xPyh8VA==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/lru-cache": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-5.1.1.tgz",
+ "integrity": "sha512-KpNARQA3Iwv+jTA0utUVVbrh+Jlrr1Fv0e56GGzAFOXN7dk/FviaDW8LHmK52DlcH4WP2n6gI8vN1aesBFgo9w==",
+ "dependencies": {
+ "yallist": "^3.0.2"
+ }
+ },
+ "node_modules/lunr": {
+ "version": "2.3.9",
+ "resolved": "https://registry.npmjs.org/lunr/-/lunr-2.3.9.tgz",
+ "integrity": "sha512-zTU3DaZaF3Rt9rhN3uBMGQD3dD2/vFQqnvZCDv4dl5iOzq2IZQqTxu90r4E5J+nP70J3ilqVCrbho2eWaeW8Ow=="
+ },
+ "node_modules/lunr-languages": {
+ "version": "1.10.0",
+ "resolved": "https://registry.npmjs.org/lunr-languages/-/lunr-languages-1.10.0.tgz",
+ "integrity": "sha512-BBjKKcwrieJlzwwc9M5H/MRXGJ2qyOSDx/NXYiwkuKjiLOOoouh0WsDzeqcLoUWcX31y7i8sb8IgsZKObdUCkw=="
+ },
+ "node_modules/make-dir": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/make-dir/-/make-dir-3.1.0.tgz",
+ "integrity": "sha512-g3FeP20LNwhALb/6Cz6Dd4F2ngze0jz7tbzrD2wAV+o9FeNHe4rL+yK2md0J/fiSf1sa1ADhXqi5+oVwOM/eGw==",
+ "dependencies": {
+ "semver": "^6.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/make-dir/node_modules/semver": {
+ "version": "6.3.1",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
+ "integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==",
+ "bin": {
+ "semver": "bin/semver.js"
+ }
+ },
+ "node_modules/mark.js": {
+ "version": "8.11.1",
+ "resolved": "https://registry.npmjs.org/mark.js/-/mark.js-8.11.1.tgz",
+ "integrity": "sha512-1I+1qpDt4idfgLQG+BNWmrqku+7/2bi5nLf4YwF8y8zXvmfiTBY3PV3ZibfrjBueCByROpuBjLLFCajqkgYoLQ=="
+ },
+ "node_modules/markdown-escapes": {
+ "version": "1.0.4",
+ "resolved": "https://registry.npmjs.org/markdown-escapes/-/markdown-escapes-1.0.4.tgz",
+ "integrity": "sha512-8z4efJYk43E0upd0NbVXwgSTQs6cT3T06etieCMEg7dRbzCbxUCK/GHlX8mhHRDcp+OLlHkPKsvqQTCvsRl2cg==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/marked": {
+ "version": "4.3.0",
+ "resolved": "https://registry.npmjs.org/marked/-/marked-4.3.0.tgz",
+ "integrity": "sha512-PRsaiG84bK+AMvxziE/lCFss8juXjNaWzVbN5tXAm4XjeaS9NAHhop+PjQxz2A9h8Q4M/xGmzP8vqNwy6JeK0A==",
+ "bin": {
+ "marked": "bin/marked.js"
+ },
+ "engines": {
+ "node": ">= 12"
+ }
+ },
+ "node_modules/mdast-squeeze-paragraphs": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/mdast-squeeze-paragraphs/-/mdast-squeeze-paragraphs-4.0.0.tgz",
+ "integrity": "sha512-zxdPn69hkQ1rm4J+2Cs2j6wDEv7O17TfXTJ33tl/+JPIoEmtV9t2ZzBM5LPHE8QlHsmVD8t3vPKCyY3oH+H8MQ==",
+ "dependencies": {
+ "unist-util-remove": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-definitions": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/mdast-util-definitions/-/mdast-util-definitions-4.0.0.tgz",
+ "integrity": "sha512-k8AJ6aNnUkB7IE+5azR9h81O5EQ/cTDXtWdMq9Kk5KcEW/8ritU5CeLg/9HhOC++nALHBlaogJ5jz0Ybk3kPMQ==",
+ "dependencies": {
+ "unist-util-visit": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-to-hast": {
+ "version": "10.0.1",
+ "resolved": "https://registry.npmjs.org/mdast-util-to-hast/-/mdast-util-to-hast-10.0.1.tgz",
+ "integrity": "sha512-BW3LM9SEMnjf4HXXVApZMt8gLQWVNXc3jryK0nJu/rOXPOnlkUjmdkDlmxMirpbU9ILncGFIwLH/ubnWBbcdgA==",
+ "dependencies": {
+ "@types/mdast": "^3.0.0",
+ "@types/unist": "^2.0.0",
+ "mdast-util-definitions": "^4.0.0",
+ "mdurl": "^1.0.0",
+ "unist-builder": "^2.0.0",
+ "unist-util-generated": "^1.0.0",
+ "unist-util-position": "^3.0.0",
+ "unist-util-visit": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdast-util-to-string": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/mdast-util-to-string/-/mdast-util-to-string-2.0.0.tgz",
+ "integrity": "sha512-AW4DRS3QbBayY/jJmD8437V1Gombjf8RSOUCMFBuo5iHi58AGEgVCKQ+ezHkZZDpAQS75hcBMpLqjpJTjtUL7w==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/mdn-data": {
+ "version": "2.0.14",
+ "resolved": "https://registry.npmjs.org/mdn-data/-/mdn-data-2.0.14.tgz",
+ "integrity": "sha512-dn6wd0uw5GsdswPFfsgMp5NSB0/aDe6fK94YJV/AJDYXL6HVLWBsxeq7js7Ad+mU2K9LAlwpk6kN2D5mwCPVow=="
+ },
+ "node_modules/mdurl": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/mdurl/-/mdurl-1.0.1.tgz",
+ "integrity": "sha512-/sKlQJCBYVY9Ers9hqzKou4H6V5UWc/M59TH2dvkt+84itfnq7uFOMLpOiOS4ujvHP4etln18fmIxA5R5fll0g=="
+ },
+ "node_modules/media-typer": {
+ "version": "0.3.0",
+ "resolved": "https://registry.npmjs.org/media-typer/-/media-typer-0.3.0.tgz",
+ "integrity": "sha512-dq+qelQ9akHpcOl/gUVRTxVIOkAJ1wR3QAvb4RsVjS8oVoFjDGTc679wJYmUmknUF5HwMLOgb5O+a3KxfWapPQ==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/memfs": {
+ "version": "3.4.13",
+ "resolved": "https://registry.npmjs.org/memfs/-/memfs-3.4.13.tgz",
+ "integrity": "sha512-omTM41g3Skpvx5dSYeZIbXKcXoAVc/AoMNwn9TKx++L/gaen/+4TTttmu8ZSch5vfVJ8uJvGbroTsIlslRg6lg==",
+ "dependencies": {
+ "fs-monkey": "^1.0.3"
+ },
+ "engines": {
+ "node": ">= 4.0.0"
+ }
+ },
+ "node_modules/merge-descriptors": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmjs.org/merge-descriptors/-/merge-descriptors-1.0.3.tgz",
+ "integrity": "sha512-gaNvAS7TZ897/rVaZ0nMtAyxNyi/pdbjbAwUpFQpN70GqnVfOiXpeUUMKRBmzXaSQ8DdTX4/0ms62r2K+hE6mQ==",
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/merge-stream": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/merge-stream/-/merge-stream-2.0.0.tgz",
+ "integrity": "sha512-abv/qOcuPfk3URPfDzmZU1LKmuw8kT+0nIHvKrKgFrwifol/doWcdA4ZqsWQ8ENrFKkd67Mfpo/LovbIUsbt3w=="
+ },
+ "node_modules/merge2": {
+ "version": "1.4.1",
+ "resolved": "https://registry.npmjs.org/merge2/-/merge2-1.4.1.tgz",
+ "integrity": "sha512-8q7VEgMJW4J8tcfVPy8g09NcQwZdbwFEqhe/WZkoIzjn/3TGDwtOCYtXGxA3O8tPzpczCCDgv+P2P5y00ZJOOg==",
+ "engines": {
+ "node": ">= 8"
+ }
+ },
+ "node_modules/methods": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmjs.org/methods/-/methods-1.1.2.tgz",
+ "integrity": "sha512-iclAHeNqNm68zFtnZ0e+1L2yUIdvzNoauKU4WBA3VvH/vPFieF7qfRlwUZU+DA9P9bPXIS90ulxoUoCH23sV2w==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/micromatch": {
+ "version": "4.0.8",
+ "resolved": "https://registry.npmjs.org/micromatch/-/micromatch-4.0.8.tgz",
+ "integrity": "sha512-PXwfBhYu0hBCPw8Dn0E+WDYb7af3dSLVWKi3HGv84IdF4TyFoC0ysxFd0Goxw7nSv4T/PzEJQxsYsEiFCKo2BA==",
+ "dependencies": {
+ "braces": "^3.0.3",
+ "picomatch": "^2.3.1"
+ },
+ "engines": {
+ "node": ">=8.6"
+ }
+ },
+ "node_modules/mime": {
+ "version": "1.6.0",
+ "resolved": "https://registry.npmjs.org/mime/-/mime-1.6.0.tgz",
+ "integrity": "sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg==",
+ "bin": {
+ "mime": "cli.js"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/mime-db": {
+ "version": "1.33.0",
+ "resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.33.0.tgz",
+ "integrity": "sha512-BHJ/EKruNIqJf/QahvxwQZXKygOQ256myeN/Ew+THcAa5q+PjyTTMMeNQC4DZw5AwfvelsUrA6B67NKMqXDbzQ==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/mime-types": {
+ "version": "2.1.18",
+ "resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.18.tgz",
+ "integrity": "sha512-lc/aahn+t4/SWV/qcmumYjymLsWfN3ELhpmVuUFjgsORruuZPVSwAQryq+HHGvO/SI2KVX26bx+En+zhM8g8hQ==",
+ "dependencies": {
+ "mime-db": "~1.33.0"
+ },
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/mimic-fn": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/mimic-fn/-/mimic-fn-2.1.0.tgz",
+ "integrity": "sha512-OqbOk5oEQeAZ8WXWydlu9HJjz9WVdEIvamMCcXmuqUYjTknH/sqsWvhQ3vgwKFRR1HpjvNBKQ37nbJgYzGqGcg==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/mimic-response": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/mimic-response/-/mimic-response-1.0.1.tgz",
+ "integrity": "sha512-j5EctnkH7amfV/q5Hgmoal1g2QHFJRraOtmx0JpIqkxhBhI/lJSl1nMpQ45hVarwNETOoWEimndZ4QK0RHxuxQ==",
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/mini-css-extract-plugin": {
+ "version": "2.7.5",
+ "resolved": "https://registry.npmjs.org/mini-css-extract-plugin/-/mini-css-extract-plugin-2.7.5.tgz",
+ "integrity": "sha512-9HaR++0mlgom81s95vvNjxkg52n2b5s//3ZTI1EtzFb98awsLSivs2LMsVqnQ3ay0PVhqWcGNyDaTE961FOcjQ==",
+ "dependencies": {
+ "schema-utils": "^4.0.0"
+ },
+ "engines": {
+ "node": ">= 12.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ },
+ "peerDependencies": {
+ "webpack": "^5.0.0"
+ }
+ },
+ "node_modules/mini-css-extract-plugin/node_modules/ajv": {
+ "version": "8.12.0",
+ "resolved": "https://registry.npmjs.org/ajv/-/ajv-8.12.0.tgz",
+ "integrity": "sha512-sRu1kpcO9yLtYxBKvqfTeh9KzZEwO3STyX1HT+4CaDzC6HpTGYhIhPIzj9XuKU7KYDwnaeh5hcOwjy1QuJzBPA==",
+ "dependencies": {
+ "fast-deep-equal": "^3.1.1",
+ "json-schema-traverse": "^1.0.0",
+ "require-from-string": "^2.0.2",
+ "uri-js": "^4.2.2"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/epoberezkin"
+ }
+ },
+ "node_modules/mini-css-extract-plugin/node_modules/ajv-keywords": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/ajv-keywords/-/ajv-keywords-5.1.0.tgz",
+ "integrity": "sha512-YCS/JNFAUyr5vAuhk1DWm1CBxRHW9LbJ2ozWeemrIqpbsqKjHVxYPyi5GC0rjZIT5JxJ3virVTS8wk4i/Z+krw==",
+ "dependencies": {
+ "fast-deep-equal": "^3.1.3"
+ },
+ "peerDependencies": {
+ "ajv": "^8.8.2"
+ }
+ },
+ "node_modules/mini-css-extract-plugin/node_modules/json-schema-traverse": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-1.0.0.tgz",
+ "integrity": "sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug=="
+ },
+ "node_modules/mini-css-extract-plugin/node_modules/schema-utils": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/schema-utils/-/schema-utils-4.0.0.tgz",
+ "integrity": "sha512-1edyXKgh6XnJsJSQ8mKWXnN/BVaIbFMLpouRUrXgVq7WYne5kw3MW7UPhO44uRXQSIpTSXoJbmrR2X0w9kUTyg==",
+ "dependencies": {
+ "@types/json-schema": "^7.0.9",
+ "ajv": "^8.8.0",
+ "ajv-formats": "^2.1.1",
+ "ajv-keywords": "^5.0.0"
+ },
+ "engines": {
+ "node": ">= 12.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ }
+ },
+ "node_modules/minimalistic-assert": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/minimalistic-assert/-/minimalistic-assert-1.0.1.tgz",
+ "integrity": "sha512-UtJcAD4yEaGtjPezWuO9wC4nwUnVH/8/Im3yEHQP4b67cXlD/Qr9hdITCU1xDbSEXg2XKNaP8jsReV7vQd00/A=="
+ },
+ "node_modules/minimatch": {
+ "version": "3.1.2",
+ "resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz",
+ "integrity": "sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==",
+ "dependencies": {
+ "brace-expansion": "^1.1.7"
+ },
+ "engines": {
+ "node": "*"
+ }
+ },
+ "node_modules/minimist": {
+ "version": "1.2.8",
+ "resolved": "https://registry.npmjs.org/minimist/-/minimist-1.2.8.tgz",
+ "integrity": "sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA==",
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/mkdirp": {
+ "version": "1.0.4",
+ "resolved": "https://registry.npmjs.org/mkdirp/-/mkdirp-1.0.4.tgz",
+ "integrity": "sha512-vVqVZQyf3WLx2Shd0qJ9xuvqgAyKPLAiqITEtqW0oIUjzo3PePDd6fW9iFz30ef7Ysp/oiWqbhszeGWW2T6Gzw==",
+ "bin": {
+ "mkdirp": "bin/cmd.js"
+ },
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/mobx": {
+ "version": "6.9.0",
+ "resolved": "https://registry.npmjs.org/mobx/-/mobx-6.9.0.tgz",
+ "integrity": "sha512-HdKewQEREEJgsWnErClfbFoVebze6rGazxFLU/XUyrII8dORfVszN1V0BMRnQSzcgsNNtkX8DHj3nC6cdWE9YQ==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/mobx"
+ }
+ },
+ "node_modules/mobx-react": {
+ "version": "7.6.0",
+ "resolved": "https://registry.npmjs.org/mobx-react/-/mobx-react-7.6.0.tgz",
+ "integrity": "sha512-+HQUNuh7AoQ9ZnU6c4rvbiVVl+wEkb9WqYsVDzGLng+Dqj1XntHu79PvEWKtSMoMj67vFp/ZPXcElosuJO8ckA==",
+ "dependencies": {
+ "mobx-react-lite": "^3.4.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/mobx"
+ },
+ "peerDependencies": {
+ "mobx": "^6.1.0",
+ "react": "^16.8.0 || ^17 || ^18"
+ },
+ "peerDependenciesMeta": {
+ "react-dom": {
+ "optional": true
+ },
+ "react-native": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/mobx-react/node_modules/mobx-react-lite": {
+ "version": "3.4.3",
+ "resolved": "https://registry.npmjs.org/mobx-react-lite/-/mobx-react-lite-3.4.3.tgz",
+ "integrity": "sha512-NkJREyFTSUXR772Qaai51BnE1voWx56LOL80xG7qkZr6vo8vEaLF3sz1JNUVh+rxmUzxYaqOhfuxTfqUh0FXUg==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/mobx"
+ },
+ "peerDependencies": {
+ "mobx": "^6.1.0",
+ "react": "^16.8.0 || ^17 || ^18"
+ },
+ "peerDependenciesMeta": {
+ "react-dom": {
+ "optional": true
+ },
+ "react-native": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/mrmime": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/mrmime/-/mrmime-1.0.1.tgz",
+ "integrity": "sha512-hzzEagAgDyoU1Q6yg5uI+AorQgdvMCur3FcKf7NhMKWsaYg+RnbTyHRa/9IlLF9rf455MOCtcqqrQQ83pPP7Uw==",
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/ms": {
+ "version": "2.1.2",
+ "resolved": "https://registry.npmjs.org/ms/-/ms-2.1.2.tgz",
+ "integrity": "sha512-sGkPx+VjMtmA6MX27oA4FBFELFCZZ4S4XqeGOXCv68tT+jb3vk/RyaKWP0PTKyWtmLSM0b+adUTEvbs1PEaH2w=="
+ },
+ "node_modules/multicast-dns": {
+ "version": "7.2.5",
+ "resolved": "https://registry.npmjs.org/multicast-dns/-/multicast-dns-7.2.5.tgz",
+ "integrity": "sha512-2eznPJP8z2BFLX50tf0LuODrpINqP1RVIm/CObbTcBRITQgmC/TjcREF1NeTBzIcR5XO/ukWo+YHOjBbFwIupg==",
+ "dependencies": {
+ "dns-packet": "^5.2.2",
+ "thunky": "^1.0.2"
+ },
+ "bin": {
+ "multicast-dns": "cli.js"
+ }
+ },
+ "node_modules/nanoid": {
+ "version": "3.3.8",
+ "resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.8.tgz",
+ "integrity": "sha512-WNLf5Sd8oZxOm+TzppcYk8gVOgP+l58xNy58D0nbUnOxOWRWvlcCV4kUF7ltmI6PsrLl/BgKEyS4mqsGChFN0w==",
+ "funding": [
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/ai"
+ }
+ ],
+ "license": "MIT",
+ "bin": {
+ "nanoid": "bin/nanoid.cjs"
+ },
+ "engines": {
+ "node": "^10 || ^12 || ^13.7 || ^14 || >=15.0.1"
+ }
+ },
+ "node_modules/negotiator": {
+ "version": "0.6.3",
+ "resolved": "https://registry.npmjs.org/negotiator/-/negotiator-0.6.3.tgz",
+ "integrity": "sha512-+EUsqGPLsM+j/zdChZjsnX51g4XrHFOIXwfnCVPGlQk/k5giakcKsuxCObBRu6DSm9opw/O6slWbJdghQM4bBg==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/neo-async": {
+ "version": "2.6.2",
+ "resolved": "https://registry.npmjs.org/neo-async/-/neo-async-2.6.2.tgz",
+ "integrity": "sha512-Yd3UES5mWCSqR+qNT93S3UoYUkqAZ9lLg8a7g9rimsWmYGK8cVToA4/sF3RrshdyV3sAGMXVUmpMYOw+dLpOuw=="
+ },
+ "node_modules/no-case": {
+ "version": "3.0.4",
+ "resolved": "https://registry.npmjs.org/no-case/-/no-case-3.0.4.tgz",
+ "integrity": "sha512-fgAN3jGAh+RoxUGZHTSOLJIqUc2wmoBwGR4tbpNAKmmovFoWq0OdRkb0VkldReO2a2iBT/OEulG9XSUc10r3zg==",
+ "dependencies": {
+ "lower-case": "^2.0.2",
+ "tslib": "^2.0.3"
+ }
+ },
+ "node_modules/node-emoji": {
+ "version": "1.11.0",
+ "resolved": "https://registry.npmjs.org/node-emoji/-/node-emoji-1.11.0.tgz",
+ "integrity": "sha512-wo2DpQkQp7Sjm2A0cq+sN7EHKO6Sl0ctXeBdFZrL9T9+UywORbufTcTZxom8YqpLQt/FqNMUkOpkZrJVYSKD3A==",
+ "dependencies": {
+ "lodash": "^4.17.21"
+ }
+ },
+ "node_modules/node-fetch": {
+ "version": "2.6.7",
+ "resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.6.7.tgz",
+ "integrity": "sha512-ZjMPFEfVx5j+y2yF35Kzx5sF7kDzxuDj6ziH4FFbOp87zKDZNx8yExJIb05OGF4Nlt9IHFIMBkRl41VdvcNdbQ==",
+ "dependencies": {
+ "whatwg-url": "^5.0.0"
+ },
+ "engines": {
+ "node": "4.x || >=6.0.0"
+ },
+ "peerDependencies": {
+ "encoding": "^0.1.0"
+ },
+ "peerDependenciesMeta": {
+ "encoding": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/node-fetch-h2": {
+ "version": "2.3.0",
+ "resolved": "https://registry.npmjs.org/node-fetch-h2/-/node-fetch-h2-2.3.0.tgz",
+ "integrity": "sha512-ofRW94Ab0T4AOh5Fk8t0h8OBWrmjb0SSB20xh1H8YnPV9EJ+f5AMoYSUQ2zgJ4Iq2HAK0I2l5/Nequ8YzFS3Hg==",
+ "dependencies": {
+ "http2-client": "^1.2.5"
+ },
+ "engines": {
+ "node": "4.x || >=6.0.0"
+ }
+ },
+ "node_modules/node-forge": {
+ "version": "1.3.1",
+ "resolved": "https://registry.npmjs.org/node-forge/-/node-forge-1.3.1.tgz",
+ "integrity": "sha512-dPEtOeMvF9VMcYV/1Wb8CPoVAXtp6MKMlcbAt4ddqmGqUJ6fQZFXkNZNkNlfevtNkGtaSoXf/vNNNSvgrdXwtA==",
+ "engines": {
+ "node": ">= 6.13.0"
+ }
+ },
+ "node_modules/node-readfiles": {
+ "version": "0.2.0",
+ "resolved": "https://registry.npmjs.org/node-readfiles/-/node-readfiles-0.2.0.tgz",
+ "integrity": "sha512-SU00ZarexNlE4Rjdm83vglt5Y9yiQ+XI1XpflWlb7q7UTN1JUItm69xMeiQCTxtTfnzt+83T8Cx+vI2ED++VDA==",
+ "dependencies": {
+ "es6-promise": "^3.2.1"
+ }
+ },
+ "node_modules/node-releases": {
+ "version": "2.0.18",
+ "resolved": "https://registry.npmjs.org/node-releases/-/node-releases-2.0.18.tgz",
+ "integrity": "sha512-d9VeXT4SJ7ZeOqGX6R5EM022wpL+eWPooLI+5UpWn2jCT1aosUQEhQP214x33Wkwx3JQMvIm+tIoVOdodFS40g=="
+ },
+ "node_modules/noms": {
+ "version": "0.0.0",
+ "resolved": "https://registry.npmjs.org/noms/-/noms-0.0.0.tgz",
+ "integrity": "sha512-lNDU9VJaOPxUmXcLb+HQFeUgQQPtMI24Gt6hgfuMHRJgMRHMF/qZ4HJD3GDru4sSw9IQl2jPjAYnQrdIeLbwow==",
+ "dependencies": {
+ "inherits": "^2.0.1",
+ "readable-stream": "~1.0.31"
+ }
+ },
+ "node_modules/noms/node_modules/readable-stream": {
+ "version": "1.0.34",
+ "resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-1.0.34.tgz",
+ "integrity": "sha512-ok1qVCJuRkNmvebYikljxJA/UEsKwLl2nI1OmaqAu4/UE+h0wKCHok4XkL/gvi39OacXvw59RJUOFUkDib2rHg==",
+ "dependencies": {
+ "core-util-is": "~1.0.0",
+ "inherits": "~2.0.1",
+ "isarray": "0.0.1",
+ "string_decoder": "~0.10.x"
+ }
+ },
+ "node_modules/noms/node_modules/string_decoder": {
+ "version": "0.10.31",
+ "resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz",
+ "integrity": "sha512-ev2QzSzWPYmy9GuqfIVildA4OdcGLeFZQrq5ys6RtiuF+RQQiZWr8TZNyAcuVXyQRYfEO+MsoB/1BuQVhOJuoQ=="
+ },
+ "node_modules/normalize-path": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/normalize-path/-/normalize-path-3.0.0.tgz",
+ "integrity": "sha512-6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/normalize-range": {
+ "version": "0.1.2",
+ "resolved": "https://registry.npmjs.org/normalize-range/-/normalize-range-0.1.2.tgz",
+ "integrity": "sha512-bdok/XvKII3nUpklnV6P2hxtMNrCboOjAcyBuQnWEhO665FwrSNRxU+AqpsyvO6LgGYPspN+lu5CLtw4jPRKNA==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/normalize-url": {
+ "version": "6.1.0",
+ "resolved": "https://registry.npmjs.org/normalize-url/-/normalize-url-6.1.0.tgz",
+ "integrity": "sha512-DlL+XwOy3NxAQ8xuC0okPgK46iuVNAK01YN7RueYBqqFeGsBjV9XmCAzAdgt+667bCl5kPh9EqKKDwnaPG1I7A==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/npm-run-path": {
+ "version": "4.0.1",
+ "resolved": "https://registry.npmjs.org/npm-run-path/-/npm-run-path-4.0.1.tgz",
+ "integrity": "sha512-S48WzZW777zhNIrn7gxOlISNAqi9ZC/uQFnRdbeIHhZhCA6UqpkOT8T1G7BvfdgP4Er8gF4sUbaS0i7QvIfCWw==",
+ "dependencies": {
+ "path-key": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/nprogress": {
+ "version": "0.2.0",
+ "resolved": "https://registry.npmjs.org/nprogress/-/nprogress-0.2.0.tgz",
+ "integrity": "sha512-I19aIingLgR1fmhftnbWWO3dXc0hSxqHQHQb3H8m+K3TnEn/iSeTZZOyvKXWqQESMwuUVnatlCnZdLBZZt2VSA=="
+ },
+ "node_modules/nth-check": {
+ "version": "2.1.1",
+ "resolved": "https://registry.npmjs.org/nth-check/-/nth-check-2.1.1.tgz",
+ "integrity": "sha512-lqjrjmaOoAnWfMmBPL+XNnynZh2+swxiX3WUE0s4yEHI6m+AwrK2UZOimIRl3X/4QctVqS8AiZjFqyOGrMXb/w==",
+ "dependencies": {
+ "boolbase": "^1.0.0"
+ },
+ "funding": {
+ "url": "https://github.com/fb55/nth-check?sponsor=1"
+ }
+ },
+ "node_modules/oas-kit-common": {
+ "version": "1.0.8",
+ "resolved": "https://registry.npmjs.org/oas-kit-common/-/oas-kit-common-1.0.8.tgz",
+ "integrity": "sha512-pJTS2+T0oGIwgjGpw7sIRU8RQMcUoKCDWFLdBqKB2BNmGpbBMH2sdqAaOXUg8OzonZHU0L7vfJu1mJFEiYDWOQ==",
+ "dependencies": {
+ "fast-safe-stringify": "^2.0.7"
+ }
+ },
+ "node_modules/oas-linter": {
+ "version": "3.2.2",
+ "resolved": "https://registry.npmjs.org/oas-linter/-/oas-linter-3.2.2.tgz",
+ "integrity": "sha512-KEGjPDVoU5K6swgo9hJVA/qYGlwfbFx+Kg2QB/kd7rzV5N8N5Mg6PlsoCMohVnQmo+pzJap/F610qTodKzecGQ==",
+ "dependencies": {
+ "@exodus/schemasafe": "^1.0.0-rc.2",
+ "should": "^13.2.1",
+ "yaml": "^1.10.0"
+ },
+ "funding": {
+ "url": "https://github.com/Mermade/oas-kit?sponsor=1"
+ }
+ },
+ "node_modules/oas-resolver": {
+ "version": "2.5.6",
+ "resolved": "https://registry.npmjs.org/oas-resolver/-/oas-resolver-2.5.6.tgz",
+ "integrity": "sha512-Yx5PWQNZomfEhPPOphFbZKi9W93CocQj18NlD2Pa4GWZzdZpSJvYwoiuurRI7m3SpcChrnO08hkuQDL3FGsVFQ==",
+ "dependencies": {
+ "node-fetch-h2": "^2.3.0",
+ "oas-kit-common": "^1.0.8",
+ "reftools": "^1.1.9",
+ "yaml": "^1.10.0",
+ "yargs": "^17.0.1"
+ },
+ "bin": {
+ "resolve": "resolve.js"
+ },
+ "funding": {
+ "url": "https://github.com/Mermade/oas-kit?sponsor=1"
+ }
+ },
+ "node_modules/oas-resolver/node_modules/cliui": {
+ "version": "8.0.1",
+ "resolved": "https://registry.npmjs.org/cliui/-/cliui-8.0.1.tgz",
+ "integrity": "sha512-BSeNnyus75C4//NQ9gQt1/csTXyo/8Sb+afLAkzAptFuMsod9HFokGNudZpi/oQV73hnVK+sR+5PVRMd+Dr7YQ==",
+ "dependencies": {
+ "string-width": "^4.2.0",
+ "strip-ansi": "^6.0.1",
+ "wrap-ansi": "^7.0.0"
+ },
+ "engines": {
+ "node": ">=12"
+ }
+ },
+ "node_modules/oas-resolver/node_modules/emoji-regex": {
+ "version": "8.0.0",
+ "resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
+ "integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="
+ },
+ "node_modules/oas-resolver/node_modules/string-width": {
+ "version": "4.2.3",
+ "resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
+ "integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
+ "dependencies": {
+ "emoji-regex": "^8.0.0",
+ "is-fullwidth-code-point": "^3.0.0",
+ "strip-ansi": "^6.0.1"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/oas-resolver/node_modules/wrap-ansi": {
+ "version": "7.0.0",
+ "resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz",
+ "integrity": "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==",
+ "dependencies": {
+ "ansi-styles": "^4.0.0",
+ "string-width": "^4.1.0",
+ "strip-ansi": "^6.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/wrap-ansi?sponsor=1"
+ }
+ },
+ "node_modules/oas-resolver/node_modules/yargs": {
+ "version": "17.7.1",
+ "resolved": "https://registry.npmjs.org/yargs/-/yargs-17.7.1.tgz",
+ "integrity": "sha512-cwiTb08Xuv5fqF4AovYacTFNxk62th7LKJ6BL9IGUpTJrWoU7/7WdQGTP2SjKf1dUNBGzDd28p/Yfs/GI6JrLw==",
+ "dependencies": {
+ "cliui": "^8.0.1",
+ "escalade": "^3.1.1",
+ "get-caller-file": "^2.0.5",
+ "require-directory": "^2.1.1",
+ "string-width": "^4.2.3",
+ "y18n": "^5.0.5",
+ "yargs-parser": "^21.1.1"
+ },
+ "engines": {
+ "node": ">=12"
+ }
+ },
+ "node_modules/oas-resolver/node_modules/yargs-parser": {
+ "version": "21.1.1",
+ "resolved": "https://registry.npmjs.org/yargs-parser/-/yargs-parser-21.1.1.tgz",
+ "integrity": "sha512-tVpsJW7DdjecAiFpbIB1e3qxIQsE6NoPc5/eTdrbbIC4h0LVsWhnoa3g+m2HclBIujHzsxZ4VJVA+GUuc2/LBw==",
+ "engines": {
+ "node": ">=12"
+ }
+ },
+ "node_modules/oas-schema-walker": {
+ "version": "1.1.5",
+ "resolved": "https://registry.npmjs.org/oas-schema-walker/-/oas-schema-walker-1.1.5.tgz",
+ "integrity": "sha512-2yucenq1a9YPmeNExoUa9Qwrt9RFkjqaMAA1X+U7sbb0AqBeTIdMHky9SQQ6iN94bO5NW0W4TRYXerG+BdAvAQ==",
+ "funding": {
+ "url": "https://github.com/Mermade/oas-kit?sponsor=1"
+ }
+ },
+ "node_modules/oas-validator": {
+ "version": "5.0.8",
+ "resolved": "https://registry.npmjs.org/oas-validator/-/oas-validator-5.0.8.tgz",
+ "integrity": "sha512-cu20/HE5N5HKqVygs3dt94eYJfBi0TsZvPVXDhbXQHiEityDN+RROTleefoKRKKJ9dFAF2JBkDHgvWj0sjKGmw==",
+ "dependencies": {
+ "call-me-maybe": "^1.0.1",
+ "oas-kit-common": "^1.0.8",
+ "oas-linter": "^3.2.2",
+ "oas-resolver": "^2.5.6",
+ "oas-schema-walker": "^1.1.5",
+ "reftools": "^1.1.9",
+ "should": "^13.2.1",
+ "yaml": "^1.10.0"
+ },
+ "funding": {
+ "url": "https://github.com/Mermade/oas-kit?sponsor=1"
+ }
+ },
+ "node_modules/object-assign": {
+ "version": "4.1.1",
+ "resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz",
+ "integrity": "sha512-rJgTQnkUnH1sFw8yT6VSU3zD3sWmu6sZhIseY8VX+GRu3P6F7Fu+JNDoXfklElbLJSnc3FUQHVe4cU5hj+BcUg==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/object-inspect": {
+ "version": "1.13.2",
+ "resolved": "https://registry.npmjs.org/object-inspect/-/object-inspect-1.13.2.tgz",
+ "integrity": "sha512-IRZSRuzJiynemAXPYtPe5BoI/RESNYR7TYm50MC5Mqbd3Jmw5y790sErYw3V6SryFJD64b74qQQs9wn5Bg/k3g==",
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/object-keys": {
+ "version": "1.1.1",
+ "resolved": "https://registry.npmjs.org/object-keys/-/object-keys-1.1.1.tgz",
+ "integrity": "sha512-NuAESUOUMrlIXOfHKzD6bpPu3tYt3xvjNdRIQ+FeT0lNb4K8WR70CaDxhuNguS2XG+GjkyMwOzsN5ZktImfhLA==",
+ "engines": {
+ "node": ">= 0.4"
+ }
+ },
+ "node_modules/object.assign": {
+ "version": "4.1.4",
+ "resolved": "https://registry.npmjs.org/object.assign/-/object.assign-4.1.4.tgz",
+ "integrity": "sha512-1mxKf0e58bvyjSCtKYY4sRe9itRk3PJpquJOjeIkz885CczcI4IvJJDLPS72oowuSh+pBxUFROpX+TU++hxhZQ==",
+ "dependencies": {
+ "call-bind": "^1.0.2",
+ "define-properties": "^1.1.4",
+ "has-symbols": "^1.0.3",
+ "object-keys": "^1.1.1"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/obuf": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmjs.org/obuf/-/obuf-1.1.2.tgz",
+ "integrity": "sha512-PX1wu0AmAdPqOL1mWhqmlOd8kOIZQwGZw6rh7uby9fTc5lhaOWFLX3I6R1hrF9k3zUY40e6igsLGkDXK92LJNg=="
+ },
+ "node_modules/on-finished": {
+ "version": "2.4.1",
+ "resolved": "https://registry.npmjs.org/on-finished/-/on-finished-2.4.1.tgz",
+ "integrity": "sha512-oVlzkg3ENAhCk2zdv7IJwd/QUD4z2RxRwpkcGY8psCVcCYZNq4wYnVWALHM+brtuJjePWiYF/ClmuDr8Ch5+kg==",
+ "dependencies": {
+ "ee-first": "1.1.1"
+ },
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/on-headers": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/on-headers/-/on-headers-1.1.0.tgz",
+ "integrity": "sha512-737ZY3yNnXy37FHkQxPzt4UZ2UWPWiCZWLvFZ4fu5cueciegX0zGPnrlY6bwRg4FdQOe9YU8MkmJwGhoMybl8A==",
+ "license": "MIT",
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/once": {
+ "version": "1.4.0",
+ "resolved": "https://registry.npmjs.org/once/-/once-1.4.0.tgz",
+ "integrity": "sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w==",
+ "dependencies": {
+ "wrappy": "1"
+ }
+ },
+ "node_modules/onetime": {
+ "version": "5.1.2",
+ "resolved": "https://registry.npmjs.org/onetime/-/onetime-5.1.2.tgz",
+ "integrity": "sha512-kbpaSSGJTWdAY5KPVeMOKXSrPtr8C8C7wodJbcsd51jRnmD+GZu8Y0VoU6Dm5Z4vWr0Ig/1NKuWRKf7j5aaYSg==",
+ "dependencies": {
+ "mimic-fn": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=6"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/open": {
+ "version": "8.4.2",
+ "resolved": "https://registry.npmjs.org/open/-/open-8.4.2.tgz",
+ "integrity": "sha512-7x81NCL719oNbsq/3mh+hVrAWmFuEYUqrq/Iw3kUzH8ReypT9QQ0BLoJS7/G9k6N81XjW4qHWtjWwe/9eLy1EQ==",
+ "dependencies": {
+ "define-lazy-prop": "^2.0.0",
+ "is-docker": "^2.1.1",
+ "is-wsl": "^2.2.0"
+ },
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/openapi-sampler": {
+ "version": "1.3.1",
+ "resolved": "https://registry.npmjs.org/openapi-sampler/-/openapi-sampler-1.3.1.tgz",
+ "integrity": "sha512-Ert9mvc2tLPmmInwSyGZS+v4Ogu9/YoZuq9oP3EdUklg2cad6+IGndP9yqJJwbgdXwZibiq5fpv6vYujchdJFg==",
+ "dependencies": {
+ "@types/json-schema": "^7.0.7",
+ "json-pointer": "0.6.2"
+ }
+ },
+ "node_modules/opener": {
+ "version": "1.5.2",
+ "resolved": "https://registry.npmjs.org/opener/-/opener-1.5.2.tgz",
+ "integrity": "sha512-ur5UIdyw5Y7yEj9wLzhqXiy6GZ3Mwx0yGI+5sMn2r0N0v3cKJvUmFH5yPP+WXh9e0xfyzyJX95D8l088DNFj7A==",
+ "bin": {
+ "opener": "bin/opener-bin.js"
+ }
+ },
+ "node_modules/p-cancelable": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/p-cancelable/-/p-cancelable-1.1.0.tgz",
+ "integrity": "sha512-s73XxOZ4zpt1edZYZzvhqFa6uvQc1vwUa0K0BdtIZgQMAJj9IbebH+JkgKZc9h+B05PKHLOTl4ajG1BmNrVZlw==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/p-limit": {
+ "version": "2.3.0",
+ "resolved": "https://registry.npmjs.org/p-limit/-/p-limit-2.3.0.tgz",
+ "integrity": "sha512-//88mFWSJx8lxCzwdAABTJL2MyWB12+eIY7MDL2SqLmAkeKU9qxRvWuSyTjm3FUmpBEMuFfckAIqEaVGUDxb6w==",
+ "dependencies": {
+ "p-try": "^2.0.0"
+ },
+ "engines": {
+ "node": ">=6"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/p-locate": {
+ "version": "4.1.0",
+ "resolved": "https://registry.npmjs.org/p-locate/-/p-locate-4.1.0.tgz",
+ "integrity": "sha512-R79ZZ/0wAxKGu3oYMlz8jy/kbhsNrS7SKZ7PxEHBgJ5+F2mtFW2fK2cOtBh1cHYkQsbzFV7I+EoRKe6Yt0oK7A==",
+ "dependencies": {
+ "p-limit": "^2.2.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/p-map": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/p-map/-/p-map-4.0.0.tgz",
+ "integrity": "sha512-/bjOqmgETBYB5BoEeGVea8dmvHb2m9GLy1E9W43yeyfP6QQCZGFNa+XRceJEuDB6zqr+gKpIAmlLebMpykw/MQ==",
+ "dependencies": {
+ "aggregate-error": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/p-retry": {
+ "version": "4.6.2",
+ "resolved": "https://registry.npmjs.org/p-retry/-/p-retry-4.6.2.tgz",
+ "integrity": "sha512-312Id396EbJdvRONlngUx0NydfrIQ5lsYu0znKVUzVvArzEIt08V1qhtyESbGVd1FGX7UKtiFp5uwKZdM8wIuQ==",
+ "dependencies": {
+ "@types/retry": "0.12.0",
+ "retry": "^0.13.1"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/p-try": {
+ "version": "2.2.0",
+ "resolved": "https://registry.npmjs.org/p-try/-/p-try-2.2.0.tgz",
+ "integrity": "sha512-R4nPAVTAU0B9D35/Gk3uJf/7XYbQcyohSKdvAxIRSNghFl4e71hVoGnBNQz9cWaXxO2I10KTC+3jMdvvoKw6dQ==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/package-json": {
+ "version": "6.5.0",
+ "resolved": "https://registry.npmjs.org/package-json/-/package-json-6.5.0.tgz",
+ "integrity": "sha512-k3bdm2n25tkyxcjSKzB5x8kfVxlMdgsbPr0GkZcwHsLpba6cBjqCt1KlcChKEvxHIcTB1FVMuwoijZ26xex5MQ==",
+ "dependencies": {
+ "got": "^9.6.0",
+ "registry-auth-token": "^4.0.0",
+ "registry-url": "^5.0.0",
+ "semver": "^6.2.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/package-json/node_modules/semver": {
+ "version": "6.3.1",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
+ "integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==",
+ "bin": {
+ "semver": "bin/semver.js"
+ }
+ },
+ "node_modules/param-case": {
+ "version": "3.0.4",
+ "resolved": "https://registry.npmjs.org/param-case/-/param-case-3.0.4.tgz",
+ "integrity": "sha512-RXlj7zCYokReqWpOPH9oYivUzLYZ5vAPIfEmCTNViosC78F8F0H9y7T7gG2M39ymgutxF5gcFEsyZQSph9Bp3A==",
+ "dependencies": {
+ "dot-case": "^3.0.4",
+ "tslib": "^2.0.3"
+ }
+ },
+ "node_modules/parent-module": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/parent-module/-/parent-module-1.0.1.tgz",
+ "integrity": "sha512-GQ2EWRpQV8/o+Aw8YqtfZZPfNRWZYkbidE9k5rpl/hC3vtHHBfGm2Ifi6qWV+coDGkrUKZAxE3Lot5kcsRlh+g==",
+ "dependencies": {
+ "callsites": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/parse-entities": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/parse-entities/-/parse-entities-2.0.0.tgz",
+ "integrity": "sha512-kkywGpCcRYhqQIchaWqZ875wzpS/bMKhz5HnN3p7wveJTkTtyAB/AlnS0f8DFSqYW1T82t6yEAkEcB+A1I3MbQ==",
+ "dependencies": {
+ "character-entities": "^1.0.0",
+ "character-entities-legacy": "^1.0.0",
+ "character-reference-invalid": "^1.0.0",
+ "is-alphanumerical": "^1.0.0",
+ "is-decimal": "^1.0.0",
+ "is-hexadecimal": "^1.0.0"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/parse-json": {
+ "version": "5.2.0",
+ "resolved": "https://registry.npmjs.org/parse-json/-/parse-json-5.2.0.tgz",
+ "integrity": "sha512-ayCKvm/phCGxOkYRSCM82iDwct8/EonSEgCSxWxD7ve6jHggsFl4fZVQBPRNgQoKiuV/odhFrGzQXZwbifC8Rg==",
+ "dependencies": {
+ "@babel/code-frame": "^7.0.0",
+ "error-ex": "^1.3.1",
+ "json-parse-even-better-errors": "^2.3.0",
+ "lines-and-columns": "^1.1.6"
+ },
+ "engines": {
+ "node": ">=8"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/parse-numeric-range": {
+ "version": "1.3.0",
+ "resolved": "https://registry.npmjs.org/parse-numeric-range/-/parse-numeric-range-1.3.0.tgz",
+ "integrity": "sha512-twN+njEipszzlMJd4ONUYgSfZPDxgHhT9Ahed5uTigpQn90FggW4SA/AIPq/6a149fTbE9qBEcSwE3FAEp6wQQ=="
+ },
+ "node_modules/parse5": {
+ "version": "7.1.2",
+ "resolved": "https://registry.npmjs.org/parse5/-/parse5-7.1.2.tgz",
+ "integrity": "sha512-Czj1WaSVpaoj0wbhMzLmWD69anp2WH7FXMB9n1Sy8/ZFF9jolSQVMu1Ij5WIyGmcBmhk7EOndpO4mIpihVqAXw==",
+ "dependencies": {
+ "entities": "^4.4.0"
+ },
+ "funding": {
+ "url": "https://github.com/inikulin/parse5?sponsor=1"
+ }
+ },
+ "node_modules/parse5-htmlparser2-tree-adapter": {
+ "version": "7.0.0",
+ "resolved": "https://registry.npmjs.org/parse5-htmlparser2-tree-adapter/-/parse5-htmlparser2-tree-adapter-7.0.0.tgz",
+ "integrity": "sha512-B77tOZrqqfUfnVcOrUvfdLbz4pu4RopLD/4vmu3HUPswwTA8OH0EMW9BlWR2B0RCoiZRAHEUu7IxeP1Pd1UU+g==",
+ "dependencies": {
+ "domhandler": "^5.0.2",
+ "parse5": "^7.0.0"
+ },
+ "funding": {
+ "url": "https://github.com/inikulin/parse5?sponsor=1"
+ }
+ },
+ "node_modules/parseurl": {
+ "version": "1.3.3",
+ "resolved": "https://registry.npmjs.org/parseurl/-/parseurl-1.3.3.tgz",
+ "integrity": "sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ==",
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/pascal-case": {
+ "version": "3.1.2",
+ "resolved": "https://registry.npmjs.org/pascal-case/-/pascal-case-3.1.2.tgz",
+ "integrity": "sha512-uWlGT3YSnK9x3BQJaOdcZwrnV6hPpd8jFH1/ucpiLRPh/2zCVJKS19E4GvYHvaCcACn3foXZ0cLB9Wrx1KGe5g==",
+ "dependencies": {
+ "no-case": "^3.0.4",
+ "tslib": "^2.0.3"
+ }
+ },
+ "node_modules/path-browserify": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/path-browserify/-/path-browserify-1.0.1.tgz",
+ "integrity": "sha512-b7uo2UCUOYZcnF/3ID0lulOJi/bafxa1xPe7ZPsammBSpjSWQkjNxlt635YGS2MiR9GjvuXCtz2emr3jbsz98g=="
+ },
+ "node_modules/path-exists": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/path-exists/-/path-exists-4.0.0.tgz",
+ "integrity": "sha512-ak9Qy5Q7jYb2Wwcey5Fpvg2KoAc/ZIhLSLOSBmRmygPsGwkVVt0fZa0qrtMz+m6tJTAHfZQ8FnmB4MG4LWy7/w==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/path-is-absolute": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/path-is-absolute/-/path-is-absolute-1.0.1.tgz",
+ "integrity": "sha512-AVbw3UJ2e9bq64vSaS9Am0fje1Pa8pbGqTTsmXfaIiMpnr5DlDhfJOuLj9Sf95ZPVDAUerDfEk88MPmPe7UCQg==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/path-is-inside": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/path-is-inside/-/path-is-inside-1.0.2.tgz",
+ "integrity": "sha512-DUWJr3+ULp4zXmol/SZkFf3JGsS9/SIv+Y3Rt93/UjPpDpklB5f1er4O3POIbUuUJ3FXgqte2Q7SrU6zAqwk8w=="
+ },
+ "node_modules/path-key": {
+ "version": "3.1.1",
+ "resolved": "https://registry.npmjs.org/path-key/-/path-key-3.1.1.tgz",
+ "integrity": "sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/path-parse": {
+ "version": "1.0.7",
+ "resolved": "https://registry.npmjs.org/path-parse/-/path-parse-1.0.7.tgz",
+ "integrity": "sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw=="
+ },
+ "node_modules/path-to-regexp": {
+ "version": "1.8.0",
+ "resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-1.8.0.tgz",
+ "integrity": "sha512-n43JRhlUKUAlibEJhPeir1ncUID16QnEjNpwzNdO3Lm4ywrBpBZ5oLD0I6br9evr1Y9JTqwRtAh7JLoOzAQdVA==",
+ "dependencies": {
+ "isarray": "0.0.1"
+ }
+ },
+ "node_modules/path-type": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/path-type/-/path-type-4.0.0.tgz",
+ "integrity": "sha512-gDKb8aZMDeD/tZWs9P6+q0J9Mwkdl6xMV8TjnGP3qJVJ06bdMgkbBlLU8IdfOsIsFz2BW1rNVT3XuNEl8zPAvw==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/perfect-scrollbar": {
+ "version": "1.5.5",
+ "resolved": "https://registry.npmjs.org/perfect-scrollbar/-/perfect-scrollbar-1.5.5.tgz",
+ "integrity": "sha512-dzalfutyP3e/FOpdlhVryN4AJ5XDVauVWxybSkLZmakFE2sS3y3pc4JnSprw8tGmHvkaG5Edr5T7LBTZ+WWU2g=="
+ },
+ "node_modules/picocolors": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.0.1.tgz",
+ "integrity": "sha512-anP1Z8qwhkbmu7MFP5iTt+wQKXgwzf7zTyGlcdzabySa9vd0Xt392U0rVmz9poOaBj0uHJKyyo9/upk0HrEQew=="
+ },
+ "node_modules/picomatch": {
+ "version": "2.3.1",
+ "resolved": "https://registry.npmjs.org/picomatch/-/picomatch-2.3.1.tgz",
+ "integrity": "sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==",
+ "engines": {
+ "node": ">=8.6"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/jonschlinkert"
+ }
+ },
+ "node_modules/pkg-dir": {
+ "version": "4.2.0",
+ "resolved": "https://registry.npmjs.org/pkg-dir/-/pkg-dir-4.2.0.tgz",
+ "integrity": "sha512-HRDzbaKjC+AOWVXxAU/x54COGeIv9eb+6CkDSQoNTt4XyWoIJvuPsXizxu/Fr23EiekbtZwmh1IcIG/l/a10GQ==",
+ "dependencies": {
+ "find-up": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/pkg-up": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/pkg-up/-/pkg-up-3.1.0.tgz",
+ "integrity": "sha512-nDywThFk1i4BQK4twPQ6TA4RT8bDY96yeuCVBWL3ePARCiEKDRSrNGbFIgUJpLp+XeIR65v8ra7WuJOFUBtkMA==",
+ "dependencies": {
+ "find-up": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/pkg-up/node_modules/find-up": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/find-up/-/find-up-3.0.0.tgz",
+ "integrity": "sha512-1yD6RmLI1XBfxugvORwlck6f75tYL+iR0jqwsOrOxMZyGYqUuDhJ0l4AXdO1iX/FTs9cBAMEk1gWSEx1kSbylg==",
+ "dependencies": {
+ "locate-path": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/pkg-up/node_modules/locate-path": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/locate-path/-/locate-path-3.0.0.tgz",
+ "integrity": "sha512-7AO748wWnIhNqAuaty2ZWHkQHRSNfPVIsPIfwEOWO22AmaoVrWavlOcMR5nzTLNYvp36X220/maaRsrec1G65A==",
+ "dependencies": {
+ "p-locate": "^3.0.0",
+ "path-exists": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/pkg-up/node_modules/p-locate": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/p-locate/-/p-locate-3.0.0.tgz",
+ "integrity": "sha512-x+12w/To+4GFfgJhBEpiDcLozRJGegY+Ei7/z0tSLkMmxGZNybVMSfWj9aJn8Z5Fc7dBUNJOOVgPv2H7IwulSQ==",
+ "dependencies": {
+ "p-limit": "^2.0.0"
+ },
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/pkg-up/node_modules/path-exists": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/path-exists/-/path-exists-3.0.0.tgz",
+ "integrity": "sha512-bpC7GYwiDYQ4wYLe+FA8lhRjhQCMcQGuSgGGqDkg/QerRWw9CmGRT0iSOVRSZJ29NMLZgIzqaljJ63oaL4NIJQ==",
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/pluralize": {
+ "version": "8.0.0",
+ "resolved": "https://registry.npmjs.org/pluralize/-/pluralize-8.0.0.tgz",
+ "integrity": "sha512-Nc3IT5yHzflTfbjgqWcCPpo7DaKy4FnpB0l/zCAW0Tc7jxAiuqSxHasntB3D7887LSrA93kDJ9IXovxJYxyLCA==",
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/polished": {
+ "version": "4.2.2",
+ "resolved": "https://registry.npmjs.org/polished/-/polished-4.2.2.tgz",
+ "integrity": "sha512-Sz2Lkdxz6F2Pgnpi9U5Ng/WdWAUZxmHrNPoVlm3aAemxoy2Qy7LGjQg4uf8qKelDAUW94F4np3iH2YPf2qefcQ==",
+ "dependencies": {
+ "@babel/runtime": "^7.17.8"
+ },
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/postcss": {
+ "version": "8.4.31",
+ "resolved": "https://registry.npmjs.org/postcss/-/postcss-8.4.31.tgz",
+ "integrity": "sha512-PS08Iboia9mts/2ygV3eLpY5ghnUcfLV/EXTOW1E2qYxJKGGBUtNjN76FYHnMs36RmARn41bC0AZmn+rR0OVpQ==",
+ "funding": [
+ {
+ "type": "opencollective",
+ "url": "https://opencollective.com/postcss/"
+ },
+ {
+ "type": "tidelift",
+ "url": "https://tidelift.com/funding/github/npm/postcss"
+ },
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/ai"
+ }
+ ],
+ "dependencies": {
+ "nanoid": "^3.3.6",
+ "picocolors": "^1.0.0",
+ "source-map-js": "^1.0.2"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14"
+ }
+ },
+ "node_modules/postcss-calc": {
+ "version": "8.2.4",
+ "resolved": "https://registry.npmjs.org/postcss-calc/-/postcss-calc-8.2.4.tgz",
+ "integrity": "sha512-SmWMSJmB8MRnnULldx0lQIyhSNvuDl9HfrZkaqqE/WHAhToYsAvDq+yAsA/kIyINDszOp3Rh0GFoNuH5Ypsm3Q==",
+ "dependencies": {
+ "postcss-selector-parser": "^6.0.9",
+ "postcss-value-parser": "^4.2.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.2"
+ }
+ },
+ "node_modules/postcss-colormin": {
+ "version": "5.3.1",
+ "resolved": "https://registry.npmjs.org/postcss-colormin/-/postcss-colormin-5.3.1.tgz",
+ "integrity": "sha512-UsWQG0AqTFQmpBegeLLc1+c3jIqBNB0zlDGRWR+dQ3pRKJL1oeMzyqmH3o2PIfn9MBdNrVPWhDbT769LxCTLJQ==",
+ "dependencies": {
+ "browserslist": "^4.21.4",
+ "caniuse-api": "^3.0.0",
+ "colord": "^2.9.1",
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-convert-values": {
+ "version": "5.1.3",
+ "resolved": "https://registry.npmjs.org/postcss-convert-values/-/postcss-convert-values-5.1.3.tgz",
+ "integrity": "sha512-82pC1xkJZtcJEfiLw6UXnXVXScgtBrjlO5CBmuDQc+dlb88ZYheFsjTn40+zBVi3DkfF7iezO0nJUPLcJK3pvA==",
+ "dependencies": {
+ "browserslist": "^4.21.4",
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-discard-comments": {
+ "version": "5.1.2",
+ "resolved": "https://registry.npmjs.org/postcss-discard-comments/-/postcss-discard-comments-5.1.2.tgz",
+ "integrity": "sha512-+L8208OVbHVF2UQf1iDmRcbdjJkuBF6IS29yBDSiWUIzpYaAhtNl6JYnYm12FnkeCwQqF5LeklOu6rAqgfBZqQ==",
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-discard-duplicates": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/postcss-discard-duplicates/-/postcss-discard-duplicates-5.1.0.tgz",
+ "integrity": "sha512-zmX3IoSI2aoenxHV6C7plngHWWhUOV3sP1T8y2ifzxzbtnuhk1EdPwm0S1bIUNaJ2eNbWeGLEwzw8huPD67aQw==",
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-discard-empty": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/postcss-discard-empty/-/postcss-discard-empty-5.1.1.tgz",
+ "integrity": "sha512-zPz4WljiSuLWsI0ir4Mcnr4qQQ5e1Ukc3i7UfE2XcrwKK2LIPIqE5jxMRxO6GbI3cv//ztXDsXwEWT3BHOGh3A==",
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-discard-overridden": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/postcss-discard-overridden/-/postcss-discard-overridden-5.1.0.tgz",
+ "integrity": "sha512-21nOL7RqWR1kasIVdKs8HNqQJhFxLsyRfAnUDm4Fe4t4mCWL9OJiHvlHPjcd8zc5Myu89b/7wZDnOSjFgeWRtw==",
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-discard-unused": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/postcss-discard-unused/-/postcss-discard-unused-5.1.0.tgz",
+ "integrity": "sha512-KwLWymI9hbwXmJa0dkrzpRbSJEh0vVUd7r8t0yOGPcfKzyJJxFM8kLyC5Ev9avji6nY95pOp1W6HqIrfT+0VGw==",
+ "dependencies": {
+ "postcss-selector-parser": "^6.0.5"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-loader": {
+ "version": "7.1.0",
+ "resolved": "https://registry.npmjs.org/postcss-loader/-/postcss-loader-7.1.0.tgz",
+ "integrity": "sha512-vTD2DJ8vJD0Vr1WzMQkRZWRjcynGh3t7NeoLg+Sb1TeuK7etiZfL/ZwHbaVa3M+Qni7Lj/29voV9IggnIUjlIw==",
+ "dependencies": {
+ "cosmiconfig": "^8.0.0",
+ "klona": "^2.0.6",
+ "semver": "^7.3.8"
+ },
+ "engines": {
+ "node": ">= 14.15.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ },
+ "peerDependencies": {
+ "postcss": "^7.0.0 || ^8.0.1",
+ "webpack": "^5.0.0"
+ }
+ },
+ "node_modules/postcss-loader/node_modules/cosmiconfig": {
+ "version": "8.1.3",
+ "resolved": "https://registry.npmjs.org/cosmiconfig/-/cosmiconfig-8.1.3.tgz",
+ "integrity": "sha512-/UkO2JKI18b5jVMJUp0lvKFMpa/Gye+ZgZjKD+DGEN9y7NRcf/nK1A0sp67ONmKtnDCNMS44E6jrk0Yc3bDuUw==",
+ "dependencies": {
+ "import-fresh": "^3.2.1",
+ "js-yaml": "^4.1.0",
+ "parse-json": "^5.0.0",
+ "path-type": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=14"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/d-fischer"
+ }
+ },
+ "node_modules/postcss-merge-idents": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/postcss-merge-idents/-/postcss-merge-idents-5.1.1.tgz",
+ "integrity": "sha512-pCijL1TREiCoog5nQp7wUe+TUonA2tC2sQ54UGeMmryK3UFGIYKqDyjnqd6RcuI4znFn9hWSLNN8xKE/vWcUQw==",
+ "dependencies": {
+ "cssnano-utils": "^3.1.0",
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-merge-longhand": {
+ "version": "5.1.7",
+ "resolved": "https://registry.npmjs.org/postcss-merge-longhand/-/postcss-merge-longhand-5.1.7.tgz",
+ "integrity": "sha512-YCI9gZB+PLNskrK0BB3/2OzPnGhPkBEwmwhfYk1ilBHYVAZB7/tkTHFBAnCrvBBOmeYyMYw3DMjT55SyxMBzjQ==",
+ "dependencies": {
+ "postcss-value-parser": "^4.2.0",
+ "stylehacks": "^5.1.1"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-merge-rules": {
+ "version": "5.1.4",
+ "resolved": "https://registry.npmjs.org/postcss-merge-rules/-/postcss-merge-rules-5.1.4.tgz",
+ "integrity": "sha512-0R2IuYpgU93y9lhVbO/OylTtKMVcHb67zjWIfCiKR9rWL3GUk1677LAqD/BcHizukdZEjT8Ru3oHRoAYoJy44g==",
+ "dependencies": {
+ "browserslist": "^4.21.4",
+ "caniuse-api": "^3.0.0",
+ "cssnano-utils": "^3.1.0",
+ "postcss-selector-parser": "^6.0.5"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-minify-font-values": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/postcss-minify-font-values/-/postcss-minify-font-values-5.1.0.tgz",
+ "integrity": "sha512-el3mYTgx13ZAPPirSVsHqFzl+BBBDrXvbySvPGFnQcTI4iNslrPaFq4muTkLZmKlGk4gyFAYUBMH30+HurREyA==",
+ "dependencies": {
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-minify-gradients": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/postcss-minify-gradients/-/postcss-minify-gradients-5.1.1.tgz",
+ "integrity": "sha512-VGvXMTpCEo4qHTNSa9A0a3D+dxGFZCYwR6Jokk+/3oB6flu2/PnPXAh2x7x52EkY5xlIHLm+Le8tJxe/7TNhzw==",
+ "dependencies": {
+ "colord": "^2.9.1",
+ "cssnano-utils": "^3.1.0",
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-minify-params": {
+ "version": "5.1.4",
+ "resolved": "https://registry.npmjs.org/postcss-minify-params/-/postcss-minify-params-5.1.4.tgz",
+ "integrity": "sha512-+mePA3MgdmVmv6g+30rn57USjOGSAyuxUmkfiWpzalZ8aiBkdPYjXWtHuwJGm1v5Ojy0Z0LaSYhHaLJQB0P8Jw==",
+ "dependencies": {
+ "browserslist": "^4.21.4",
+ "cssnano-utils": "^3.1.0",
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-minify-selectors": {
+ "version": "5.2.1",
+ "resolved": "https://registry.npmjs.org/postcss-minify-selectors/-/postcss-minify-selectors-5.2.1.tgz",
+ "integrity": "sha512-nPJu7OjZJTsVUmPdm2TcaiohIwxP+v8ha9NehQ2ye9szv4orirRU3SDdtUmKH+10nzn0bAyOXZ0UEr7OpvLehg==",
+ "dependencies": {
+ "postcss-selector-parser": "^6.0.5"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-modules-extract-imports": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/postcss-modules-extract-imports/-/postcss-modules-extract-imports-3.0.0.tgz",
+ "integrity": "sha512-bdHleFnP3kZ4NYDhuGlVK+CMrQ/pqUm8bx/oGL93K6gVwiclvX5x0n76fYMKuIGKzlABOy13zsvqjb0f92TEXw==",
+ "engines": {
+ "node": "^10 || ^12 || >= 14"
+ },
+ "peerDependencies": {
+ "postcss": "^8.1.0"
+ }
+ },
+ "node_modules/postcss-modules-local-by-default": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/postcss-modules-local-by-default/-/postcss-modules-local-by-default-4.0.0.tgz",
+ "integrity": "sha512-sT7ihtmGSF9yhm6ggikHdV0hlziDTX7oFoXtuVWeDd3hHObNkcHRo9V3yg7vCAY7cONyxJC/XXCmmiHHcvX7bQ==",
+ "dependencies": {
+ "icss-utils": "^5.0.0",
+ "postcss-selector-parser": "^6.0.2",
+ "postcss-value-parser": "^4.1.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >= 14"
+ },
+ "peerDependencies": {
+ "postcss": "^8.1.0"
+ }
+ },
+ "node_modules/postcss-modules-scope": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/postcss-modules-scope/-/postcss-modules-scope-3.0.0.tgz",
+ "integrity": "sha512-hncihwFA2yPath8oZ15PZqvWGkWf+XUfQgUGamS4LqoP1anQLOsOJw0vr7J7IwLpoY9fatA2qiGUGmuZL0Iqlg==",
+ "dependencies": {
+ "postcss-selector-parser": "^6.0.4"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >= 14"
+ },
+ "peerDependencies": {
+ "postcss": "^8.1.0"
+ }
+ },
+ "node_modules/postcss-modules-values": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/postcss-modules-values/-/postcss-modules-values-4.0.0.tgz",
+ "integrity": "sha512-RDxHkAiEGI78gS2ofyvCsu7iycRv7oqw5xMWn9iMoR0N/7mf9D50ecQqUo5BZ9Zh2vH4bCUR/ktCqbB9m8vJjQ==",
+ "dependencies": {
+ "icss-utils": "^5.0.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >= 14"
+ },
+ "peerDependencies": {
+ "postcss": "^8.1.0"
+ }
+ },
+ "node_modules/postcss-normalize-charset": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/postcss-normalize-charset/-/postcss-normalize-charset-5.1.0.tgz",
+ "integrity": "sha512-mSgUJ+pd/ldRGVx26p2wz9dNZ7ji6Pn8VWBajMXFf8jk7vUoSrZ2lt/wZR7DtlZYKesmZI680qjr2CeFF2fbUg==",
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-normalize-display-values": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/postcss-normalize-display-values/-/postcss-normalize-display-values-5.1.0.tgz",
+ "integrity": "sha512-WP4KIM4o2dazQXWmFaqMmcvsKmhdINFblgSeRgn8BJ6vxaMyaJkwAzpPpuvSIoG/rmX3M+IrRZEz2H0glrQNEA==",
+ "dependencies": {
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-normalize-positions": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/postcss-normalize-positions/-/postcss-normalize-positions-5.1.1.tgz",
+ "integrity": "sha512-6UpCb0G4eofTCQLFVuI3EVNZzBNPiIKcA1AKVka+31fTVySphr3VUgAIULBhxZkKgwLImhzMR2Bw1ORK+37INg==",
+ "dependencies": {
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-normalize-repeat-style": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/postcss-normalize-repeat-style/-/postcss-normalize-repeat-style-5.1.1.tgz",
+ "integrity": "sha512-mFpLspGWkQtBcWIRFLmewo8aC3ImN2i/J3v8YCFUwDnPu3Xz4rLohDO26lGjwNsQxB3YF0KKRwspGzE2JEuS0g==",
+ "dependencies": {
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-normalize-string": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/postcss-normalize-string/-/postcss-normalize-string-5.1.0.tgz",
+ "integrity": "sha512-oYiIJOf4T9T1N4i+abeIc7Vgm/xPCGih4bZz5Nm0/ARVJ7K6xrDlLwvwqOydvyL3RHNf8qZk6vo3aatiw/go3w==",
+ "dependencies": {
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-normalize-timing-functions": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/postcss-normalize-timing-functions/-/postcss-normalize-timing-functions-5.1.0.tgz",
+ "integrity": "sha512-DOEkzJ4SAXv5xkHl0Wa9cZLF3WCBhF3o1SKVxKQAa+0pYKlueTpCgvkFAHfk+Y64ezX9+nITGrDZeVGgITJXjg==",
+ "dependencies": {
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-normalize-unicode": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/postcss-normalize-unicode/-/postcss-normalize-unicode-5.1.1.tgz",
+ "integrity": "sha512-qnCL5jzkNUmKVhZoENp1mJiGNPcsJCs1aaRmURmeJGES23Z/ajaln+EPTD+rBeNkSryI+2WTdW+lwcVdOikrpA==",
+ "dependencies": {
+ "browserslist": "^4.21.4",
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-normalize-url": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/postcss-normalize-url/-/postcss-normalize-url-5.1.0.tgz",
+ "integrity": "sha512-5upGeDO+PVthOxSmds43ZeMeZfKH+/DKgGRD7TElkkyS46JXAUhMzIKiCa7BabPeIy3AQcTkXwVVN7DbqsiCew==",
+ "dependencies": {
+ "normalize-url": "^6.0.1",
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-normalize-whitespace": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/postcss-normalize-whitespace/-/postcss-normalize-whitespace-5.1.1.tgz",
+ "integrity": "sha512-83ZJ4t3NUDETIHTa3uEg6asWjSBYL5EdkVB0sDncx9ERzOKBVJIUeDO9RyA9Zwtig8El1d79HBp0JEi8wvGQnA==",
+ "dependencies": {
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-ordered-values": {
+ "version": "5.1.3",
+ "resolved": "https://registry.npmjs.org/postcss-ordered-values/-/postcss-ordered-values-5.1.3.tgz",
+ "integrity": "sha512-9UO79VUhPwEkzbb3RNpqqghc6lcYej1aveQteWY+4POIwlqkYE21HKWaLDF6lWNuqCobEAyTovVhtI32Rbv2RQ==",
+ "dependencies": {
+ "cssnano-utils": "^3.1.0",
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-reduce-idents": {
+ "version": "5.2.0",
+ "resolved": "https://registry.npmjs.org/postcss-reduce-idents/-/postcss-reduce-idents-5.2.0.tgz",
+ "integrity": "sha512-BTrLjICoSB6gxbc58D5mdBK8OhXRDqud/zodYfdSi52qvDHdMwk+9kB9xsM8yJThH/sZU5A6QVSmMmaN001gIg==",
+ "dependencies": {
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-reduce-initial": {
+ "version": "5.1.2",
+ "resolved": "https://registry.npmjs.org/postcss-reduce-initial/-/postcss-reduce-initial-5.1.2.tgz",
+ "integrity": "sha512-dE/y2XRaqAi6OvjzD22pjTUQ8eOfc6m/natGHgKFBK9DxFmIm69YmaRVQrGgFlEfc1HePIurY0TmDeROK05rIg==",
+ "dependencies": {
+ "browserslist": "^4.21.4",
+ "caniuse-api": "^3.0.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-reduce-transforms": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/postcss-reduce-transforms/-/postcss-reduce-transforms-5.1.0.tgz",
+ "integrity": "sha512-2fbdbmgir5AvpW9RLtdONx1QoYG2/EtqpNQbFASDlixBbAYuTcJ0dECwlqNqH7VbaUnEnh8SrxOe2sRIn24XyQ==",
+ "dependencies": {
+ "postcss-value-parser": "^4.2.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-selector-parser": {
+ "version": "6.0.11",
+ "resolved": "https://registry.npmjs.org/postcss-selector-parser/-/postcss-selector-parser-6.0.11.tgz",
+ "integrity": "sha512-zbARubNdogI9j7WY4nQJBiNqQf3sLS3wCP4WfOidu+p28LofJqDH1tcXypGrcmMHhDk2t9wGhCsYe/+szLTy1g==",
+ "dependencies": {
+ "cssesc": "^3.0.0",
+ "util-deprecate": "^1.0.2"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/postcss-sort-media-queries": {
+ "version": "4.3.0",
+ "resolved": "https://registry.npmjs.org/postcss-sort-media-queries/-/postcss-sort-media-queries-4.3.0.tgz",
+ "integrity": "sha512-jAl8gJM2DvuIJiI9sL1CuiHtKM4s5aEIomkU8G3LFvbP+p8i7Sz8VV63uieTgoewGqKbi+hxBTiOKJlB35upCg==",
+ "dependencies": {
+ "sort-css-media-queries": "2.1.0"
+ },
+ "engines": {
+ "node": ">=10.0.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.4.16"
+ }
+ },
+ "node_modules/postcss-svgo": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/postcss-svgo/-/postcss-svgo-5.1.0.tgz",
+ "integrity": "sha512-D75KsH1zm5ZrHyxPakAxJWtkyXew5qwS70v56exwvw542d9CRtTo78K0WeFxZB4G7JXKKMbEZtZayTGdIky/eA==",
+ "dependencies": {
+ "postcss-value-parser": "^4.2.0",
+ "svgo": "^2.7.0"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-unique-selectors": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/postcss-unique-selectors/-/postcss-unique-selectors-5.1.1.tgz",
+ "integrity": "sha512-5JiODlELrz8L2HwxfPnhOWZYWDxVHWL83ufOv84NrcgipI7TaeRsatAhK4Tr2/ZiYldpK/wBvw5BD3qfaK96GA==",
+ "dependencies": {
+ "postcss-selector-parser": "^6.0.5"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/postcss-value-parser": {
+ "version": "4.2.0",
+ "resolved": "https://registry.npmjs.org/postcss-value-parser/-/postcss-value-parser-4.2.0.tgz",
+ "integrity": "sha512-1NNCs6uurfkVbeXG4S8JFT9t19m45ICnif8zWLd5oPSZ50QnwMfK+H3jv408d4jw/7Bttv5axS5IiHoLaVNHeQ=="
+ },
+ "node_modules/postcss-zindex": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/postcss-zindex/-/postcss-zindex-5.1.0.tgz",
+ "integrity": "sha512-fgFMf0OtVSBR1va1JNHYgMxYk73yhn/qb4uQDq1DLGYolz8gHCyr/sesEuGUaYs58E3ZJRcpoGuPVoB7Meiq9A==",
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/preact": {
+ "version": "10.13.2",
+ "resolved": "https://registry.npmjs.org/preact/-/preact-10.13.2.tgz",
+ "integrity": "sha512-q44QFLhOhty2Bd0Y46fnYW0gD/cbVM9dUVtNTDKPcdXSMA7jfY+Jpd6rk3GB0lcQss0z5s/6CmVP0Z/hV+g6pw==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/preact"
+ }
+ },
+ "node_modules/prepend-http": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/prepend-http/-/prepend-http-2.0.0.tgz",
+ "integrity": "sha512-ravE6m9Atw9Z/jjttRUZ+clIXogdghyZAuWJ3qEzjT+jI/dL1ifAqhZeC5VHzQp1MSt1+jxKkFNemj/iO7tVUA==",
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/pretty-error": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/pretty-error/-/pretty-error-4.0.0.tgz",
+ "integrity": "sha512-AoJ5YMAcXKYxKhuJGdcvse+Voc6v1RgnsR3nWcYU7q4t6z0Q6T86sv5Zq8VIRbOWWFpvdGE83LtdSMNd+6Y0xw==",
+ "dependencies": {
+ "lodash": "^4.17.20",
+ "renderkid": "^3.0.0"
+ }
+ },
+ "node_modules/pretty-time": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/pretty-time/-/pretty-time-1.1.0.tgz",
+ "integrity": "sha512-28iF6xPQrP8Oa6uxE6a1biz+lWeTOAPKggvjB8HAs6nVMKZwf5bG++632Dx614hIWgUPkgivRfG+a8uAXGTIbA==",
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/prism-react-renderer": {
+ "version": "1.3.5",
+ "resolved": "https://registry.npmjs.org/prism-react-renderer/-/prism-react-renderer-1.3.5.tgz",
+ "integrity": "sha512-IJ+MSwBWKG+SM3b2SUfdrhC+gu01QkV2KmRQgREThBfSQRoufqRfxfHUxpG1WcaFjP+kojcFyO9Qqtpgt3qLCg==",
+ "peerDependencies": {
+ "react": ">=0.14.9"
+ }
+ },
+ "node_modules/prismjs": {
+ "version": "1.30.0",
+ "resolved": "https://registry.npmjs.org/prismjs/-/prismjs-1.30.0.tgz",
+ "integrity": "sha512-DEvV2ZF2r2/63V+tK8hQvrR2ZGn10srHbXviTlcv7Kpzw8jWiNTqbVgjO3IY8RxrrOUF8VPMQQFysYYYv0YZxw==",
+ "license": "MIT",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/process-nextick-args": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/process-nextick-args/-/process-nextick-args-2.0.1.tgz",
+ "integrity": "sha512-3ouUOpQhtgrbOa17J7+uxOTpITYWaGP7/AhoR3+A+/1e9skrzelGi/dXzEYyvbxubEF6Wn2ypscTKiKJFFn1ag=="
+ },
+ "node_modules/promise": {
+ "version": "7.3.1",
+ "resolved": "https://registry.npmjs.org/promise/-/promise-7.3.1.tgz",
+ "integrity": "sha512-nolQXZ/4L+bP/UGlkfaIujX9BKxGwmQ9OT4mOt5yvy8iK1h3wqTEJCijzGANTCCl9nWjY41juyAn2K3Q1hLLTg==",
+ "dependencies": {
+ "asap": "~2.0.3"
+ }
+ },
+ "node_modules/prompts": {
+ "version": "2.4.2",
+ "resolved": "https://registry.npmjs.org/prompts/-/prompts-2.4.2.tgz",
+ "integrity": "sha512-NxNv/kLguCA7p3jE8oL2aEBsrJWgAakBpgmgK6lpPWV+WuOmY6r2/zbAVnP+T8bQlA0nzHXSJSJW0Hq7ylaD2Q==",
+ "dependencies": {
+ "kleur": "^3.0.3",
+ "sisteransi": "^1.0.5"
+ },
+ "engines": {
+ "node": ">= 6"
+ }
+ },
+ "node_modules/prop-types": {
+ "version": "15.8.1",
+ "resolved": "https://registry.npmjs.org/prop-types/-/prop-types-15.8.1.tgz",
+ "integrity": "sha512-oj87CgZICdulUohogVAR7AjlC0327U4el4L6eAvOqCeudMDVU0NThNaV+b9Df4dXgSP1gXMTnPdhfe/2qDH5cg==",
+ "dependencies": {
+ "loose-envify": "^1.4.0",
+ "object-assign": "^4.1.1",
+ "react-is": "^16.13.1"
+ }
+ },
+ "node_modules/property-information": {
+ "version": "5.6.0",
+ "resolved": "https://registry.npmjs.org/property-information/-/property-information-5.6.0.tgz",
+ "integrity": "sha512-YUHSPk+A30YPv+0Qf8i9Mbfe/C0hdPXk1s1jPVToV8pk8BQtpw10ct89Eo7OWkutrwqvT0eicAxlOg3dOAu8JA==",
+ "dependencies": {
+ "xtend": "^4.0.0"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/proxy-addr": {
+ "version": "2.0.7",
+ "resolved": "https://registry.npmjs.org/proxy-addr/-/proxy-addr-2.0.7.tgz",
+ "integrity": "sha512-llQsMLSUDUPT44jdrU/O37qlnifitDP+ZwrmmZcoSKyLKvtZxpyV0n2/bD/N4tBAAZ/gJEdZU7KMraoK1+XYAg==",
+ "dependencies": {
+ "forwarded": "0.2.0",
+ "ipaddr.js": "1.9.1"
+ },
+ "engines": {
+ "node": ">= 0.10"
+ }
+ },
+ "node_modules/proxy-addr/node_modules/ipaddr.js": {
+ "version": "1.9.1",
+ "resolved": "https://registry.npmjs.org/ipaddr.js/-/ipaddr.js-1.9.1.tgz",
+ "integrity": "sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==",
+ "engines": {
+ "node": ">= 0.10"
+ }
+ },
+ "node_modules/pump": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/pump/-/pump-3.0.0.tgz",
+ "integrity": "sha512-LwZy+p3SFs1Pytd/jYct4wpv49HiYCqd9Rlc5ZVdk0V+8Yzv6jR5Blk3TRmPL1ft69TxP0IMZGJ+WPFU2BFhww==",
+ "dependencies": {
+ "end-of-stream": "^1.1.0",
+ "once": "^1.3.1"
+ }
+ },
+ "node_modules/punycode": {
+ "version": "1.4.1",
+ "resolved": "https://registry.npmjs.org/punycode/-/punycode-1.4.1.tgz",
+ "integrity": "sha512-jmYNElW7yvO7TV33CjSmvSiE2yco3bV2czu/OzDKdMNVZQWfxCblURLhf+47syQRBntjfLdd/H0egrzIG+oaFQ=="
+ },
+ "node_modules/pupa": {
+ "version": "2.1.1",
+ "resolved": "https://registry.npmjs.org/pupa/-/pupa-2.1.1.tgz",
+ "integrity": "sha512-l1jNAspIBSFqbT+y+5FosojNpVpF94nlI+wDUpqP9enwOTfHx9f0gh5nB96vl+6yTpsJsypeNrwfzPrKuHB41A==",
+ "dependencies": {
+ "escape-goat": "^2.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/pure-color": {
+ "version": "1.3.0",
+ "resolved": "https://registry.npmjs.org/pure-color/-/pure-color-1.3.0.tgz",
+ "integrity": "sha512-QFADYnsVoBMw1srW7OVKEYjG+MbIa49s54w1MA1EDY6r2r/sTcKKYqRX1f4GYvnXP7eN/Pe9HFcX+hwzmrXRHA=="
+ },
+ "node_modules/qs": {
+ "version": "6.13.0",
+ "resolved": "https://registry.npmjs.org/qs/-/qs-6.13.0.tgz",
+ "integrity": "sha512-+38qI9SOr8tfZ4QmJNplMUxqjbe7LKvvZgWdExBOmd+egZTtjLB67Gu0HRX3u/XOq7UU2Nx6nsjvS16Z9uwfpg==",
+ "dependencies": {
+ "side-channel": "^1.0.6"
+ },
+ "engines": {
+ "node": ">=0.6"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/queue": {
+ "version": "6.0.2",
+ "resolved": "https://registry.npmjs.org/queue/-/queue-6.0.2.tgz",
+ "integrity": "sha512-iHZWu+q3IdFZFX36ro/lKBkSvfkztY5Y7HMiPlOUjhupPcG2JMfst2KKEpu5XndviX/3UhFbRngUPNKtgvtZiA==",
+ "dependencies": {
+ "inherits": "~2.0.3"
+ }
+ },
+ "node_modules/queue-microtask": {
+ "version": "1.2.3",
+ "resolved": "https://registry.npmjs.org/queue-microtask/-/queue-microtask-1.2.3.tgz",
+ "integrity": "sha512-NuaNSa6flKT5JaSYQzJok04JzTL1CA6aGhv5rfLW3PgqA+M2ChpZQnAC8h8i4ZFkBS8X5RqkDBHA7r4hej3K9A==",
+ "funding": [
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/feross"
+ },
+ {
+ "type": "patreon",
+ "url": "https://www.patreon.com/feross"
+ },
+ {
+ "type": "consulting",
+ "url": "https://feross.org/support"
+ }
+ ]
+ },
+ "node_modules/randombytes": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/randombytes/-/randombytes-2.1.0.tgz",
+ "integrity": "sha512-vYl3iOX+4CKUWuxGi9Ukhie6fsqXqS9FE2Zaic4tNFD2N2QQaXOMFbuKK4QmDHC0JO6B1Zp41J0LpT0oR68amQ==",
+ "dependencies": {
+ "safe-buffer": "^5.1.0"
+ }
+ },
+ "node_modules/range-parser": {
+ "version": "1.2.0",
+ "resolved": "https://registry.npmjs.org/range-parser/-/range-parser-1.2.0.tgz",
+ "integrity": "sha512-kA5WQoNVo4t9lNx2kQNFCxKeBl5IbbSNBl1M/tLkw9WCn+hxNBAW5Qh8gdhs63CJnhjJ2zQWFoqPJP2sK1AV5A==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/raw-body": {
+ "version": "2.5.2",
+ "resolved": "https://registry.npmjs.org/raw-body/-/raw-body-2.5.2.tgz",
+ "integrity": "sha512-8zGqypfENjCIqGhgXToC8aB2r7YrBX+AQAfIPs/Mlk+BtPTztOvTS01NRW/3Eh60J+a48lt8qsCzirQ6loCVfA==",
+ "dependencies": {
+ "bytes": "3.1.2",
+ "http-errors": "2.0.0",
+ "iconv-lite": "0.4.24",
+ "unpipe": "1.0.0"
+ },
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/raw-body/node_modules/bytes": {
+ "version": "3.1.2",
+ "resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.2.tgz",
+ "integrity": "sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg==",
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/rc": {
+ "version": "1.2.8",
+ "resolved": "https://registry.npmjs.org/rc/-/rc-1.2.8.tgz",
+ "integrity": "sha512-y3bGgqKj3QBdxLbLkomlohkvsA8gdAiUQlSBJnBhfn+BPxg4bc62d8TcBW15wavDfgexCgccckhcZvywyQYPOw==",
+ "dependencies": {
+ "deep-extend": "^0.6.0",
+ "ini": "~1.3.0",
+ "minimist": "^1.2.0",
+ "strip-json-comments": "~2.0.1"
+ },
+ "bin": {
+ "rc": "cli.js"
+ }
+ },
+ "node_modules/rc/node_modules/strip-json-comments": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-2.0.1.tgz",
+ "integrity": "sha512-4gB8na07fecVVkOI6Rs4e7T6NOTki5EmL7TUduTs6bu3EdnSycntVJ4re8kgZA+wx9IueI2Y11bfbgwtzuE0KQ==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/react": {
+ "version": "17.0.2",
+ "resolved": "https://registry.npmjs.org/react/-/react-17.0.2.tgz",
+ "integrity": "sha512-gnhPt75i/dq/z3/6q/0asP78D0u592D5L1pd7M8P+dck6Fu/jJeL6iVVK23fptSUZj8Vjf++7wXA8UNclGQcbA==",
+ "dependencies": {
+ "loose-envify": "^1.1.0",
+ "object-assign": "^4.1.1"
+ },
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/react-base16-styling": {
+ "version": "0.6.0",
+ "resolved": "https://registry.npmjs.org/react-base16-styling/-/react-base16-styling-0.6.0.tgz",
+ "integrity": "sha512-yvh/7CArceR/jNATXOKDlvTnPKPmGZz7zsenQ3jUwLzHkNUR0CvY3yGYJbWJ/nnxsL8Sgmt5cO3/SILVuPO6TQ==",
+ "dependencies": {
+ "base16": "^1.0.0",
+ "lodash.curry": "^4.0.1",
+ "lodash.flow": "^3.3.0",
+ "pure-color": "^1.2.0"
+ }
+ },
+ "node_modules/react-dev-utils": {
+ "version": "12.0.1",
+ "resolved": "https://registry.npmjs.org/react-dev-utils/-/react-dev-utils-12.0.1.tgz",
+ "integrity": "sha512-84Ivxmr17KjUupyqzFode6xKhjwuEJDROWKJy/BthkL7Wn6NJ8h4WE6k/exAv6ImS+0oZLRRW5j/aINMHyeGeQ==",
+ "dependencies": {
+ "@babel/code-frame": "^7.16.0",
+ "address": "^1.1.2",
+ "browserslist": "^4.18.1",
+ "chalk": "^4.1.2",
+ "cross-spawn": "^7.0.3",
+ "detect-port-alt": "^1.1.6",
+ "escape-string-regexp": "^4.0.0",
+ "filesize": "^8.0.6",
+ "find-up": "^5.0.0",
+ "fork-ts-checker-webpack-plugin": "^6.5.0",
+ "global-modules": "^2.0.0",
+ "globby": "^11.0.4",
+ "gzip-size": "^6.0.0",
+ "immer": "^9.0.7",
+ "is-root": "^2.1.0",
+ "loader-utils": "^3.2.0",
+ "open": "^8.4.0",
+ "pkg-up": "^3.1.0",
+ "prompts": "^2.4.2",
+ "react-error-overlay": "^6.0.11",
+ "recursive-readdir": "^2.2.2",
+ "shell-quote": "^1.7.3",
+ "strip-ansi": "^6.0.1",
+ "text-table": "^0.2.0"
+ },
+ "engines": {
+ "node": ">=14"
+ }
+ },
+ "node_modules/react-dev-utils/node_modules/find-up": {
+ "version": "5.0.0",
+ "resolved": "https://registry.npmjs.org/find-up/-/find-up-5.0.0.tgz",
+ "integrity": "sha512-78/PXT1wlLLDgTzDs7sjq9hzz0vXD+zn+7wypEe4fXQxCmdmqfGsEPQxmiCSQI3ajFV91bVSsvNtrJRiW6nGng==",
+ "dependencies": {
+ "locate-path": "^6.0.0",
+ "path-exists": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/react-dev-utils/node_modules/loader-utils": {
+ "version": "3.2.1",
+ "resolved": "https://registry.npmjs.org/loader-utils/-/loader-utils-3.2.1.tgz",
+ "integrity": "sha512-ZvFw1KWS3GVyYBYb7qkmRM/WwL2TQQBxgCK62rlvm4WpVQ23Nb4tYjApUlfjrEGvOs7KHEsmyUn75OHZrJMWPw==",
+ "engines": {
+ "node": ">= 12.13.0"
+ }
+ },
+ "node_modules/react-dev-utils/node_modules/locate-path": {
+ "version": "6.0.0",
+ "resolved": "https://registry.npmjs.org/locate-path/-/locate-path-6.0.0.tgz",
+ "integrity": "sha512-iPZK6eYjbxRu3uB4/WZ3EsEIMJFMqAoopl3R+zuq0UjcAm/MO6KCweDgPfP3elTztoKP3KtnVHxTn2NHBSDVUw==",
+ "dependencies": {
+ "p-locate": "^5.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/react-dev-utils/node_modules/p-limit": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/p-limit/-/p-limit-3.1.0.tgz",
+ "integrity": "sha512-TYOanM3wGwNGsZN2cVTYPArw454xnXj5qmWF1bEoAc4+cU/ol7GVh7odevjp1FNHduHc3KZMcFduxU5Xc6uJRQ==",
+ "dependencies": {
+ "yocto-queue": "^0.1.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/react-dev-utils/node_modules/p-locate": {
+ "version": "5.0.0",
+ "resolved": "https://registry.npmjs.org/p-locate/-/p-locate-5.0.0.tgz",
+ "integrity": "sha512-LaNjtRWUBY++zB5nE/NwcaoMylSPk+S+ZHNB1TzdbMJMny6dynpAGt7X/tl/QYq3TIeE6nxHppbo2LGymrG5Pw==",
+ "dependencies": {
+ "p-limit": "^3.0.2"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/react-dom": {
+ "version": "17.0.2",
+ "resolved": "https://registry.npmjs.org/react-dom/-/react-dom-17.0.2.tgz",
+ "integrity": "sha512-s4h96KtLDUQlsENhMn1ar8t2bEa+q/YAtj8pPPdIjPDGBDIVNsrD9aXNWqspUe6AzKCIG0C1HZZLqLV7qpOBGA==",
+ "dependencies": {
+ "loose-envify": "^1.1.0",
+ "object-assign": "^4.1.1",
+ "scheduler": "^0.20.2"
+ },
+ "peerDependencies": {
+ "react": "17.0.2"
+ }
+ },
+ "node_modules/react-error-overlay": {
+ "version": "6.0.11",
+ "resolved": "https://registry.npmjs.org/react-error-overlay/-/react-error-overlay-6.0.11.tgz",
+ "integrity": "sha512-/6UZ2qgEyH2aqzYZgQPxEnz33NJ2gNsnHA2o5+o4wW9bLM/JYQitNP9xPhsXwC08hMMovfGe/8retsdDsczPRg=="
+ },
+ "node_modules/react-fast-compare": {
+ "version": "3.2.1",
+ "resolved": "https://registry.npmjs.org/react-fast-compare/-/react-fast-compare-3.2.1.tgz",
+ "integrity": "sha512-xTYf9zFim2pEif/Fw16dBiXpe0hoy5PxcD8+OwBnTtNLfIm3g6WxhKNurY+6OmdH1u6Ta/W/Vl6vjbYP1MFnDg=="
+ },
+ "node_modules/react-helmet-async": {
+ "version": "1.3.0",
+ "resolved": "https://registry.npmjs.org/react-helmet-async/-/react-helmet-async-1.3.0.tgz",
+ "integrity": "sha512-9jZ57/dAn9t3q6hneQS0wukqC2ENOBgMNVEhb/ZG9ZSxUetzVIw4iAmEU38IaVg3QGYauQPhSeUTuIUtFglWpg==",
+ "dependencies": {
+ "@babel/runtime": "^7.12.5",
+ "invariant": "^2.2.4",
+ "prop-types": "^15.7.2",
+ "react-fast-compare": "^3.2.0",
+ "shallowequal": "^1.1.0"
+ },
+ "peerDependencies": {
+ "react": "^16.6.0 || ^17.0.0 || ^18.0.0",
+ "react-dom": "^16.6.0 || ^17.0.0 || ^18.0.0"
+ }
+ },
+ "node_modules/react-is": {
+ "version": "16.13.1",
+ "resolved": "https://registry.npmjs.org/react-is/-/react-is-16.13.1.tgz",
+ "integrity": "sha512-24e6ynE2H+OKt4kqsOvNd8kBpV65zoxbA4BVsEOB3ARVWQki/DHzaUoC5KuON/BiccDaCCTZBuOcfZs70kR8bQ=="
+ },
+ "node_modules/react-json-view": {
+ "version": "1.21.3",
+ "resolved": "https://registry.npmjs.org/react-json-view/-/react-json-view-1.21.3.tgz",
+ "integrity": "sha512-13p8IREj9/x/Ye4WI/JpjhoIwuzEgUAtgJZNBJckfzJt1qyh24BdTm6UQNGnyTq9dapQdrqvquZTo3dz1X6Cjw==",
+ "dependencies": {
+ "flux": "^4.0.1",
+ "react-base16-styling": "^0.6.0",
+ "react-lifecycles-compat": "^3.0.4",
+ "react-textarea-autosize": "^8.3.2"
+ },
+ "peerDependencies": {
+ "react": "^17.0.0 || ^16.3.0 || ^15.5.4",
+ "react-dom": "^17.0.0 || ^16.3.0 || ^15.5.4"
+ }
+ },
+ "node_modules/react-lifecycles-compat": {
+ "version": "3.0.4",
+ "resolved": "https://registry.npmjs.org/react-lifecycles-compat/-/react-lifecycles-compat-3.0.4.tgz",
+ "integrity": "sha512-fBASbA6LnOU9dOU2eW7aQ8xmYBSXUIWr+UmF9b1efZBazGNO+rcXT/icdKnYm2pTwcRylVUYwW7H1PHfLekVzA=="
+ },
+ "node_modules/react-loadable": {
+ "name": "@docusaurus/react-loadable",
+ "version": "5.5.2",
+ "resolved": "https://registry.npmjs.org/@docusaurus/react-loadable/-/react-loadable-5.5.2.tgz",
+ "integrity": "sha512-A3dYjdBGuy0IGT+wyLIGIKLRE+sAk1iNk0f1HjNDysO7u8lhL4N3VEm+FAubmJbAztn94F7MxBTPmnixbiyFdQ==",
+ "dependencies": {
+ "@types/react": "*",
+ "prop-types": "^15.6.2"
+ },
+ "peerDependencies": {
+ "react": "*"
+ }
+ },
+ "node_modules/react-loadable-ssr-addon-v5-slorber": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/react-loadable-ssr-addon-v5-slorber/-/react-loadable-ssr-addon-v5-slorber-1.0.1.tgz",
+ "integrity": "sha512-lq3Lyw1lGku8zUEJPDxsNm1AfYHBrO9Y1+olAYwpUJ2IGFBskM0DMKok97A6LWUpHm+o7IvQBOWu9MLenp9Z+A==",
+ "dependencies": {
+ "@babel/runtime": "^7.10.3"
+ },
+ "engines": {
+ "node": ">=10.13.0"
+ },
+ "peerDependencies": {
+ "react-loadable": "*",
+ "webpack": ">=4.41.1 || 5.x"
+ }
+ },
+ "node_modules/react-router": {
+ "version": "5.3.4",
+ "resolved": "https://registry.npmjs.org/react-router/-/react-router-5.3.4.tgz",
+ "integrity": "sha512-Ys9K+ppnJah3QuaRiLxk+jDWOR1MekYQrlytiXxC1RyfbdsZkS5pvKAzCCr031xHixZwpnsYNT5xysdFHQaYsA==",
+ "dependencies": {
+ "@babel/runtime": "^7.12.13",
+ "history": "^4.9.0",
+ "hoist-non-react-statics": "^3.1.0",
+ "loose-envify": "^1.3.1",
+ "path-to-regexp": "^1.7.0",
+ "prop-types": "^15.6.2",
+ "react-is": "^16.6.0",
+ "tiny-invariant": "^1.0.2",
+ "tiny-warning": "^1.0.0"
+ },
+ "peerDependencies": {
+ "react": ">=15"
+ }
+ },
+ "node_modules/react-router-config": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/react-router-config/-/react-router-config-5.1.1.tgz",
+ "integrity": "sha512-DuanZjaD8mQp1ppHjgnnUnyOlqYXZVjnov/JzFhjLEwd3Z4dYjMSnqrEzzGThH47vpCOqPPwJM2FtthLeJ8Pbg==",
+ "dependencies": {
+ "@babel/runtime": "^7.1.2"
+ },
+ "peerDependencies": {
+ "react": ">=15",
+ "react-router": ">=5"
+ }
+ },
+ "node_modules/react-router-dom": {
+ "version": "5.3.4",
+ "resolved": "https://registry.npmjs.org/react-router-dom/-/react-router-dom-5.3.4.tgz",
+ "integrity": "sha512-m4EqFMHv/Ih4kpcBCONHbkT68KoAeHN4p3lAGoNryfHi0dMy0kCzEZakiKRsvg5wHZ/JLrLW8o8KomWiz/qbYQ==",
+ "dependencies": {
+ "@babel/runtime": "^7.12.13",
+ "history": "^4.9.0",
+ "loose-envify": "^1.3.1",
+ "prop-types": "^15.6.2",
+ "react-router": "5.3.4",
+ "tiny-invariant": "^1.0.2",
+ "tiny-warning": "^1.0.0"
+ },
+ "peerDependencies": {
+ "react": ">=15"
+ }
+ },
+ "node_modules/react-tabs": {
+ "version": "3.2.3",
+ "resolved": "https://registry.npmjs.org/react-tabs/-/react-tabs-3.2.3.tgz",
+ "integrity": "sha512-jx325RhRVnS9DdFbeF511z0T0WEqEoMl1uCE3LoZ6VaZZm7ytatxbum0B8bCTmaiV0KsU+4TtLGTGevCic7SWg==",
+ "dependencies": {
+ "clsx": "^1.1.0",
+ "prop-types": "^15.5.0"
+ },
+ "peerDependencies": {
+ "react": "^16.3.0 || ^17.0.0-0"
+ }
+ },
+ "node_modules/react-textarea-autosize": {
+ "version": "8.4.1",
+ "resolved": "https://registry.npmjs.org/react-textarea-autosize/-/react-textarea-autosize-8.4.1.tgz",
+ "integrity": "sha512-aD2C+qK6QypknC+lCMzteOdIjoMbNlgSFmJjCV+DrfTPwp59i/it9mMNf2HDzvRjQgKAyBDPyLJhcrzElf2U4Q==",
+ "dependencies": {
+ "@babel/runtime": "^7.20.13",
+ "use-composed-ref": "^1.3.0",
+ "use-latest": "^1.2.1"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "peerDependencies": {
+ "react": "^16.8.0 || ^17.0.0 || ^18.0.0"
+ }
+ },
+ "node_modules/readable-stream": {
+ "version": "3.6.2",
+ "resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-3.6.2.tgz",
+ "integrity": "sha512-9u/sniCrY3D5WdsERHzHE4G2YCXqoG5FTHUiCC4SIbr6XcLZBY05ya9EKjYek9O5xOAwjGq+1JdGBAS7Q9ScoA==",
+ "dependencies": {
+ "inherits": "^2.0.3",
+ "string_decoder": "^1.1.1",
+ "util-deprecate": "^1.0.1"
+ },
+ "engines": {
+ "node": ">= 6"
+ }
+ },
+ "node_modules/readdirp": {
+ "version": "3.6.0",
+ "resolved": "https://registry.npmjs.org/readdirp/-/readdirp-3.6.0.tgz",
+ "integrity": "sha512-hOS089on8RduqdbhvQ5Z37A0ESjsqz6qnRcffsMU3495FuTdqSm+7bhJ29JvIOsBDEEnan5DPu9t3To9VRlMzA==",
+ "dependencies": {
+ "picomatch": "^2.2.1"
+ },
+ "engines": {
+ "node": ">=8.10.0"
+ }
+ },
+ "node_modules/reading-time": {
+ "version": "1.5.0",
+ "resolved": "https://registry.npmjs.org/reading-time/-/reading-time-1.5.0.tgz",
+ "integrity": "sha512-onYyVhBNr4CmAxFsKS7bz+uTLRakypIe4R+5A824vBSkQy/hB3fZepoVEf8OVAxzLvK+H/jm9TzpI3ETSm64Kg=="
+ },
+ "node_modules/rechoir": {
+ "version": "0.6.2",
+ "resolved": "https://registry.npmjs.org/rechoir/-/rechoir-0.6.2.tgz",
+ "integrity": "sha512-HFM8rkZ+i3zrV+4LQjwQ0W+ez98pApMGM3HUrN04j3CqzPOzl9nmP15Y8YXNm8QHGv/eacOVEjqhmWpkRV0NAw==",
+ "dependencies": {
+ "resolve": "^1.1.6"
+ },
+ "engines": {
+ "node": ">= 0.10"
+ }
+ },
+ "node_modules/recursive-readdir": {
+ "version": "2.2.3",
+ "resolved": "https://registry.npmjs.org/recursive-readdir/-/recursive-readdir-2.2.3.tgz",
+ "integrity": "sha512-8HrF5ZsXk5FAH9dgsx3BlUer73nIhuj+9OrQwEbLTPOBzGkL1lsFCR01am+v+0m2Cmbs1nP12hLDl5FA7EszKA==",
+ "dependencies": {
+ "minimatch": "^3.0.5"
+ },
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/redoc": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/redoc/-/redoc-2.0.0.tgz",
+ "integrity": "sha512-rU8iLdAkT89ywOkYk66Mr+IofqaMASlRvTew0dJvopCORMIPUcPMxjlJbJNC6wsn2vvMnpUFLQ/0ISDWn9BWag==",
+ "dependencies": {
+ "@redocly/openapi-core": "^1.0.0-beta.104",
+ "classnames": "^2.3.1",
+ "decko": "^1.2.0",
+ "dompurify": "^2.2.8",
+ "eventemitter3": "^4.0.7",
+ "json-pointer": "^0.6.2",
+ "lunr": "^2.3.9",
+ "mark.js": "^8.11.1",
+ "marked": "^4.0.15",
+ "mobx-react": "^7.2.0",
+ "openapi-sampler": "^1.3.0",
+ "path-browserify": "^1.0.1",
+ "perfect-scrollbar": "^1.5.5",
+ "polished": "^4.1.3",
+ "prismjs": "^1.27.0",
+ "prop-types": "^15.7.2",
+ "react-tabs": "^3.2.2",
+ "slugify": "~1.4.7",
+ "stickyfill": "^1.1.1",
+ "style-loader": "^3.3.1",
+ "swagger2openapi": "^7.0.6",
+ "url-template": "^2.0.8"
+ },
+ "engines": {
+ "node": ">=6.9",
+ "npm": ">=3.0.0"
+ },
+ "peerDependencies": {
+ "core-js": "^3.1.4",
+ "mobx": "^6.0.4",
+ "react": "^16.8.4 || ^17.0.0",
+ "react-dom": "^16.8.4 || ^17.0.0",
+ "styled-components": "^4.1.1 || ^5.1.1"
+ }
+ },
+ "node_modules/redocusaurus": {
+ "version": "1.6.1",
+ "resolved": "https://registry.npmjs.org/redocusaurus/-/redocusaurus-1.6.1.tgz",
+ "integrity": "sha512-6yzay69VX+gh0Dwv1MCOFW0Ih3GcwrU+MyWLjTOrLk+oZ7hLM/4aMJA3yZUZcyvVHbrbZfhJsl970A3D0B1P0g==",
+ "dependencies": {
+ "docusaurus-plugin-redoc": "1.6.0",
+ "docusaurus-theme-redoc": "1.6.1"
+ },
+ "engines": {
+ "node": ">=14"
+ },
+ "peerDependencies": {
+ "@docusaurus/theme-common": "^2.0.0",
+ "@docusaurus/utils": "^2.0.0"
+ }
+ },
+ "node_modules/reftools": {
+ "version": "1.1.9",
+ "resolved": "https://registry.npmjs.org/reftools/-/reftools-1.1.9.tgz",
+ "integrity": "sha512-OVede/NQE13xBQ+ob5CKd5KyeJYU2YInb1bmV4nRoOfquZPkAkxuOXicSe1PvqIuZZ4kD13sPKBbR7UFDmli6w==",
+ "funding": {
+ "url": "https://github.com/Mermade/oas-kit?sponsor=1"
+ }
+ },
+ "node_modules/regenerate": {
+ "version": "1.4.2",
+ "resolved": "https://registry.npmjs.org/regenerate/-/regenerate-1.4.2.tgz",
+ "integrity": "sha512-zrceR/XhGYU/d/opr2EKO7aRHUeiBI8qjtfHqADTwZd6Szfy16la6kqD0MIUs5z5hx6AaKa+PixpPrR289+I0A=="
+ },
+ "node_modules/regenerate-unicode-properties": {
+ "version": "10.1.0",
+ "resolved": "https://registry.npmjs.org/regenerate-unicode-properties/-/regenerate-unicode-properties-10.1.0.tgz",
+ "integrity": "sha512-d1VudCLoIGitcU/hEg2QqvyGZQmdC0Lf8BqdOMXGFSvJP4bNV1+XqbPQeHHLD51Jh4QJJ225dlIFvY4Ly6MXmQ==",
+ "dependencies": {
+ "regenerate": "^1.4.2"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/regenerator-runtime": {
+ "version": "0.14.1",
+ "resolved": "https://registry.npmjs.org/regenerator-runtime/-/regenerator-runtime-0.14.1.tgz",
+ "integrity": "sha512-dYnhHh0nJoMfnkZs6GmmhFknAGRrLznOu5nc9ML+EJxGvrx6H7teuevqVqCuPcPK//3eDrrjQhehXVx9cnkGdw==",
+ "license": "MIT"
+ },
+ "node_modules/regenerator-transform": {
+ "version": "0.15.1",
+ "resolved": "https://registry.npmjs.org/regenerator-transform/-/regenerator-transform-0.15.1.tgz",
+ "integrity": "sha512-knzmNAcuyxV+gQCufkYcvOqX/qIIfHLv0u5x79kRxuGojfYVky1f15TzZEu2Avte8QGepvUNTnLskf8E6X6Vyg==",
+ "dependencies": {
+ "@babel/runtime": "^7.8.4"
+ }
+ },
+ "node_modules/regexpu-core": {
+ "version": "5.3.2",
+ "resolved": "https://registry.npmjs.org/regexpu-core/-/regexpu-core-5.3.2.tgz",
+ "integrity": "sha512-RAM5FlZz+Lhmo7db9L298p2vHP5ZywrVXmVXpmAD9GuL5MPH6t9ROw1iA/wfHkQ76Qe7AaPF0nGuim96/IrQMQ==",
+ "dependencies": {
+ "@babel/regjsgen": "^0.8.0",
+ "regenerate": "^1.4.2",
+ "regenerate-unicode-properties": "^10.1.0",
+ "regjsparser": "^0.9.1",
+ "unicode-match-property-ecmascript": "^2.0.0",
+ "unicode-match-property-value-ecmascript": "^2.1.0"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/registry-auth-token": {
+ "version": "4.2.2",
+ "resolved": "https://registry.npmjs.org/registry-auth-token/-/registry-auth-token-4.2.2.tgz",
+ "integrity": "sha512-PC5ZysNb42zpFME6D/XlIgtNGdTl8bBOCw90xQLVMpzuuubJKYDWFAEuUNc+Cn8Z8724tg2SDhDRrkVEsqfDMg==",
+ "dependencies": {
+ "rc": "1.2.8"
+ },
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/registry-url": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/registry-url/-/registry-url-5.1.0.tgz",
+ "integrity": "sha512-8acYXXTI0AkQv6RAOjE3vOaIXZkT9wo4LOFbBKYQEEnnMNBpKqdUrI6S4NT0KPIo/WVvJ5tE/X5LF/TQUf0ekw==",
+ "dependencies": {
+ "rc": "^1.2.8"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/regjsparser": {
+ "version": "0.9.1",
+ "resolved": "https://registry.npmjs.org/regjsparser/-/regjsparser-0.9.1.tgz",
+ "integrity": "sha512-dQUtn90WanSNl+7mQKcXAgZxvUe7Z0SqXlgzv0za4LwiUhyzBC58yQO3liFoUgu8GiJVInAhJjkj1N0EtQ5nkQ==",
+ "dependencies": {
+ "jsesc": "~0.5.0"
+ },
+ "bin": {
+ "regjsparser": "bin/parser"
+ }
+ },
+ "node_modules/regjsparser/node_modules/jsesc": {
+ "version": "0.5.0",
+ "resolved": "https://registry.npmjs.org/jsesc/-/jsesc-0.5.0.tgz",
+ "integrity": "sha512-uZz5UnB7u4T9LvwmFqXii7pZSouaRPorGs5who1Ip7VO0wxanFvBL7GkM6dTHlgX+jhBApRetaWpnDabOeTcnA==",
+ "bin": {
+ "jsesc": "bin/jsesc"
+ }
+ },
+ "node_modules/relateurl": {
+ "version": "0.2.7",
+ "resolved": "https://registry.npmjs.org/relateurl/-/relateurl-0.2.7.tgz",
+ "integrity": "sha512-G08Dxvm4iDN3MLM0EsP62EDV9IuhXPR6blNz6Utcp7zyV3tr4HVNINt6MpaRWbxoOHT3Q7YN2P+jaHX8vUbgog==",
+ "engines": {
+ "node": ">= 0.10"
+ }
+ },
+ "node_modules/remark-emoji": {
+ "version": "2.2.0",
+ "resolved": "https://registry.npmjs.org/remark-emoji/-/remark-emoji-2.2.0.tgz",
+ "integrity": "sha512-P3cj9s5ggsUvWw5fS2uzCHJMGuXYRb0NnZqYlNecewXt8QBU9n5vW3DUUKOhepS8F9CwdMx9B8a3i7pqFWAI5w==",
+ "dependencies": {
+ "emoticon": "^3.2.0",
+ "node-emoji": "^1.10.0",
+ "unist-util-visit": "^2.0.3"
+ }
+ },
+ "node_modules/remark-footnotes": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/remark-footnotes/-/remark-footnotes-2.0.0.tgz",
+ "integrity": "sha512-3Clt8ZMH75Ayjp9q4CorNeyjwIxHFcTkaektplKGl2A1jNGEUey8cKL0ZC5vJwfcD5GFGsNLImLG/NGzWIzoMQ==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/remark-mdx": {
+ "version": "1.6.22",
+ "resolved": "https://registry.npmjs.org/remark-mdx/-/remark-mdx-1.6.22.tgz",
+ "integrity": "sha512-phMHBJgeV76uyFkH4rvzCftLfKCr2RZuF+/gmVcaKrpsihyzmhXjA0BEMDaPTXG5y8qZOKPVo83NAOX01LPnOQ==",
+ "dependencies": {
+ "@babel/core": "7.12.9",
+ "@babel/helper-plugin-utils": "7.10.4",
+ "@babel/plugin-proposal-object-rest-spread": "7.12.1",
+ "@babel/plugin-syntax-jsx": "7.12.1",
+ "@mdx-js/util": "1.6.22",
+ "is-alphabetical": "1.0.4",
+ "remark-parse": "8.0.3",
+ "unified": "9.2.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/remark-mdx/node_modules/@babel/core": {
+ "version": "7.12.9",
+ "resolved": "https://registry.npmjs.org/@babel/core/-/core-7.12.9.tgz",
+ "integrity": "sha512-gTXYh3M5wb7FRXQy+FErKFAv90BnlOuNn1QkCK2lREoPAjrQCO49+HVSrFoe5uakFAF5eenS75KbO2vQiLrTMQ==",
+ "dependencies": {
+ "@babel/code-frame": "^7.10.4",
+ "@babel/generator": "^7.12.5",
+ "@babel/helper-module-transforms": "^7.12.1",
+ "@babel/helpers": "^7.12.5",
+ "@babel/parser": "^7.12.7",
+ "@babel/template": "^7.12.7",
+ "@babel/traverse": "^7.12.9",
+ "@babel/types": "^7.12.7",
+ "convert-source-map": "^1.7.0",
+ "debug": "^4.1.0",
+ "gensync": "^1.0.0-beta.1",
+ "json5": "^2.1.2",
+ "lodash": "^4.17.19",
+ "resolve": "^1.3.2",
+ "semver": "^5.4.1",
+ "source-map": "^0.5.0"
+ },
+ "engines": {
+ "node": ">=6.9.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/babel"
+ }
+ },
+ "node_modules/remark-mdx/node_modules/@babel/helper-plugin-utils": {
+ "version": "7.10.4",
+ "resolved": "https://registry.npmjs.org/@babel/helper-plugin-utils/-/helper-plugin-utils-7.10.4.tgz",
+ "integrity": "sha512-O4KCvQA6lLiMU9l2eawBPMf1xPP8xPfB3iEQw150hOVTqj/rfXz0ThTb4HEzqQfs2Bmo5Ay8BzxfzVtBrr9dVg=="
+ },
+ "node_modules/remark-mdx/node_modules/@babel/plugin-proposal-object-rest-spread": {
+ "version": "7.12.1",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-proposal-object-rest-spread/-/plugin-proposal-object-rest-spread-7.12.1.tgz",
+ "integrity": "sha512-s6SowJIjzlhx8o7lsFx5zmY4At6CTtDvgNQDdPzkBQucle58A6b/TTeEBYtyDgmcXjUTM+vE8YOGHZzzbc/ioA==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.10.4",
+ "@babel/plugin-syntax-object-rest-spread": "^7.8.0",
+ "@babel/plugin-transform-parameters": "^7.12.1"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/remark-mdx/node_modules/@babel/plugin-syntax-jsx": {
+ "version": "7.12.1",
+ "resolved": "https://registry.npmjs.org/@babel/plugin-syntax-jsx/-/plugin-syntax-jsx-7.12.1.tgz",
+ "integrity": "sha512-1yRi7yAtB0ETgxdY9ti/p2TivUxJkTdhu/ZbF9MshVGqOx1TdB3b7xCXs49Fupgg50N45KcAsRP/ZqWjs9SRjg==",
+ "dependencies": {
+ "@babel/helper-plugin-utils": "^7.10.4"
+ },
+ "peerDependencies": {
+ "@babel/core": "^7.0.0-0"
+ }
+ },
+ "node_modules/remark-mdx/node_modules/semver": {
+ "version": "5.7.2",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-5.7.2.tgz",
+ "integrity": "sha512-cBznnQ9KjJqU67B52RMC65CMarK2600WFnbkcaiwWq3xy/5haFJlshgnpjovMVJ+Hff49d8GEn0b87C5pDQ10g==",
+ "bin": {
+ "semver": "bin/semver"
+ }
+ },
+ "node_modules/remark-mdx/node_modules/source-map": {
+ "version": "0.5.7",
+ "resolved": "https://registry.npmjs.org/source-map/-/source-map-0.5.7.tgz",
+ "integrity": "sha512-LbrmJOMUSdEVxIKvdcJzQC+nQhe8FUZQTXQy6+I75skNgn3OoQ0DZA8YnFa7gp8tqtL3KPf1kmo0R5DoApeSGQ==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/remark-mdx/node_modules/unified": {
+ "version": "9.2.0",
+ "resolved": "https://registry.npmjs.org/unified/-/unified-9.2.0.tgz",
+ "integrity": "sha512-vx2Z0vY+a3YoTj8+pttM3tiJHCwY5UFbYdiWrwBEbHmK8pvsPj2rtAX2BFfgXen8T39CJWblWRDT4L5WGXtDdg==",
+ "dependencies": {
+ "bail": "^1.0.0",
+ "extend": "^3.0.0",
+ "is-buffer": "^2.0.0",
+ "is-plain-obj": "^2.0.0",
+ "trough": "^1.0.0",
+ "vfile": "^4.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/remark-parse": {
+ "version": "8.0.3",
+ "resolved": "https://registry.npmjs.org/remark-parse/-/remark-parse-8.0.3.tgz",
+ "integrity": "sha512-E1K9+QLGgggHxCQtLt++uXltxEprmWzNfg+MxpfHsZlrddKzZ/hZyWHDbK3/Ap8HJQqYJRXP+jHczdL6q6i85Q==",
+ "dependencies": {
+ "ccount": "^1.0.0",
+ "collapse-white-space": "^1.0.2",
+ "is-alphabetical": "^1.0.0",
+ "is-decimal": "^1.0.0",
+ "is-whitespace-character": "^1.0.0",
+ "is-word-character": "^1.0.0",
+ "markdown-escapes": "^1.0.0",
+ "parse-entities": "^2.0.0",
+ "repeat-string": "^1.5.4",
+ "state-toggle": "^1.0.0",
+ "trim": "0.0.1",
+ "trim-trailing-lines": "^1.0.0",
+ "unherit": "^1.0.4",
+ "unist-util-remove-position": "^2.0.0",
+ "vfile-location": "^3.0.0",
+ "xtend": "^4.0.1"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/remark-squeeze-paragraphs": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/remark-squeeze-paragraphs/-/remark-squeeze-paragraphs-4.0.0.tgz",
+ "integrity": "sha512-8qRqmL9F4nuLPIgl92XUuxI3pFxize+F1H0e/W3llTk0UsjJaj01+RrirkMw7P21RKe4X6goQhYRSvNWX+70Rw==",
+ "dependencies": {
+ "mdast-squeeze-paragraphs": "^4.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/renderkid": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/renderkid/-/renderkid-3.0.0.tgz",
+ "integrity": "sha512-q/7VIQA8lmM1hF+jn+sFSPWGlMkSAeNYcPLmDQx2zzuiDfaLrOmumR8iaUKlenFgh0XRPIUeSPlH3A+AW3Z5pg==",
+ "dependencies": {
+ "css-select": "^4.1.3",
+ "dom-converter": "^0.2.0",
+ "htmlparser2": "^6.1.0",
+ "lodash": "^4.17.21",
+ "strip-ansi": "^6.0.1"
+ }
+ },
+ "node_modules/renderkid/node_modules/css-select": {
+ "version": "4.3.0",
+ "resolved": "https://registry.npmjs.org/css-select/-/css-select-4.3.0.tgz",
+ "integrity": "sha512-wPpOYtnsVontu2mODhA19JrqWxNsfdatRKd64kmpRbQgh1KtItko5sTnEpPdpSaJszTOhEMlF/RPz28qj4HqhQ==",
+ "dependencies": {
+ "boolbase": "^1.0.0",
+ "css-what": "^6.0.1",
+ "domhandler": "^4.3.1",
+ "domutils": "^2.8.0",
+ "nth-check": "^2.0.1"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/fb55"
+ }
+ },
+ "node_modules/renderkid/node_modules/dom-serializer": {
+ "version": "1.4.1",
+ "resolved": "https://registry.npmjs.org/dom-serializer/-/dom-serializer-1.4.1.tgz",
+ "integrity": "sha512-VHwB3KfrcOOkelEG2ZOfxqLZdfkil8PtJi4P8N2MMXucZq2yLp75ClViUlOVwyoHEDjYU433Aq+5zWP61+RGag==",
+ "dependencies": {
+ "domelementtype": "^2.0.1",
+ "domhandler": "^4.2.0",
+ "entities": "^2.0.0"
+ },
+ "funding": {
+ "url": "https://github.com/cheeriojs/dom-serializer?sponsor=1"
+ }
+ },
+ "node_modules/renderkid/node_modules/domhandler": {
+ "version": "4.3.1",
+ "resolved": "https://registry.npmjs.org/domhandler/-/domhandler-4.3.1.tgz",
+ "integrity": "sha512-GrwoxYN+uWlzO8uhUXRl0P+kHE4GtVPfYzVLcUxPL7KNdHKj66vvlhiweIHqYYXWlw+T8iLMp42Lm67ghw4WMQ==",
+ "dependencies": {
+ "domelementtype": "^2.2.0"
+ },
+ "engines": {
+ "node": ">= 4"
+ },
+ "funding": {
+ "url": "https://github.com/fb55/domhandler?sponsor=1"
+ }
+ },
+ "node_modules/renderkid/node_modules/domutils": {
+ "version": "2.8.0",
+ "resolved": "https://registry.npmjs.org/domutils/-/domutils-2.8.0.tgz",
+ "integrity": "sha512-w96Cjofp72M5IIhpjgobBimYEfoPjx1Vx0BSX9P30WBdZW2WIKU0T1Bd0kz2eNZ9ikjKgHbEyKx8BB6H1L3h3A==",
+ "dependencies": {
+ "dom-serializer": "^1.0.1",
+ "domelementtype": "^2.2.0",
+ "domhandler": "^4.2.0"
+ },
+ "funding": {
+ "url": "https://github.com/fb55/domutils?sponsor=1"
+ }
+ },
+ "node_modules/renderkid/node_modules/entities": {
+ "version": "2.2.0",
+ "resolved": "https://registry.npmjs.org/entities/-/entities-2.2.0.tgz",
+ "integrity": "sha512-p92if5Nz619I0w+akJrLZH0MX0Pb5DX39XOwQTtXSdQQOaYH03S1uIQp4mhOZtAXrxq4ViO67YTiLBo2638o9A==",
+ "funding": {
+ "url": "https://github.com/fb55/entities?sponsor=1"
+ }
+ },
+ "node_modules/renderkid/node_modules/htmlparser2": {
+ "version": "6.1.0",
+ "resolved": "https://registry.npmjs.org/htmlparser2/-/htmlparser2-6.1.0.tgz",
+ "integrity": "sha512-gyyPk6rgonLFEDGoeRgQNaEUvdJ4ktTmmUh/h2t7s+M8oPpIPxgNACWa+6ESR57kXstwqPiCut0V8NRpcwgU7A==",
+ "funding": [
+ "https://github.com/fb55/htmlparser2?sponsor=1",
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/fb55"
+ }
+ ],
+ "dependencies": {
+ "domelementtype": "^2.0.1",
+ "domhandler": "^4.0.0",
+ "domutils": "^2.5.2",
+ "entities": "^2.0.0"
+ }
+ },
+ "node_modules/repeat-string": {
+ "version": "1.6.1",
+ "resolved": "https://registry.npmjs.org/repeat-string/-/repeat-string-1.6.1.tgz",
+ "integrity": "sha512-PV0dzCYDNfRi1jCDbJzpW7jNNDRuCOG/jI5ctQcGKt/clZD+YcPS3yIlWuTJMmESC8aevCFmWJy5wjAFgNqN6w==",
+ "engines": {
+ "node": ">=0.10"
+ }
+ },
+ "node_modules/require-directory": {
+ "version": "2.1.1",
+ "resolved": "https://registry.npmjs.org/require-directory/-/require-directory-2.1.1.tgz",
+ "integrity": "sha512-fGxEI7+wsG9xrvdjsrlmL22OMTTiHRwAMroiEeMgq8gzoLC/PQr7RsRDSTLUg/bZAZtF+TVIkHc6/4RIKrui+Q==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/require-from-string": {
+ "version": "2.0.2",
+ "resolved": "https://registry.npmjs.org/require-from-string/-/require-from-string-2.0.2.tgz",
+ "integrity": "sha512-Xf0nWe6RseziFMu+Ap9biiUbmplq6S9/p+7w7YXP/JBHhrUDDUhwa+vANyubuqfZWTveU//DYVGsDG7RKL/vEw==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/require-like": {
+ "version": "0.1.2",
+ "resolved": "https://registry.npmjs.org/require-like/-/require-like-0.1.2.tgz",
+ "integrity": "sha512-oyrU88skkMtDdauHDuKVrgR+zuItqr6/c//FXzvmxRGMexSDc6hNvJInGW3LL46n+8b50RykrvwSUIIQH2LQ5A==",
+ "engines": {
+ "node": "*"
+ }
+ },
+ "node_modules/requires-port": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/requires-port/-/requires-port-1.0.0.tgz",
+ "integrity": "sha512-KigOCHcocU3XODJxsu8i/j8T9tzT4adHiecwORRQ0ZZFcp7ahwXuRU1m+yuO90C5ZUyGeGfocHDI14M3L3yDAQ=="
+ },
+ "node_modules/resolve": {
+ "version": "1.22.1",
+ "resolved": "https://registry.npmjs.org/resolve/-/resolve-1.22.1.tgz",
+ "integrity": "sha512-nBpuuYuY5jFsli/JIs1oldw6fOQCBioohqWZg/2hiaOybXOft4lonv85uDOKXdf8rhyK159cxU5cDcK/NKk8zw==",
+ "dependencies": {
+ "is-core-module": "^2.9.0",
+ "path-parse": "^1.0.7",
+ "supports-preserve-symlinks-flag": "^1.0.0"
+ },
+ "bin": {
+ "resolve": "bin/resolve"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/resolve-from": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/resolve-from/-/resolve-from-4.0.0.tgz",
+ "integrity": "sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g==",
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/resolve-pathname": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/resolve-pathname/-/resolve-pathname-3.0.0.tgz",
+ "integrity": "sha512-C7rARubxI8bXFNB/hqcp/4iUeIXJhJZvFPFPiSPRnhU5UPxzMFIl+2E6yY6c4k9giDJAhtV+enfA+G89N6Csng=="
+ },
+ "node_modules/responselike": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/responselike/-/responselike-1.0.2.tgz",
+ "integrity": "sha512-/Fpe5guzJk1gPqdJLJR5u7eG/gNY4nImjbRDaVWVMRhne55TCmj2i9Q+54PBRfatRC8v/rIiv9BN0pMd9OV5EQ==",
+ "dependencies": {
+ "lowercase-keys": "^1.0.0"
+ }
+ },
+ "node_modules/retry": {
+ "version": "0.13.1",
+ "resolved": "https://registry.npmjs.org/retry/-/retry-0.13.1.tgz",
+ "integrity": "sha512-XQBQ3I8W1Cge0Seh+6gjj03LbmRFWuoszgK9ooCpwYIrhhoO80pfq4cUkU5DkknwfOfFteRwlZ56PYOGYyFWdg==",
+ "engines": {
+ "node": ">= 4"
+ }
+ },
+ "node_modules/reusify": {
+ "version": "1.0.4",
+ "resolved": "https://registry.npmjs.org/reusify/-/reusify-1.0.4.tgz",
+ "integrity": "sha512-U9nH88a3fc/ekCF1l0/UP1IosiuIjyTh7hBvXVMHYgVcfGvt897Xguj2UOLDeI5BG2m7/uwyaLVT6fbtCwTyzw==",
+ "engines": {
+ "iojs": ">=1.0.0",
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/rimraf": {
+ "version": "3.0.2",
+ "resolved": "https://registry.npmjs.org/rimraf/-/rimraf-3.0.2.tgz",
+ "integrity": "sha512-JZkJMZkAGFFPP2YqXZXPbMlMBgsxzE8ILs4lMIX/2o0L9UBw9O/Y3o6wFw/i9YLapcUJWwqbi3kdxIPdC62TIA==",
+ "dependencies": {
+ "glob": "^7.1.3"
+ },
+ "bin": {
+ "rimraf": "bin.js"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/isaacs"
+ }
+ },
+ "node_modules/rtl-detect": {
+ "version": "1.0.4",
+ "resolved": "https://registry.npmjs.org/rtl-detect/-/rtl-detect-1.0.4.tgz",
+ "integrity": "sha512-EBR4I2VDSSYr7PkBmFy04uhycIpDKp+21p/jARYXlCSjQksTBQcJ0HFUPOO79EPPH5JS6VAhiIQbycf0O3JAxQ=="
+ },
+ "node_modules/rtlcss": {
+ "version": "3.5.0",
+ "resolved": "https://registry.npmjs.org/rtlcss/-/rtlcss-3.5.0.tgz",
+ "integrity": "sha512-wzgMaMFHQTnyi9YOwsx9LjOxYXJPzS8sYnFaKm6R5ysvTkwzHiB0vxnbHwchHQT65PTdBjDG21/kQBWI7q9O7A==",
+ "dependencies": {
+ "find-up": "^5.0.0",
+ "picocolors": "^1.0.0",
+ "postcss": "^8.3.11",
+ "strip-json-comments": "^3.1.1"
+ },
+ "bin": {
+ "rtlcss": "bin/rtlcss.js"
+ }
+ },
+ "node_modules/rtlcss/node_modules/find-up": {
+ "version": "5.0.0",
+ "resolved": "https://registry.npmjs.org/find-up/-/find-up-5.0.0.tgz",
+ "integrity": "sha512-78/PXT1wlLLDgTzDs7sjq9hzz0vXD+zn+7wypEe4fXQxCmdmqfGsEPQxmiCSQI3ajFV91bVSsvNtrJRiW6nGng==",
+ "dependencies": {
+ "locate-path": "^6.0.0",
+ "path-exists": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/rtlcss/node_modules/locate-path": {
+ "version": "6.0.0",
+ "resolved": "https://registry.npmjs.org/locate-path/-/locate-path-6.0.0.tgz",
+ "integrity": "sha512-iPZK6eYjbxRu3uB4/WZ3EsEIMJFMqAoopl3R+zuq0UjcAm/MO6KCweDgPfP3elTztoKP3KtnVHxTn2NHBSDVUw==",
+ "dependencies": {
+ "p-locate": "^5.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/rtlcss/node_modules/p-limit": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/p-limit/-/p-limit-3.1.0.tgz",
+ "integrity": "sha512-TYOanM3wGwNGsZN2cVTYPArw454xnXj5qmWF1bEoAc4+cU/ol7GVh7odevjp1FNHduHc3KZMcFduxU5Xc6uJRQ==",
+ "dependencies": {
+ "yocto-queue": "^0.1.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/rtlcss/node_modules/p-locate": {
+ "version": "5.0.0",
+ "resolved": "https://registry.npmjs.org/p-locate/-/p-locate-5.0.0.tgz",
+ "integrity": "sha512-LaNjtRWUBY++zB5nE/NwcaoMylSPk+S+ZHNB1TzdbMJMny6dynpAGt7X/tl/QYq3TIeE6nxHppbo2LGymrG5Pw==",
+ "dependencies": {
+ "p-limit": "^3.0.2"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/run-parallel": {
+ "version": "1.2.0",
+ "resolved": "https://registry.npmjs.org/run-parallel/-/run-parallel-1.2.0.tgz",
+ "integrity": "sha512-5l4VyZR86LZ/lDxZTR6jqL8AFE2S0IFLMP26AbjsLVADxHdhB/c0GUsH+y39UfCi3dzz8OlQuPmnaJOMoDHQBA==",
+ "funding": [
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/feross"
+ },
+ {
+ "type": "patreon",
+ "url": "https://www.patreon.com/feross"
+ },
+ {
+ "type": "consulting",
+ "url": "https://feross.org/support"
+ }
+ ],
+ "dependencies": {
+ "queue-microtask": "^1.2.2"
+ }
+ },
+ "node_modules/rxjs": {
+ "version": "7.8.0",
+ "resolved": "https://registry.npmjs.org/rxjs/-/rxjs-7.8.0.tgz",
+ "integrity": "sha512-F2+gxDshqmIub1KdvZkaEfGDwLNpPvk9Fs6LD/MyQxNgMds/WH9OdDDXOmxUZpME+iSK3rQCctkL0DYyytUqMg==",
+ "dependencies": {
+ "tslib": "^2.1.0"
+ }
+ },
+ "node_modules/safe-buffer": {
+ "version": "5.2.1",
+ "resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz",
+ "integrity": "sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ==",
+ "funding": [
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/feross"
+ },
+ {
+ "type": "patreon",
+ "url": "https://www.patreon.com/feross"
+ },
+ {
+ "type": "consulting",
+ "url": "https://feross.org/support"
+ }
+ ]
+ },
+ "node_modules/safer-buffer": {
+ "version": "2.1.2",
+ "resolved": "https://registry.npmjs.org/safer-buffer/-/safer-buffer-2.1.2.tgz",
+ "integrity": "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg=="
+ },
+ "node_modules/sax": {
+ "version": "1.2.4",
+ "resolved": "https://registry.npmjs.org/sax/-/sax-1.2.4.tgz",
+ "integrity": "sha512-NqVDv9TpANUjFm0N8uM5GxL36UgKi9/atZw+x7YFnQ8ckwFGKrl4xX4yWtrey3UJm5nP1kUbnYgLopqWNSRhWw=="
+ },
+ "node_modules/scheduler": {
+ "version": "0.20.2",
+ "resolved": "https://registry.npmjs.org/scheduler/-/scheduler-0.20.2.tgz",
+ "integrity": "sha512-2eWfGgAqqWFGqtdMmcL5zCMK1U8KlXv8SQFGglL3CEtd0aDVDWgeF/YoCmvln55m5zSk3J/20hTaSBeSObsQDQ==",
+ "dependencies": {
+ "loose-envify": "^1.1.0",
+ "object-assign": "^4.1.1"
+ }
+ },
+ "node_modules/schema-utils": {
+ "version": "2.7.1",
+ "resolved": "https://registry.npmjs.org/schema-utils/-/schema-utils-2.7.1.tgz",
+ "integrity": "sha512-SHiNtMOUGWBQJwzISiVYKu82GiV4QYGePp3odlY1tuKO7gPtphAT5R/py0fA6xtbgLL/RvtJZnU9b8s0F1q0Xg==",
+ "dependencies": {
+ "@types/json-schema": "^7.0.5",
+ "ajv": "^6.12.4",
+ "ajv-keywords": "^3.5.2"
+ },
+ "engines": {
+ "node": ">= 8.9.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ }
+ },
+ "node_modules/section-matter": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/section-matter/-/section-matter-1.0.0.tgz",
+ "integrity": "sha512-vfD3pmTzGpufjScBh50YHKzEu2lxBWhVEHsNGoEXmCmn2hKGfeNLYMzCJpe8cD7gqX7TJluOVpBkAequ6dgMmA==",
+ "dependencies": {
+ "extend-shallow": "^2.0.1",
+ "kind-of": "^6.0.0"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/select-hose": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/select-hose/-/select-hose-2.0.0.tgz",
+ "integrity": "sha512-mEugaLK+YfkijB4fx0e6kImuJdCIt2LxCRcbEYPqRGCs4F2ogyfZU5IAZRdjCP8JPq2AtdNoC/Dux63d9Kiryg=="
+ },
+ "node_modules/selfsigned": {
+ "version": "2.1.1",
+ "resolved": "https://registry.npmjs.org/selfsigned/-/selfsigned-2.1.1.tgz",
+ "integrity": "sha512-GSL3aowiF7wa/WtSFwnUrludWFoNhftq8bUkH9pkzjpN2XSPOAYEgg6e0sS9s0rZwgJzJiQRPU18A6clnoW5wQ==",
+ "dependencies": {
+ "node-forge": "^1"
+ },
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/semver": {
+ "version": "7.5.4",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-7.5.4.tgz",
+ "integrity": "sha512-1bCSESV6Pv+i21Hvpxp3Dx+pSD8lIPt8uVjRrxAUt/nbswYc+tK6Y2btiULjd4+fnq15PX+nqQDC7Oft7WkwcA==",
+ "dependencies": {
+ "lru-cache": "^6.0.0"
+ },
+ "bin": {
+ "semver": "bin/semver.js"
+ },
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/semver-diff": {
+ "version": "3.1.1",
+ "resolved": "https://registry.npmjs.org/semver-diff/-/semver-diff-3.1.1.tgz",
+ "integrity": "sha512-GX0Ix/CJcHyB8c4ykpHGIAvLyOwOobtM/8d+TQkAd81/bEjgPHrfba41Vpesr7jX/t8Uh+R3EX9eAS5be+jQYg==",
+ "dependencies": {
+ "semver": "^6.3.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/semver-diff/node_modules/semver": {
+ "version": "6.3.1",
+ "resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
+ "integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==",
+ "bin": {
+ "semver": "bin/semver.js"
+ }
+ },
+ "node_modules/semver/node_modules/lru-cache": {
+ "version": "6.0.0",
+ "resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-6.0.0.tgz",
+ "integrity": "sha512-Jo6dJ04CmSjuznwJSS3pUeWmd/H0ffTlkXXgwZi+eq1UCmqQwCh+eLsYOYCwY991i2Fah4h1BEMCx4qThGbsiA==",
+ "dependencies": {
+ "yallist": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/semver/node_modules/yallist": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/yallist/-/yallist-4.0.0.tgz",
+ "integrity": "sha512-3wdGidZyq5PB084XLES5TpOSRA3wjXAlIWMhum2kRcv/41Sn2emQ0dycQW4uZXLejwKvg6EsvbdlVL+FYEct7A=="
+ },
+ "node_modules/send": {
+ "version": "0.19.0",
+ "resolved": "https://registry.npmjs.org/send/-/send-0.19.0.tgz",
+ "integrity": "sha512-dW41u5VfLXu8SJh5bwRmyYUbAoSB3c9uQh6L8h/KtsFREPWpbX1lrljJo186Jc4nmci/sGUZ9a0a0J2zgfq2hw==",
+ "dependencies": {
+ "debug": "2.6.9",
+ "depd": "2.0.0",
+ "destroy": "1.2.0",
+ "encodeurl": "~1.0.2",
+ "escape-html": "~1.0.3",
+ "etag": "~1.8.1",
+ "fresh": "0.5.2",
+ "http-errors": "2.0.0",
+ "mime": "1.6.0",
+ "ms": "2.1.3",
+ "on-finished": "2.4.1",
+ "range-parser": "~1.2.1",
+ "statuses": "2.0.1"
+ },
+ "engines": {
+ "node": ">= 0.8.0"
+ }
+ },
+ "node_modules/send/node_modules/debug": {
+ "version": "2.6.9",
+ "resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
+ "integrity": "sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA==",
+ "dependencies": {
+ "ms": "2.0.0"
+ }
+ },
+ "node_modules/send/node_modules/debug/node_modules/ms": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz",
+ "integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A=="
+ },
+ "node_modules/send/node_modules/ms": {
+ "version": "2.1.3",
+ "resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
+ "integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA=="
+ },
+ "node_modules/send/node_modules/range-parser": {
+ "version": "1.2.1",
+ "resolved": "https://registry.npmjs.org/range-parser/-/range-parser-1.2.1.tgz",
+ "integrity": "sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/serialize-javascript": {
+ "version": "6.0.2",
+ "resolved": "https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-6.0.2.tgz",
+ "integrity": "sha512-Saa1xPByTTq2gdeFZYLLo+RFE35NHZkAbqZeWNd3BpzppeVisAqpDjcp8dyf6uIvEqJRd46jemmyA4iFIeVk8g==",
+ "license": "BSD-3-Clause",
+ "dependencies": {
+ "randombytes": "^2.1.0"
+ }
+ },
+ "node_modules/serve-handler": {
+ "version": "6.1.5",
+ "resolved": "https://registry.npmjs.org/serve-handler/-/serve-handler-6.1.5.tgz",
+ "integrity": "sha512-ijPFle6Hwe8zfmBxJdE+5fta53fdIY0lHISJvuikXB3VYFafRjMRpOffSPvCYsbKyBA7pvy9oYr/BT1O3EArlg==",
+ "dependencies": {
+ "bytes": "3.0.0",
+ "content-disposition": "0.5.2",
+ "fast-url-parser": "1.1.3",
+ "mime-types": "2.1.18",
+ "minimatch": "3.1.2",
+ "path-is-inside": "1.0.2",
+ "path-to-regexp": "2.2.1",
+ "range-parser": "1.2.0"
+ }
+ },
+ "node_modules/serve-handler/node_modules/path-to-regexp": {
+ "version": "2.2.1",
+ "resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-2.2.1.tgz",
+ "integrity": "sha512-gu9bD6Ta5bwGrrU8muHzVOBFFREpp2iRkVfhBJahwJ6p6Xw20SjT0MxLnwkjOibQmGSYhiUnf2FLe7k+jcFmGQ=="
+ },
+ "node_modules/serve-index": {
+ "version": "1.9.1",
+ "resolved": "https://registry.npmjs.org/serve-index/-/serve-index-1.9.1.tgz",
+ "integrity": "sha512-pXHfKNP4qujrtteMrSBb0rc8HJ9Ms/GrXwcUtUtD5s4ewDJI8bT3Cz2zTVRMKtri49pLx2e0Ya8ziP5Ya2pZZw==",
+ "dependencies": {
+ "accepts": "~1.3.4",
+ "batch": "0.6.1",
+ "debug": "2.6.9",
+ "escape-html": "~1.0.3",
+ "http-errors": "~1.6.2",
+ "mime-types": "~2.1.17",
+ "parseurl": "~1.3.2"
+ },
+ "engines": {
+ "node": ">= 0.8.0"
+ }
+ },
+ "node_modules/serve-index/node_modules/debug": {
+ "version": "2.6.9",
+ "resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
+ "integrity": "sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA==",
+ "dependencies": {
+ "ms": "2.0.0"
+ }
+ },
+ "node_modules/serve-index/node_modules/depd": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmjs.org/depd/-/depd-1.1.2.tgz",
+ "integrity": "sha512-7emPTl6Dpo6JRXOXjLRxck+FlLRX5847cLKEn00PLAgc3g2hTZZgr+e4c2v6QpSmLeFP3n5yUo7ft6avBK/5jQ==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/serve-index/node_modules/http-errors": {
+ "version": "1.6.3",
+ "resolved": "https://registry.npmjs.org/http-errors/-/http-errors-1.6.3.tgz",
+ "integrity": "sha512-lks+lVC8dgGyh97jxvxeYTWQFvh4uw4yC12gVl63Cg30sjPX4wuGcdkICVXDAESr6OJGjqGA8Iz5mkeN6zlD7A==",
+ "dependencies": {
+ "depd": "~1.1.2",
+ "inherits": "2.0.3",
+ "setprototypeof": "1.1.0",
+ "statuses": ">= 1.4.0 < 2"
+ },
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/serve-index/node_modules/inherits": {
+ "version": "2.0.3",
+ "resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.3.tgz",
+ "integrity": "sha512-x00IRNXNy63jwGkJmzPigoySHbaqpNuzKbBOmzK+g2OdZpQ9w+sxCN+VSB3ja7IAge2OP2qpfxTjeNcyjmW1uw=="
+ },
+ "node_modules/serve-index/node_modules/ms": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz",
+ "integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A=="
+ },
+ "node_modules/serve-index/node_modules/setprototypeof": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.1.0.tgz",
+ "integrity": "sha512-BvE/TwpZX4FXExxOxZyRGQQv651MSwmWKZGqvmPcRIjDqWub67kTKuIMx43cZZrS/cBBzwBcNDWoFxt2XEFIpQ=="
+ },
+ "node_modules/serve-index/node_modules/statuses": {
+ "version": "1.5.0",
+ "resolved": "https://registry.npmjs.org/statuses/-/statuses-1.5.0.tgz",
+ "integrity": "sha512-OpZ3zP+jT1PI7I8nemJX4AKmAX070ZkYPVWV/AaKTJl+tXCTGyVdC1a4SL8RUQYEwk/f34ZX8UTykN68FwrqAA==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/serve-static": {
+ "version": "1.16.2",
+ "resolved": "https://registry.npmjs.org/serve-static/-/serve-static-1.16.2.tgz",
+ "integrity": "sha512-VqpjJZKadQB/PEbEwvFdO43Ax5dFBZ2UECszz8bQ7pi7wt//PWe1P6MN7eCnjsatYtBT6EuiClbjSWP2WrIoTw==",
+ "dependencies": {
+ "encodeurl": "~2.0.0",
+ "escape-html": "~1.0.3",
+ "parseurl": "~1.3.3",
+ "send": "0.19.0"
+ },
+ "engines": {
+ "node": ">= 0.8.0"
+ }
+ },
+ "node_modules/serve-static/node_modules/encodeurl": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/encodeurl/-/encodeurl-2.0.0.tgz",
+ "integrity": "sha512-Q0n9HRi4m6JuGIV1eFlmvJB7ZEVxu93IrMyiMsGC0lrMJMWzRgx6WGquyfQgZVb31vhGgXnfmPNNXmxnOkRBrg==",
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/set-function-length": {
+ "version": "1.2.2",
+ "resolved": "https://registry.npmjs.org/set-function-length/-/set-function-length-1.2.2.tgz",
+ "integrity": "sha512-pgRc4hJ4/sNjWCSS9AmnS40x3bNMDTknHgL5UaMBTMyJnU90EgWh1Rz+MC9eFu4BuN/UwZjKQuY/1v3rM7HMfg==",
+ "dependencies": {
+ "define-data-property": "^1.1.4",
+ "es-errors": "^1.3.0",
+ "function-bind": "^1.1.2",
+ "get-intrinsic": "^1.2.4",
+ "gopd": "^1.0.1",
+ "has-property-descriptors": "^1.0.2"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ }
+ },
+ "node_modules/setimmediate": {
+ "version": "1.0.5",
+ "resolved": "https://registry.npmjs.org/setimmediate/-/setimmediate-1.0.5.tgz",
+ "integrity": "sha512-MATJdZp8sLqDl/68LfQmbP8zKPLQNV6BIZoIgrscFDQ+RsvK/BxeDQOgyxKKoh0y/8h3BqVFnCqQ/gd+reiIXA=="
+ },
+ "node_modules/setprototypeof": {
+ "version": "1.2.0",
+ "resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.2.0.tgz",
+ "integrity": "sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw=="
+ },
+ "node_modules/shallow-clone": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/shallow-clone/-/shallow-clone-3.0.1.tgz",
+ "integrity": "sha512-/6KqX+GVUdqPuPPd2LxDDxzX6CAbjJehAAOKlNpqqUpAqPM6HeL8f+o3a+JsyGjn2lv0WY8UsTgUJjU9Ok55NA==",
+ "dependencies": {
+ "kind-of": "^6.0.2"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/shallowequal": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/shallowequal/-/shallowequal-1.1.0.tgz",
+ "integrity": "sha512-y0m1JoUZSlPAjXVtPPW70aZWfIL/dSP7AFkRnniLCrK/8MDKog3TySTBmckD+RObVxH0v4Tox67+F14PdED2oQ=="
+ },
+ "node_modules/shebang-command": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/shebang-command/-/shebang-command-2.0.0.tgz",
+ "integrity": "sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA==",
+ "dependencies": {
+ "shebang-regex": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/shebang-regex": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/shebang-regex/-/shebang-regex-3.0.0.tgz",
+ "integrity": "sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/shell-quote": {
+ "version": "1.8.0",
+ "resolved": "https://registry.npmjs.org/shell-quote/-/shell-quote-1.8.0.tgz",
+ "integrity": "sha512-QHsz8GgQIGKlRi24yFc6a6lN69Idnx634w49ay6+jA5yFh7a1UY+4Rp6HPx/L/1zcEDPEij8cIsiqR6bQsE5VQ==",
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/shelljs": {
+ "version": "0.8.5",
+ "resolved": "https://registry.npmjs.org/shelljs/-/shelljs-0.8.5.tgz",
+ "integrity": "sha512-TiwcRcrkhHvbrZbnRcFYMLl30Dfov3HKqzp5tO5b4pt6G/SezKcYhmDg15zXVBswHmctSAQKznqNW2LO5tTDow==",
+ "dependencies": {
+ "glob": "^7.0.0",
+ "interpret": "^1.0.0",
+ "rechoir": "^0.6.2"
+ },
+ "bin": {
+ "shjs": "bin/shjs"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/should": {
+ "version": "13.2.3",
+ "resolved": "https://registry.npmjs.org/should/-/should-13.2.3.tgz",
+ "integrity": "sha512-ggLesLtu2xp+ZxI+ysJTmNjh2U0TsC+rQ/pfED9bUZZ4DKefP27D+7YJVVTvKsmjLpIi9jAa7itwDGkDDmt1GQ==",
+ "dependencies": {
+ "should-equal": "^2.0.0",
+ "should-format": "^3.0.3",
+ "should-type": "^1.4.0",
+ "should-type-adaptors": "^1.0.1",
+ "should-util": "^1.0.0"
+ }
+ },
+ "node_modules/should-equal": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/should-equal/-/should-equal-2.0.0.tgz",
+ "integrity": "sha512-ZP36TMrK9euEuWQYBig9W55WPC7uo37qzAEmbjHz4gfyuXrEUgF8cUvQVO+w+d3OMfPvSRQJ22lSm8MQJ43LTA==",
+ "dependencies": {
+ "should-type": "^1.4.0"
+ }
+ },
+ "node_modules/should-format": {
+ "version": "3.0.3",
+ "resolved": "https://registry.npmjs.org/should-format/-/should-format-3.0.3.tgz",
+ "integrity": "sha512-hZ58adtulAk0gKtua7QxevgUaXTTXxIi8t41L3zo9AHvjXO1/7sdLECuHeIN2SRtYXpNkmhoUP2pdeWgricQ+Q==",
+ "dependencies": {
+ "should-type": "^1.3.0",
+ "should-type-adaptors": "^1.0.1"
+ }
+ },
+ "node_modules/should-type": {
+ "version": "1.4.0",
+ "resolved": "https://registry.npmjs.org/should-type/-/should-type-1.4.0.tgz",
+ "integrity": "sha512-MdAsTu3n25yDbIe1NeN69G4n6mUnJGtSJHygX3+oN0ZbO3DTiATnf7XnYJdGT42JCXurTb1JI0qOBR65shvhPQ=="
+ },
+ "node_modules/should-type-adaptors": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/should-type-adaptors/-/should-type-adaptors-1.1.0.tgz",
+ "integrity": "sha512-JA4hdoLnN+kebEp2Vs8eBe9g7uy0zbRo+RMcU0EsNy+R+k049Ki+N5tT5Jagst2g7EAja+euFuoXFCa8vIklfA==",
+ "dependencies": {
+ "should-type": "^1.3.0",
+ "should-util": "^1.0.0"
+ }
+ },
+ "node_modules/should-util": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/should-util/-/should-util-1.0.1.tgz",
+ "integrity": "sha512-oXF8tfxx5cDk8r2kYqlkUJzZpDBqVY/II2WhvU0n9Y3XYvAYRmeaf1PvvIvTgPnv4KJ+ES5M0PyDq5Jp+Ygy2g=="
+ },
+ "node_modules/side-channel": {
+ "version": "1.0.6",
+ "resolved": "https://registry.npmjs.org/side-channel/-/side-channel-1.0.6.tgz",
+ "integrity": "sha512-fDW/EZ6Q9RiO8eFG8Hj+7u/oW+XrPTIChwCOM2+th2A6OblDtYYIpve9m+KvI9Z4C9qSEXlaGR6bTEYHReuglA==",
+ "dependencies": {
+ "call-bind": "^1.0.7",
+ "es-errors": "^1.3.0",
+ "get-intrinsic": "^1.2.4",
+ "object-inspect": "^1.13.1"
+ },
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/signal-exit": {
+ "version": "3.0.7",
+ "resolved": "https://registry.npmjs.org/signal-exit/-/signal-exit-3.0.7.tgz",
+ "integrity": "sha512-wnD2ZE+l+SPC/uoS0vXeE9L1+0wuaMqKlfz9AMUo38JsyLSBWSFcHR1Rri62LZc12vLr1gb3jl7iwQhgwpAbGQ=="
+ },
+ "node_modules/sirv": {
+ "version": "1.0.19",
+ "resolved": "https://registry.npmjs.org/sirv/-/sirv-1.0.19.tgz",
+ "integrity": "sha512-JuLThK3TnZG1TAKDwNIqNq6QA2afLOCcm+iE8D1Kj3GA40pSPsxQjjJl0J8X3tsR7T+CP1GavpzLwYkgVLWrZQ==",
+ "dependencies": {
+ "@polka/url": "^1.0.0-next.20",
+ "mrmime": "^1.0.0",
+ "totalist": "^1.0.0"
+ },
+ "engines": {
+ "node": ">= 10"
+ }
+ },
+ "node_modules/sisteransi": {
+ "version": "1.0.5",
+ "resolved": "https://registry.npmjs.org/sisteransi/-/sisteransi-1.0.5.tgz",
+ "integrity": "sha512-bLGGlR1QxBcynn2d5YmDX4MGjlZvy2MRBDRNHLJ8VI6l6+9FUiyTFNJ0IveOSP0bcXgVDPRcfGqA0pjaqUpfVg=="
+ },
+ "node_modules/sitemap": {
+ "version": "7.1.1",
+ "resolved": "https://registry.npmjs.org/sitemap/-/sitemap-7.1.1.tgz",
+ "integrity": "sha512-mK3aFtjz4VdJN0igpIJrinf3EO8U8mxOPsTBzSsy06UtjZQJ3YY3o3Xa7zSc5nMqcMrRwlChHZ18Kxg0caiPBg==",
+ "dependencies": {
+ "@types/node": "^17.0.5",
+ "@types/sax": "^1.2.1",
+ "arg": "^5.0.0",
+ "sax": "^1.2.4"
+ },
+ "bin": {
+ "sitemap": "dist/cli.js"
+ },
+ "engines": {
+ "node": ">=12.0.0",
+ "npm": ">=5.6.0"
+ }
+ },
+ "node_modules/sitemap/node_modules/@types/node": {
+ "version": "17.0.45",
+ "resolved": "https://registry.npmjs.org/@types/node/-/node-17.0.45.tgz",
+ "integrity": "sha512-w+tIMs3rq2afQdsPJlODhoUEKzFP1ayaoyl1CcnwtIlsVe7K7bA1NGm4s3PraqTLlXnbIN84zuBlxBWo1u9BLw=="
+ },
+ "node_modules/slash": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/slash/-/slash-3.0.0.tgz",
+ "integrity": "sha512-g9Q1haeby36OSStwb4ntCGGGaKsaVSjQ68fBxoQcutl5fS1vuY18H3wSt3jFyFtrkx+Kz0V1G85A4MyAdDMi2Q==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/slugify": {
+ "version": "1.4.7",
+ "resolved": "https://registry.npmjs.org/slugify/-/slugify-1.4.7.tgz",
+ "integrity": "sha512-tf+h5W1IrjNm/9rKKj0JU2MDMruiopx0jjVA5zCdBtcGjfp0+c5rHw/zADLC3IeKlGHtVbHtpfzvYA0OYT+HKg==",
+ "engines": {
+ "node": ">=8.0.0"
+ }
+ },
+ "node_modules/sockjs": {
+ "version": "0.3.24",
+ "resolved": "https://registry.npmjs.org/sockjs/-/sockjs-0.3.24.tgz",
+ "integrity": "sha512-GJgLTZ7vYb/JtPSSZ10hsOYIvEYsjbNU+zPdIHcUaWVNUEPivzxku31865sSSud0Da0W4lEeOPlmw93zLQchuQ==",
+ "dependencies": {
+ "faye-websocket": "^0.11.3",
+ "uuid": "^8.3.2",
+ "websocket-driver": "^0.7.4"
+ }
+ },
+ "node_modules/sort-css-media-queries": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/sort-css-media-queries/-/sort-css-media-queries-2.1.0.tgz",
+ "integrity": "sha512-IeWvo8NkNiY2vVYdPa27MCQiR0MN0M80johAYFVxWWXQ44KU84WNxjslwBHmc/7ZL2ccwkM7/e6S5aiKZXm7jA==",
+ "engines": {
+ "node": ">= 6.3.0"
+ }
+ },
+ "node_modules/source-map": {
+ "version": "0.6.1",
+ "resolved": "https://registry.npmjs.org/source-map/-/source-map-0.6.1.tgz",
+ "integrity": "sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/source-map-js": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.0.2.tgz",
+ "integrity": "sha512-R0XvVJ9WusLiqTCEiGCmICCMplcCkIwwR11mOSD9CR5u+IXYdiseeEuXCVAjS54zqwkLcPNnmU4OeJ6tUrWhDw==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/source-map-support": {
+ "version": "0.5.21",
+ "resolved": "https://registry.npmjs.org/source-map-support/-/source-map-support-0.5.21.tgz",
+ "integrity": "sha512-uBHU3L3czsIyYXKX88fdrGovxdSCoTGDRZ6SYXtSRxLZUzHg5P/66Ht6uoUlHu9EZod+inXhKo3qQgwXUT/y1w==",
+ "dependencies": {
+ "buffer-from": "^1.0.0",
+ "source-map": "^0.6.0"
+ }
+ },
+ "node_modules/space-separated-tokens": {
+ "version": "1.1.5",
+ "resolved": "https://registry.npmjs.org/space-separated-tokens/-/space-separated-tokens-1.1.5.tgz",
+ "integrity": "sha512-q/JSVd1Lptzhf5bkYm4ob4iWPjx0KiRe3sRFBNrVqbJkFaBm5vbbowy1mymoPNLRa52+oadOhJ+K49wsSeSjTA==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/spdy": {
+ "version": "4.0.2",
+ "resolved": "https://registry.npmjs.org/spdy/-/spdy-4.0.2.tgz",
+ "integrity": "sha512-r46gZQZQV+Kl9oItvl1JZZqJKGr+oEkB08A6BzkiR7593/7IbtuncXHd2YoYeTsG4157ZssMu9KYvUHLcjcDoA==",
+ "dependencies": {
+ "debug": "^4.1.0",
+ "handle-thing": "^2.0.0",
+ "http-deceiver": "^1.2.7",
+ "select-hose": "^2.0.0",
+ "spdy-transport": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=6.0.0"
+ }
+ },
+ "node_modules/spdy-transport": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/spdy-transport/-/spdy-transport-3.0.0.tgz",
+ "integrity": "sha512-hsLVFE5SjA6TCisWeJXFKniGGOpBgMLmerfO2aCyCU5s7nJ/rpAepqmFifv/GCbSbueEeAJJnmSQ2rKC/g8Fcw==",
+ "dependencies": {
+ "debug": "^4.1.0",
+ "detect-node": "^2.0.4",
+ "hpack.js": "^2.1.6",
+ "obuf": "^1.1.2",
+ "readable-stream": "^3.0.6",
+ "wbuf": "^1.7.3"
+ }
+ },
+ "node_modules/sprintf-js": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmjs.org/sprintf-js/-/sprintf-js-1.0.3.tgz",
+ "integrity": "sha512-D9cPgkvLlV3t3IzL0D0YLvGA9Ahk4PcvVwUbN0dSGr1aP0Nrt4AEnTUbuGvquEC0mA64Gqt1fzirlRs5ibXx8g=="
+ },
+ "node_modules/stable": {
+ "version": "0.1.8",
+ "resolved": "https://registry.npmjs.org/stable/-/stable-0.1.8.tgz",
+ "integrity": "sha512-ji9qxRnOVfcuLDySj9qzhGSEFVobyt1kIOSkj1qZzYLzq7Tos/oUUWvotUPQLlrsidqsK6tBH89Bc9kL5zHA6w==",
+ "deprecated": "Modern JS already guarantees Array#sort() is a stable sort, so this library is deprecated. See the compatibility table on MDN: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/sort#browser_compatibility"
+ },
+ "node_modules/state-toggle": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmjs.org/state-toggle/-/state-toggle-1.0.3.tgz",
+ "integrity": "sha512-d/5Z4/2iiCnHw6Xzghyhb+GcmF89bxwgXG60wjIiZaxnymbyOmI8Hk4VqHXiVVp6u2ysaskFfXg3ekCj4WNftQ==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/statuses": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/statuses/-/statuses-2.0.1.tgz",
+ "integrity": "sha512-RwNA9Z/7PrK06rYLIzFMlaF+l73iwpzsqRIFgbMLbTcLD6cOao82TaWefPXQvB2fOC4AjuYSEndS7N/mTCbkdQ==",
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/std-env": {
+ "version": "3.3.2",
+ "resolved": "https://registry.npmjs.org/std-env/-/std-env-3.3.2.tgz",
+ "integrity": "sha512-uUZI65yrV2Qva5gqE0+A7uVAvO40iPo6jGhs7s8keRfHCmtg+uB2X6EiLGCI9IgL1J17xGhvoOqSz79lzICPTA=="
+ },
+ "node_modules/stickyfill": {
+ "version": "1.1.1",
+ "resolved": "https://registry.npmjs.org/stickyfill/-/stickyfill-1.1.1.tgz",
+ "integrity": "sha512-GCp7vHAfpao+Qh/3Flh9DXEJ/qSi0KJwJw6zYlZOtRYXWUIpMM6mC2rIep/dK8RQqwW0KxGJIllmjPIBOGN8AA=="
+ },
+ "node_modules/string_decoder": {
+ "version": "1.3.0",
+ "resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-1.3.0.tgz",
+ "integrity": "sha512-hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA==",
+ "dependencies": {
+ "safe-buffer": "~5.2.0"
+ }
+ },
+ "node_modules/string-width": {
+ "version": "5.1.2",
+ "resolved": "https://registry.npmjs.org/string-width/-/string-width-5.1.2.tgz",
+ "integrity": "sha512-HnLOCR3vjcY8beoNLtcjZ5/nxn2afmME6lhrDrebokqMap+XbeW8n9TXpPDOqdGK5qcI3oT0GKTW6wC7EMiVqA==",
+ "dependencies": {
+ "eastasianwidth": "^0.2.0",
+ "emoji-regex": "^9.2.2",
+ "strip-ansi": "^7.0.1"
+ },
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/string-width/node_modules/ansi-regex": {
+ "version": "6.0.1",
+ "resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-6.0.1.tgz",
+ "integrity": "sha512-n5M855fKb2SsfMIiFFoVrABHJC8QtHwVx+mHWP3QcEqBHYienj5dHSgjbxtC0WEZXYt4wcD6zrQElDPhFuZgfA==",
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/ansi-regex?sponsor=1"
+ }
+ },
+ "node_modules/string-width/node_modules/strip-ansi": {
+ "version": "7.0.1",
+ "resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-7.0.1.tgz",
+ "integrity": "sha512-cXNxvT8dFNRVfhVME3JAe98mkXDYN2O1l7jmcwMnOslDeESg1rF/OZMtK0nRAhiari1unG5cD4jG3rapUAkLbw==",
+ "dependencies": {
+ "ansi-regex": "^6.0.1"
+ },
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/strip-ansi?sponsor=1"
+ }
+ },
+ "node_modules/stringify-object": {
+ "version": "3.3.0",
+ "resolved": "https://registry.npmjs.org/stringify-object/-/stringify-object-3.3.0.tgz",
+ "integrity": "sha512-rHqiFh1elqCQ9WPLIC8I0Q/g/wj5J1eMkyoiD6eoQApWHP0FtlK7rqnhmabL5VUY9JQCcqwwvlOaSuutekgyrw==",
+ "dependencies": {
+ "get-own-enumerable-property-symbols": "^3.0.0",
+ "is-obj": "^1.0.1",
+ "is-regexp": "^1.0.0"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/strip-ansi": {
+ "version": "6.0.1",
+ "resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz",
+ "integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==",
+ "dependencies": {
+ "ansi-regex": "^5.0.1"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/strip-bom-string": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/strip-bom-string/-/strip-bom-string-1.0.0.tgz",
+ "integrity": "sha512-uCC2VHvQRYu+lMh4My/sFNmF2klFymLX1wHJeXnbEJERpV/ZsVuonzerjfrGpIGF7LBVa1O7i9kjiWvJiFck8g==",
+ "engines": {
+ "node": ">=0.10.0"
+ }
+ },
+ "node_modules/strip-final-newline": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/strip-final-newline/-/strip-final-newline-2.0.0.tgz",
+ "integrity": "sha512-BrpvfNAE3dcvq7ll3xVumzjKjZQ5tI1sEUIKr3Uoks0XUl45St3FlatVqef9prk4jRDzhW6WZg+3bk93y6pLjA==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/strip-json-comments": {
+ "version": "3.1.1",
+ "resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-3.1.1.tgz",
+ "integrity": "sha512-6fPc+R4ihwqP6N/aIv2f1gMH8lOVtWQHoqC4yK6oSDVVocumAsfCqjkXnqiYMhmMwS/mEHLp7Vehlt3ql6lEig==",
+ "engines": {
+ "node": ">=8"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/style-loader": {
+ "version": "3.3.2",
+ "resolved": "https://registry.npmjs.org/style-loader/-/style-loader-3.3.2.tgz",
+ "integrity": "sha512-RHs/vcrKdQK8wZliteNK4NKzxvLBzpuHMqYmUVWeKa6MkaIQ97ZTOS0b+zapZhy6GcrgWnvWYCMHRirC3FsUmw==",
+ "engines": {
+ "node": ">= 12.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ },
+ "peerDependencies": {
+ "webpack": "^5.0.0"
+ }
+ },
+ "node_modules/style-to-object": {
+ "version": "0.3.0",
+ "resolved": "https://registry.npmjs.org/style-to-object/-/style-to-object-0.3.0.tgz",
+ "integrity": "sha512-CzFnRRXhzWIdItT3OmF8SQfWyahHhjq3HwcMNCNLn+N7klOOqPjMeG/4JSu77D7ypZdGvSzvkrbyeTMizz2VrA==",
+ "dependencies": {
+ "inline-style-parser": "0.1.1"
+ }
+ },
+ "node_modules/styled-components": {
+ "version": "5.3.9",
+ "resolved": "https://registry.npmjs.org/styled-components/-/styled-components-5.3.9.tgz",
+ "integrity": "sha512-Aj3kb13B75DQBo2oRwRa/APdB5rSmwUfN5exyarpX+x/tlM/rwZA2vVk2vQgVSP6WKaZJHWwiFrzgHt+CLtB4A==",
+ "dependencies": {
+ "@babel/helper-module-imports": "^7.0.0",
+ "@babel/traverse": "^7.4.5",
+ "@emotion/is-prop-valid": "^1.1.0",
+ "@emotion/stylis": "^0.8.4",
+ "@emotion/unitless": "^0.7.4",
+ "babel-plugin-styled-components": ">= 1.12.0",
+ "css-to-react-native": "^3.0.0",
+ "hoist-non-react-statics": "^3.0.0",
+ "shallowequal": "^1.1.0",
+ "supports-color": "^5.5.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/styled-components"
+ },
+ "peerDependencies": {
+ "react": ">= 16.8.0",
+ "react-dom": ">= 16.8.0",
+ "react-is": ">= 16.8.0"
+ }
+ },
+ "node_modules/styled-components/node_modules/babel-plugin-styled-components": {
+ "version": "2.0.7",
+ "resolved": "https://registry.npmjs.org/babel-plugin-styled-components/-/babel-plugin-styled-components-2.0.7.tgz",
+ "integrity": "sha512-i7YhvPgVqRKfoQ66toiZ06jPNA3p6ierpfUuEWxNF+fV27Uv5gxBkf8KZLHUCc1nFA9j6+80pYoIpqCeyW3/bA==",
+ "dependencies": {
+ "@babel/helper-annotate-as-pure": "^7.16.0",
+ "@babel/helper-module-imports": "^7.16.0",
+ "babel-plugin-syntax-jsx": "^6.18.0",
+ "lodash": "^4.17.11",
+ "picomatch": "^2.3.0"
+ },
+ "peerDependencies": {
+ "styled-components": ">= 2"
+ }
+ },
+ "node_modules/styled-components/node_modules/has-flag": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/has-flag/-/has-flag-3.0.0.tgz",
+ "integrity": "sha512-sKJf1+ceQBr4SMkvQnBDNDtf4TXpVhVGateu0t918bl30FnbE2m4vNLX+VWe/dpjlb+HugGYzW7uQXH98HPEYw==",
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/styled-components/node_modules/supports-color": {
+ "version": "5.5.0",
+ "resolved": "https://registry.npmjs.org/supports-color/-/supports-color-5.5.0.tgz",
+ "integrity": "sha512-QjVjwdXIt408MIiAqCX4oUKsgU2EqAGzs2Ppkm4aQYbjm+ZEWEcW4SfFNTr4uMNZma0ey4f5lgLrkB0aX0QMow==",
+ "dependencies": {
+ "has-flag": "^3.0.0"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/stylehacks": {
+ "version": "5.1.1",
+ "resolved": "https://registry.npmjs.org/stylehacks/-/stylehacks-5.1.1.tgz",
+ "integrity": "sha512-sBpcd5Hx7G6seo7b1LkpttvTz7ikD0LlH5RmdcBNb6fFR0Fl7LQwHDFr300q4cwUqi+IYrFGmsIHieMBfnN/Bw==",
+ "dependencies": {
+ "browserslist": "^4.21.4",
+ "postcss-selector-parser": "^6.0.4"
+ },
+ "engines": {
+ "node": "^10 || ^12 || >=14.0"
+ },
+ "peerDependencies": {
+ "postcss": "^8.2.15"
+ }
+ },
+ "node_modules/supports-color": {
+ "version": "7.2.0",
+ "resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz",
+ "integrity": "sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw==",
+ "dependencies": {
+ "has-flag": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/supports-preserve-symlinks-flag": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/supports-preserve-symlinks-flag/-/supports-preserve-symlinks-flag-1.0.0.tgz",
+ "integrity": "sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w==",
+ "engines": {
+ "node": ">= 0.4"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/ljharb"
+ }
+ },
+ "node_modules/svg-parser": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/svg-parser/-/svg-parser-2.0.4.tgz",
+ "integrity": "sha512-e4hG1hRwoOdRb37cIMSgzNsxyzKfayW6VOflrwvR+/bzrkyxY/31WkbgnQpgtrNp1SdpJvpUAGTa/ZoiPNDuRQ=="
+ },
+ "node_modules/svgo": {
+ "version": "2.8.0",
+ "resolved": "https://registry.npmjs.org/svgo/-/svgo-2.8.0.tgz",
+ "integrity": "sha512-+N/Q9kV1+F+UeWYoSiULYo4xYSDQlTgb+ayMobAXPwMnLvop7oxKMo9OzIrX5x3eS4L4f2UHhc9axXwY8DpChg==",
+ "dependencies": {
+ "@trysound/sax": "0.2.0",
+ "commander": "^7.2.0",
+ "css-select": "^4.1.3",
+ "css-tree": "^1.1.3",
+ "csso": "^4.2.0",
+ "picocolors": "^1.0.0",
+ "stable": "^0.1.8"
+ },
+ "bin": {
+ "svgo": "bin/svgo"
+ },
+ "engines": {
+ "node": ">=10.13.0"
+ }
+ },
+ "node_modules/svgo/node_modules/commander": {
+ "version": "7.2.0",
+ "resolved": "https://registry.npmjs.org/commander/-/commander-7.2.0.tgz",
+ "integrity": "sha512-QrWXB+ZQSVPmIWIhtEO9H+gwHaMGYiF5ChvoJ+K9ZGHG/sVsa6yiesAD1GC/x46sET00Xlwo1u49RVVVzvcSkw==",
+ "engines": {
+ "node": ">= 10"
+ }
+ },
+ "node_modules/svgo/node_modules/css-select": {
+ "version": "4.3.0",
+ "resolved": "https://registry.npmjs.org/css-select/-/css-select-4.3.0.tgz",
+ "integrity": "sha512-wPpOYtnsVontu2mODhA19JrqWxNsfdatRKd64kmpRbQgh1KtItko5sTnEpPdpSaJszTOhEMlF/RPz28qj4HqhQ==",
+ "dependencies": {
+ "boolbase": "^1.0.0",
+ "css-what": "^6.0.1",
+ "domhandler": "^4.3.1",
+ "domutils": "^2.8.0",
+ "nth-check": "^2.0.1"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/fb55"
+ }
+ },
+ "node_modules/svgo/node_modules/dom-serializer": {
+ "version": "1.4.1",
+ "resolved": "https://registry.npmjs.org/dom-serializer/-/dom-serializer-1.4.1.tgz",
+ "integrity": "sha512-VHwB3KfrcOOkelEG2ZOfxqLZdfkil8PtJi4P8N2MMXucZq2yLp75ClViUlOVwyoHEDjYU433Aq+5zWP61+RGag==",
+ "dependencies": {
+ "domelementtype": "^2.0.1",
+ "domhandler": "^4.2.0",
+ "entities": "^2.0.0"
+ },
+ "funding": {
+ "url": "https://github.com/cheeriojs/dom-serializer?sponsor=1"
+ }
+ },
+ "node_modules/svgo/node_modules/domhandler": {
+ "version": "4.3.1",
+ "resolved": "https://registry.npmjs.org/domhandler/-/domhandler-4.3.1.tgz",
+ "integrity": "sha512-GrwoxYN+uWlzO8uhUXRl0P+kHE4GtVPfYzVLcUxPL7KNdHKj66vvlhiweIHqYYXWlw+T8iLMp42Lm67ghw4WMQ==",
+ "dependencies": {
+ "domelementtype": "^2.2.0"
+ },
+ "engines": {
+ "node": ">= 4"
+ },
+ "funding": {
+ "url": "https://github.com/fb55/domhandler?sponsor=1"
+ }
+ },
+ "node_modules/svgo/node_modules/domutils": {
+ "version": "2.8.0",
+ "resolved": "https://registry.npmjs.org/domutils/-/domutils-2.8.0.tgz",
+ "integrity": "sha512-w96Cjofp72M5IIhpjgobBimYEfoPjx1Vx0BSX9P30WBdZW2WIKU0T1Bd0kz2eNZ9ikjKgHbEyKx8BB6H1L3h3A==",
+ "dependencies": {
+ "dom-serializer": "^1.0.1",
+ "domelementtype": "^2.2.0",
+ "domhandler": "^4.2.0"
+ },
+ "funding": {
+ "url": "https://github.com/fb55/domutils?sponsor=1"
+ }
+ },
+ "node_modules/svgo/node_modules/entities": {
+ "version": "2.2.0",
+ "resolved": "https://registry.npmjs.org/entities/-/entities-2.2.0.tgz",
+ "integrity": "sha512-p92if5Nz619I0w+akJrLZH0MX0Pb5DX39XOwQTtXSdQQOaYH03S1uIQp4mhOZtAXrxq4ViO67YTiLBo2638o9A==",
+ "funding": {
+ "url": "https://github.com/fb55/entities?sponsor=1"
+ }
+ },
+ "node_modules/swagger2openapi": {
+ "version": "7.0.8",
+ "resolved": "https://registry.npmjs.org/swagger2openapi/-/swagger2openapi-7.0.8.tgz",
+ "integrity": "sha512-upi/0ZGkYgEcLeGieoz8gT74oWHA0E7JivX7aN9mAf+Tc7BQoRBvnIGHoPDw+f9TXTW4s6kGYCZJtauP6OYp7g==",
+ "dependencies": {
+ "call-me-maybe": "^1.0.1",
+ "node-fetch": "^2.6.1",
+ "node-fetch-h2": "^2.3.0",
+ "node-readfiles": "^0.2.0",
+ "oas-kit-common": "^1.0.8",
+ "oas-resolver": "^2.5.6",
+ "oas-schema-walker": "^1.1.5",
+ "oas-validator": "^5.0.8",
+ "reftools": "^1.1.9",
+ "yaml": "^1.10.0",
+ "yargs": "^17.0.1"
+ },
+ "bin": {
+ "boast": "boast.js",
+ "oas-validate": "oas-validate.js",
+ "swagger2openapi": "swagger2openapi.js"
+ },
+ "funding": {
+ "url": "https://github.com/Mermade/oas-kit?sponsor=1"
+ }
+ },
+ "node_modules/swagger2openapi/node_modules/cliui": {
+ "version": "8.0.1",
+ "resolved": "https://registry.npmjs.org/cliui/-/cliui-8.0.1.tgz",
+ "integrity": "sha512-BSeNnyus75C4//NQ9gQt1/csTXyo/8Sb+afLAkzAptFuMsod9HFokGNudZpi/oQV73hnVK+sR+5PVRMd+Dr7YQ==",
+ "dependencies": {
+ "string-width": "^4.2.0",
+ "strip-ansi": "^6.0.1",
+ "wrap-ansi": "^7.0.0"
+ },
+ "engines": {
+ "node": ">=12"
+ }
+ },
+ "node_modules/swagger2openapi/node_modules/emoji-regex": {
+ "version": "8.0.0",
+ "resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
+ "integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="
+ },
+ "node_modules/swagger2openapi/node_modules/string-width": {
+ "version": "4.2.3",
+ "resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
+ "integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
+ "dependencies": {
+ "emoji-regex": "^8.0.0",
+ "is-fullwidth-code-point": "^3.0.0",
+ "strip-ansi": "^6.0.1"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/swagger2openapi/node_modules/wrap-ansi": {
+ "version": "7.0.0",
+ "resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz",
+ "integrity": "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==",
+ "dependencies": {
+ "ansi-styles": "^4.0.0",
+ "string-width": "^4.1.0",
+ "strip-ansi": "^6.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/wrap-ansi?sponsor=1"
+ }
+ },
+ "node_modules/swagger2openapi/node_modules/yargs": {
+ "version": "17.7.1",
+ "resolved": "https://registry.npmjs.org/yargs/-/yargs-17.7.1.tgz",
+ "integrity": "sha512-cwiTb08Xuv5fqF4AovYacTFNxk62th7LKJ6BL9IGUpTJrWoU7/7WdQGTP2SjKf1dUNBGzDd28p/Yfs/GI6JrLw==",
+ "dependencies": {
+ "cliui": "^8.0.1",
+ "escalade": "^3.1.1",
+ "get-caller-file": "^2.0.5",
+ "require-directory": "^2.1.1",
+ "string-width": "^4.2.3",
+ "y18n": "^5.0.5",
+ "yargs-parser": "^21.1.1"
+ },
+ "engines": {
+ "node": ">=12"
+ }
+ },
+ "node_modules/swagger2openapi/node_modules/yargs-parser": {
+ "version": "21.1.1",
+ "resolved": "https://registry.npmjs.org/yargs-parser/-/yargs-parser-21.1.1.tgz",
+ "integrity": "sha512-tVpsJW7DdjecAiFpbIB1e3qxIQsE6NoPc5/eTdrbbIC4h0LVsWhnoa3g+m2HclBIujHzsxZ4VJVA+GUuc2/LBw==",
+ "engines": {
+ "node": ">=12"
+ }
+ },
+ "node_modules/tapable": {
+ "version": "2.2.1",
+ "resolved": "https://registry.npmjs.org/tapable/-/tapable-2.2.1.tgz",
+ "integrity": "sha512-GNzQvQTOIP6RyTfE2Qxb8ZVlNmw0n88vp1szwWRimP02mnTsx3Wtn5qRdqY9w2XduFNUgvOwhNnQsjwCp+kqaQ==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/terser": {
+ "version": "5.31.6",
+ "resolved": "https://registry.npmjs.org/terser/-/terser-5.31.6.tgz",
+ "integrity": "sha512-PQ4DAriWzKj+qgehQ7LK5bQqCFNMmlhjR2PFFLuqGCpuCAauxemVBWwWOxo3UIwWQx8+Pr61Df++r76wDmkQBg==",
+ "dependencies": {
+ "@jridgewell/source-map": "^0.3.3",
+ "acorn": "^8.8.2",
+ "commander": "^2.20.0",
+ "source-map-support": "~0.5.20"
+ },
+ "bin": {
+ "terser": "bin/terser"
+ },
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/terser-webpack-plugin": {
+ "version": "5.3.10",
+ "resolved": "https://registry.npmjs.org/terser-webpack-plugin/-/terser-webpack-plugin-5.3.10.tgz",
+ "integrity": "sha512-BKFPWlPDndPs+NGGCr1U59t0XScL5317Y0UReNrHaw9/FwhPENlq6bfgs+4yPfyP51vqC1bQ4rp1EfXW5ZSH9w==",
+ "dependencies": {
+ "@jridgewell/trace-mapping": "^0.3.20",
+ "jest-worker": "^27.4.5",
+ "schema-utils": "^3.1.1",
+ "serialize-javascript": "^6.0.1",
+ "terser": "^5.26.0"
+ },
+ "engines": {
+ "node": ">= 10.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ },
+ "peerDependencies": {
+ "webpack": "^5.1.0"
+ },
+ "peerDependenciesMeta": {
+ "@swc/core": {
+ "optional": true
+ },
+ "esbuild": {
+ "optional": true
+ },
+ "uglify-js": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/terser-webpack-plugin/node_modules/jest-worker": {
+ "version": "27.5.1",
+ "resolved": "https://registry.npmjs.org/jest-worker/-/jest-worker-27.5.1.tgz",
+ "integrity": "sha512-7vuh85V5cdDofPyxn58nrPjBktZo0u9x1g8WtjQol+jZDaE+fhN+cIvTj11GndBnMnyfrUOG1sZQxCdjKh+DKg==",
+ "dependencies": {
+ "@types/node": "*",
+ "merge-stream": "^2.0.0",
+ "supports-color": "^8.0.0"
+ },
+ "engines": {
+ "node": ">= 10.13.0"
+ }
+ },
+ "node_modules/terser-webpack-plugin/node_modules/schema-utils": {
+ "version": "3.1.1",
+ "resolved": "https://registry.npmjs.org/schema-utils/-/schema-utils-3.1.1.tgz",
+ "integrity": "sha512-Y5PQxS4ITlC+EahLuXaY86TXfR7Dc5lw294alXOq86JAHCihAIZfqv8nNCWvaEJvaC51uN9hbLGeV0cFBdH+Fw==",
+ "dependencies": {
+ "@types/json-schema": "^7.0.8",
+ "ajv": "^6.12.5",
+ "ajv-keywords": "^3.5.2"
+ },
+ "engines": {
+ "node": ">= 10.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ }
+ },
+ "node_modules/terser-webpack-plugin/node_modules/supports-color": {
+ "version": "8.1.1",
+ "resolved": "https://registry.npmjs.org/supports-color/-/supports-color-8.1.1.tgz",
+ "integrity": "sha512-MpUEN2OodtUzxvKQl72cUF7RQ5EiHsGvSsVG0ia9c5RbWGL2CI4C7EpPS8UTBIplnlzZiNuV56w+FuNxy3ty2Q==",
+ "dependencies": {
+ "has-flag": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/supports-color?sponsor=1"
+ }
+ },
+ "node_modules/terser/node_modules/commander": {
+ "version": "2.20.3",
+ "resolved": "https://registry.npmjs.org/commander/-/commander-2.20.3.tgz",
+ "integrity": "sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ=="
+ },
+ "node_modules/text-table": {
+ "version": "0.2.0",
+ "resolved": "https://registry.npmjs.org/text-table/-/text-table-0.2.0.tgz",
+ "integrity": "sha512-N+8UisAXDGk8PFXP4HAzVR9nbfmVJ3zYLAWiTIoqC5v5isinhr+r5uaO8+7r3BMfuNIufIsA7RdpVgacC2cSpw=="
+ },
+ "node_modules/through2": {
+ "version": "2.0.5",
+ "resolved": "https://registry.npmjs.org/through2/-/through2-2.0.5.tgz",
+ "integrity": "sha512-/mrRod8xqpA+IHSLyGCQ2s8SPHiCDEeQJSep1jqLYeEUClOFG2Qsh+4FU6G9VeqpZnGW/Su8LQGc4YKni5rYSQ==",
+ "dependencies": {
+ "readable-stream": "~2.3.6",
+ "xtend": "~4.0.1"
+ }
+ },
+ "node_modules/through2/node_modules/isarray": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/isarray/-/isarray-1.0.0.tgz",
+ "integrity": "sha512-VLghIWNM6ELQzo7zwmcg0NmTVyWKYjvIeM83yjp0wRDTmUnrM678fQbcKBo6n2CJEF0szoG//ytg+TKla89ALQ=="
+ },
+ "node_modules/through2/node_modules/readable-stream": {
+ "version": "2.3.8",
+ "resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-2.3.8.tgz",
+ "integrity": "sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA==",
+ "dependencies": {
+ "core-util-is": "~1.0.0",
+ "inherits": "~2.0.3",
+ "isarray": "~1.0.0",
+ "process-nextick-args": "~2.0.0",
+ "safe-buffer": "~5.1.1",
+ "string_decoder": "~1.1.1",
+ "util-deprecate": "~1.0.1"
+ }
+ },
+ "node_modules/through2/node_modules/safe-buffer": {
+ "version": "5.1.2",
+ "resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz",
+ "integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g=="
+ },
+ "node_modules/through2/node_modules/string_decoder": {
+ "version": "1.1.1",
+ "resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-1.1.1.tgz",
+ "integrity": "sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg==",
+ "dependencies": {
+ "safe-buffer": "~5.1.0"
+ }
+ },
+ "node_modules/thunky": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/thunky/-/thunky-1.1.0.tgz",
+ "integrity": "sha512-eHY7nBftgThBqOyHGVN+l8gF0BucP09fMo0oO/Lb0w1OF80dJv+lDVpXG60WMQvkcxAkNybKsrEIE3ZtKGmPrA=="
+ },
+ "node_modules/tiny-invariant": {
+ "version": "1.3.1",
+ "resolved": "https://registry.npmjs.org/tiny-invariant/-/tiny-invariant-1.3.1.tgz",
+ "integrity": "sha512-AD5ih2NlSssTCwsMznbvwMZpJ1cbhkGd2uueNxzv2jDlEeZdU04JQfRnggJQ8DrcVBGjAsCKwFBbDlVNtEMlzw=="
+ },
+ "node_modules/tiny-warning": {
+ "version": "1.0.3",
+ "resolved": "https://registry.npmjs.org/tiny-warning/-/tiny-warning-1.0.3.tgz",
+ "integrity": "sha512-lBN9zLN/oAf68o3zNXYrdCt1kP8WsiGW8Oo2ka41b2IM5JL/S1CTyX1rW0mb/zSuJun0ZUrDxx4sqvYS2FWzPA=="
+ },
+ "node_modules/to-readable-stream": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/to-readable-stream/-/to-readable-stream-1.0.0.tgz",
+ "integrity": "sha512-Iq25XBt6zD5npPhlLVXGFN3/gyR2/qODcKNNyTMd4vbm39HUaOiAM4PMq0eMVC/Tkxz+Zjdsc55g9yyz+Yq00Q==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/to-regex-range": {
+ "version": "5.0.1",
+ "resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz",
+ "integrity": "sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==",
+ "dependencies": {
+ "is-number": "^7.0.0"
+ },
+ "engines": {
+ "node": ">=8.0"
+ }
+ },
+ "node_modules/toidentifier": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/toidentifier/-/toidentifier-1.0.1.tgz",
+ "integrity": "sha512-o5sSPKEkg/DIQNmH43V0/uerLrpzVedkUh8tGNvaeXpfpuwjKenlSox/2O/BTlZUtEe+JG7s5YhEz608PlAHRA==",
+ "engines": {
+ "node": ">=0.6"
+ }
+ },
+ "node_modules/totalist": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/totalist/-/totalist-1.1.0.tgz",
+ "integrity": "sha512-gduQwd1rOdDMGxFG1gEvhV88Oirdo2p+KjoYFU7k2g+i7n6AFFbDQ5kMPUsW0pNbfQsB/cwXvT1i4Bue0s9g5g==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/tr46": {
+ "version": "0.0.3",
+ "resolved": "https://registry.npmjs.org/tr46/-/tr46-0.0.3.tgz",
+ "integrity": "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw=="
+ },
+ "node_modules/trim": {
+ "version": "0.0.1",
+ "resolved": "https://registry.npmjs.org/trim/-/trim-0.0.1.tgz",
+ "integrity": "sha512-YzQV+TZg4AxpKxaTHK3c3D+kRDCGVEE7LemdlQZoQXn0iennk10RsIoY6ikzAqJTc9Xjl9C1/waHom/J86ziAQ==",
+ "deprecated": "Use String.prototype.trim() instead"
+ },
+ "node_modules/trim-trailing-lines": {
+ "version": "1.1.4",
+ "resolved": "https://registry.npmjs.org/trim-trailing-lines/-/trim-trailing-lines-1.1.4.tgz",
+ "integrity": "sha512-rjUWSqnfTNrjbB9NQWfPMH/xRK1deHeGsHoVfpxJ++XeYXE0d6B1En37AHfw3jtfTU7dzMzZL2jjpe8Qb5gLIQ==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/trough": {
+ "version": "1.0.5",
+ "resolved": "https://registry.npmjs.org/trough/-/trough-1.0.5.tgz",
+ "integrity": "sha512-rvuRbTarPXmMb79SmzEp8aqXNKcK+y0XaB298IXueQ8I2PsrATcPBCSPyK/dDNa2iWOhKlfNnOjdAOTBU/nkFA==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/tslib": {
+ "version": "2.5.0",
+ "resolved": "https://registry.npmjs.org/tslib/-/tslib-2.5.0.tgz",
+ "integrity": "sha512-336iVw3rtn2BUK7ORdIAHTyxHGRIHVReokCR3XjbckJMK7ms8FysBfhLR8IXnAgy7T0PTPNBWKiH514FOW/WSg=="
+ },
+ "node_modules/type-fest": {
+ "version": "2.19.0",
+ "resolved": "https://registry.npmjs.org/type-fest/-/type-fest-2.19.0.tgz",
+ "integrity": "sha512-RAH822pAdBgcNMAfWnCBU3CFZcfZ/i1eZjwFU/dsLKumyuuP3niueg2UAukXYF0E2AAoc82ZSSf9J0WQBinzHA==",
+ "engines": {
+ "node": ">=12.20"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/type-is": {
+ "version": "1.6.18",
+ "resolved": "https://registry.npmjs.org/type-is/-/type-is-1.6.18.tgz",
+ "integrity": "sha512-TkRKr9sUTxEH8MdfuCSP7VizJyzRNMjj2J2do2Jr3Kym598JVdEksuzPQCnlFPW4ky9Q+iA+ma9BGm06XQBy8g==",
+ "dependencies": {
+ "media-typer": "0.3.0",
+ "mime-types": "~2.1.24"
+ },
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/type-is/node_modules/mime-db": {
+ "version": "1.52.0",
+ "resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
+ "integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/type-is/node_modules/mime-types": {
+ "version": "2.1.35",
+ "resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
+ "integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
+ "dependencies": {
+ "mime-db": "1.52.0"
+ },
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/typedarray-to-buffer": {
+ "version": "3.1.5",
+ "resolved": "https://registry.npmjs.org/typedarray-to-buffer/-/typedarray-to-buffer-3.1.5.tgz",
+ "integrity": "sha512-zdu8XMNEDepKKR+XYOXAVPtWui0ly0NtohUscw+UmaHiAWT8hrV1rr//H6V+0DvJ3OQ19S979M0laLfX8rm82Q==",
+ "dependencies": {
+ "is-typedarray": "^1.0.0"
+ }
+ },
+ "node_modules/typescript": {
+ "version": "5.0.2",
+ "resolved": "https://registry.npmjs.org/typescript/-/typescript-5.0.2.tgz",
+ "integrity": "sha512-wVORMBGO/FAs/++blGNeAVdbNKtIh1rbBL2EyQ1+J9lClJ93KiiKe8PmFIVdXhHcyv44SL9oglmfeSsndo0jRw==",
+ "peer": true,
+ "bin": {
+ "tsc": "bin/tsc",
+ "tsserver": "bin/tsserver"
+ },
+ "engines": {
+ "node": ">=12.20"
+ }
+ },
+ "node_modules/ua-parser-js": {
+ "version": "0.7.34",
+ "resolved": "https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.34.tgz",
+ "integrity": "sha512-cJMeh/eOILyGu0ejgTKB95yKT3zOenSe9UGE3vj6WfiOwgGYnmATUsnDixMFvdU+rNMvWih83hrUP8VwhF9yXQ==",
+ "funding": [
+ {
+ "type": "opencollective",
+ "url": "https://opencollective.com/ua-parser-js"
+ },
+ {
+ "type": "paypal",
+ "url": "https://paypal.me/faisalman"
+ }
+ ],
+ "engines": {
+ "node": "*"
+ }
+ },
+ "node_modules/unherit": {
+ "version": "1.1.3",
+ "resolved": "https://registry.npmjs.org/unherit/-/unherit-1.1.3.tgz",
+ "integrity": "sha512-Ft16BJcnapDKp0+J/rqFC3Rrk6Y/Ng4nzsC028k2jdDII/rdZ7Wd3pPT/6+vIIxRagwRc9K0IUX0Ra4fKvw+WQ==",
+ "dependencies": {
+ "inherits": "^2.0.0",
+ "xtend": "^4.0.0"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/unicode-canonical-property-names-ecmascript": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/unicode-canonical-property-names-ecmascript/-/unicode-canonical-property-names-ecmascript-2.0.0.tgz",
+ "integrity": "sha512-yY5PpDlfVIU5+y/BSCxAJRBIS1Zc2dDG3Ujq+sR0U+JjUevW2JhocOF+soROYDSaAezOzOKuyyixhD6mBknSmQ==",
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/unicode-match-property-ecmascript": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/unicode-match-property-ecmascript/-/unicode-match-property-ecmascript-2.0.0.tgz",
+ "integrity": "sha512-5kaZCrbp5mmbz5ulBkDkbY0SsPOjKqVS35VpL9ulMPfSl0J0Xsm+9Evphv9CoIZFwre7aJoa94AY6seMKGVN5Q==",
+ "dependencies": {
+ "unicode-canonical-property-names-ecmascript": "^2.0.0",
+ "unicode-property-aliases-ecmascript": "^2.0.0"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/unicode-match-property-value-ecmascript": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/unicode-match-property-value-ecmascript/-/unicode-match-property-value-ecmascript-2.1.0.tgz",
+ "integrity": "sha512-qxkjQt6qjg/mYscYMC0XKRn3Rh0wFPlfxB0xkt9CfyTvpX1Ra0+rAmdX2QyAobptSEvuy4RtpPRui6XkV+8wjA==",
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/unicode-property-aliases-ecmascript": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/unicode-property-aliases-ecmascript/-/unicode-property-aliases-ecmascript-2.1.0.tgz",
+ "integrity": "sha512-6t3foTQI9qne+OZoVQB/8x8rk2k1eVy1gRXhV3oFQ5T6R1dqQ1xtin3XqSlx3+ATBkliTaR/hHyJBm+LVPNM8w==",
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/unified": {
+ "version": "9.2.2",
+ "resolved": "https://registry.npmjs.org/unified/-/unified-9.2.2.tgz",
+ "integrity": "sha512-Sg7j110mtefBD+qunSLO1lqOEKdrwBFBrR6Qd8f4uwkhWNlbkaqwHse6e7QvD3AP/MNoJdEDLaf8OxYyoWgorQ==",
+ "dependencies": {
+ "bail": "^1.0.0",
+ "extend": "^3.0.0",
+ "is-buffer": "^2.0.0",
+ "is-plain-obj": "^2.0.0",
+ "trough": "^1.0.0",
+ "vfile": "^4.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/unique-string": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/unique-string/-/unique-string-2.0.0.tgz",
+ "integrity": "sha512-uNaeirEPvpZWSgzwsPGtU2zVSTrn/8L5q/IexZmH0eH6SA73CmAA5U4GwORTxQAZs95TAXLNqeLoPPNO5gZfWg==",
+ "dependencies": {
+ "crypto-random-string": "^2.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/unist-builder": {
+ "version": "2.0.3",
+ "resolved": "https://registry.npmjs.org/unist-builder/-/unist-builder-2.0.3.tgz",
+ "integrity": "sha512-f98yt5pnlMWlzP539tPc4grGMsFaQQlP/vM396b00jngsiINumNmsY8rkXjfoi1c6QaM8nQ3vaGDuoKWbe/1Uw==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/unist-util-generated": {
+ "version": "1.1.6",
+ "resolved": "https://registry.npmjs.org/unist-util-generated/-/unist-util-generated-1.1.6.tgz",
+ "integrity": "sha512-cln2Mm1/CZzN5ttGK7vkoGw+RZ8VcUH6BtGbq98DDtRGquAAOXig1mrBQYelOwMXYS8rK+vZDyyojSjp7JX+Lg==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/unist-util-is": {
+ "version": "4.1.0",
+ "resolved": "https://registry.npmjs.org/unist-util-is/-/unist-util-is-4.1.0.tgz",
+ "integrity": "sha512-ZOQSsnce92GrxSqlnEEseX0gi7GH9zTJZ0p9dtu87WRb/37mMPO2Ilx1s/t9vBHrFhbgweUwb+t7cIn5dxPhZg==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/unist-util-position": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/unist-util-position/-/unist-util-position-3.1.0.tgz",
+ "integrity": "sha512-w+PkwCbYSFw8vpgWD0v7zRCl1FpY3fjDSQ3/N/wNd9Ffa4gPi8+4keqt99N3XW6F99t/mUzp2xAhNmfKWp95QA==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/unist-util-remove": {
+ "version": "2.1.0",
+ "resolved": "https://registry.npmjs.org/unist-util-remove/-/unist-util-remove-2.1.0.tgz",
+ "integrity": "sha512-J8NYPyBm4baYLdCbjmf1bhPu45Cr1MWTm77qd9istEkzWpnN6O9tMsEbB2JhNnBCqGENRqEWomQ+He6au0B27Q==",
+ "dependencies": {
+ "unist-util-is": "^4.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/unist-util-remove-position": {
+ "version": "2.0.1",
+ "resolved": "https://registry.npmjs.org/unist-util-remove-position/-/unist-util-remove-position-2.0.1.tgz",
+ "integrity": "sha512-fDZsLYIe2uT+oGFnuZmy73K6ZxOPG/Qcm+w7jbEjaFcJgbQ6cqjs/eSPzXhsmGpAsWPkqZM9pYjww5QTn3LHMA==",
+ "dependencies": {
+ "unist-util-visit": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/unist-util-stringify-position": {
+ "version": "2.0.3",
+ "resolved": "https://registry.npmjs.org/unist-util-stringify-position/-/unist-util-stringify-position-2.0.3.tgz",
+ "integrity": "sha512-3faScn5I+hy9VleOq/qNbAd6pAx7iH5jYBMS9I1HgQVijz/4mv5Bvw5iw1sC/90CODiKo81G/ps8AJrISn687g==",
+ "dependencies": {
+ "@types/unist": "^2.0.2"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/unist-util-visit": {
+ "version": "2.0.3",
+ "resolved": "https://registry.npmjs.org/unist-util-visit/-/unist-util-visit-2.0.3.tgz",
+ "integrity": "sha512-iJ4/RczbJMkD0712mGktuGpm/U4By4FfDonL7N/9tATGIF4imikjOuagyMY53tnZq3NP6BcmlrHhEKAfGWjh7Q==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-is": "^4.0.0",
+ "unist-util-visit-parents": "^3.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/unist-util-visit-parents": {
+ "version": "3.1.1",
+ "resolved": "https://registry.npmjs.org/unist-util-visit-parents/-/unist-util-visit-parents-3.1.1.tgz",
+ "integrity": "sha512-1KROIZWo6bcMrZEwiH2UrXDyalAa0uqzWCxCJj6lPOvTve2WkfgCytoDTPaMnodXh1WrXOq0haVYHj99ynJlsg==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-is": "^4.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/universalify": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/universalify/-/universalify-2.0.0.tgz",
+ "integrity": "sha512-hAZsKq7Yy11Zu1DE0OzWjw7nnLZmJZYTDZZyEFHZdUhV8FkH5MCfoU1XMaxXovpyW5nq5scPqq0ZDP9Zyl04oQ==",
+ "engines": {
+ "node": ">= 10.0.0"
+ }
+ },
+ "node_modules/unpipe": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/unpipe/-/unpipe-1.0.0.tgz",
+ "integrity": "sha512-pjy2bYhSsufwWlKwPc+l3cN7+wuJlK6uz0YdJEOlQDbl6jo/YlPi4mb8agUkVC8BF7V8NuzeyPNqRksA3hztKQ==",
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/untildify": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/untildify/-/untildify-4.0.0.tgz",
+ "integrity": "sha512-KK8xQ1mkzZeg9inewmFVDNkg3l5LUhoq9kN6iWYB/CC9YMG8HA+c1Q8HwDe6dEX7kErrEVNVBO3fWsVq5iDgtw==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/update-browserslist-db": {
+ "version": "1.1.0",
+ "resolved": "https://registry.npmjs.org/update-browserslist-db/-/update-browserslist-db-1.1.0.tgz",
+ "integrity": "sha512-EdRAaAyk2cUE1wOf2DkEhzxqOQvFOoRJFNS6NeyJ01Gp2beMRpBAINjM2iDXE3KCuKhwnvHIQCJm6ThL2Z+HzQ==",
+ "funding": [
+ {
+ "type": "opencollective",
+ "url": "https://opencollective.com/browserslist"
+ },
+ {
+ "type": "tidelift",
+ "url": "https://tidelift.com/funding/github/npm/browserslist"
+ },
+ {
+ "type": "github",
+ "url": "https://github.com/sponsors/ai"
+ }
+ ],
+ "dependencies": {
+ "escalade": "^3.1.2",
+ "picocolors": "^1.0.1"
+ },
+ "bin": {
+ "update-browserslist-db": "cli.js"
+ },
+ "peerDependencies": {
+ "browserslist": ">= 4.21.0"
+ }
+ },
+ "node_modules/update-notifier": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/update-notifier/-/update-notifier-5.1.0.tgz",
+ "integrity": "sha512-ItnICHbeMh9GqUy31hFPrD1kcuZ3rpxDZbf4KUDavXwS0bW5m7SLbDQpGX3UYr072cbrF5hFUs3r5tUsPwjfHw==",
+ "dependencies": {
+ "boxen": "^5.0.0",
+ "chalk": "^4.1.0",
+ "configstore": "^5.0.1",
+ "has-yarn": "^2.1.0",
+ "import-lazy": "^2.1.0",
+ "is-ci": "^2.0.0",
+ "is-installed-globally": "^0.4.0",
+ "is-npm": "^5.0.0",
+ "is-yarn-global": "^0.3.0",
+ "latest-version": "^5.1.0",
+ "pupa": "^2.1.1",
+ "semver": "^7.3.4",
+ "semver-diff": "^3.1.1",
+ "xdg-basedir": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/yeoman/update-notifier?sponsor=1"
+ }
+ },
+ "node_modules/update-notifier/node_modules/boxen": {
+ "version": "5.1.2",
+ "resolved": "https://registry.npmjs.org/boxen/-/boxen-5.1.2.tgz",
+ "integrity": "sha512-9gYgQKXx+1nP8mP7CzFyaUARhg7D3n1dF/FnErWmu9l6JvGpNUN278h0aSb+QjoiKSWG+iZ3uHrcqk0qrY9RQQ==",
+ "dependencies": {
+ "ansi-align": "^3.0.0",
+ "camelcase": "^6.2.0",
+ "chalk": "^4.1.0",
+ "cli-boxes": "^2.2.1",
+ "string-width": "^4.2.2",
+ "type-fest": "^0.20.2",
+ "widest-line": "^3.1.0",
+ "wrap-ansi": "^7.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/update-notifier/node_modules/cli-boxes": {
+ "version": "2.2.1",
+ "resolved": "https://registry.npmjs.org/cli-boxes/-/cli-boxes-2.2.1.tgz",
+ "integrity": "sha512-y4coMcylgSCdVinjiDBuR8PCC2bLjyGTwEmPb9NHR/QaNU6EUOXcTY/s6VjGMD6ENSEaeQYHCY0GNGS5jfMwPw==",
+ "engines": {
+ "node": ">=6"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/update-notifier/node_modules/emoji-regex": {
+ "version": "8.0.0",
+ "resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
+ "integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="
+ },
+ "node_modules/update-notifier/node_modules/string-width": {
+ "version": "4.2.3",
+ "resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
+ "integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
+ "dependencies": {
+ "emoji-regex": "^8.0.0",
+ "is-fullwidth-code-point": "^3.0.0",
+ "strip-ansi": "^6.0.1"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/update-notifier/node_modules/type-fest": {
+ "version": "0.20.2",
+ "resolved": "https://registry.npmjs.org/type-fest/-/type-fest-0.20.2.tgz",
+ "integrity": "sha512-Ne+eE4r0/iWnpAxD852z3A+N0Bt5RN//NjJwRd2VFHEmrywxf5vsZlh4R6lixl6B+wz/8d+maTSAkN1FIkI3LQ==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/update-notifier/node_modules/widest-line": {
+ "version": "3.1.0",
+ "resolved": "https://registry.npmjs.org/widest-line/-/widest-line-3.1.0.tgz",
+ "integrity": "sha512-NsmoXalsWVDMGupxZ5R08ka9flZjjiLvHVAWYOKtiKM8ujtZWr9cRffak+uSE48+Ob8ObalXpwyeUiyDD6QFgg==",
+ "dependencies": {
+ "string-width": "^4.0.0"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/update-notifier/node_modules/wrap-ansi": {
+ "version": "7.0.0",
+ "resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz",
+ "integrity": "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==",
+ "dependencies": {
+ "ansi-styles": "^4.0.0",
+ "string-width": "^4.1.0",
+ "strip-ansi": "^6.0.0"
+ },
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/wrap-ansi?sponsor=1"
+ }
+ },
+ "node_modules/uri-js": {
+ "version": "4.4.1",
+ "resolved": "https://registry.npmjs.org/uri-js/-/uri-js-4.4.1.tgz",
+ "integrity": "sha512-7rKUyy33Q1yc98pQ1DAmLtwX109F7TIfWlW1Ydo8Wl1ii1SeHieeh0HHfPeL2fMXK6z0s8ecKs9frCuLJvndBg==",
+ "dependencies": {
+ "punycode": "^2.1.0"
+ }
+ },
+ "node_modules/uri-js/node_modules/punycode": {
+ "version": "2.3.0",
+ "resolved": "https://registry.npmjs.org/punycode/-/punycode-2.3.0.tgz",
+ "integrity": "sha512-rRV+zQD8tVFys26lAGR9WUuS4iUAngJScM+ZRSKtvl5tKeZ2t5bvdNFdNHBW9FWR4guGHlgmsZ1G7BSm2wTbuA==",
+ "engines": {
+ "node": ">=6"
+ }
+ },
+ "node_modules/url-loader": {
+ "version": "4.1.1",
+ "resolved": "https://registry.npmjs.org/url-loader/-/url-loader-4.1.1.tgz",
+ "integrity": "sha512-3BTV812+AVHHOJQO8O5MkWgZ5aosP7GnROJwvzLS9hWDj00lZ6Z0wNak423Lp9PBZN05N+Jk/N5Si8jRAlGyWA==",
+ "dependencies": {
+ "loader-utils": "^2.0.0",
+ "mime-types": "^2.1.27",
+ "schema-utils": "^3.0.0"
+ },
+ "engines": {
+ "node": ">= 10.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ },
+ "peerDependencies": {
+ "file-loader": "*",
+ "webpack": "^4.0.0 || ^5.0.0"
+ },
+ "peerDependenciesMeta": {
+ "file-loader": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/url-loader/node_modules/mime-db": {
+ "version": "1.52.0",
+ "resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
+ "integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/url-loader/node_modules/mime-types": {
+ "version": "2.1.35",
+ "resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
+ "integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
+ "dependencies": {
+ "mime-db": "1.52.0"
+ },
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/url-loader/node_modules/schema-utils": {
+ "version": "3.1.1",
+ "resolved": "https://registry.npmjs.org/schema-utils/-/schema-utils-3.1.1.tgz",
+ "integrity": "sha512-Y5PQxS4ITlC+EahLuXaY86TXfR7Dc5lw294alXOq86JAHCihAIZfqv8nNCWvaEJvaC51uN9hbLGeV0cFBdH+Fw==",
+ "dependencies": {
+ "@types/json-schema": "^7.0.8",
+ "ajv": "^6.12.5",
+ "ajv-keywords": "^3.5.2"
+ },
+ "engines": {
+ "node": ">= 10.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ }
+ },
+ "node_modules/url-parse-lax": {
+ "version": "3.0.0",
+ "resolved": "https://registry.npmjs.org/url-parse-lax/-/url-parse-lax-3.0.0.tgz",
+ "integrity": "sha512-NjFKA0DidqPa5ciFcSrXnAltTtzz84ogy+NebPvfEgAck0+TNg4UJ4IN+fB7zRZfbgUf0syOo9MDxFkDSMuFaQ==",
+ "dependencies": {
+ "prepend-http": "^2.0.0"
+ },
+ "engines": {
+ "node": ">=4"
+ }
+ },
+ "node_modules/url-template": {
+ "version": "2.0.8",
+ "resolved": "https://registry.npmjs.org/url-template/-/url-template-2.0.8.tgz",
+ "integrity": "sha512-XdVKMF4SJ0nP/O7XIPB0JwAEuT9lDIYnNsK8yGVe43y0AWoKeJNdv3ZNWh7ksJ6KqQFjOO6ox/VEitLnaVNufw=="
+ },
+ "node_modules/use-composed-ref": {
+ "version": "1.3.0",
+ "resolved": "https://registry.npmjs.org/use-composed-ref/-/use-composed-ref-1.3.0.tgz",
+ "integrity": "sha512-GLMG0Jc/jiKov/3Ulid1wbv3r54K9HlMW29IWcDFPEqFkSO2nS0MuefWgMJpeHQ9YJeXDL3ZUF+P3jdXlZX/cQ==",
+ "peerDependencies": {
+ "react": "^16.8.0 || ^17.0.0 || ^18.0.0"
+ }
+ },
+ "node_modules/use-isomorphic-layout-effect": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmjs.org/use-isomorphic-layout-effect/-/use-isomorphic-layout-effect-1.1.2.tgz",
+ "integrity": "sha512-49L8yCO3iGT/ZF9QttjwLF/ZD9Iwto5LnH5LmEdk/6cFmXddqi2ulF0edxTwjj+7mqvpVVGQWvbXZdn32wRSHA==",
+ "peerDependencies": {
+ "react": "^16.8.0 || ^17.0.0 || ^18.0.0"
+ },
+ "peerDependenciesMeta": {
+ "@types/react": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/use-latest": {
+ "version": "1.2.1",
+ "resolved": "https://registry.npmjs.org/use-latest/-/use-latest-1.2.1.tgz",
+ "integrity": "sha512-xA+AVm/Wlg3e2P/JiItTziwS7FK92LWrDB0p+hgXloIMuVCeJJ8v6f0eeHyPZaJrM+usM1FkFfbNCrJGs8A/zw==",
+ "dependencies": {
+ "use-isomorphic-layout-effect": "^1.1.1"
+ },
+ "peerDependencies": {
+ "react": "^16.8.0 || ^17.0.0 || ^18.0.0"
+ },
+ "peerDependenciesMeta": {
+ "@types/react": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/use-sync-external-store": {
+ "version": "1.2.0",
+ "resolved": "https://registry.npmjs.org/use-sync-external-store/-/use-sync-external-store-1.2.0.tgz",
+ "integrity": "sha512-eEgnFxGQ1Ife9bzYs6VLi8/4X6CObHMw9Qr9tPY43iKwsPw8xE8+EFsf/2cFZ5S3esXgpWgtSCtLNS41F+sKPA==",
+ "peerDependencies": {
+ "react": "^16.8.0 || ^17.0.0 || ^18.0.0"
+ }
+ },
+ "node_modules/util-deprecate": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz",
+ "integrity": "sha512-EPD5q1uXyFxJpCrLnCc1nHnq3gOa6DZBocAIiI2TaSCA7VCJ1UJDMagCzIkXNsUYfD1daK//LTEQ8xiIbrHtcw=="
+ },
+ "node_modules/utila": {
+ "version": "0.4.0",
+ "resolved": "https://registry.npmjs.org/utila/-/utila-0.4.0.tgz",
+ "integrity": "sha512-Z0DbgELS9/L/75wZbro8xAnT50pBVFQZ+hUEueGDU5FN51YSCYM+jdxsfCiHjwNP/4LCDD0i/graKpeBnOXKRA=="
+ },
+ "node_modules/utility-types": {
+ "version": "3.10.0",
+ "resolved": "https://registry.npmjs.org/utility-types/-/utility-types-3.10.0.tgz",
+ "integrity": "sha512-O11mqxmi7wMKCo6HKFt5AhO4BwY3VV68YU07tgxfz8zJTIxr4BpsezN49Ffwy9j3ZpwwJp4fkRwjRzq3uWE6Rg==",
+ "engines": {
+ "node": ">= 4"
+ }
+ },
+ "node_modules/utils-merge": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/utils-merge/-/utils-merge-1.0.1.tgz",
+ "integrity": "sha512-pMZTvIkT1d+TFGvDOqodOclx0QWkkgi6Tdoa8gC8ffGAAqz9pzPTZWAybbsHHoED/ztMtkv/VoYTYyShUn81hA==",
+ "engines": {
+ "node": ">= 0.4.0"
+ }
+ },
+ "node_modules/uuid": {
+ "version": "8.3.2",
+ "resolved": "https://registry.npmjs.org/uuid/-/uuid-8.3.2.tgz",
+ "integrity": "sha512-+NYs2QeMWy+GWFOEm9xnn6HCDp0l7QBD7ml8zLUmJ+93Q5NF0NocErnwkTkXVFNiX3/fpC6afS8Dhb/gz7R7eg==",
+ "bin": {
+ "uuid": "dist/bin/uuid"
+ }
+ },
+ "node_modules/value-equal": {
+ "version": "1.0.1",
+ "resolved": "https://registry.npmjs.org/value-equal/-/value-equal-1.0.1.tgz",
+ "integrity": "sha512-NOJ6JZCAWr0zlxZt+xqCHNTEKOsrks2HQd4MqhP1qy4z1SkbEP467eNx6TgDKXMvUOb+OENfJCZwM+16n7fRfw=="
+ },
+ "node_modules/vary": {
+ "version": "1.1.2",
+ "resolved": "https://registry.npmjs.org/vary/-/vary-1.1.2.tgz",
+ "integrity": "sha512-BNGbWLfd0eUPabhkXUVm0j8uuvREyTh5ovRa/dyow/BqAbZJyC+5fU+IzQOzmAKzYqYRAISoRhdQr3eIZ/PXqg==",
+ "engines": {
+ "node": ">= 0.8"
+ }
+ },
+ "node_modules/vfile": {
+ "version": "4.2.1",
+ "resolved": "https://registry.npmjs.org/vfile/-/vfile-4.2.1.tgz",
+ "integrity": "sha512-O6AE4OskCG5S1emQ/4gl8zK586RqA3srz3nfK/Viy0UPToBc5Trp9BVFb1u0CjsKrAWwnpr4ifM/KBXPWwJbCA==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "is-buffer": "^2.0.0",
+ "unist-util-stringify-position": "^2.0.0",
+ "vfile-message": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/vfile-location": {
+ "version": "3.2.0",
+ "resolved": "https://registry.npmjs.org/vfile-location/-/vfile-location-3.2.0.tgz",
+ "integrity": "sha512-aLEIZKv/oxuCDZ8lkJGhuhztf/BW4M+iHdCwglA/eWc+vtuRFJj8EtgceYFX4LRjOhCAAiNHsKGssC6onJ+jbA==",
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/vfile-message": {
+ "version": "2.0.4",
+ "resolved": "https://registry.npmjs.org/vfile-message/-/vfile-message-2.0.4.tgz",
+ "integrity": "sha512-DjssxRGkMvifUOJre00juHoP9DPWuzjxKuMDrhNbk2TdaYYBNMStsNhEOt3idrtI12VQYM/1+iM0KOzXi4pxwQ==",
+ "dependencies": {
+ "@types/unist": "^2.0.0",
+ "unist-util-stringify-position": "^2.0.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/unified"
+ }
+ },
+ "node_modules/wait-on": {
+ "version": "6.0.1",
+ "resolved": "https://registry.npmjs.org/wait-on/-/wait-on-6.0.1.tgz",
+ "integrity": "sha512-zht+KASY3usTY5u2LgaNqn/Cd8MukxLGjdcZxT2ns5QzDmTFc4XoWBgC+C/na+sMRZTuVygQoMYwdcVjHnYIVw==",
+ "dependencies": {
+ "axios": "^0.25.0",
+ "joi": "^17.6.0",
+ "lodash": "^4.17.21",
+ "minimist": "^1.2.5",
+ "rxjs": "^7.5.4"
+ },
+ "bin": {
+ "wait-on": "bin/wait-on"
+ },
+ "engines": {
+ "node": ">=10.0.0"
+ }
+ },
+ "node_modules/watchpack": {
+ "version": "2.4.2",
+ "resolved": "https://registry.npmjs.org/watchpack/-/watchpack-2.4.2.tgz",
+ "integrity": "sha512-TnbFSbcOCcDgjZ4piURLCbJ3nJhznVh9kw6F6iokjiFPl8ONxe9A6nMDVXDiNbrSfLILs6vB07F7wLBrwPYzJw==",
+ "dependencies": {
+ "glob-to-regexp": "^0.4.1",
+ "graceful-fs": "^4.1.2"
+ },
+ "engines": {
+ "node": ">=10.13.0"
+ }
+ },
+ "node_modules/wbuf": {
+ "version": "1.7.3",
+ "resolved": "https://registry.npmjs.org/wbuf/-/wbuf-1.7.3.tgz",
+ "integrity": "sha512-O84QOnr0icsbFGLS0O3bI5FswxzRr8/gHwWkDlQFskhSPryQXvrTMxjxGP4+iWYoauLoBvfDpkrOauZ+0iZpDA==",
+ "dependencies": {
+ "minimalistic-assert": "^1.0.0"
+ }
+ },
+ "node_modules/web-namespaces": {
+ "version": "1.1.4",
+ "resolved": "https://registry.npmjs.org/web-namespaces/-/web-namespaces-1.1.4.tgz",
+ "integrity": "sha512-wYxSGajtmoP4WxfejAPIr4l0fVh+jeMXZb08wNc0tMg6xsfZXj3cECqIK0G7ZAqUq0PP8WlMDtaOGVBTAWztNw==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ },
+ "node_modules/webidl-conversions": {
+ "version": "3.0.1",
+ "resolved": "https://registry.npmjs.org/webidl-conversions/-/webidl-conversions-3.0.1.tgz",
+ "integrity": "sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ=="
+ },
+ "node_modules/webpack": {
+ "version": "5.94.0",
+ "resolved": "https://registry.npmjs.org/webpack/-/webpack-5.94.0.tgz",
+ "integrity": "sha512-KcsGn50VT+06JH/iunZJedYGUJS5FGjow8wb9c0v5n1Om8O1g4L6LjtfxwlXIATopoQu+vOXXa7gYisWxCoPyg==",
+ "dependencies": {
+ "@types/estree": "^1.0.5",
+ "@webassemblyjs/ast": "^1.12.1",
+ "@webassemblyjs/wasm-edit": "^1.12.1",
+ "@webassemblyjs/wasm-parser": "^1.12.1",
+ "acorn": "^8.7.1",
+ "acorn-import-attributes": "^1.9.5",
+ "browserslist": "^4.21.10",
+ "chrome-trace-event": "^1.0.2",
+ "enhanced-resolve": "^5.17.1",
+ "es-module-lexer": "^1.2.1",
+ "eslint-scope": "5.1.1",
+ "events": "^3.2.0",
+ "glob-to-regexp": "^0.4.1",
+ "graceful-fs": "^4.2.11",
+ "json-parse-even-better-errors": "^2.3.1",
+ "loader-runner": "^4.2.0",
+ "mime-types": "^2.1.27",
+ "neo-async": "^2.6.2",
+ "schema-utils": "^3.2.0",
+ "tapable": "^2.1.1",
+ "terser-webpack-plugin": "^5.3.10",
+ "watchpack": "^2.4.1",
+ "webpack-sources": "^3.2.3"
+ },
+ "bin": {
+ "webpack": "bin/webpack.js"
+ },
+ "engines": {
+ "node": ">=10.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ },
+ "peerDependenciesMeta": {
+ "webpack-cli": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/webpack-bundle-analyzer": {
+ "version": "4.8.0",
+ "resolved": "https://registry.npmjs.org/webpack-bundle-analyzer/-/webpack-bundle-analyzer-4.8.0.tgz",
+ "integrity": "sha512-ZzoSBePshOKhr+hd8u6oCkZVwpVaXgpw23ScGLFpR6SjYI7+7iIWYarjN6OEYOfRt8o7ZyZZQk0DuMizJ+LEIg==",
+ "dependencies": {
+ "@discoveryjs/json-ext": "0.5.7",
+ "acorn": "^8.0.4",
+ "acorn-walk": "^8.0.0",
+ "chalk": "^4.1.0",
+ "commander": "^7.2.0",
+ "gzip-size": "^6.0.0",
+ "lodash": "^4.17.20",
+ "opener": "^1.5.2",
+ "sirv": "^1.0.7",
+ "ws": "^7.3.1"
+ },
+ "bin": {
+ "webpack-bundle-analyzer": "lib/bin/analyzer.js"
+ },
+ "engines": {
+ "node": ">= 10.13.0"
+ }
+ },
+ "node_modules/webpack-bundle-analyzer/node_modules/commander": {
+ "version": "7.2.0",
+ "resolved": "https://registry.npmjs.org/commander/-/commander-7.2.0.tgz",
+ "integrity": "sha512-QrWXB+ZQSVPmIWIhtEO9H+gwHaMGYiF5ChvoJ+K9ZGHG/sVsa6yiesAD1GC/x46sET00Xlwo1u49RVVVzvcSkw==",
+ "engines": {
+ "node": ">= 10"
+ }
+ },
+ "node_modules/webpack-dev-middleware": {
+ "version": "5.3.4",
+ "resolved": "https://registry.npmjs.org/webpack-dev-middleware/-/webpack-dev-middleware-5.3.4.tgz",
+ "integrity": "sha512-BVdTqhhs+0IfoeAf7EoH5WE+exCmqGerHfDM0IL096Px60Tq2Mn9MAbnaGUe6HiMa41KMCYF19gyzZmBcq/o4Q==",
+ "dependencies": {
+ "colorette": "^2.0.10",
+ "memfs": "^3.4.3",
+ "mime-types": "^2.1.31",
+ "range-parser": "^1.2.1",
+ "schema-utils": "^4.0.0"
+ },
+ "engines": {
+ "node": ">= 12.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ },
+ "peerDependencies": {
+ "webpack": "^4.0.0 || ^5.0.0"
+ }
+ },
+ "node_modules/webpack-dev-middleware/node_modules/ajv": {
+ "version": "8.12.0",
+ "resolved": "https://registry.npmjs.org/ajv/-/ajv-8.12.0.tgz",
+ "integrity": "sha512-sRu1kpcO9yLtYxBKvqfTeh9KzZEwO3STyX1HT+4CaDzC6HpTGYhIhPIzj9XuKU7KYDwnaeh5hcOwjy1QuJzBPA==",
+ "dependencies": {
+ "fast-deep-equal": "^3.1.1",
+ "json-schema-traverse": "^1.0.0",
+ "require-from-string": "^2.0.2",
+ "uri-js": "^4.2.2"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/epoberezkin"
+ }
+ },
+ "node_modules/webpack-dev-middleware/node_modules/ajv-keywords": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/ajv-keywords/-/ajv-keywords-5.1.0.tgz",
+ "integrity": "sha512-YCS/JNFAUyr5vAuhk1DWm1CBxRHW9LbJ2ozWeemrIqpbsqKjHVxYPyi5GC0rjZIT5JxJ3virVTS8wk4i/Z+krw==",
+ "dependencies": {
+ "fast-deep-equal": "^3.1.3"
+ },
+ "peerDependencies": {
+ "ajv": "^8.8.2"
+ }
+ },
+ "node_modules/webpack-dev-middleware/node_modules/json-schema-traverse": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-1.0.0.tgz",
+ "integrity": "sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug=="
+ },
+ "node_modules/webpack-dev-middleware/node_modules/mime-db": {
+ "version": "1.52.0",
+ "resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
+ "integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/webpack-dev-middleware/node_modules/mime-types": {
+ "version": "2.1.35",
+ "resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
+ "integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
+ "dependencies": {
+ "mime-db": "1.52.0"
+ },
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/webpack-dev-middleware/node_modules/range-parser": {
+ "version": "1.2.1",
+ "resolved": "https://registry.npmjs.org/range-parser/-/range-parser-1.2.1.tgz",
+ "integrity": "sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/webpack-dev-middleware/node_modules/schema-utils": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/schema-utils/-/schema-utils-4.0.0.tgz",
+ "integrity": "sha512-1edyXKgh6XnJsJSQ8mKWXnN/BVaIbFMLpouRUrXgVq7WYne5kw3MW7UPhO44uRXQSIpTSXoJbmrR2X0w9kUTyg==",
+ "dependencies": {
+ "@types/json-schema": "^7.0.9",
+ "ajv": "^8.8.0",
+ "ajv-formats": "^2.1.1",
+ "ajv-keywords": "^5.0.0"
+ },
+ "engines": {
+ "node": ">= 12.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ }
+ },
+ "node_modules/webpack-dev-server": {
+ "version": "4.13.1",
+ "resolved": "https://registry.npmjs.org/webpack-dev-server/-/webpack-dev-server-4.13.1.tgz",
+ "integrity": "sha512-5tWg00bnWbYgkN+pd5yISQKDejRBYGEw15RaEEslH+zdbNDxxaZvEAO2WulaSaFKb5n3YG8JXsGaDsut1D0xdA==",
+ "dependencies": {
+ "@types/bonjour": "^3.5.9",
+ "@types/connect-history-api-fallback": "^1.3.5",
+ "@types/express": "^4.17.13",
+ "@types/serve-index": "^1.9.1",
+ "@types/serve-static": "^1.13.10",
+ "@types/sockjs": "^0.3.33",
+ "@types/ws": "^8.5.1",
+ "ansi-html-community": "^0.0.8",
+ "bonjour-service": "^1.0.11",
+ "chokidar": "^3.5.3",
+ "colorette": "^2.0.10",
+ "compression": "^1.7.4",
+ "connect-history-api-fallback": "^2.0.0",
+ "default-gateway": "^6.0.3",
+ "express": "^4.17.3",
+ "graceful-fs": "^4.2.6",
+ "html-entities": "^2.3.2",
+ "http-proxy-middleware": "^2.0.3",
+ "ipaddr.js": "^2.0.1",
+ "launch-editor": "^2.6.0",
+ "open": "^8.0.9",
+ "p-retry": "^4.5.0",
+ "rimraf": "^3.0.2",
+ "schema-utils": "^4.0.0",
+ "selfsigned": "^2.1.1",
+ "serve-index": "^1.9.1",
+ "sockjs": "^0.3.24",
+ "spdy": "^4.0.2",
+ "webpack-dev-middleware": "^5.3.1",
+ "ws": "^8.13.0"
+ },
+ "bin": {
+ "webpack-dev-server": "bin/webpack-dev-server.js"
+ },
+ "engines": {
+ "node": ">= 12.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ },
+ "peerDependencies": {
+ "webpack": "^4.37.0 || ^5.0.0"
+ },
+ "peerDependenciesMeta": {
+ "webpack": {
+ "optional": true
+ },
+ "webpack-cli": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/webpack-dev-server/node_modules/ajv": {
+ "version": "8.12.0",
+ "resolved": "https://registry.npmjs.org/ajv/-/ajv-8.12.0.tgz",
+ "integrity": "sha512-sRu1kpcO9yLtYxBKvqfTeh9KzZEwO3STyX1HT+4CaDzC6HpTGYhIhPIzj9XuKU7KYDwnaeh5hcOwjy1QuJzBPA==",
+ "dependencies": {
+ "fast-deep-equal": "^3.1.1",
+ "json-schema-traverse": "^1.0.0",
+ "require-from-string": "^2.0.2",
+ "uri-js": "^4.2.2"
+ },
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/epoberezkin"
+ }
+ },
+ "node_modules/webpack-dev-server/node_modules/ajv-keywords": {
+ "version": "5.1.0",
+ "resolved": "https://registry.npmjs.org/ajv-keywords/-/ajv-keywords-5.1.0.tgz",
+ "integrity": "sha512-YCS/JNFAUyr5vAuhk1DWm1CBxRHW9LbJ2ozWeemrIqpbsqKjHVxYPyi5GC0rjZIT5JxJ3virVTS8wk4i/Z+krw==",
+ "dependencies": {
+ "fast-deep-equal": "^3.1.3"
+ },
+ "peerDependencies": {
+ "ajv": "^8.8.2"
+ }
+ },
+ "node_modules/webpack-dev-server/node_modules/json-schema-traverse": {
+ "version": "1.0.0",
+ "resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-1.0.0.tgz",
+ "integrity": "sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug=="
+ },
+ "node_modules/webpack-dev-server/node_modules/schema-utils": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/schema-utils/-/schema-utils-4.0.0.tgz",
+ "integrity": "sha512-1edyXKgh6XnJsJSQ8mKWXnN/BVaIbFMLpouRUrXgVq7WYne5kw3MW7UPhO44uRXQSIpTSXoJbmrR2X0w9kUTyg==",
+ "dependencies": {
+ "@types/json-schema": "^7.0.9",
+ "ajv": "^8.8.0",
+ "ajv-formats": "^2.1.1",
+ "ajv-keywords": "^5.0.0"
+ },
+ "engines": {
+ "node": ">= 12.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ }
+ },
+ "node_modules/webpack-dev-server/node_modules/ws": {
+ "version": "8.13.0",
+ "resolved": "https://registry.npmjs.org/ws/-/ws-8.13.0.tgz",
+ "integrity": "sha512-x9vcZYTrFPC7aSIbj7sRCYo7L/Xb8Iy+pW0ng0wt2vCJv7M9HOMy0UoN3rr+IFC7hb7vXoqS+P9ktyLLLhO+LA==",
+ "engines": {
+ "node": ">=10.0.0"
+ },
+ "peerDependencies": {
+ "bufferutil": "^4.0.1",
+ "utf-8-validate": ">=5.0.2"
+ },
+ "peerDependenciesMeta": {
+ "bufferutil": {
+ "optional": true
+ },
+ "utf-8-validate": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/webpack-merge": {
+ "version": "5.8.0",
+ "resolved": "https://registry.npmjs.org/webpack-merge/-/webpack-merge-5.8.0.tgz",
+ "integrity": "sha512-/SaI7xY0831XwP6kzuwhKWVKDP9t1QY1h65lAFLbZqMPIuYcD9QAW4u9STIbU9kaJbPBB/geU/gLr1wDjOhQ+Q==",
+ "dependencies": {
+ "clone-deep": "^4.0.1",
+ "wildcard": "^2.0.0"
+ },
+ "engines": {
+ "node": ">=10.0.0"
+ }
+ },
+ "node_modules/webpack-sources": {
+ "version": "3.2.3",
+ "resolved": "https://registry.npmjs.org/webpack-sources/-/webpack-sources-3.2.3.tgz",
+ "integrity": "sha512-/DyMEOrDgLKKIG0fmvtz+4dUX/3Ghozwgm6iPp8KRhvn+eQf9+Q7GWxVNMk3+uCPWfdXYC4ExGBckIXdFEfH1w==",
+ "engines": {
+ "node": ">=10.13.0"
+ }
+ },
+ "node_modules/webpack/node_modules/mime-db": {
+ "version": "1.52.0",
+ "resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
+ "integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/webpack/node_modules/mime-types": {
+ "version": "2.1.35",
+ "resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
+ "integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
+ "dependencies": {
+ "mime-db": "1.52.0"
+ },
+ "engines": {
+ "node": ">= 0.6"
+ }
+ },
+ "node_modules/webpack/node_modules/schema-utils": {
+ "version": "3.3.0",
+ "resolved": "https://registry.npmjs.org/schema-utils/-/schema-utils-3.3.0.tgz",
+ "integrity": "sha512-pN/yOAvcC+5rQ5nERGuwrjLlYvLTbCibnZ1I7B1LaiAz9BRBlE9GMgE/eqV30P7aJQUf7Ddimy/RsbYO/GrVGg==",
+ "dependencies": {
+ "@types/json-schema": "^7.0.8",
+ "ajv": "^6.12.5",
+ "ajv-keywords": "^3.5.2"
+ },
+ "engines": {
+ "node": ">= 10.13.0"
+ },
+ "funding": {
+ "type": "opencollective",
+ "url": "https://opencollective.com/webpack"
+ }
+ },
+ "node_modules/webpackbar": {
+ "version": "5.0.2",
+ "resolved": "https://registry.npmjs.org/webpackbar/-/webpackbar-5.0.2.tgz",
+ "integrity": "sha512-BmFJo7veBDgQzfWXl/wwYXr/VFus0614qZ8i9znqcl9fnEdiVkdbi0TedLQ6xAK92HZHDJ0QmyQ0fmuZPAgCYQ==",
+ "dependencies": {
+ "chalk": "^4.1.0",
+ "consola": "^2.15.3",
+ "pretty-time": "^1.1.0",
+ "std-env": "^3.0.1"
+ },
+ "engines": {
+ "node": ">=12"
+ },
+ "peerDependencies": {
+ "webpack": "3 || 4 || 5"
+ }
+ },
+ "node_modules/websocket-driver": {
+ "version": "0.7.4",
+ "resolved": "https://registry.npmjs.org/websocket-driver/-/websocket-driver-0.7.4.tgz",
+ "integrity": "sha512-b17KeDIQVjvb0ssuSDF2cYXSg2iztliJ4B9WdsuB6J952qCPKmnVq4DyW5motImXHDC1cBT/1UezrJVsKw5zjg==",
+ "dependencies": {
+ "http-parser-js": ">=0.5.1",
+ "safe-buffer": ">=5.1.0",
+ "websocket-extensions": ">=0.1.1"
+ },
+ "engines": {
+ "node": ">=0.8.0"
+ }
+ },
+ "node_modules/websocket-extensions": {
+ "version": "0.1.4",
+ "resolved": "https://registry.npmjs.org/websocket-extensions/-/websocket-extensions-0.1.4.tgz",
+ "integrity": "sha512-OqedPIGOfsDlo31UNwYbCFMSaO9m9G/0faIHj5/dZFDMFqPTcx6UwqyOy3COEaEOg/9VsGIpdqn62W5KhoKSpg==",
+ "engines": {
+ "node": ">=0.8.0"
+ }
+ },
+ "node_modules/whatwg-url": {
+ "version": "5.0.0",
+ "resolved": "https://registry.npmjs.org/whatwg-url/-/whatwg-url-5.0.0.tgz",
+ "integrity": "sha512-saE57nupxk6v3HY35+jzBwYa0rKSy0XR8JSxZPwgLr7ys0IBzhGviA1/TUGJLmSVqs8pb9AnvICXEuOHLprYTw==",
+ "dependencies": {
+ "tr46": "~0.0.3",
+ "webidl-conversions": "^3.0.0"
+ }
+ },
+ "node_modules/which": {
+ "version": "2.0.2",
+ "resolved": "https://registry.npmjs.org/which/-/which-2.0.2.tgz",
+ "integrity": "sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA==",
+ "dependencies": {
+ "isexe": "^2.0.0"
+ },
+ "bin": {
+ "node-which": "bin/node-which"
+ },
+ "engines": {
+ "node": ">= 8"
+ }
+ },
+ "node_modules/widest-line": {
+ "version": "4.0.1",
+ "resolved": "https://registry.npmjs.org/widest-line/-/widest-line-4.0.1.tgz",
+ "integrity": "sha512-o0cyEG0e8GPzT4iGHphIOh0cJOV8fivsXxddQasHPHfoZf1ZexrfeA21w2NaEN1RHE+fXlfISmOE8R9N3u3Qig==",
+ "dependencies": {
+ "string-width": "^5.0.1"
+ },
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/wildcard": {
+ "version": "2.0.0",
+ "resolved": "https://registry.npmjs.org/wildcard/-/wildcard-2.0.0.tgz",
+ "integrity": "sha512-JcKqAHLPxcdb9KM49dufGXn2x3ssnfjbcaQdLlfZsL9rH9wgDQjUtDxbo8NE0F6SFvydeu1VhZe7hZuHsB2/pw=="
+ },
+ "node_modules/wrap-ansi": {
+ "version": "8.1.0",
+ "resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-8.1.0.tgz",
+ "integrity": "sha512-si7QWI6zUMq56bESFvagtmzMdGOtoxfR+Sez11Mobfc7tm+VkUckk9bW2UeffTGVUbOksxmSw0AA2gs8g71NCQ==",
+ "dependencies": {
+ "ansi-styles": "^6.1.0",
+ "string-width": "^5.0.1",
+ "strip-ansi": "^7.0.1"
+ },
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/wrap-ansi?sponsor=1"
+ }
+ },
+ "node_modules/wrap-ansi/node_modules/ansi-regex": {
+ "version": "6.0.1",
+ "resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-6.0.1.tgz",
+ "integrity": "sha512-n5M855fKb2SsfMIiFFoVrABHJC8QtHwVx+mHWP3QcEqBHYienj5dHSgjbxtC0WEZXYt4wcD6zrQElDPhFuZgfA==",
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/ansi-regex?sponsor=1"
+ }
+ },
+ "node_modules/wrap-ansi/node_modules/ansi-styles": {
+ "version": "6.2.1",
+ "resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-6.2.1.tgz",
+ "integrity": "sha512-bN798gFfQX+viw3R7yrGWRqnrN2oRkEkUjjl4JNn4E8GxxbjtG3FbrEIIY3l8/hrwUwIeCZvi4QuOTP4MErVug==",
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/ansi-styles?sponsor=1"
+ }
+ },
+ "node_modules/wrap-ansi/node_modules/strip-ansi": {
+ "version": "7.0.1",
+ "resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-7.0.1.tgz",
+ "integrity": "sha512-cXNxvT8dFNRVfhVME3JAe98mkXDYN2O1l7jmcwMnOslDeESg1rF/OZMtK0nRAhiari1unG5cD4jG3rapUAkLbw==",
+ "dependencies": {
+ "ansi-regex": "^6.0.1"
+ },
+ "engines": {
+ "node": ">=12"
+ },
+ "funding": {
+ "url": "https://github.com/chalk/strip-ansi?sponsor=1"
+ }
+ },
+ "node_modules/wrappy": {
+ "version": "1.0.2",
+ "resolved": "https://registry.npmjs.org/wrappy/-/wrappy-1.0.2.tgz",
+ "integrity": "sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ=="
+ },
+ "node_modules/write-file-atomic": {
+ "version": "3.0.3",
+ "resolved": "https://registry.npmjs.org/write-file-atomic/-/write-file-atomic-3.0.3.tgz",
+ "integrity": "sha512-AvHcyZ5JnSfq3ioSyjrBkH9yW4m7Ayk8/9My/DD9onKeu/94fwrMocemO2QAJFAlnnDN+ZDS+ZjAR5ua1/PV/Q==",
+ "dependencies": {
+ "imurmurhash": "^0.1.4",
+ "is-typedarray": "^1.0.0",
+ "signal-exit": "^3.0.2",
+ "typedarray-to-buffer": "^3.1.5"
+ }
+ },
+ "node_modules/ws": {
+ "version": "7.5.9",
+ "resolved": "https://registry.npmjs.org/ws/-/ws-7.5.9.tgz",
+ "integrity": "sha512-F+P9Jil7UiSKSkppIiD94dN07AwvFixvLIj1Og1Rl9GGMuNipJnV9JzjD6XuqmAeiswGvUmNLjr5cFuXwNS77Q==",
+ "engines": {
+ "node": ">=8.3.0"
+ },
+ "peerDependencies": {
+ "bufferutil": "^4.0.1",
+ "utf-8-validate": "^5.0.2"
+ },
+ "peerDependenciesMeta": {
+ "bufferutil": {
+ "optional": true
+ },
+ "utf-8-validate": {
+ "optional": true
+ }
+ }
+ },
+ "node_modules/xdg-basedir": {
+ "version": "4.0.0",
+ "resolved": "https://registry.npmjs.org/xdg-basedir/-/xdg-basedir-4.0.0.tgz",
+ "integrity": "sha512-PSNhEJDejZYV7h50BohL09Er9VaIefr2LMAf3OEmpCkjOi34eYyQYAXUTjEQtZJTKcF0E2UKTh+osDLsgNim9Q==",
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/xml-js": {
+ "version": "1.6.11",
+ "resolved": "https://registry.npmjs.org/xml-js/-/xml-js-1.6.11.tgz",
+ "integrity": "sha512-7rVi2KMfwfWFl+GpPg6m80IVMWXLRjO+PxTq7V2CDhoGak0wzYzFgUY2m4XJ47OGdXd8eLE8EmwfAmdjw7lC1g==",
+ "dependencies": {
+ "sax": "^1.2.4"
+ },
+ "bin": {
+ "xml-js": "bin/cli.js"
+ }
+ },
+ "node_modules/xtend": {
+ "version": "4.0.2",
+ "resolved": "https://registry.npmjs.org/xtend/-/xtend-4.0.2.tgz",
+ "integrity": "sha512-LKYU1iAXJXUgAXn9URjiu+MWhyUXHsvfp7mcuYm9dSUKK0/CjtrUwFAxD82/mCWbtLsGjFIad0wIsod4zrTAEQ==",
+ "engines": {
+ "node": ">=0.4"
+ }
+ },
+ "node_modules/y18n": {
+ "version": "5.0.8",
+ "resolved": "https://registry.npmjs.org/y18n/-/y18n-5.0.8.tgz",
+ "integrity": "sha512-0pfFzegeDWJHJIAmTLRP2DwHjdF5s7jo9tuztdQxAhINCdvS+3nGINqPd00AphqJR/0LhANUS6/+7SCb98YOfA==",
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/yallist": {
+ "version": "3.1.1",
+ "resolved": "https://registry.npmjs.org/yallist/-/yallist-3.1.1.tgz",
+ "integrity": "sha512-a4UGQaWPH59mOXUYnAG2ewncQS4i4F43Tv3JoAM+s2VDAmS9NsK8GpDMLrCHPksFT7h3K6TOoUNn2pb7RoXx4g=="
+ },
+ "node_modules/yaml": {
+ "version": "1.10.2",
+ "resolved": "https://registry.npmjs.org/yaml/-/yaml-1.10.2.tgz",
+ "integrity": "sha512-r3vXyErRCYJ7wg28yvBY5VSoAF8ZvlcW9/BwUzEtUsjvX/DKs24dIkuwjtuprwJJHsbyUbLApepYTR1BN4uHrg==",
+ "engines": {
+ "node": ">= 6"
+ }
+ },
+ "node_modules/yaml-ast-parser": {
+ "version": "0.0.43",
+ "resolved": "https://registry.npmjs.org/yaml-ast-parser/-/yaml-ast-parser-0.0.43.tgz",
+ "integrity": "sha512-2PTINUwsRqSd+s8XxKaJWQlUuEMHJQyEuh2edBbW8KNJz0SJPwUSD2zRWqezFEdN7IzAgeuYHFUCF7o8zRdZ0A=="
+ },
+ "node_modules/yargs": {
+ "version": "16.2.0",
+ "resolved": "https://registry.npmjs.org/yargs/-/yargs-16.2.0.tgz",
+ "integrity": "sha512-D1mvvtDG0L5ft/jGWkLpG1+m0eQxOfaBvTNELraWj22wSVUMWxZUvYgJYcKh6jGGIkJFhH4IZPQhR4TKpc8mBw==",
+ "dependencies": {
+ "cliui": "^7.0.2",
+ "escalade": "^3.1.1",
+ "get-caller-file": "^2.0.5",
+ "require-directory": "^2.1.1",
+ "string-width": "^4.2.0",
+ "y18n": "^5.0.5",
+ "yargs-parser": "^20.2.2"
+ },
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/yargs-parser": {
+ "version": "20.2.9",
+ "resolved": "https://registry.npmjs.org/yargs-parser/-/yargs-parser-20.2.9.tgz",
+ "integrity": "sha512-y11nGElTIV+CT3Zv9t7VKl+Q3hTQoT9a1Qzezhhl6Rp21gJ/IVTW7Z3y9EWXhuUBC2Shnf+DX0antecpAwSP8w==",
+ "engines": {
+ "node": ">=10"
+ }
+ },
+ "node_modules/yargs/node_modules/emoji-regex": {
+ "version": "8.0.0",
+ "resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
+ "integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A=="
+ },
+ "node_modules/yargs/node_modules/string-width": {
+ "version": "4.2.3",
+ "resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
+ "integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
+ "dependencies": {
+ "emoji-regex": "^8.0.0",
+ "is-fullwidth-code-point": "^3.0.0",
+ "strip-ansi": "^6.0.1"
+ },
+ "engines": {
+ "node": ">=8"
+ }
+ },
+ "node_modules/yocto-queue": {
+ "version": "0.1.0",
+ "resolved": "https://registry.npmjs.org/yocto-queue/-/yocto-queue-0.1.0.tgz",
+ "integrity": "sha512-rVksvsnNCdJ/ohGc6xgPwyN8eheCxsiLM8mxuE/t/mOVqJewPuO1miLpTHQiRgTKCLexL4MeAFVagts7HmNZ2Q==",
+ "engines": {
+ "node": ">=10"
+ },
+ "funding": {
+ "url": "https://github.com/sponsors/sindresorhus"
+ }
+ },
+ "node_modules/zwitch": {
+ "version": "1.0.5",
+ "resolved": "https://registry.npmjs.org/zwitch/-/zwitch-1.0.5.tgz",
+ "integrity": "sha512-V50KMwwzqJV0NpZIZFwfOD5/lyny3WlSzRiXgA0G7VUnRlqttta1L6UQIHzd6EuBY/cHGfwTIck7w1yH6Q5zUw==",
+ "funding": {
+ "type": "github",
+ "url": "https://github.com/sponsors/wooorm"
+ }
+ }
+ }
+}
diff --git a/docs/package.json b/docs/package.json
new file mode 100644
index 000000000000..5fcdd055efe0
--- /dev/null
+++ b/docs/package.json
@@ -0,0 +1,45 @@
+{
+ "name": "",
+ "version": "0.0.0",
+ "private": true,
+ "scripts": {
+ "docusaurus": "docusaurus",
+ "start": "docusaurus start",
+ "build": "docusaurus build",
+ "swizzle": "docusaurus swizzle",
+ "deploy": "docusaurus deploy",
+ "clear": "docusaurus clear",
+ "serve": "docusaurus serve",
+ "write-translations": "docusaurus write-translations",
+ "write-heading-ids": "docusaurus write-heading-ids"
+ },
+ "dependencies": {
+ "@cmfcmf/docusaurus-search-local": "^1.0.0",
+ "@docusaurus/core": "2.4.0",
+ "@docusaurus/preset-classic": "2.4.0",
+ "@mdx-js/react": "^1.6.22",
+ "clsx": "^1.2.1",
+ "prism-react-renderer": "^1.3.5",
+ "react": "^17.0.2",
+ "react-dom": "^17.0.2",
+ "redocusaurus": "^1.6.1"
+ },
+ "devDependencies": {
+ "@docusaurus/module-type-aliases": "2.4.0"
+ },
+ "browserslist": {
+ "production": [
+ ">0.5%",
+ "not dead",
+ "not op_mini all"
+ ],
+ "development": [
+ "last 1 chrome version",
+ "last 1 firefox version",
+ "last 1 safari version"
+ ]
+ },
+ "engines": {
+ "node": ">=16.14"
+ }
+}
diff --git a/docs/scripts/generate_api_schema.py b/docs/scripts/generate_api_schema.py
new file mode 100644
index 000000000000..cd50a7522d38
--- /dev/null
+++ b/docs/scripts/generate_api_schema.py
@@ -0,0 +1,31 @@
+import yaml
+import os
+
+from fastapi.openapi.utils import get_openapi
+
+DISPATCH_JWT_SECRET = "test"
+
+# TEST
+DATABASE_HOSTNAME = "foo"
+DATABASE_CREDENTIALS = "bar:bar"
+DISPATCH_ENCRYPTION_KEY = "baz"
+
+os.environ["DISPATCH_JWT_SECRET"] = DISPATCH_JWT_SECRET
+os.environ["DATABASE_HOSTNAME"] = DATABASE_HOSTNAME
+os.environ["DATABASE_CREDENTIALS"] = DATABASE_CREDENTIALS
+os.environ["DISPATCH_ENCRYPTION_KEY"] = DISPATCH_ENCRYPTION_KEY
+
+from dispatch.main import api as app # noqa
+
+
+with open("openapi.yaml", "w") as f:
+ yaml.dump(
+ get_openapi(
+ title=app.title,
+ version=app.version,
+ openapi_version=app.openapi_version,
+ description=app.description,
+ routes=app.routes,
+ ),
+ f,
+ )
diff --git a/docs/scripts/openapi.yaml b/docs/scripts/openapi.yaml
new file mode 100644
index 000000000000..b3d3a4b3dfc1
--- /dev/null
+++ b/docs/scripts/openapi.yaml
@@ -0,0 +1,23240 @@
+components:
+ schemas:
+ AlertCreate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ external_link:
+ nullable: true
+ title: External Link
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: false
+ title: Name
+ type: string
+ originator:
+ nullable: true
+ title: Originator
+ type: string
+ title: AlertCreate
+ type: object
+ AlertRead:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ external_link:
+ nullable: true
+ title: External Link
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: false
+ title: Name
+ type: string
+ originator:
+ nullable: true
+ title: Originator
+ type: string
+ required:
+ - id
+ title: AlertRead
+ type: object
+ AlertUpdate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ external_link:
+ nullable: true
+ title: External Link
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: false
+ title: Name
+ type: string
+ originator:
+ nullable: true
+ title: Originator
+ type: string
+ title: AlertUpdate
+ type: object
+ CaseCreate:
+ properties:
+ assignee:
+ $ref: "#/components/schemas/ParticipantUpdate"
+ case_priority:
+ $ref: "#/components/schemas/CasePriorityRead"
+ case_severity:
+ $ref: "#/components/schemas/CaseSeverityRead"
+ case_type:
+ $ref: "#/components/schemas/CaseTypeRead"
+ description:
+ title: Description
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__case__models__ProjectRead"
+ reporter:
+ $ref: "#/components/schemas/ParticipantUpdate"
+ resolution:
+ title: Resolution
+ type: string
+ resolution_reason:
+ $ref: "#/components/schemas/CaseResolutionReason"
+ status:
+ $ref: "#/components/schemas/CaseStatus"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ title:
+ title: Title
+ type: string
+ visibility:
+ $ref: "#/components/schemas/Visibility"
+ required:
+ - title
+ title: CaseCreate
+ type: object
+ CasePriorityBase:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ page_assignee:
+ title: Page Assignee
+ type: boolean
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ title: CasePriorityBase
+ type: object
+ CasePriorityCreate:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ page_assignee:
+ title: Page Assignee
+ type: boolean
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ title: CasePriorityCreate
+ type: object
+ CasePriorityPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/CasePriorityRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: CasePriorityPagination
+ type: object
+ CasePriorityRead:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ page_assignee:
+ title: Page Assignee
+ type: boolean
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ - id
+ title: CasePriorityRead
+ type: object
+ CasePriorityUpdate:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ page_assignee:
+ title: Page Assignee
+ type: boolean
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ title: CasePriorityUpdate
+ type: object
+ CaseReadMinimal:
+ properties:
+ assignee:
+ $ref: "#/components/schemas/ParticipantReadMinimal"
+ case_priority:
+ $ref: "#/components/schemas/CasePriorityRead"
+ case_severity:
+ $ref: "#/components/schemas/CaseSeverityRead"
+ case_type:
+ $ref: "#/components/schemas/CaseTypeRead"
+ closed_at:
+ format: date-time
+ title: Closed At
+ type: string
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ description:
+ title: Description
+ type: string
+ duplicates:
+ default: []
+ items:
+ $ref: "#/components/schemas/CaseReadMinimal"
+ title: Duplicates
+ type: array
+ escalated_at:
+ format: date-time
+ title: Escalated At
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incidents:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentReadMinimal"
+ title: Incidents
+ type: array
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__case__models__ProjectRead"
+ related:
+ default: []
+ items:
+ $ref: "#/components/schemas/CaseReadMinimal"
+ title: Related
+ type: array
+ reported_at:
+ format: date-time
+ title: Reported At
+ type: string
+ resolution:
+ title: Resolution
+ type: string
+ resolution_reason:
+ $ref: "#/components/schemas/CaseResolutionReason"
+ status:
+ $ref: "#/components/schemas/CaseStatus"
+ title:
+ title: Title
+ type: string
+ triage_at:
+ format: date-time
+ title: Triage At
+ type: string
+ visibility:
+ $ref: "#/components/schemas/Visibility"
+ required:
+ - title
+ - id
+ - case_priority
+ - case_severity
+ - case_type
+ - project
+ title: CaseReadMinimal
+ type: object
+ CaseResolutionReason:
+ description: An enumeration.
+ enum:
+ - False Positive
+ - User Acknowledged
+ - Mitigated
+ - Escalated
+ title: CaseResolutionReason
+ type: string
+ CaseSeverityBase:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ title: CaseSeverityBase
+ type: object
+ CaseSeverityCreate:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ title: CaseSeverityCreate
+ type: object
+ CaseSeverityPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/CaseSeverityRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: CaseSeverityPagination
+ type: object
+ CaseSeverityRead:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ - id
+ title: CaseSeverityRead
+ type: object
+ CaseSeverityUpdate:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ title: CaseSeverityUpdate
+ type: object
+ CaseStatus:
+ description: An enumeration.
+ enum:
+ - New
+ - Triage
+ - Escalated
+ - Stable
+ - Closed
+ title: CaseStatus
+ type: string
+ CaseTypeBase:
+ properties:
+ case_template_document:
+ $ref: "#/components/schemas/dispatch__case__type__models__Document"
+ conversation_target:
+ title: Conversation Target
+ type: string
+ default:
+ default: false
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ exclude_from_metrics:
+ default: false
+ title: Exclude From Metrics
+ type: boolean
+ incident_type:
+ $ref: "#/components/schemas/IncidentType"
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ oncall_service:
+ $ref: "#/components/schemas/dispatch__case__type__models__Service"
+ plugin_metadata:
+ default: []
+ items:
+ $ref: "#/components/schemas/PluginMetadata"
+ title: Plugin Metadata
+ type: array
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ visibility:
+ nullable: true
+ title: Visibility
+ type: string
+ required:
+ - name
+ title: CaseTypeBase
+ type: object
+ CaseTypeCreate:
+ properties:
+ case_template_document:
+ $ref: "#/components/schemas/dispatch__case__type__models__Document"
+ conversation_target:
+ title: Conversation Target
+ type: string
+ default:
+ default: false
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ exclude_from_metrics:
+ default: false
+ title: Exclude From Metrics
+ type: boolean
+ incident_type:
+ $ref: "#/components/schemas/IncidentType"
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ oncall_service:
+ $ref: "#/components/schemas/dispatch__case__type__models__Service"
+ plugin_metadata:
+ default: []
+ items:
+ $ref: "#/components/schemas/PluginMetadata"
+ title: Plugin Metadata
+ type: array
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ visibility:
+ nullable: true
+ title: Visibility
+ type: string
+ required:
+ - name
+ title: CaseTypeCreate
+ type: object
+ CaseTypePagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/CaseTypeRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: CaseTypePagination
+ type: object
+ CaseTypeRead:
+ properties:
+ case_template_document:
+ $ref: "#/components/schemas/dispatch__case__type__models__Document"
+ conversation_target:
+ title: Conversation Target
+ type: string
+ default:
+ default: false
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ exclude_from_metrics:
+ default: false
+ title: Exclude From Metrics
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incident_type:
+ $ref: "#/components/schemas/IncidentType"
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ oncall_service:
+ $ref: "#/components/schemas/dispatch__case__type__models__Service"
+ plugin_metadata:
+ default: []
+ items:
+ $ref: "#/components/schemas/PluginMetadata"
+ title: Plugin Metadata
+ type: array
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ visibility:
+ nullable: true
+ title: Visibility
+ type: string
+ required:
+ - name
+ - id
+ title: CaseTypeRead
+ type: object
+ CaseTypeUpdate:
+ properties:
+ case_template_document:
+ $ref: "#/components/schemas/dispatch__case__type__models__Document"
+ conversation_target:
+ title: Conversation Target
+ type: string
+ default:
+ default: false
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ exclude_from_metrics:
+ default: false
+ title: Exclude From Metrics
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incident_type:
+ $ref: "#/components/schemas/IncidentType"
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ oncall_service:
+ $ref: "#/components/schemas/dispatch__case__type__models__Service"
+ plugin_metadata:
+ default: []
+ items:
+ $ref: "#/components/schemas/PluginMetadata"
+ title: Plugin Metadata
+ type: array
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ visibility:
+ nullable: true
+ title: Visibility
+ type: string
+ required:
+ - name
+ title: CaseTypeUpdate
+ type: object
+ CaseUpdate:
+ properties:
+ assignee:
+ $ref: "#/components/schemas/ParticipantUpdate"
+ case_priority:
+ $ref: "#/components/schemas/CasePriorityBase"
+ case_severity:
+ $ref: "#/components/schemas/CaseSeverityBase"
+ case_type:
+ $ref: "#/components/schemas/CaseTypeBase"
+ description:
+ title: Description
+ type: string
+ duplicates:
+ default: []
+ items:
+ $ref: "#/components/schemas/dispatch__case__models__CaseRead"
+ title: Duplicates
+ type: array
+ escalated_at:
+ format: date-time
+ title: Escalated At
+ type: string
+ incidents:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentReadMinimal"
+ title: Incidents
+ type: array
+ related:
+ default: []
+ items:
+ $ref: "#/components/schemas/dispatch__case__models__CaseRead"
+ title: Related
+ type: array
+ reported_at:
+ format: date-time
+ title: Reported At
+ type: string
+ resolution:
+ title: Resolution
+ type: string
+ resolution_reason:
+ $ref: "#/components/schemas/CaseResolutionReason"
+ status:
+ $ref: "#/components/schemas/CaseStatus"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ title:
+ title: Title
+ type: string
+ triage_at:
+ format: date-time
+ title: Triage At
+ type: string
+ visibility:
+ $ref: "#/components/schemas/Visibility"
+ required:
+ - title
+ title: CaseUpdate
+ type: object
+ ConferenceRead:
+ properties:
+ conference_challenge:
+ nullable: true
+ title: Conference Challenge
+ type: string
+ conference_id:
+ nullable: true
+ title: Conference Id
+ type: string
+ description:
+ nullable: true
+ title: Description
+ type: string
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ nullable: true
+ title: Resource Type
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ title: ConferenceRead
+ type: object
+ ConversationRead:
+ properties:
+ channel_id:
+ nullable: true
+ title: Channel Id
+ type: string
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ nullable: true
+ title: Resource Type
+ type: string
+ thread_id:
+ nullable: true
+ title: Thread Id
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ required:
+ - id
+ title: ConversationRead
+ type: object
+ DefinitionCreate:
+ properties:
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ source:
+ nullable: true
+ title: Source
+ type: string
+ terms:
+ default: []
+ items:
+ $ref: "#/components/schemas/DefinitionTerm"
+ title: Terms
+ type: array
+ text:
+ title: Text
+ type: string
+ required:
+ - text
+ - project
+ title: DefinitionCreate
+ type: object
+ DefinitionPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/DefinitionRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: DefinitionPagination
+ type: object
+ DefinitionRead:
+ properties:
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ source:
+ nullable: true
+ title: Source
+ type: string
+ terms:
+ items:
+ $ref: "#/components/schemas/DefinitionTerm"
+ title: Terms
+ type: array
+ text:
+ title: Text
+ type: string
+ required:
+ - text
+ - id
+ title: DefinitionRead
+ type: object
+ DefinitionTerm:
+ properties:
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ text:
+ title: Text
+ type: string
+ title: DefinitionTerm
+ type: object
+ DefinitionUpdate:
+ properties:
+ source:
+ nullable: true
+ title: Source
+ type: string
+ terms:
+ default: []
+ items:
+ $ref: "#/components/schemas/DefinitionTerm"
+ title: Terms
+ type: array
+ text:
+ title: Text
+ type: string
+ required:
+ - text
+ title: DefinitionUpdate
+ type: object
+ DocumentCreate:
+ properties:
+ created_at:
+ format: date-time
+ nullable: true
+ title: Created At
+ type: string
+ description:
+ nullable: true
+ title: Description
+ type: string
+ evergreen:
+ default: false
+ title: Evergreen
+ type: boolean
+ evergreen_last_reminder_at:
+ format: date-time
+ nullable: true
+ title: Evergreen Last Reminder At
+ type: string
+ evergreen_owner:
+ format: email
+ title: Evergreen Owner
+ type: string
+ evergreen_reminder_interval:
+ default: 90
+ title: Evergreen Reminder Interval
+ type: integer
+ filters:
+ default: []
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Filters
+ type: array
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ nullable: true
+ title: Resource Type
+ type: string
+ updated_at:
+ format: date-time
+ nullable: true
+ title: Updated At
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ required:
+ - name
+ - project
+ title: DocumentCreate
+ type: object
+ DocumentPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/DocumentRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: DocumentPagination
+ type: object
+ DocumentRead:
+ properties:
+ created_at:
+ format: date-time
+ nullable: true
+ title: Created At
+ type: string
+ description:
+ nullable: true
+ title: Description
+ type: string
+ evergreen:
+ default: false
+ title: Evergreen
+ type: boolean
+ evergreen_last_reminder_at:
+ format: date-time
+ nullable: true
+ title: Evergreen Last Reminder At
+ type: string
+ evergreen_owner:
+ format: email
+ title: Evergreen Owner
+ type: string
+ evergreen_reminder_interval:
+ default: 90
+ title: Evergreen Reminder Interval
+ type: integer
+ filters:
+ default: []
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Filters
+ type: array
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ nullable: true
+ title: Resource Type
+ type: string
+ updated_at:
+ format: date-time
+ nullable: true
+ title: Updated At
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ required:
+ - name
+ - id
+ title: DocumentRead
+ type: object
+ DocumentUpdate:
+ properties:
+ created_at:
+ format: date-time
+ nullable: true
+ title: Created At
+ type: string
+ description:
+ nullable: true
+ title: Description
+ type: string
+ evergreen:
+ default: false
+ title: Evergreen
+ type: boolean
+ evergreen_last_reminder_at:
+ format: date-time
+ nullable: true
+ title: Evergreen Last Reminder At
+ type: string
+ evergreen_owner:
+ format: email
+ title: Evergreen Owner
+ type: string
+ evergreen_reminder_interval:
+ default: 90
+ title: Evergreen Reminder Interval
+ type: integer
+ filters:
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Filters
+ type: array
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ nullable: true
+ title: Resource Type
+ type: string
+ updated_at:
+ format: date-time
+ nullable: true
+ title: Updated At
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ required:
+ - name
+ title: DocumentUpdate
+ type: object
+ EntityCreate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ entity_type:
+ $ref: "#/components/schemas/EntityTypeCreate"
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: true
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ source:
+ nullable: true
+ title: Source
+ type: string
+ value:
+ nullable: true
+ title: Value
+ type: string
+ required:
+ - entity_type
+ - project
+ title: EntityCreate
+ type: object
+ EntityPagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/EntityRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: EntityPagination
+ type: object
+ EntityRead:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ entity_type:
+ $ref: "#/components/schemas/EntityTypeRead"
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: true
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ source:
+ nullable: true
+ title: Source
+ type: string
+ value:
+ nullable: true
+ title: Value
+ type: string
+ required:
+ - id
+ - project
+ title: EntityRead
+ type: object
+ EntityTypeCreate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ global_find:
+ title: Global Find
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ jpath:
+ nullable: true
+ title: Jpath
+ type: string
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ regular_expression:
+ nullable: true
+ title: Regular Expression
+ type: string
+ required:
+ - project
+ title: EntityTypeCreate
+ type: object
+ EntityTypePagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/EntityTypeRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: EntityTypePagination
+ type: object
+ EntityTypeRead:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ global_find:
+ title: Global Find
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ jpath:
+ nullable: true
+ title: Jpath
+ type: string
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ regular_expression:
+ nullable: true
+ title: Regular Expression
+ type: string
+ required:
+ - id
+ - project
+ title: EntityTypeRead
+ type: object
+ EntityTypeUpdate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ global_find:
+ title: Global Find
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ jpath:
+ nullable: true
+ title: Jpath
+ type: string
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ regular_expression:
+ nullable: true
+ title: Regular Expression
+ type: string
+ title: EntityTypeUpdate
+ type: object
+ EntityUpdate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ entity_type:
+ $ref: "#/components/schemas/EntityTypeUpdate"
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: true
+ title: Name
+ type: string
+ source:
+ nullable: true
+ title: Source
+ type: string
+ value:
+ nullable: true
+ title: Value
+ type: string
+ title: EntityUpdate
+ type: object
+ ErrorMessage:
+ properties:
+ msg:
+ title: Msg
+ type: string
+ required:
+ - msg
+ title: ErrorMessage
+ type: object
+ ErrorResponse:
+ properties:
+ detail:
+ items:
+ $ref: "#/components/schemas/ErrorMessage"
+ title: Detail
+ type: array
+ title: ErrorResponse
+ type: object
+ EventRead:
+ properties:
+ description:
+ title: Description
+ type: string
+ details:
+ title: Details
+ type: object
+ ended_at:
+ format: date-time
+ title: Ended At
+ type: string
+ source:
+ title: Source
+ type: string
+ started_at:
+ format: date-time
+ title: Started At
+ type: string
+ uuid:
+ format: uuid
+ title: Uuid
+ type: string
+ required:
+ - uuid
+ - started_at
+ - ended_at
+ - source
+ - description
+ title: EventRead
+ type: object
+ ExecutiveReportCreate:
+ properties:
+ current_status:
+ title: Current Status
+ type: string
+ next_steps:
+ title: Next Steps
+ type: string
+ overview:
+ title: Overview
+ type: string
+ required:
+ - current_status
+ - overview
+ - next_steps
+ title: ExecutiveReportCreate
+ type: object
+ FeedbackCreate:
+ properties:
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ feedback:
+ nullable: true
+ title: Feedback
+ type: string
+ incident:
+ $ref: "#/components/schemas/IncidentReadMinimal"
+ participant:
+ $ref: "#/components/schemas/ParticipantRead"
+ rating:
+ allOf:
+ - $ref: "#/components/schemas/FeedbackRating"
+ default: Very satisfied
+ title: FeedbackCreate
+ type: object
+ FeedbackPagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/FeedbackRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: FeedbackPagination
+ type: object
+ FeedbackRating:
+ description: An enumeration.
+ enum:
+ - Very satisfied
+ - Somewhat satisfied
+ - Neither satisfied nor dissatisfied
+ - Somewhat dissatisfied
+ - Very dissatisfied
+ title: FeedbackRating
+ type: string
+ FeedbackRead:
+ properties:
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ feedback:
+ nullable: true
+ title: Feedback
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incident:
+ $ref: "#/components/schemas/IncidentReadMinimal"
+ participant:
+ $ref: "#/components/schemas/ParticipantRead"
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ rating:
+ allOf:
+ - $ref: "#/components/schemas/FeedbackRating"
+ default: Very satisfied
+ required:
+ - id
+ title: FeedbackRead
+ type: object
+ FeedbackUpdate:
+ properties:
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ feedback:
+ nullable: true
+ title: Feedback
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incident:
+ $ref: "#/components/schemas/IncidentReadMinimal"
+ participant:
+ $ref: "#/components/schemas/ParticipantRead"
+ rating:
+ allOf:
+ - $ref: "#/components/schemas/FeedbackRating"
+ default: Very satisfied
+ title: FeedbackUpdate
+ type: object
+ GroupRead:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ email:
+ format: email
+ title: Email
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ nullable: true
+ title: Resource Type
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ required:
+ - name
+ - email
+ - id
+ title: GroupRead
+ type: object
+ HTTPValidationError:
+ properties:
+ detail:
+ items:
+ $ref: "#/components/schemas/ValidationError"
+ title: Detail
+ type: array
+ title: HTTPValidationError
+ type: object
+ IncidentCostCreate:
+ properties:
+ amount:
+ default: 0
+ title: Amount
+ type: number
+ incident_cost_type:
+ $ref: "#/components/schemas/IncidentCostTypeRead"
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - incident_cost_type
+ - project
+ title: IncidentCostCreate
+ type: object
+ IncidentCostPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentCostRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: IncidentCostPagination
+ type: object
+ IncidentCostRead:
+ properties:
+ amount:
+ default: 0
+ title: Amount
+ type: number
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incident_cost_type:
+ $ref: "#/components/schemas/IncidentCostTypeRead"
+ required:
+ - id
+ - incident_cost_type
+ title: IncidentCostRead
+ type: object
+ IncidentCostTypeCreate:
+ properties:
+ category:
+ nullable: true
+ title: Category
+ type: string
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ details:
+ default: {}
+ title: Details
+ type: object
+ editable:
+ title: Editable
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - name
+ - project
+ title: IncidentCostTypeCreate
+ type: object
+ IncidentCostTypePagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentCostTypeRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: IncidentCostTypePagination
+ type: object
+ IncidentCostTypeRead:
+ properties:
+ category:
+ nullable: true
+ title: Category
+ type: string
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ details:
+ default: {}
+ title: Details
+ type: object
+ editable:
+ title: Editable
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ required:
+ - name
+ - id
+ title: IncidentCostTypeRead
+ type: object
+ IncidentCostTypeUpdate:
+ properties:
+ category:
+ nullable: true
+ title: Category
+ type: string
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ details:
+ default: {}
+ title: Details
+ type: object
+ editable:
+ title: Editable
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ required:
+ - name
+ title: IncidentCostTypeUpdate
+ type: object
+ IncidentCostUpdate:
+ properties:
+ amount:
+ default: 0
+ title: Amount
+ type: number
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incident_cost_type:
+ $ref: "#/components/schemas/IncidentCostTypeRead"
+ required:
+ - incident_cost_type
+ title: IncidentCostUpdate
+ type: object
+ IncidentCreate:
+ properties:
+ commander:
+ $ref: "#/components/schemas/ParticipantUpdate"
+ description:
+ title: Description
+ type: string
+ incident_priority:
+ $ref: "#/components/schemas/IncidentPriorityCreate"
+ incident_severity:
+ $ref: "#/components/schemas/IncidentSeverityCreate"
+ incident_type:
+ $ref: "#/components/schemas/IncidentTypeCreate"
+ project:
+ $ref: "#/components/schemas/dispatch__incident__models__ProjectRead"
+ reporter:
+ $ref: "#/components/schemas/ParticipantUpdate"
+ resolution:
+ title: Resolution
+ type: string
+ status:
+ $ref: "#/components/schemas/IncidentStatus"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ title:
+ title: Title
+ type: string
+ visibility:
+ $ref: "#/components/schemas/Visibility"
+ required:
+ - title
+ - description
+ title: IncidentCreate
+ type: object
+ IncidentPriorityBase:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ executive_report_reminder:
+ title: Executive Report Reminder
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ page_commander:
+ title: Page Commander
+ type: boolean
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ tactical_report_reminder:
+ title: Tactical Report Reminder
+ type: integer
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ title: IncidentPriorityBase
+ type: object
+ IncidentPriorityCreate:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ executive_report_reminder:
+ title: Executive Report Reminder
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ page_commander:
+ title: Page Commander
+ type: boolean
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ tactical_report_reminder:
+ title: Tactical Report Reminder
+ type: integer
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ title: IncidentPriorityCreate
+ type: object
+ IncidentPriorityPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentPriorityRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: IncidentPriorityPagination
+ type: object
+ IncidentPriorityRead:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ executive_report_reminder:
+ title: Executive Report Reminder
+ type: integer
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ page_commander:
+ title: Page Commander
+ type: boolean
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ tactical_report_reminder:
+ title: Tactical Report Reminder
+ type: integer
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ - id
+ title: IncidentPriorityRead
+ type: object
+ IncidentPriorityReadMinimal:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ executive_report_reminder:
+ title: Executive Report Reminder
+ type: integer
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ page_commander:
+ title: Page Commander
+ type: boolean
+ tactical_report_reminder:
+ title: Tactical Report Reminder
+ type: integer
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - id
+ - name
+ title: IncidentPriorityReadMinimal
+ type: object
+ IncidentPriorityUpdate:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ executive_report_reminder:
+ title: Executive Report Reminder
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ page_commander:
+ title: Page Commander
+ type: boolean
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ tactical_report_reminder:
+ title: Tactical Report Reminder
+ type: integer
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ title: IncidentPriorityUpdate
+ type: object
+ IncidentRead:
+ properties:
+ cases:
+ default: []
+ items:
+ $ref: "#/components/schemas/dispatch__incident__models__CaseRead"
+ title: Cases
+ type: array
+ closed_at:
+ format: date-time
+ title: Closed At
+ type: string
+ commander:
+ $ref: "#/components/schemas/ParticipantRead"
+ commanders_location:
+ title: Commanders Location
+ type: string
+ conference:
+ $ref: "#/components/schemas/ConferenceRead"
+ conversation:
+ $ref: "#/components/schemas/ConversationRead"
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ description:
+ title: Description
+ type: string
+ documents:
+ default: []
+ items:
+ $ref: "#/components/schemas/DocumentRead"
+ title: Documents
+ type: array
+ duplicates:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentReadMinimal"
+ title: Duplicates
+ type: array
+ events:
+ default: []
+ items:
+ $ref: "#/components/schemas/EventRead"
+ title: Events
+ type: array
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incident_costs:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentCostRead"
+ title: Incident Costs
+ type: array
+ incident_priority:
+ $ref: "#/components/schemas/IncidentPriorityRead"
+ incident_severity:
+ $ref: "#/components/schemas/IncidentSeverityRead"
+ incident_type:
+ $ref: "#/components/schemas/IncidentTypeRead"
+ last_executive_report:
+ $ref: "#/components/schemas/ReportRead"
+ last_tactical_report:
+ $ref: "#/components/schemas/ReportRead"
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ participants:
+ default: []
+ items:
+ $ref: "#/components/schemas/ParticipantRead"
+ title: Participants
+ type: array
+ participants_location:
+ title: Participants Location
+ type: string
+ participants_team:
+ title: Participants Team
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__incident__models__ProjectRead"
+ reported_at:
+ format: date-time
+ title: Reported At
+ type: string
+ reporter:
+ $ref: "#/components/schemas/ParticipantRead"
+ reporters_location:
+ title: Reporters Location
+ type: string
+ resolution:
+ title: Resolution
+ type: string
+ stable_at:
+ format: date-time
+ title: Stable At
+ type: string
+ status:
+ $ref: "#/components/schemas/IncidentStatus"
+ storage:
+ $ref: "#/components/schemas/StorageRead"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ tasks:
+ default: []
+ items:
+ $ref: "#/components/schemas/dispatch__incident__models__TaskRead"
+ title: Tasks
+ type: array
+ terms:
+ default: []
+ items:
+ $ref: "#/components/schemas/TermRead"
+ title: Terms
+ type: array
+ ticket:
+ $ref: "#/components/schemas/TicketRead"
+ title:
+ title: Title
+ type: string
+ total_cost:
+ title: Total Cost
+ type: number
+ visibility:
+ $ref: "#/components/schemas/Visibility"
+ workflow_instances:
+ default: []
+ items:
+ $ref: "#/components/schemas/WorkflowInstanceRead"
+ title: Workflow Instances
+ type: array
+ required:
+ - title
+ - description
+ - id
+ - incident_priority
+ - incident_severity
+ - incident_type
+ - project
+ title: IncidentRead
+ type: object
+ IncidentReadMinimal:
+ properties:
+ closed_at:
+ format: date-time
+ title: Closed At
+ type: string
+ commander:
+ $ref: "#/components/schemas/ParticipantReadMinimal"
+ commanders_location:
+ title: Commanders Location
+ type: string
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ description:
+ title: Description
+ type: string
+ duplicates:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentReadMinimal"
+ title: Duplicates
+ type: array
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incident_costs:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentCostRead"
+ title: Incident Costs
+ type: array
+ incident_priority:
+ $ref: "#/components/schemas/IncidentPriorityReadMinimal"
+ incident_severity:
+ $ref: "#/components/schemas/IncidentSeverityReadMinimal"
+ incident_type:
+ $ref: "#/components/schemas/IncidentTypeReadMinimal"
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ participants_location:
+ title: Participants Location
+ type: string
+ participants_team:
+ title: Participants Team
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__incident__models__ProjectRead"
+ reported_at:
+ format: date-time
+ title: Reported At
+ type: string
+ reporter:
+ $ref: "#/components/schemas/ParticipantReadMinimal"
+ reporters_location:
+ title: Reporters Location
+ type: string
+ resolution:
+ title: Resolution
+ type: string
+ stable_at:
+ format: date-time
+ title: Stable At
+ type: string
+ status:
+ $ref: "#/components/schemas/IncidentStatus"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ title:
+ title: Title
+ type: string
+ total_cost:
+ title: Total Cost
+ type: number
+ visibility:
+ $ref: "#/components/schemas/Visibility"
+ required:
+ - title
+ - description
+ - id
+ - incident_priority
+ - incident_severity
+ - incident_type
+ - project
+ title: IncidentReadMinimal
+ type: object
+ IncidentRoleCreateUpdate:
+ properties:
+ enabled:
+ title: Enabled
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incident_priorities:
+ items:
+ $ref: "#/components/schemas/IncidentPriorityRead"
+ title: Incident Priorities
+ type: array
+ incident_types:
+ items:
+ $ref: "#/components/schemas/IncidentTypeRead"
+ title: Incident Types
+ type: array
+ individual:
+ $ref: "#/components/schemas/IndividualContactRead"
+ order:
+ exclusiveMinimum: 0.0
+ title: Order
+ type: integer
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ service:
+ $ref: "#/components/schemas/ServiceRead"
+ tags:
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ title: IncidentRoleCreateUpdate
+ type: object
+ IncidentRoleRead:
+ properties:
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incident_priorities:
+ items:
+ $ref: "#/components/schemas/IncidentPriorityRead"
+ title: Incident Priorities
+ type: array
+ incident_types:
+ items:
+ $ref: "#/components/schemas/IncidentTypeRead"
+ title: Incident Types
+ type: array
+ individual:
+ $ref: "#/components/schemas/IndividualContactRead"
+ order:
+ exclusiveMinimum: 0.0
+ title: Order
+ type: integer
+ role:
+ $ref: "#/components/schemas/ParticipantRoleType"
+ service:
+ $ref: "#/components/schemas/ServiceRead"
+ tags:
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ updated_at:
+ format: date-time
+ title: Updated At
+ type: string
+ required:
+ - id
+ - role
+ title: IncidentRoleRead
+ type: object
+ IncidentRoles:
+ properties:
+ policies:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentRoleRead"
+ title: Policies
+ type: array
+ title: IncidentRoles
+ type: object
+ IncidentRolesCreateUpdate:
+ properties:
+ policies:
+ items:
+ $ref: "#/components/schemas/IncidentRoleCreateUpdate"
+ title: Policies
+ type: array
+ required:
+ - policies
+ title: IncidentRolesCreateUpdate
+ type: object
+ IncidentSeverityBase:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ title: IncidentSeverityBase
+ type: object
+ IncidentSeverityCreate:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ title: IncidentSeverityCreate
+ type: object
+ IncidentSeverityPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentSeverityRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: IncidentSeverityPagination
+ type: object
+ IncidentSeverityRead:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ - id
+ title: IncidentSeverityRead
+ type: object
+ IncidentSeverityReadMinimal:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ required:
+ - id
+ - name
+ title: IncidentSeverityReadMinimal
+ type: object
+ IncidentSeverityUpdate:
+ properties:
+ color:
+ format: color
+ nullable: true
+ title: Color
+ type: string
+ default:
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ view_order:
+ title: View Order
+ type: integer
+ required:
+ - name
+ title: IncidentSeverityUpdate
+ type: object
+ IncidentStatus:
+ description: An enumeration.
+ enum:
+ - Active
+ - Stable
+ - Closed
+ title: IncidentStatus
+ type: string
+ IncidentType:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ visibility:
+ nullable: true
+ title: Visibility
+ type: string
+ required:
+ - id
+ - name
+ title: IncidentType
+ type: object
+ IncidentTypeBase:
+ properties:
+ default:
+ default: false
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ exclude_from_metrics:
+ default: false
+ title: Exclude From Metrics
+ type: boolean
+ executive_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ incident_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ plugin_metadata:
+ default: []
+ items:
+ $ref: "#/components/schemas/PluginMetadata"
+ title: Plugin Metadata
+ type: array
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ review_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ tracking_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ visibility:
+ nullable: true
+ title: Visibility
+ type: string
+ required:
+ - name
+ title: IncidentTypeBase
+ type: object
+ IncidentTypeCreate:
+ properties:
+ default:
+ default: false
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ exclude_from_metrics:
+ default: false
+ title: Exclude From Metrics
+ type: boolean
+ executive_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ incident_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ plugin_metadata:
+ default: []
+ items:
+ $ref: "#/components/schemas/PluginMetadata"
+ title: Plugin Metadata
+ type: array
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ review_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ tracking_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ visibility:
+ nullable: true
+ title: Visibility
+ type: string
+ required:
+ - name
+ title: IncidentTypeCreate
+ type: object
+ IncidentTypePagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentTypeRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: IncidentTypePagination
+ type: object
+ IncidentTypeRead:
+ properties:
+ default:
+ default: false
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ exclude_from_metrics:
+ default: false
+ title: Exclude From Metrics
+ type: boolean
+ executive_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incident_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ plugin_metadata:
+ default: []
+ items:
+ $ref: "#/components/schemas/PluginMetadata"
+ title: Plugin Metadata
+ type: array
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ review_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ tracking_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ visibility:
+ nullable: true
+ title: Visibility
+ type: string
+ required:
+ - name
+ - id
+ title: IncidentTypeRead
+ type: object
+ IncidentTypeReadMinimal:
+ properties:
+ default:
+ default: false
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ exclude_from_metrics:
+ default: false
+ title: Exclude From Metrics
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ visibility:
+ nullable: true
+ title: Visibility
+ type: string
+ required:
+ - id
+ - name
+ title: IncidentTypeReadMinimal
+ type: object
+ IncidentTypeUpdate:
+ properties:
+ default:
+ default: false
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ exclude_from_metrics:
+ default: false
+ title: Exclude From Metrics
+ type: boolean
+ executive_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incident_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ plugin_metadata:
+ default: []
+ items:
+ $ref: "#/components/schemas/PluginMetadata"
+ title: Plugin Metadata
+ type: array
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ review_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ tracking_template_document:
+ $ref: "#/components/schemas/dispatch__incident__type__models__Document"
+ visibility:
+ nullable: true
+ title: Visibility
+ type: string
+ required:
+ - name
+ title: IncidentTypeUpdate
+ type: object
+ IncidentUpdate:
+ properties:
+ cases:
+ default: []
+ items:
+ $ref: "#/components/schemas/dispatch__incident__models__CaseRead"
+ title: Cases
+ type: array
+ commander:
+ $ref: "#/components/schemas/ParticipantUpdate"
+ description:
+ title: Description
+ type: string
+ duplicates:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentReadMinimal"
+ title: Duplicates
+ type: array
+ incident_costs:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentCostUpdate"
+ title: Incident Costs
+ type: array
+ incident_priority:
+ $ref: "#/components/schemas/IncidentPriorityBase"
+ incident_severity:
+ $ref: "#/components/schemas/IncidentSeverityBase"
+ incident_type:
+ $ref: "#/components/schemas/IncidentTypeBase"
+ reported_at:
+ format: date-time
+ title: Reported At
+ type: string
+ reporter:
+ $ref: "#/components/schemas/ParticipantUpdate"
+ resolution:
+ title: Resolution
+ type: string
+ stable_at:
+ format: date-time
+ title: Stable At
+ type: string
+ status:
+ $ref: "#/components/schemas/IncidentStatus"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ terms:
+ default: []
+ items:
+ $ref: "#/components/schemas/TermRead"
+ title: Terms
+ type: array
+ title:
+ title: Title
+ type: string
+ visibility:
+ $ref: "#/components/schemas/Visibility"
+ required:
+ - title
+ - description
+ - incident_priority
+ - incident_severity
+ - incident_type
+ title: IncidentUpdate
+ type: object
+ IndividualContactCreate:
+ properties:
+ company:
+ nullable: true
+ title: Company
+ type: string
+ contact_type:
+ nullable: true
+ title: Contact Type
+ type: string
+ email:
+ format: email
+ title: Email
+ type: string
+ external_id:
+ nullable: true
+ title: External Id
+ type: string
+ filters:
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Filters
+ type: array
+ is_active:
+ default: true
+ title: Is Active
+ type: boolean
+ is_external:
+ default: false
+ title: Is External
+ type: boolean
+ mobile_phone:
+ nullable: true
+ title: Mobile Phone
+ type: string
+ name:
+ nullable: true
+ title: Name
+ type: string
+ notes:
+ nullable: true
+ title: Notes
+ type: string
+ office_phone:
+ nullable: true
+ title: Office Phone
+ type: string
+ owner:
+ nullable: true
+ title: Owner
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ title:
+ nullable: true
+ title: Title
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ required:
+ - email
+ - project
+ title: IndividualContactCreate
+ type: object
+ IndividualContactPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/IndividualContactRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: IndividualContactPagination
+ type: object
+ IndividualContactRead:
+ properties:
+ company:
+ nullable: true
+ title: Company
+ type: string
+ contact_type:
+ nullable: true
+ title: Contact Type
+ type: string
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ email:
+ format: email
+ title: Email
+ type: string
+ external_id:
+ nullable: true
+ title: External Id
+ type: string
+ filters:
+ default: []
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Filters
+ type: array
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ is_active:
+ default: true
+ title: Is Active
+ type: boolean
+ is_external:
+ default: false
+ title: Is External
+ type: boolean
+ mobile_phone:
+ nullable: true
+ title: Mobile Phone
+ type: string
+ name:
+ nullable: true
+ title: Name
+ type: string
+ notes:
+ nullable: true
+ title: Notes
+ type: string
+ office_phone:
+ nullable: true
+ title: Office Phone
+ type: string
+ owner:
+ nullable: true
+ title: Owner
+ type: string
+ title:
+ nullable: true
+ title: Title
+ type: string
+ updated_at:
+ format: date-time
+ title: Updated At
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ required:
+ - email
+ title: IndividualContactRead
+ type: object
+ IndividualContactReadMinimal:
+ properties:
+ company:
+ nullable: true
+ title: Company
+ type: string
+ contact_type:
+ nullable: true
+ title: Contact Type
+ type: string
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ email:
+ format: email
+ title: Email
+ type: string
+ external_id:
+ nullable: true
+ title: External Id
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ is_active:
+ default: true
+ title: Is Active
+ type: boolean
+ is_external:
+ default: false
+ title: Is External
+ type: boolean
+ mobile_phone:
+ nullable: true
+ title: Mobile Phone
+ type: string
+ name:
+ nullable: true
+ title: Name
+ type: string
+ notes:
+ nullable: true
+ title: Notes
+ type: string
+ office_phone:
+ nullable: true
+ title: Office Phone
+ type: string
+ owner:
+ nullable: true
+ title: Owner
+ type: string
+ title:
+ nullable: true
+ title: Title
+ type: string
+ updated_at:
+ format: date-time
+ title: Updated At
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ required:
+ - email
+ title: IndividualContactReadMinimal
+ type: object
+ IndividualContactUpdate:
+ properties:
+ company:
+ nullable: true
+ title: Company
+ type: string
+ contact_type:
+ nullable: true
+ title: Contact Type
+ type: string
+ email:
+ format: email
+ title: Email
+ type: string
+ external_id:
+ nullable: true
+ title: External Id
+ type: string
+ filters:
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Filters
+ type: array
+ is_active:
+ default: true
+ title: Is Active
+ type: boolean
+ is_external:
+ default: false
+ title: Is External
+ type: boolean
+ mobile_phone:
+ nullable: true
+ title: Mobile Phone
+ type: string
+ name:
+ nullable: true
+ title: Name
+ type: string
+ notes:
+ nullable: true
+ title: Notes
+ type: string
+ office_phone:
+ nullable: true
+ title: Office Phone
+ type: string
+ owner:
+ nullable: true
+ title: Owner
+ type: string
+ title:
+ nullable: true
+ title: Title
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ required:
+ - email
+ title: IndividualContactUpdate
+ type: object
+ KeyValue:
+ properties:
+ key:
+ title: Key
+ type: string
+ value:
+ title: Value
+ type: string
+ required:
+ - key
+ - value
+ title: KeyValue
+ type: object
+ NotificationCreate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ evergreen:
+ default: false
+ title: Evergreen
+ type: boolean
+ evergreen_last_reminder_at:
+ format: date-time
+ nullable: true
+ title: Evergreen Last Reminder At
+ type: string
+ evergreen_owner:
+ format: email
+ title: Evergreen Owner
+ type: string
+ evergreen_reminder_interval:
+ default: 90
+ title: Evergreen Reminder Interval
+ type: integer
+ filters:
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Filters
+ type: array
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ target:
+ title: Target
+ type: string
+ type:
+ $ref: "#/components/schemas/NotificationTypeEnum"
+ required:
+ - name
+ - type
+ - target
+ - project
+ title: NotificationCreate
+ type: object
+ NotificationPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/NotificationRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: NotificationPagination
+ type: object
+ NotificationRead:
+ properties:
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ evergreen:
+ default: false
+ title: Evergreen
+ type: boolean
+ evergreen_last_reminder_at:
+ format: date-time
+ nullable: true
+ title: Evergreen Last Reminder At
+ type: string
+ evergreen_owner:
+ format: email
+ title: Evergreen Owner
+ type: string
+ evergreen_reminder_interval:
+ default: 90
+ title: Evergreen Reminder Interval
+ type: integer
+ filters:
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Filters
+ type: array
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ target:
+ title: Target
+ type: string
+ type:
+ $ref: "#/components/schemas/NotificationTypeEnum"
+ updated_at:
+ format: date-time
+ title: Updated At
+ type: string
+ required:
+ - name
+ - type
+ - target
+ - id
+ title: NotificationRead
+ type: object
+ NotificationTypeEnum:
+ description: An enumeration.
+ enum:
+ - conversation
+ - email
+ title: NotificationTypeEnum
+ type: string
+ NotificationUpdate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ evergreen:
+ default: false
+ title: Evergreen
+ type: boolean
+ evergreen_last_reminder_at:
+ format: date-time
+ nullable: true
+ title: Evergreen Last Reminder At
+ type: string
+ evergreen_owner:
+ format: email
+ title: Evergreen Owner
+ type: string
+ evergreen_reminder_interval:
+ default: 90
+ title: Evergreen Reminder Interval
+ type: integer
+ filters:
+ items:
+ $ref: "#/components/schemas/SearchFilterUpdate"
+ title: Filters
+ type: array
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ target:
+ title: Target
+ type: string
+ type:
+ $ref: "#/components/schemas/NotificationTypeEnum"
+ required:
+ - name
+ - type
+ - target
+ title: NotificationUpdate
+ type: object
+ OrganizationCreate:
+ properties:
+ banner_color:
+ format: color
+ nullable: true
+ title: Banner Color
+ type: string
+ banner_enabled:
+ default: false
+ nullable: true
+ title: Banner Enabled
+ type: boolean
+ banner_text:
+ minLength: 3
+ nullable: true
+ pattern: ^(?!\s*$).+
+ title: Banner Text
+ type: string
+ default:
+ default: false
+ nullable: true
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ required:
+ - name
+ title: OrganizationCreate
+ type: object
+ OrganizationPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/OrganizationRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: OrganizationPagination
+ type: object
+ OrganizationRead:
+ properties:
+ banner_color:
+ format: color
+ nullable: true
+ title: Banner Color
+ type: string
+ banner_enabled:
+ default: false
+ nullable: true
+ title: Banner Enabled
+ type: boolean
+ banner_text:
+ minLength: 3
+ nullable: true
+ pattern: ^(?!\s*$).+
+ title: Banner Text
+ type: string
+ default:
+ default: false
+ nullable: true
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ slug:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Slug
+ type: string
+ required:
+ - name
+ title: OrganizationRead
+ type: object
+ OrganizationUpdate:
+ properties:
+ banner_color:
+ format: color
+ nullable: true
+ title: Banner Color
+ type: string
+ banner_enabled:
+ default: false
+ nullable: true
+ title: Banner Enabled
+ type: boolean
+ banner_text:
+ minLength: 3
+ nullable: true
+ pattern: ^(?!\s*$).+
+ title: Banner Text
+ type: string
+ default:
+ default: false
+ nullable: true
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ title: OrganizationUpdate
+ type: object
+ ParticipantRead:
+ properties:
+ added_reason:
+ nullable: true
+ title: Added Reason
+ type: string
+ department:
+ nullable: true
+ title: Department
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ individual:
+ $ref: "#/components/schemas/IndividualContactRead"
+ location:
+ nullable: true
+ title: Location
+ type: string
+ participant_roles:
+ default: []
+ items:
+ $ref: "#/components/schemas/ParticipantRoleRead"
+ title: Participant Roles
+ type: array
+ team:
+ nullable: true
+ title: Team
+ type: string
+ required:
+ - id
+ title: ParticipantRead
+ type: object
+ ParticipantReadMinimal:
+ properties:
+ added_reason:
+ nullable: true
+ title: Added Reason
+ type: string
+ department:
+ nullable: true
+ title: Department
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ individual:
+ $ref: "#/components/schemas/IndividualContactReadMinimal"
+ location:
+ nullable: true
+ title: Location
+ type: string
+ participant_roles:
+ default: []
+ items:
+ $ref: "#/components/schemas/ParticipantRoleReadMinimal"
+ title: Participant Roles
+ type: array
+ team:
+ nullable: true
+ title: Team
+ type: string
+ required:
+ - id
+ title: ParticipantReadMinimal
+ type: object
+ ParticipantRoleRead:
+ properties:
+ activity:
+ title: Activity
+ type: integer
+ assumed_at:
+ format: date-time
+ title: Assumed At
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ renounced_at:
+ format: date-time
+ title: Renounced At
+ type: string
+ role:
+ title: Role
+ type: string
+ required:
+ - role
+ - id
+ title: ParticipantRoleRead
+ type: object
+ ParticipantRoleReadMinimal:
+ properties:
+ activity:
+ title: Activity
+ type: integer
+ assumed_at:
+ format: date-time
+ title: Assumed At
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ renounced_at:
+ format: date-time
+ title: Renounced At
+ type: string
+ role:
+ title: Role
+ type: string
+ required:
+ - role
+ - id
+ title: ParticipantRoleReadMinimal
+ type: object
+ ParticipantRoleType:
+ description: An enumeration.
+ enum:
+ - Assignee
+ - Incident Commander
+ - Liaison
+ - Scribe
+ - Participant
+ - Observer
+ - Reporter
+ title: ParticipantRoleType
+ type: string
+ ParticipantUpdate:
+ properties:
+ added_reason:
+ nullable: true
+ title: Added Reason
+ type: string
+ department:
+ nullable: true
+ title: Department
+ type: string
+ individual:
+ $ref: "#/components/schemas/IndividualContactRead"
+ location:
+ nullable: true
+ title: Location
+ type: string
+ team:
+ nullable: true
+ title: Team
+ type: string
+ title: ParticipantUpdate
+ type: object
+ PluginInstanceCreate:
+ properties:
+ configuration:
+ title: Configuration
+ type: object
+ enabled:
+ title: Enabled
+ type: boolean
+ plugin:
+ $ref: "#/components/schemas/PluginRead"
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - plugin
+ - project
+ title: PluginInstanceCreate
+ type: object
+ PluginInstancePagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/PluginInstanceRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: PluginInstancePagination
+ type: object
+ PluginInstanceRead:
+ properties:
+ configuration:
+ title: Configuration
+ type: object
+ configuration_schema:
+ title: Configuration Schema
+ enabled:
+ title: Enabled
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ plugin:
+ $ref: "#/components/schemas/PluginRead"
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - id
+ - plugin
+ title: PluginInstanceRead
+ type: object
+ PluginInstanceUpdate:
+ properties:
+ configuration:
+ title: Configuration
+ type: object
+ enabled:
+ title: Enabled
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ title: PluginInstanceUpdate
+ type: object
+ PluginMetadata:
+ properties:
+ metadata:
+ default: []
+ items:
+ $ref: "#/components/schemas/KeyValue"
+ title: Metadata
+ type: array
+ slug:
+ title: Slug
+ type: string
+ required:
+ - slug
+ title: PluginMetadata
+ type: object
+ PluginPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/PluginRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: PluginPagination
+ type: object
+ PluginRead:
+ properties:
+ author:
+ title: Author
+ type: string
+ author_url:
+ title: Author Url
+ type: string
+ configuration_schema:
+ title: Configuration Schema
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ multiple:
+ title: Multiple
+ type: boolean
+ slug:
+ title: Slug
+ type: string
+ title:
+ title: Title
+ type: string
+ type:
+ title: Type
+ type: string
+ required:
+ - id
+ - title
+ - slug
+ - author
+ - author_url
+ - type
+ - multiple
+ title: PluginRead
+ type: object
+ ProjectCreate:
+ properties:
+ annual_employee_cost:
+ title: Annual Employee Cost
+ type: integer
+ business_year_hours:
+ title: Business Year Hours
+ type: integer
+ color:
+ nullable: true
+ title: Color
+ type: string
+ default:
+ default: false
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ organization:
+ $ref: "#/components/schemas/OrganizationRead"
+ owner_conversation:
+ nullable: true
+ title: Owner Conversation
+ type: string
+ owner_email:
+ format: email
+ nullable: true
+ title: Owner Email
+ type: string
+ required:
+ - name
+ - organization
+ title: ProjectCreate
+ type: object
+ ProjectPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: ProjectPagination
+ type: object
+ ProjectUpdate:
+ properties:
+ annual_employee_cost:
+ title: Annual Employee Cost
+ type: integer
+ business_year_hours:
+ title: Business Year Hours
+ type: integer
+ color:
+ nullable: true
+ title: Color
+ type: string
+ default:
+ default: false
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ owner_conversation:
+ nullable: true
+ title: Owner Conversation
+ type: string
+ owner_email:
+ format: email
+ nullable: true
+ title: Owner Email
+ type: string
+ required:
+ - name
+ title: ProjectUpdate
+ type: object
+ QueryCreate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ language:
+ nullable: true
+ title: Language
+ type: string
+ name:
+ nullable: false
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ source:
+ $ref: "#/components/schemas/SourceRead"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ text:
+ nullable: true
+ title: Text
+ type: string
+ required:
+ - source
+ - project
+ title: QueryCreate
+ type: object
+ QueryPagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/QueryRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: QueryPagination
+ type: object
+ QueryRead:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ language:
+ nullable: true
+ title: Language
+ type: string
+ name:
+ nullable: false
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ source:
+ $ref: "#/components/schemas/SourceRead"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ text:
+ nullable: true
+ title: Text
+ type: string
+ required:
+ - source
+ - project
+ - id
+ title: QueryRead
+ type: object
+ QueryReadMinimal:
+ properties:
+ description:
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ title: Name
+ type: string
+ required:
+ - id
+ - name
+ - description
+ title: QueryReadMinimal
+ type: object
+ QueryUpdate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ language:
+ nullable: true
+ title: Language
+ type: string
+ name:
+ nullable: false
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ source:
+ $ref: "#/components/schemas/SourceRead"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ text:
+ nullable: true
+ title: Text
+ type: string
+ required:
+ - source
+ - project
+ title: QueryUpdate
+ type: object
+ ReportRead:
+ properties:
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ details:
+ title: Details
+ type: object
+ id:
+ title: Id
+ type: integer
+ type:
+ $ref: "#/components/schemas/ReportTypes"
+ required:
+ - type
+ - id
+ title: ReportRead
+ type: object
+ ReportTypes:
+ description: An enumeration.
+ enum:
+ - Tactical Report
+ - Executive Report
+ title: ReportTypes
+ type: string
+ SearchFilterCreate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ expression:
+ items:
+ type: object
+ title: Expression
+ type: array
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ subject:
+ allOf:
+ - $ref: "#/components/schemas/SearchFilterSubject"
+ default: incident
+ required:
+ - expression
+ - name
+ - project
+ title: SearchFilterCreate
+ type: object
+ SearchFilterPagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: SearchFilterPagination
+ type: object
+ SearchFilterRead:
+ properties:
+ creator:
+ $ref: "#/components/schemas/UserRead"
+ description:
+ nullable: true
+ title: Description
+ type: string
+ expression:
+ items:
+ type: object
+ title: Expression
+ type: array
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ subject:
+ allOf:
+ - $ref: "#/components/schemas/SearchFilterSubject"
+ default: incident
+ required:
+ - expression
+ - name
+ - id
+ title: SearchFilterRead
+ type: object
+ SearchFilterSubject:
+ description: An enumeration.
+ enum:
+ - case
+ - incident
+ title: SearchFilterSubject
+ type: string
+ SearchFilterUpdate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ expression:
+ items:
+ type: object
+ title: Expression
+ type: array
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ subject:
+ allOf:
+ - $ref: "#/components/schemas/SearchFilterSubject"
+ default: incident
+ required:
+ - expression
+ - name
+ title: SearchFilterUpdate
+ type: object
+ SearchTypes:
+ description: An enumeration.
+ enum:
+ - Definition
+ - Document
+ - Incident
+ - IncidentPriority
+ - IncidentType
+ - IndividualContact
+ - Plugin
+ - Query
+ - SearchFilter
+ - Case
+ - Service
+ - Source
+ - Tag
+ - Task
+ - TeamContact
+ - Term
+ title: SearchTypes
+ type: string
+ ServiceCreate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ evergreen:
+ default: false
+ title: Evergreen
+ type: boolean
+ evergreen_last_reminder_at:
+ format: date-time
+ nullable: true
+ title: Evergreen Last Reminder At
+ type: string
+ evergreen_owner:
+ format: email
+ title: Evergreen Owner
+ type: string
+ evergreen_reminder_interval:
+ default: 90
+ title: Evergreen Reminder Interval
+ type: integer
+ external_id:
+ nullable: true
+ title: External Id
+ type: string
+ filters:
+ default: []
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Filters
+ type: array
+ is_active:
+ title: Is Active
+ type: boolean
+ name:
+ nullable: true
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ type:
+ nullable: true
+ title: Type
+ type: string
+ required:
+ - project
+ title: ServiceCreate
+ type: object
+ ServicePagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/ServiceRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: ServicePagination
+ type: object
+ ServiceRead:
+ properties:
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ description:
+ nullable: true
+ title: Description
+ type: string
+ evergreen:
+ default: false
+ title: Evergreen
+ type: boolean
+ evergreen_last_reminder_at:
+ format: date-time
+ nullable: true
+ title: Evergreen Last Reminder At
+ type: string
+ evergreen_owner:
+ format: email
+ title: Evergreen Owner
+ type: string
+ evergreen_reminder_interval:
+ default: 90
+ title: Evergreen Reminder Interval
+ type: integer
+ external_id:
+ nullable: true
+ title: External Id
+ type: string
+ filters:
+ default: []
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Filters
+ type: array
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ is_active:
+ title: Is Active
+ type: boolean
+ name:
+ nullable: true
+ title: Name
+ type: string
+ type:
+ nullable: true
+ title: Type
+ type: string
+ updated_at:
+ format: date-time
+ title: Updated At
+ type: string
+ required:
+ - id
+ title: ServiceRead
+ type: object
+ ServiceUpdate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ evergreen:
+ default: false
+ title: Evergreen
+ type: boolean
+ evergreen_last_reminder_at:
+ format: date-time
+ nullable: true
+ title: Evergreen Last Reminder At
+ type: string
+ evergreen_owner:
+ format: email
+ title: Evergreen Owner
+ type: string
+ evergreen_reminder_interval:
+ default: 90
+ title: Evergreen Reminder Interval
+ type: integer
+ external_id:
+ nullable: true
+ title: External Id
+ type: string
+ filters:
+ default: []
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Filters
+ type: array
+ is_active:
+ title: Is Active
+ type: boolean
+ name:
+ nullable: true
+ title: Name
+ type: string
+ type:
+ nullable: true
+ title: Type
+ type: string
+ title: ServiceUpdate
+ type: object
+ SignalCreate:
+ properties:
+ case_priority:
+ $ref: "#/components/schemas/CasePriorityRead"
+ case_type:
+ $ref: "#/components/schemas/CaseTypeRead"
+ conversation_target:
+ title: Conversation Target
+ type: string
+ create_case:
+ default: true
+ title: Create Case
+ type: boolean
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ description:
+ title: Description
+ type: string
+ enabled:
+ default: false
+ title: Enabled
+ type: boolean
+ entity_types:
+ default: []
+ items:
+ $ref: "#/components/schemas/EntityTypeRead"
+ title: Entity Types
+ type: array
+ external_id:
+ title: External Id
+ type: string
+ external_url:
+ title: External Url
+ type: string
+ filters:
+ default: []
+ items:
+ $ref: "#/components/schemas/SignalFilterRead"
+ title: Filters
+ type: array
+ name:
+ title: Name
+ type: string
+ oncall_service:
+ $ref: "#/components/schemas/dispatch__signal__models__Service"
+ owner:
+ title: Owner
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ source:
+ $ref: "#/components/schemas/SourceBase"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ variant:
+ title: Variant
+ type: string
+ workflows:
+ default: []
+ items:
+ $ref: "#/components/schemas/WorkflowRead"
+ title: Workflows
+ type: array
+ required:
+ - name
+ - owner
+ - external_id
+ - project
+ title: SignalCreate
+ type: object
+ SignalFilterAction:
+ description: An enumeration.
+ enum:
+ - deduplicate
+ - snooze
+ title: SignalFilterAction
+ type: string
+ SignalFilterCreate:
+ properties:
+ action:
+ allOf:
+ - $ref: "#/components/schemas/SignalFilterAction"
+ default: snooze
+ description:
+ nullable: true
+ title: Description
+ type: string
+ expiration:
+ format: date-time
+ nullable: true
+ title: Expiration
+ type: string
+ expression:
+ items:
+ type: object
+ title: Expression
+ type: array
+ mode:
+ allOf:
+ - $ref: "#/components/schemas/SignalFilterMode"
+ default: active
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ window:
+ default: 600
+ title: Window
+ type: integer
+ required:
+ - expression
+ - name
+ - project
+ title: SignalFilterCreate
+ type: object
+ SignalFilterMode:
+ description: An enumeration.
+ enum:
+ - active
+ - monitor
+ - inactive
+ - expired
+ title: SignalFilterMode
+ type: string
+ SignalFilterPagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/SignalFilterRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: SignalFilterPagination
+ type: object
+ SignalFilterRead:
+ properties:
+ action:
+ allOf:
+ - $ref: "#/components/schemas/SignalFilterAction"
+ default: snooze
+ description:
+ nullable: true
+ title: Description
+ type: string
+ expiration:
+ format: date-time
+ nullable: true
+ title: Expiration
+ type: string
+ expression:
+ items:
+ type: object
+ title: Expression
+ type: array
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ mode:
+ allOf:
+ - $ref: "#/components/schemas/SignalFilterMode"
+ default: active
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ window:
+ default: 600
+ title: Window
+ type: integer
+ required:
+ - expression
+ - name
+ - id
+ title: SignalFilterRead
+ type: object
+ SignalFilterUpdate:
+ properties:
+ action:
+ allOf:
+ - $ref: "#/components/schemas/SignalFilterAction"
+ default: snooze
+ description:
+ nullable: true
+ title: Description
+ type: string
+ expiration:
+ format: date-time
+ nullable: true
+ title: Expiration
+ type: string
+ expression:
+ items:
+ type: object
+ title: Expression
+ type: array
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ mode:
+ allOf:
+ - $ref: "#/components/schemas/SignalFilterMode"
+ default: active
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ window:
+ default: 600
+ title: Window
+ type: integer
+ required:
+ - expression
+ - name
+ - id
+ title: SignalFilterUpdate
+ type: object
+ SignalInstanceCreate:
+ properties:
+ case:
+ $ref: "#/components/schemas/dispatch__case__models__CaseRead"
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ entities:
+ default: []
+ items:
+ $ref: "#/components/schemas/EntityRead"
+ title: Entities
+ type: array
+ filter_action:
+ $ref: "#/components/schemas/SignalFilterAction"
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ raw:
+ title: Raw
+ type: object
+ signal:
+ $ref: "#/components/schemas/dispatch__signal__models__SignalRead"
+ required:
+ - project
+ - raw
+ title: SignalInstanceCreate
+ type: object
+ SignalInstancePagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/dispatch__signal__models__SignalInstanceRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: SignalInstancePagination
+ type: object
+ SignalPagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/dispatch__signal__models__SignalRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: SignalPagination
+ type: object
+ SignalUpdate:
+ properties:
+ case_priority:
+ $ref: "#/components/schemas/CasePriorityRead"
+ case_type:
+ $ref: "#/components/schemas/CaseTypeRead"
+ conversation_target:
+ title: Conversation Target
+ type: string
+ create_case:
+ default: true
+ title: Create Case
+ type: boolean
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ description:
+ title: Description
+ type: string
+ enabled:
+ default: false
+ title: Enabled
+ type: boolean
+ entity_types:
+ default: []
+ items:
+ $ref: "#/components/schemas/EntityTypeRead"
+ title: Entity Types
+ type: array
+ external_id:
+ title: External Id
+ type: string
+ external_url:
+ title: External Url
+ type: string
+ filters:
+ default: []
+ items:
+ $ref: "#/components/schemas/SignalFilterRead"
+ title: Filters
+ type: array
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ title: Name
+ type: string
+ oncall_service:
+ $ref: "#/components/schemas/dispatch__signal__models__Service"
+ owner:
+ title: Owner
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ source:
+ $ref: "#/components/schemas/SourceBase"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ variant:
+ title: Variant
+ type: string
+ workflows:
+ default: []
+ items:
+ $ref: "#/components/schemas/WorkflowRead"
+ title: Workflows
+ type: array
+ required:
+ - name
+ - owner
+ - external_id
+ - project
+ - id
+ title: SignalUpdate
+ type: object
+ SourceBase:
+ properties:
+ aggregated:
+ default: false
+ nullable: true
+ title: Aggregated
+ type: boolean
+ alerts:
+ default: []
+ items:
+ $ref: "#/components/schemas/AlertRead"
+ title: Alerts
+ type: array
+ cost:
+ title: Cost
+ type: number
+ data_last_loaded_at:
+ format: date-time
+ nullable: true
+ title: Last Loaded
+ type: string
+ delay:
+ nullable: true
+ title: Delay
+ type: integer
+ description:
+ nullable: true
+ title: Description
+ type: string
+ documentation:
+ nullable: true
+ title: Documentation
+ type: string
+ external_id:
+ nullable: true
+ title: External Id
+ type: string
+ incidents:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentRead"
+ title: Incidents
+ type: array
+ links:
+ default: []
+ items: {}
+ title: Links
+ type: array
+ name:
+ nullable: false
+ title: Name
+ type: string
+ owner:
+ allOf:
+ - $ref: "#/components/schemas/ServiceRead"
+ nullable: true
+ title: Owner
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ queries:
+ default: []
+ items:
+ $ref: "#/components/schemas/QueryReadMinimal"
+ title: Queries
+ type: array
+ retention:
+ nullable: true
+ title: Retention
+ type: integer
+ sampling_rate:
+ description: Rate at which data is sampled (as a percentage) 100% meaning
+ all data is captured.
+ exclusiveMaximum: 101.0
+ exclusiveMinimum: 1.0
+ nullable: true
+ title: Sampling Rate
+ type: integer
+ size:
+ nullable: true
+ title: Size
+ type: integer
+ source_data_format:
+ $ref: "#/components/schemas/SourceDataFormatRead"
+ source_environment:
+ $ref: "#/components/schemas/SourceEnvironmentRead"
+ source_schema:
+ nullable: true
+ title: Source Schema
+ type: string
+ source_status:
+ $ref: "#/components/schemas/SourceStatusRead"
+ source_transport:
+ $ref: "#/components/schemas/SourceTransportRead"
+ source_type:
+ $ref: "#/components/schemas/SourceTypeRead"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ required:
+ - project
+ title: SourceBase
+ type: object
+ SourceCreate:
+ properties:
+ aggregated:
+ default: false
+ nullable: true
+ title: Aggregated
+ type: boolean
+ alerts:
+ default: []
+ items:
+ $ref: "#/components/schemas/AlertRead"
+ title: Alerts
+ type: array
+ cost:
+ title: Cost
+ type: number
+ data_last_loaded_at:
+ format: date-time
+ nullable: true
+ title: Last Loaded
+ type: string
+ delay:
+ nullable: true
+ title: Delay
+ type: integer
+ description:
+ nullable: true
+ title: Description
+ type: string
+ documentation:
+ nullable: true
+ title: Documentation
+ type: string
+ external_id:
+ nullable: true
+ title: External Id
+ type: string
+ incidents:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentRead"
+ title: Incidents
+ type: array
+ links:
+ default: []
+ items: {}
+ title: Links
+ type: array
+ name:
+ nullable: false
+ title: Name
+ type: string
+ owner:
+ allOf:
+ - $ref: "#/components/schemas/ServiceRead"
+ nullable: true
+ title: Owner
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ queries:
+ default: []
+ items:
+ $ref: "#/components/schemas/QueryReadMinimal"
+ title: Queries
+ type: array
+ retention:
+ nullable: true
+ title: Retention
+ type: integer
+ sampling_rate:
+ description: Rate at which data is sampled (as a percentage) 100% meaning
+ all data is captured.
+ exclusiveMaximum: 101.0
+ exclusiveMinimum: 1.0
+ nullable: true
+ title: Sampling Rate
+ type: integer
+ size:
+ nullable: true
+ title: Size
+ type: integer
+ source_data_format:
+ $ref: "#/components/schemas/SourceDataFormatRead"
+ source_environment:
+ $ref: "#/components/schemas/SourceEnvironmentRead"
+ source_schema:
+ nullable: true
+ title: Source Schema
+ type: string
+ source_status:
+ $ref: "#/components/schemas/SourceStatusRead"
+ source_transport:
+ $ref: "#/components/schemas/SourceTransportRead"
+ source_type:
+ $ref: "#/components/schemas/SourceTypeRead"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ required:
+ - project
+ title: SourceCreate
+ type: object
+ SourceDataFormatCreate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ name:
+ nullable: false
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - project
+ title: SourceDataFormatCreate
+ type: object
+ SourceDataFormatPagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/SourceDataFormatRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: SourceDataFormatPagination
+ type: object
+ SourceDataFormatRead:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: false
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - id
+ - project
+ title: SourceDataFormatRead
+ type: object
+ SourceDataFormatUpdate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: false
+ title: Name
+ type: string
+ required:
+ - id
+ title: SourceDataFormatUpdate
+ type: object
+ SourceEnvironmentCreate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ name:
+ nullable: false
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - project
+ title: SourceEnvironmentCreate
+ type: object
+ SourceEnvironmentPagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/SourceEnvironmentRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: SourceEnvironmentPagination
+ type: object
+ SourceEnvironmentRead:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: false
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - id
+ - project
+ title: SourceEnvironmentRead
+ type: object
+ SourceEnvironmentUpdate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: false
+ title: Name
+ type: string
+ required:
+ - id
+ title: SourceEnvironmentUpdate
+ type: object
+ SourcePagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/SourceRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: SourcePagination
+ type: object
+ SourceRead:
+ properties:
+ aggregated:
+ default: false
+ nullable: true
+ title: Aggregated
+ type: boolean
+ alerts:
+ default: []
+ items:
+ $ref: "#/components/schemas/AlertRead"
+ title: Alerts
+ type: array
+ cost:
+ title: Cost
+ type: number
+ data_last_loaded_at:
+ format: date-time
+ nullable: true
+ title: Last Loaded
+ type: string
+ delay:
+ nullable: true
+ title: Delay
+ type: integer
+ description:
+ nullable: true
+ title: Description
+ type: string
+ documentation:
+ nullable: true
+ title: Documentation
+ type: string
+ external_id:
+ nullable: true
+ title: External Id
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incidents:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentRead"
+ title: Incidents
+ type: array
+ links:
+ default: []
+ items: {}
+ title: Links
+ type: array
+ name:
+ nullable: false
+ title: Name
+ type: string
+ owner:
+ allOf:
+ - $ref: "#/components/schemas/ServiceRead"
+ nullable: true
+ title: Owner
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ queries:
+ default: []
+ items:
+ $ref: "#/components/schemas/QueryReadMinimal"
+ title: Queries
+ type: array
+ retention:
+ nullable: true
+ title: Retention
+ type: integer
+ sampling_rate:
+ description: Rate at which data is sampled (as a percentage) 100% meaning
+ all data is captured.
+ exclusiveMaximum: 101.0
+ exclusiveMinimum: 1.0
+ nullable: true
+ title: Sampling Rate
+ type: integer
+ size:
+ nullable: true
+ title: Size
+ type: integer
+ source_data_format:
+ $ref: "#/components/schemas/SourceDataFormatRead"
+ source_environment:
+ $ref: "#/components/schemas/SourceEnvironmentRead"
+ source_schema:
+ nullable: true
+ title: Source Schema
+ type: string
+ source_status:
+ $ref: "#/components/schemas/SourceStatusRead"
+ source_transport:
+ $ref: "#/components/schemas/SourceTransportRead"
+ source_type:
+ $ref: "#/components/schemas/SourceTypeRead"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ required:
+ - project
+ - id
+ title: SourceRead
+ type: object
+ SourceStatusCreate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ name:
+ nullable: false
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - project
+ title: SourceStatusCreate
+ type: object
+ SourceStatusPagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/SourceStatusRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: SourceStatusPagination
+ type: object
+ SourceStatusRead:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: false
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - id
+ - project
+ title: SourceStatusRead
+ type: object
+ SourceStatusUpdate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: false
+ title: Name
+ type: string
+ required:
+ - id
+ title: SourceStatusUpdate
+ type: object
+ SourceTransportCreate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ name:
+ nullable: false
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - project
+ title: SourceTransportCreate
+ type: object
+ SourceTransportPagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/SourceTransportRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: SourceTransportPagination
+ type: object
+ SourceTransportRead:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: false
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - id
+ - project
+ title: SourceTransportRead
+ type: object
+ SourceTransportUpdate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: false
+ title: Name
+ type: string
+ required:
+ - id
+ title: SourceTransportUpdate
+ type: object
+ SourceTypeCreate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ name:
+ nullable: false
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - project
+ title: SourceTypeCreate
+ type: object
+ SourceTypePagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/SourceTypeRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: SourceTypePagination
+ type: object
+ SourceTypeRead:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: false
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - id
+ - project
+ title: SourceTypeRead
+ type: object
+ SourceTypeUpdate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: false
+ title: Name
+ type: string
+ required:
+ - id
+ title: SourceTypeUpdate
+ type: object
+ SourceUpdate:
+ properties:
+ aggregated:
+ default: false
+ nullable: true
+ title: Aggregated
+ type: boolean
+ alerts:
+ default: []
+ items:
+ $ref: "#/components/schemas/AlertRead"
+ title: Alerts
+ type: array
+ cost:
+ title: Cost
+ type: number
+ data_last_loaded_at:
+ format: date-time
+ nullable: true
+ title: Last Loaded
+ type: string
+ delay:
+ nullable: true
+ title: Delay
+ type: integer
+ description:
+ nullable: true
+ title: Description
+ type: string
+ documentation:
+ nullable: true
+ title: Documentation
+ type: string
+ external_id:
+ nullable: true
+ title: External Id
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incidents:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentRead"
+ title: Incidents
+ type: array
+ links:
+ default: []
+ items: {}
+ title: Links
+ type: array
+ name:
+ nullable: false
+ title: Name
+ type: string
+ owner:
+ allOf:
+ - $ref: "#/components/schemas/ServiceRead"
+ nullable: true
+ title: Owner
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ queries:
+ default: []
+ items:
+ $ref: "#/components/schemas/QueryReadMinimal"
+ title: Queries
+ type: array
+ retention:
+ nullable: true
+ title: Retention
+ type: integer
+ sampling_rate:
+ description: Rate at which data is sampled (as a percentage) 100% meaning
+ all data is captured.
+ exclusiveMaximum: 101.0
+ exclusiveMinimum: 1.0
+ nullable: true
+ title: Sampling Rate
+ type: integer
+ size:
+ nullable: true
+ title: Size
+ type: integer
+ source_data_format:
+ $ref: "#/components/schemas/SourceDataFormatRead"
+ source_environment:
+ $ref: "#/components/schemas/SourceEnvironmentRead"
+ source_schema:
+ nullable: true
+ title: Source Schema
+ type: string
+ source_status:
+ $ref: "#/components/schemas/SourceStatusRead"
+ source_transport:
+ $ref: "#/components/schemas/SourceTransportRead"
+ source_type:
+ $ref: "#/components/schemas/SourceTypeRead"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ required:
+ - project
+ title: SourceUpdate
+ type: object
+ StorageRead:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ nullable: true
+ title: Resource Type
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ title: StorageRead
+ type: object
+ TacticalReportCreate:
+ properties:
+ actions:
+ title: Actions
+ type: string
+ conditions:
+ title: Conditions
+ type: string
+ needs:
+ title: Needs
+ type: string
+ required:
+ - conditions
+ - actions
+ - needs
+ title: TacticalReportCreate
+ type: object
+ TagCreate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ discoverable:
+ default: true
+ title: Discoverable
+ type: boolean
+ external_id:
+ nullable: true
+ title: External Id
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: true
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ source:
+ nullable: true
+ title: Source
+ type: string
+ tag_type:
+ $ref: "#/components/schemas/TagTypeCreate"
+ uri:
+ nullable: true
+ title: Uri
+ type: string
+ required:
+ - tag_type
+ - project
+ title: TagCreate
+ type: object
+ TagPagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: TagPagination
+ type: object
+ TagRead:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ discoverable:
+ default: true
+ title: Discoverable
+ type: boolean
+ external_id:
+ nullable: true
+ title: External Id
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: true
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ source:
+ nullable: true
+ title: Source
+ type: string
+ tag_type:
+ $ref: "#/components/schemas/TagTypeRead"
+ uri:
+ nullable: true
+ title: Uri
+ type: string
+ required:
+ - id
+ - project
+ title: TagRead
+ type: object
+ TagTypeCreate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ exclusive:
+ default: false
+ title: Exclusive
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - name
+ - project
+ title: TagTypeCreate
+ type: object
+ TagTypePagination:
+ properties:
+ items:
+ items:
+ $ref: "#/components/schemas/TagTypeRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - items
+ - total
+ title: TagTypePagination
+ type: object
+ TagTypeRead:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ exclusive:
+ default: false
+ title: Exclusive
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - name
+ - id
+ - project
+ title: TagTypeRead
+ type: object
+ TagTypeUpdate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ exclusive:
+ default: false
+ title: Exclusive
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ required:
+ - name
+ title: TagTypeUpdate
+ type: object
+ TagUpdate:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ discoverable:
+ default: true
+ title: Discoverable
+ type: boolean
+ external_id:
+ nullable: true
+ title: External Id
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ nullable: true
+ title: Name
+ type: string
+ source:
+ nullable: true
+ title: Source
+ type: string
+ tag_type:
+ $ref: "#/components/schemas/TagTypeUpdate"
+ uri:
+ nullable: true
+ title: Uri
+ type: string
+ title: TagUpdate
+ type: object
+ TaskCreate:
+ properties:
+ assignees:
+ default: []
+ items:
+ $ref: "#/components/schemas/ParticipantUpdate"
+ title: Assignees
+ type: array
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ creator:
+ $ref: "#/components/schemas/ParticipantUpdate"
+ description:
+ nullable: true
+ title: Description
+ type: string
+ incident:
+ $ref: "#/components/schemas/IncidentReadMinimal"
+ owner:
+ $ref: "#/components/schemas/ParticipantUpdate"
+ priority:
+ nullable: true
+ title: Priority
+ type: string
+ resolve_by:
+ format: date-time
+ title: Resolve By
+ type: string
+ resolved_at:
+ format: date-time
+ title: Resolved At
+ type: string
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ title: Resource Type
+ type: string
+ source:
+ nullable: true
+ title: Source
+ type: string
+ status:
+ allOf:
+ - $ref: "#/components/schemas/TaskStatus"
+ default: Open
+ updated_at:
+ format: date-time
+ title: Updated At
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ required:
+ - incident
+ title: TaskCreate
+ type: object
+ TaskStatus:
+ description: An enumeration.
+ enum:
+ - Open
+ - Resolved
+ title: TaskStatus
+ type: string
+ TaskUpdate:
+ properties:
+ assignees:
+ default: []
+ items:
+ $ref: "#/components/schemas/ParticipantUpdate"
+ title: Assignees
+ type: array
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ creator:
+ $ref: "#/components/schemas/ParticipantUpdate"
+ description:
+ nullable: true
+ title: Description
+ type: string
+ incident:
+ $ref: "#/components/schemas/IncidentReadMinimal"
+ owner:
+ $ref: "#/components/schemas/ParticipantUpdate"
+ priority:
+ nullable: true
+ title: Priority
+ type: string
+ resolve_by:
+ format: date-time
+ title: Resolve By
+ type: string
+ resolved_at:
+ format: date-time
+ title: Resolved At
+ type: string
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ nullable: true
+ title: Resource Type
+ type: string
+ source:
+ nullable: true
+ title: Source
+ type: string
+ status:
+ allOf:
+ - $ref: "#/components/schemas/TaskStatus"
+ default: Open
+ updated_at:
+ format: date-time
+ title: Updated At
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ required:
+ - incident
+ title: TaskUpdate
+ type: object
+ TeamContactCreate:
+ properties:
+ company:
+ nullable: true
+ title: Company
+ type: string
+ contact_type:
+ nullable: true
+ title: Contact Type
+ type: string
+ email:
+ format: email
+ title: Email
+ type: string
+ evergreen:
+ default: false
+ title: Evergreen
+ type: boolean
+ evergreen_last_reminder_at:
+ format: date-time
+ nullable: true
+ title: Evergreen Last Reminder At
+ type: string
+ evergreen_owner:
+ format: email
+ title: Evergreen Owner
+ type: string
+ evergreen_reminder_interval:
+ default: 90
+ title: Evergreen Reminder Interval
+ type: integer
+ filters:
+ default: []
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Filters
+ type: array
+ is_active:
+ default: true
+ title: Is Active
+ type: boolean
+ is_external:
+ default: false
+ title: Is External
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ notes:
+ nullable: true
+ title: Notes
+ type: string
+ owner:
+ nullable: true
+ title: Owner
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ required:
+ - email
+ - name
+ - project
+ title: TeamContactCreate
+ type: object
+ TeamContactRead:
+ properties:
+ company:
+ nullable: true
+ title: Company
+ type: string
+ contact_type:
+ nullable: true
+ title: Contact Type
+ type: string
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ email:
+ format: email
+ title: Email
+ type: string
+ evergreen:
+ default: false
+ title: Evergreen
+ type: boolean
+ evergreen_last_reminder_at:
+ format: date-time
+ nullable: true
+ title: Evergreen Last Reminder At
+ type: string
+ evergreen_owner:
+ format: email
+ title: Evergreen Owner
+ type: string
+ evergreen_reminder_interval:
+ default: 90
+ title: Evergreen Reminder Interval
+ type: integer
+ filters:
+ default: []
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Filters
+ type: array
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ is_active:
+ default: true
+ title: Is Active
+ type: boolean
+ is_external:
+ default: false
+ title: Is External
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ notes:
+ nullable: true
+ title: Notes
+ type: string
+ owner:
+ nullable: true
+ title: Owner
+ type: string
+ updated_at:
+ format: date-time
+ title: Updated At
+ type: string
+ required:
+ - email
+ - name
+ - id
+ - created_at
+ - updated_at
+ title: TeamContactRead
+ type: object
+ TeamContactUpdate:
+ properties:
+ company:
+ nullable: true
+ title: Company
+ type: string
+ contact_type:
+ nullable: true
+ title: Contact Type
+ type: string
+ email:
+ format: email
+ title: Email
+ type: string
+ evergreen:
+ default: false
+ title: Evergreen
+ type: boolean
+ evergreen_last_reminder_at:
+ format: date-time
+ nullable: true
+ title: Evergreen Last Reminder At
+ type: string
+ evergreen_owner:
+ format: email
+ title: Evergreen Owner
+ type: string
+ evergreen_reminder_interval:
+ default: 90
+ title: Evergreen Reminder Interval
+ type: integer
+ filters:
+ default: []
+ items:
+ $ref: "#/components/schemas/SearchFilterRead"
+ title: Filters
+ type: array
+ is_active:
+ default: true
+ title: Is Active
+ type: boolean
+ is_external:
+ default: false
+ title: Is External
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ notes:
+ nullable: true
+ title: Notes
+ type: string
+ owner:
+ nullable: true
+ title: Owner
+ type: string
+ required:
+ - email
+ - name
+ title: TeamContactUpdate
+ type: object
+ TeamPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/TeamContactRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: TeamPagination
+ type: object
+ TermCreate:
+ properties:
+ definitions:
+ default: []
+ items:
+ $ref: "#/components/schemas/DefinitionRead"
+ title: Definitions
+ type: array
+ discoverable:
+ default: true
+ title: Discoverable
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ text:
+ nullable: true
+ title: Text
+ type: string
+ required:
+ - project
+ title: TermCreate
+ type: object
+ TermPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/TermRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: TermPagination
+ type: object
+ TermRead:
+ properties:
+ definitions:
+ default: []
+ items:
+ $ref: "#/components/schemas/DefinitionRead"
+ title: Definitions
+ type: array
+ discoverable:
+ default: true
+ title: Discoverable
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ text:
+ nullable: true
+ title: Text
+ type: string
+ required:
+ - id
+ title: TermRead
+ type: object
+ TermUpdate:
+ properties:
+ definitions:
+ default: []
+ items:
+ $ref: "#/components/schemas/DefinitionRead"
+ title: Definitions
+ type: array
+ discoverable:
+ default: true
+ title: Discoverable
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ text:
+ nullable: true
+ title: Text
+ type: string
+ title: TermUpdate
+ type: object
+ TicketRead:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ nullable: true
+ title: Resource Type
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ title: TicketRead
+ type: object
+ UserLogin:
+ properties:
+ email:
+ format: email
+ title: Email
+ type: string
+ organizations:
+ default: []
+ items:
+ $ref: "#/components/schemas/UserOrganization"
+ title: Organizations
+ type: array
+ password:
+ title: Password
+ type: string
+ projects:
+ default: []
+ items:
+ $ref: "#/components/schemas/UserProject"
+ title: Projects
+ type: array
+ required:
+ - email
+ - password
+ title: UserLogin
+ type: object
+ UserLoginResponse:
+ properties:
+ projects:
+ items:
+ $ref: "#/components/schemas/UserProject"
+ title: Projects
+ type: array
+ token:
+ nullable: true
+ title: Token
+ type: string
+ title: UserLoginResponse
+ type: object
+ UserOrganization:
+ properties:
+ default:
+ default: false
+ title: Default
+ type: boolean
+ organization:
+ $ref: "#/components/schemas/OrganizationRead"
+ role:
+ nullable: true
+ title: Role
+ type: string
+ required:
+ - organization
+ title: UserOrganization
+ type: object
+ UserPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/UserRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: UserPagination
+ type: object
+ UserProject:
+ properties:
+ default:
+ default: false
+ title: Default
+ type: boolean
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ role:
+ nullable: true
+ title: Role
+ type: string
+ required:
+ - project
+ title: UserProject
+ type: object
+ UserRead:
+ properties:
+ email:
+ format: email
+ title: Email
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ organizations:
+ default: []
+ items:
+ $ref: "#/components/schemas/UserOrganization"
+ title: Organizations
+ type: array
+ projects:
+ default: []
+ items:
+ $ref: "#/components/schemas/UserProject"
+ title: Projects
+ type: array
+ role:
+ nullable: true
+ title: Role
+ type: string
+ required:
+ - email
+ - id
+ title: UserRead
+ type: object
+ UserRegister:
+ properties:
+ email:
+ format: email
+ title: Email
+ type: string
+ organizations:
+ default: []
+ items:
+ $ref: "#/components/schemas/UserOrganization"
+ title: Organizations
+ type: array
+ password:
+ nullable: true
+ title: Password
+ type: string
+ projects:
+ default: []
+ items:
+ $ref: "#/components/schemas/UserProject"
+ title: Projects
+ type: array
+ required:
+ - email
+ title: UserRegister
+ type: object
+ UserRegisterResponse:
+ properties:
+ token:
+ nullable: true
+ title: Token
+ type: string
+ title: UserRegisterResponse
+ type: object
+ UserUpdate:
+ properties:
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ organizations:
+ items:
+ $ref: "#/components/schemas/UserOrganization"
+ title: Organizations
+ type: array
+ password:
+ nullable: true
+ title: Password
+ type: string
+ projects:
+ items:
+ $ref: "#/components/schemas/UserProject"
+ title: Projects
+ type: array
+ role:
+ nullable: true
+ title: Role
+ type: string
+ required:
+ - id
+ title: UserUpdate
+ type: object
+ ValidationError:
+ properties:
+ loc:
+ items:
+ anyOf:
+ - type: string
+ - type: integer
+ title: Location
+ type: array
+ msg:
+ title: Message
+ type: string
+ type:
+ title: Error Type
+ type: string
+ required:
+ - loc
+ - msg
+ - type
+ title: ValidationError
+ type: object
+ Visibility:
+ description: An enumeration.
+ enum:
+ - Open
+ - Restricted
+ title: Visibility
+ type: string
+ WorkflowCase:
+ properties:
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ required:
+ - id
+ title: WorkflowCase
+ type: object
+ WorkflowCreate:
+ properties:
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ parameters:
+ default: []
+ items:
+ type: object
+ title: Parameters
+ type: array
+ plugin_instance:
+ $ref: "#/components/schemas/PluginInstanceRead"
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ resource_id:
+ title: Resource Id
+ type: string
+ updated_at:
+ format: date-time
+ title: Updated At
+ type: string
+ required:
+ - name
+ - resource_id
+ - plugin_instance
+ - project
+ title: WorkflowCreate
+ type: object
+ WorkflowIncident:
+ properties:
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ required:
+ - id
+ title: WorkflowIncident
+ type: object
+ WorkflowInstanceCreate:
+ properties:
+ artifacts:
+ default: []
+ items:
+ $ref: "#/components/schemas/DocumentCreate"
+ title: Artifacts
+ type: array
+ case:
+ $ref: "#/components/schemas/WorkflowCase"
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ creator:
+ $ref: "#/components/schemas/ParticipantRead"
+ incident:
+ $ref: "#/components/schemas/WorkflowIncident"
+ parameters:
+ default: []
+ items:
+ type: object
+ title: Parameters
+ type: array
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ nullable: true
+ title: Resource Type
+ type: string
+ run_reason:
+ nullable: true
+ title: Run Reason
+ type: string
+ signal:
+ $ref: "#/components/schemas/WorkflowSignal"
+ status:
+ $ref: "#/components/schemas/WorkflowInstanceStatus"
+ updated_at:
+ format: date-time
+ title: Updated At
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ title: WorkflowInstanceCreate
+ type: object
+ WorkflowInstanceRead:
+ properties:
+ artifacts:
+ default: []
+ items:
+ $ref: "#/components/schemas/DocumentCreate"
+ title: Artifacts
+ type: array
+ case:
+ $ref: "#/components/schemas/WorkflowCase"
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ creator:
+ $ref: "#/components/schemas/ParticipantRead"
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incident:
+ $ref: "#/components/schemas/WorkflowIncident"
+ parameters:
+ default: []
+ items:
+ type: object
+ title: Parameters
+ type: array
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ nullable: true
+ title: Resource Type
+ type: string
+ run_reason:
+ nullable: true
+ title: Run Reason
+ type: string
+ signal:
+ $ref: "#/components/schemas/WorkflowSignal"
+ status:
+ $ref: "#/components/schemas/WorkflowInstanceStatus"
+ updated_at:
+ format: date-time
+ title: Updated At
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ workflow:
+ $ref: "#/components/schemas/WorkflowRead"
+ required:
+ - id
+ - workflow
+ title: WorkflowInstanceRead
+ type: object
+ WorkflowInstanceStatus:
+ description: An enumeration.
+ enum:
+ - Submitted
+ - Created
+ - Running
+ - Completed
+ - Failed
+ title: WorkflowInstanceStatus
+ type: string
+ WorkflowPagination:
+ properties:
+ items:
+ default: []
+ items:
+ $ref: "#/components/schemas/WorkflowRead"
+ title: Items
+ type: array
+ total:
+ title: Total
+ type: integer
+ required:
+ - total
+ title: WorkflowPagination
+ type: object
+ WorkflowRead:
+ properties:
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ parameters:
+ default: []
+ items:
+ type: object
+ title: Parameters
+ type: array
+ plugin_instance:
+ $ref: "#/components/schemas/PluginInstanceRead"
+ resource_id:
+ title: Resource Id
+ type: string
+ updated_at:
+ format: date-time
+ title: Updated At
+ type: string
+ required:
+ - name
+ - resource_id
+ - plugin_instance
+ - id
+ title: WorkflowRead
+ type: object
+ WorkflowSignal:
+ properties:
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ required:
+ - id
+ title: WorkflowSignal
+ type: object
+ WorkflowUpdate:
+ properties:
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ description:
+ nullable: true
+ title: Description
+ type: string
+ enabled:
+ title: Enabled
+ type: boolean
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ parameters:
+ default: []
+ items:
+ type: object
+ title: Parameters
+ type: array
+ plugin_instance:
+ $ref: "#/components/schemas/PluginInstanceRead"
+ resource_id:
+ title: Resource Id
+ type: string
+ updated_at:
+ format: date-time
+ title: Updated At
+ type: string
+ required:
+ - name
+ - resource_id
+ - plugin_instance
+ title: WorkflowUpdate
+ type: object
+ dispatch__case__models__CaseRead:
+ properties:
+ assignee:
+ $ref: "#/components/schemas/ParticipantRead"
+ case_priority:
+ $ref: "#/components/schemas/CasePriorityRead"
+ case_severity:
+ $ref: "#/components/schemas/CaseSeverityRead"
+ case_type:
+ $ref: "#/components/schemas/CaseTypeRead"
+ closed_at:
+ format: date-time
+ title: Closed At
+ type: string
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ description:
+ title: Description
+ type: string
+ documents:
+ default: []
+ items:
+ $ref: "#/components/schemas/DocumentRead"
+ title: Documents
+ type: array
+ duplicates:
+ default: []
+ items:
+ $ref: "#/components/schemas/CaseReadMinimal"
+ title: Duplicates
+ type: array
+ escalated_at:
+ format: date-time
+ title: Escalated At
+ type: string
+ events:
+ default: []
+ items:
+ $ref: "#/components/schemas/EventRead"
+ title: Events
+ type: array
+ groups:
+ default: []
+ items:
+ $ref: "#/components/schemas/GroupRead"
+ title: Groups
+ type: array
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incidents:
+ default: []
+ items:
+ $ref: "#/components/schemas/IncidentReadMinimal"
+ title: Incidents
+ type: array
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ participants:
+ default: []
+ items:
+ $ref: "#/components/schemas/ParticipantRead"
+ title: Participants
+ type: array
+ project:
+ $ref: "#/components/schemas/dispatch__case__models__ProjectRead"
+ related:
+ default: []
+ items:
+ $ref: "#/components/schemas/CaseReadMinimal"
+ title: Related
+ type: array
+ reported_at:
+ format: date-time
+ title: Reported At
+ type: string
+ resolution:
+ title: Resolution
+ type: string
+ resolution_reason:
+ $ref: "#/components/schemas/CaseResolutionReason"
+ signal_instances:
+ default: []
+ items:
+ $ref: "#/components/schemas/dispatch__case__models__SignalInstanceRead"
+ title: Signal Instances
+ type: array
+ status:
+ $ref: "#/components/schemas/CaseStatus"
+ storage:
+ $ref: "#/components/schemas/StorageRead"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ ticket:
+ $ref: "#/components/schemas/TicketRead"
+ title:
+ title: Title
+ type: string
+ triage_at:
+ format: date-time
+ title: Triage At
+ type: string
+ visibility:
+ $ref: "#/components/schemas/Visibility"
+ workflow_instances:
+ default: []
+ items:
+ $ref: "#/components/schemas/WorkflowInstanceRead"
+ title: Workflow Instances
+ type: array
+ required:
+ - title
+ - id
+ - case_priority
+ - case_severity
+ - case_type
+ - project
+ title: CaseRead
+ type: object
+ dispatch__case__models__ProjectRead:
+ properties:
+ color:
+ title: Color
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ required:
+ - name
+ title: ProjectRead
+ type: object
+ dispatch__case__models__SignalInstanceRead:
+ properties:
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ entities:
+ default: []
+ items:
+ $ref: "#/components/schemas/EntityRead"
+ title: Entities
+ type: array
+ fingerprint:
+ title: Fingerprint
+ type: string
+ raw:
+ title: Raw
+ signal:
+ $ref: "#/components/schemas/dispatch__case__models__SignalRead"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ required:
+ - signal
+ - created_at
+ title: SignalInstanceRead
+ type: object
+ dispatch__case__models__SignalRead:
+ properties:
+ description:
+ title: Description
+ type: string
+ external_id:
+ title: External Id
+ type: string
+ external_url:
+ title: External Url
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ title: Name
+ type: string
+ owner:
+ title: Owner
+ type: string
+ variant:
+ title: Variant
+ type: string
+ workflow_instances:
+ default: []
+ items:
+ $ref: "#/components/schemas/WorkflowInstanceRead"
+ title: Workflow Instances
+ type: array
+ required:
+ - id
+ - name
+ - owner
+ - external_id
+ title: SignalRead
+ type: object
+ dispatch__case__type__models__Document:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ nullable: true
+ title: Resource Type
+ type: string
+ weblink:
+ title: Weblink
+ type: string
+ required:
+ - id
+ - name
+ - weblink
+ title: Document
+ type: object
+ dispatch__case__type__models__Service:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ external_id:
+ title: External Id
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ is_active:
+ title: Is Active
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ type:
+ nullable: true
+ title: Type
+ type: string
+ required:
+ - id
+ - external_id
+ - name
+ title: Service
+ type: object
+ dispatch__incident__models__CaseRead:
+ properties:
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ required:
+ - id
+ title: CaseRead
+ type: object
+ dispatch__incident__models__ProjectRead:
+ properties:
+ color:
+ title: Color
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ required:
+ - name
+ title: ProjectRead
+ type: object
+ dispatch__incident__models__TaskRead:
+ properties:
+ assignees:
+ default: []
+ items:
+ $ref: "#/components/schemas/ParticipantRead"
+ title: Assignees
+ type: array
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ status:
+ allOf:
+ - $ref: "#/components/schemas/TaskStatus"
+ default: Open
+ weblink:
+ title: Weblink
+ type: string
+ required:
+ - id
+ title: TaskRead
+ type: object
+ dispatch__incident__type__models__Document:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ nullable: true
+ title: Resource Type
+ type: string
+ weblink:
+ title: Weblink
+ type: string
+ required:
+ - id
+ - name
+ - weblink
+ title: Document
+ type: object
+ dispatch__project__models__ProjectRead:
+ properties:
+ annual_employee_cost:
+ title: Annual Employee Cost
+ type: integer
+ business_year_hours:
+ title: Business Year Hours
+ type: integer
+ color:
+ nullable: true
+ title: Color
+ type: string
+ default:
+ default: false
+ title: Default
+ type: boolean
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ owner_conversation:
+ nullable: true
+ title: Owner Conversation
+ type: string
+ owner_email:
+ format: email
+ nullable: true
+ title: Owner Email
+ type: string
+ required:
+ - name
+ title: ProjectRead
+ type: object
+ dispatch__signal__models__Service:
+ properties:
+ description:
+ nullable: true
+ title: Description
+ type: string
+ external_id:
+ title: External Id
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ is_active:
+ title: Is Active
+ type: boolean
+ name:
+ minLength: 3
+ pattern: ^(?!\s*$).+
+ title: Name
+ type: string
+ type:
+ nullable: true
+ title: Type
+ type: string
+ required:
+ - id
+ - external_id
+ - name
+ title: Service
+ type: object
+ dispatch__signal__models__SignalInstanceRead:
+ properties:
+ case:
+ $ref: "#/components/schemas/dispatch__case__models__CaseRead"
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ entities:
+ default: []
+ items:
+ $ref: "#/components/schemas/EntityRead"
+ title: Entities
+ type: array
+ filter_action:
+ $ref: "#/components/schemas/SignalFilterAction"
+ fingerprint:
+ title: Fingerprint
+ type: string
+ id:
+ format: uuid
+ title: Id
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ raw:
+ title: Raw
+ type: object
+ signal:
+ $ref: "#/components/schemas/dispatch__signal__models__SignalRead"
+ required:
+ - project
+ - raw
+ - id
+ - signal
+ title: SignalInstanceRead
+ type: object
+ dispatch__signal__models__SignalRead:
+ properties:
+ case_priority:
+ $ref: "#/components/schemas/CasePriorityRead"
+ case_type:
+ $ref: "#/components/schemas/CaseTypeRead"
+ conversation_target:
+ title: Conversation Target
+ type: string
+ create_case:
+ default: true
+ title: Create Case
+ type: boolean
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ description:
+ title: Description
+ type: string
+ enabled:
+ default: false
+ title: Enabled
+ type: boolean
+ entity_types:
+ default: []
+ items:
+ $ref: "#/components/schemas/EntityTypeRead"
+ title: Entity Types
+ type: array
+ external_id:
+ title: External Id
+ type: string
+ external_url:
+ title: External Url
+ type: string
+ filters:
+ default: []
+ items:
+ $ref: "#/components/schemas/SignalFilterRead"
+ title: Filters
+ type: array
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ name:
+ title: Name
+ type: string
+ oncall_service:
+ $ref: "#/components/schemas/dispatch__signal__models__Service"
+ owner:
+ title: Owner
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ source:
+ $ref: "#/components/schemas/SourceBase"
+ tags:
+ default: []
+ items:
+ $ref: "#/components/schemas/TagRead"
+ title: Tags
+ type: array
+ variant:
+ title: Variant
+ type: string
+ workflows:
+ default: []
+ items:
+ $ref: "#/components/schemas/WorkflowRead"
+ title: Workflows
+ type: array
+ required:
+ - name
+ - owner
+ - external_id
+ - project
+ - id
+ title: SignalRead
+ type: object
+ dispatch__task__models__TaskRead:
+ properties:
+ assignees:
+ default: []
+ items:
+ $ref: "#/components/schemas/ParticipantRead"
+ title: Assignees
+ type: array
+ created_at:
+ format: date-time
+ title: Created At
+ type: string
+ creator:
+ $ref: "#/components/schemas/ParticipantRead"
+ description:
+ nullable: true
+ title: Description
+ type: string
+ id:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Id
+ type: integer
+ incident:
+ $ref: "#/components/schemas/IncidentReadMinimal"
+ owner:
+ $ref: "#/components/schemas/ParticipantRead"
+ priority:
+ nullable: true
+ title: Priority
+ type: string
+ project:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ resolve_by:
+ format: date-time
+ title: Resolve By
+ type: string
+ resolved_at:
+ format: date-time
+ title: Resolved At
+ type: string
+ resource_id:
+ nullable: true
+ title: Resource Id
+ type: string
+ resource_type:
+ nullable: true
+ title: Resource Type
+ type: string
+ source:
+ nullable: true
+ title: Source
+ type: string
+ status:
+ allOf:
+ - $ref: "#/components/schemas/TaskStatus"
+ default: Open
+ updated_at:
+ format: date-time
+ title: Updated At
+ type: string
+ weblink:
+ nullable: true
+ title: Weblink
+ type: string
+ required:
+ - incident
+ - id
+ title: TaskRead
+ type: object
+info:
+ description: Welcome to Dispatch's API documentation! Here you will able to discover
+ all of the ways you can interact with the Dispatch API.
+ title: Dispatch
+ version: 0.1.0
+openapi: 3.0.2
+paths:
+ /organizations:
+ get:
+ description: Get all organizations.
+ operationId: get_organizations_organizations_get
+ parameters:
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/OrganizationPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Organizations
+ tags:
+ - organizations
+ post:
+ description: Create a new organization.
+ operationId: create_organization_organizations_post
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/OrganizationCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/OrganizationRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Organization
+ tags:
+ - organizations
+ /organizations/{organization_id}:
+ get:
+ description: Get an organization.
+ operationId: get_organization_organizations__organization_id__get
+ parameters:
+ - in: path
+ name: organization_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Organization Id
+ type: integer
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/OrganizationRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Organization
+ tags:
+ - organizations
+ put:
+ description: Update an organization.
+ operationId: update_organization_organizations__organization_id__put
+ parameters:
+ - in: path
+ name: organization_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Organization Id
+ type: integer
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/OrganizationUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/OrganizationRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Organization
+ tags:
+ - organizations
+ /{organization}/auth/login:
+ post:
+ operationId: login_user__organization__auth_login_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/UserLogin"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/UserLoginResponse"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Login User
+ tags:
+ - auth
+ /{organization}/auth/me:
+ get:
+ operationId: get_me__organization__auth_me_get
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/UserRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Me
+ tags:
+ - auth
+ /{organization}/auth/register:
+ post:
+ operationId: register_user__organization__auth_register_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/UserRegister"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/UserRegisterResponse"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Register User
+ tags:
+ - auth
+ /{organization}/case_priorities:
+ get:
+ description: Returns all case priorities.
+ operationId: get_case_priorities__organization__case_priorities_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CasePriorityPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Case Priorities
+ tags:
+ - case_priorities
+ - case_priorities
+ post:
+ description: Creates a new case priority.
+ operationId: create_case_priority__organization__case_priorities_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CasePriorityCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CasePriorityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Case Priority
+ tags:
+ - case_priorities
+ /{organization}/case_priorities/{case_priority_id}:
+ get:
+ description: Gets a case priority.
+ operationId: get_case_priority__organization__case_priorities__case_priority_id__get
+ parameters:
+ - in: path
+ name: case_priority_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Case Priority Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CasePriorityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Case Priority
+ tags:
+ - case_priorities
+ put:
+ description: Updates an existing case priority.
+ operationId: update_case_priority__organization__case_priorities__case_priority_id__put
+ parameters:
+ - in: path
+ name: case_priority_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Case Priority Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CasePriorityUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CasePriorityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Case Priority
+ tags:
+ - case_priorities
+ /{organization}/case_severities:
+ get:
+ description: Returns all case severities.
+ operationId: get_case_severities__organization__case_severities_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CaseSeverityPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Case Severities
+ tags:
+ - case_severities
+ - case_severities
+ post:
+ description: Creates a new case severity.
+ operationId: create_case_severity__organization__case_severities_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CaseSeverityCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CaseSeverityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Case Severity
+ tags:
+ - case_severities
+ /{organization}/case_severities/{case_severity_id}:
+ get:
+ description: Gets a case severity.
+ operationId: get_case_severity__organization__case_severities__case_severity_id__get
+ parameters:
+ - in: path
+ name: case_severity_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Case Severity Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CaseSeverityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Case Severity
+ tags:
+ - case_severities
+ put:
+ description: Updates an existing case severity.
+ operationId: update_case_severity__organization__case_severities__case_severity_id__put
+ parameters:
+ - in: path
+ name: case_severity_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Case Severity Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CaseSeverityUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CaseSeverityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Case Severity
+ tags:
+ - case_severities
+ /{organization}/case_types:
+ get:
+ description: Returns all case types.
+ operationId: get_case_types__organization__case_types_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CaseTypePagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Case Types
+ tags:
+ - case_types
+ - case_types
+ post:
+ description: Creates a new case type.
+ operationId: create_case_type__organization__case_types_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CaseTypeCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CaseTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Case Type
+ tags:
+ - case_types
+ /{organization}/case_types/{case_type_id}:
+ get:
+ description: Gets a case type.
+ operationId: get_case_type__organization__case_types__case_type_id__get
+ parameters:
+ - in: path
+ name: case_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Case Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CaseTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Case Type
+ tags:
+ - case_types
+ put:
+ description: Updates an existing case type.
+ operationId: update_case_type__organization__case_types__case_type_id__put
+ parameters:
+ - in: path
+ name: case_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Case Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CaseTypeUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CaseTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Case Type
+ tags:
+ - case_types
+ /{organization}/cases:
+ get:
+ description: Retrieves all cases.
+ operationId: get_cases__organization__cases_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: include[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Include[]
+ type: array
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Retrieves a list of cases.
+ tags:
+ - cases
+ post:
+ description: Creates a new case.
+ operationId: create_case__organization__cases_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CaseCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/dispatch__case__models__CaseRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Creates a new case.
+ tags:
+ - cases
+ /{organization}/cases/{case_id}:
+ delete:
+ description: Deletes an existing case and its external resources.
+ operationId: delete_case__organization__cases__case_id__delete
+ parameters:
+ - in: path
+ name: case_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Case Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Deletes an existing case and its external resources.
+ tags:
+ - cases
+ get:
+ description: Retrieves the details of a single case.
+ operationId: get_case__organization__cases__case_id__get
+ parameters:
+ - in: path
+ name: case_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Case Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/dispatch__case__models__CaseRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Retrieves a single case.
+ tags:
+ - cases
+ put:
+ description: Updates an existing case.
+ operationId: update_case__organization__cases__case_id__put
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: path
+ name: case_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Case Id
+ type: integer
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/CaseUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/dispatch__case__models__CaseRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Updates an existing case.
+ tags:
+ - cases
+ /{organization}/cases/{case_id}/escalate:
+ put:
+ description: Escalates an existing case.
+ operationId: escalate_case__organization__cases__case_id__escalate_put
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Escalates an existing case.
+ tags:
+ - cases
+ /{organization}/data/alerts:
+ post:
+ description: Creates a new alert.
+ operationId: create_alert__organization__data_alerts_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/AlertCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/AlertRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Alert
+ tags:
+ - alerts
+ /{organization}/data/alerts/{alert_id}:
+ delete:
+ description: Deletes an alert, returning only an HTTP 200 OK if successful.
+ operationId: delete_alert__organization__data_alerts__alert_id__delete
+ parameters:
+ - in: path
+ name: alert_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Alert Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Alert
+ tags:
+ - alerts
+ get:
+ description: Given its unique id, retrieve details about a single alert.
+ operationId: get_alert__organization__data_alerts__alert_id__get
+ parameters:
+ - in: path
+ name: alert_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Alert Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/AlertRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Alert
+ tags:
+ - alerts
+ put:
+ description: Updates an alert.
+ operationId: update_alert__organization__data_alerts__alert_id__put
+ parameters:
+ - in: path
+ name: alert_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Alert Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/AlertUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/AlertRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Alert
+ tags:
+ - alerts
+ /{organization}/data/queries:
+ get:
+ description: Get all queries, or only those matching a given search term.
+ operationId: get_queries__organization__data_queries_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/QueryPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Queries
+ tags:
+ - queries
+ post:
+ description: Creates a new data query.
+ operationId: create_query__organization__data_queries_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/QueryCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/QueryRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Query
+ tags:
+ - queries
+ /{organization}/data/queries/{query_id}:
+ delete:
+ description: Deletes a data query, returning only an HTTP 200 OK if successful.
+ operationId: delete_query__organization__data_queries__query_id__delete
+ parameters:
+ - in: path
+ name: query_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Query Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Query
+ tags:
+ - queries
+ get:
+ description: Given its unique ID, retrieve details about a single query.
+ operationId: get_query__organization__data_queries__query_id__get
+ parameters:
+ - in: path
+ name: query_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Query Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/QueryRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Query
+ tags:
+ - queries
+ put:
+ description: Updates a data query.
+ operationId: update_query__organization__data_queries__query_id__put
+ parameters:
+ - in: path
+ name: query_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Query Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/QueryUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/QueryRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Query
+ tags:
+ - queries
+ /{organization}/data/sources:
+ get:
+ description: Get all sources, or only those matching a given search term.
+ operationId: get_sources__organization__data_sources_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourcePagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Sources
+ tags:
+ - sources
+ post:
+ description: Creates a new source.
+ operationId: create_source__organization__data_sources_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Source
+ tags:
+ - sources
+ /{organization}/data/sources/dataFormats:
+ get:
+ description: Get all source data formats, or only those matching a given search
+ term.
+ operationId: get_source_data_formats__organization__data_sources_dataFormats_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceDataFormatPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Source Data Formats
+ tags:
+ - source_data_formats
+ post:
+ description: Creates a new source data format.
+ operationId: create_source_data_format__organization__data_sources_dataFormats_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceDataFormatCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceDataFormatRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Source Data Format
+ tags:
+ - source_data_formats
+ /{organization}/data/sources/dataFormats/{source_data_format_id}:
+ delete:
+ description: Delete a source data format, returning only an HTTP 200 OK if successful.
+ operationId: delete_source_data_format__organization__data_sources_dataFormats__source_data_format_id__delete
+ parameters:
+ - in: path
+ name: source_data_format_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Data Format Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Source Data Format
+ tags:
+ - source_data_formats
+ get:
+ description: Given its unique id, retrieve details about a source data format.
+ operationId: get_source_data_format__organization__data_sources_dataFormats__source_data_format_id__get
+ parameters:
+ - in: path
+ name: source_data_format_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Data Format Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceDataFormatRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Source Data Format
+ tags:
+ - source_data_formats
+ put:
+ description: Updates a source data format.
+ operationId: update_source_data_format__organization__data_sources_dataFormats__source_data_format_id__put
+ parameters:
+ - in: path
+ name: source_data_format_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Data Format Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceDataFormatUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceDataFormatRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Source Data Format
+ tags:
+ - source_data_formats
+ /{organization}/data/sources/environments:
+ get:
+ description: Get all source_environment environments, or only those matching
+ a given search term.
+ operationId: get_source_environments__organization__data_sources_environments_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceEnvironmentPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Source Environments
+ tags:
+ - source_environments
+ post:
+ description: Creates a new source environment.
+ operationId: create_source_environment__organization__data_sources_environments_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceEnvironmentCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceEnvironmentRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Source Environment
+ tags:
+ - source_environments
+ /{organization}/data/sources/environments/{source_environment_id}:
+ delete:
+ description: Delete a source environment, returning only an HTTP 200 OK if successful.
+ operationId: delete_source_environment__organization__data_sources_environments__source_environment_id__delete
+ parameters:
+ - in: path
+ name: source_environment_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Environment Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Source Environment
+ tags:
+ - source_environments
+ get:
+ description: Given its unique id, retrieve details about a single source_environment
+ environment.
+ operationId: get_source_environment__organization__data_sources_environments__source_environment_id__get
+ parameters:
+ - in: path
+ name: source_environment_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Environment Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceEnvironmentRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Source Environment
+ tags:
+ - source_environments
+ put:
+ description: Updates a source environment.
+ operationId: update_source_environment__organization__data_sources_environments__source_environment_id__put
+ parameters:
+ - in: path
+ name: source_environment_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Environment Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceEnvironmentUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceEnvironmentRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Source Environment
+ tags:
+ - source_environments
+ /{organization}/data/sources/statuses:
+ get:
+ description: Get all source statuses, or only those matching a given search
+ term.
+ operationId: get_source_statuses__organization__data_sources_statuses_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceStatusPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Source Statuses
+ tags:
+ - source_statuses
+ post:
+ description: Creates a new source status.
+ operationId: create_source_status__organization__data_sources_statuses_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceStatusCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceStatusRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Source Status
+ tags:
+ - source_statuses
+ /{organization}/data/sources/statuses/{source_status_id}:
+ delete:
+ description: Deletes a source status, returning only an HTTP 200 OK if successful.
+ operationId: delete_source_status__organization__data_sources_statuses__source_status_id__delete
+ parameters:
+ - in: path
+ name: source_status_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Status Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Source Status
+ tags:
+ - source_statuses
+ get:
+ description: Given its unique id, retrieve details about a single source status.
+ operationId: get_source_status__organization__data_sources_statuses__source_status_id__get
+ parameters:
+ - in: path
+ name: source_status_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Status Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceStatusRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Source Status
+ tags:
+ - source_statuses
+ put:
+ description: Updates a source status.
+ operationId: update_source_status__organization__data_sources_statuses__source_status_id__put
+ parameters:
+ - in: path
+ name: source_status_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Status Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceStatusUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceStatusRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Source Status
+ tags:
+ - source_statuses
+ /{organization}/data/sources/transports:
+ get:
+ description: Get all source transports, or only those matching a given search
+ term.
+ operationId: get_source_transports__organization__data_sources_transports_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceTransportPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Source Transports
+ tags:
+ - source_transports
+ post:
+ description: Creates a new source transport.
+ operationId: create_source_transport__organization__data_sources_transports_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceTransportCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceTransportRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Source Transport
+ tags:
+ - source_transports
+ /{organization}/data/sources/transports/{source_transport_id}:
+ delete:
+ description: Deletes a source transport, returning only an HTTP 200 OK if successful.
+ operationId: delete_source_transport__organization__data_sources_transports__source_transport_id__delete
+ parameters:
+ - in: path
+ name: source_transport_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Transport Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Source Transport
+ tags:
+ - source_transports
+ get:
+ description: Given its unique id, retrieve details about a single source transport.
+ operationId: get_source_transport__organization__data_sources_transports__source_transport_id__get
+ parameters:
+ - in: path
+ name: source_transport_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Transport Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceTransportRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Source Transport
+ tags:
+ - source_transports
+ put:
+ description: Updates a source transport.
+ operationId: update_source_transport__organization__data_sources_transports__source_transport_id__put
+ parameters:
+ - in: path
+ name: source_transport_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Transport Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceTransportUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceTransportRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Source Transport
+ tags:
+ - source_transports
+ /{organization}/data/sources/types:
+ get:
+ description: Get all source types, or only those matching a given search term.
+ operationId: get_source_types__organization__data_sources_types_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceTypePagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Source Types
+ tags:
+ - source_types
+ post:
+ description: Creates a new source type.
+ operationId: create_source_type__organization__data_sources_types_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceTypeCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Source Type
+ tags:
+ - source_types
+ /{organization}/data/sources/types/{source_type_id}:
+ delete:
+ description: Deletes a source type, returning only an HTTP 200 OK if successful.
+ operationId: delete_source_type__organization__data_sources_types__source_type_id__delete
+ parameters:
+ - in: path
+ name: source_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Source Type
+ tags:
+ - source_types
+ get:
+ description: Given its unique id, retrieve details about a single source type.
+ operationId: get_source_type__organization__data_sources_types__source_type_id__get
+ parameters:
+ - in: path
+ name: source_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Source Type
+ tags:
+ - source_types
+ put:
+ description: Updates a source type.
+ operationId: update_source_type__organization__data_sources_types__source_type_id__put
+ parameters:
+ - in: path
+ name: source_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceTypeUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Source Type
+ tags:
+ - source_types
+ /{organization}/data/sources/{source_id}:
+ delete:
+ description: Deletes a source, returning only an HTTP 200 OK if successful.
+ operationId: delete_source__organization__data_sources__source_id__delete
+ parameters:
+ - in: path
+ name: source_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Source
+ tags:
+ - sources
+ get:
+ description: Given its unique id, retrieve details about a single source.
+ operationId: get_source__organization__data_sources__source_id__get
+ parameters:
+ - in: path
+ name: source_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Source
+ tags:
+ - sources
+ put:
+ description: Updates a source.
+ operationId: update_source__organization__data_sources__source_id__put
+ parameters:
+ - in: path
+ name: source_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Source Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SourceRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Source
+ tags:
+ - sources
+ /{organization}/definitions:
+ get:
+ description: Get all definitions.
+ operationId: get_definitions__organization__definitions_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/DefinitionPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Definitions
+ tags:
+ - definitions
+ post:
+ description: Create a new definition.
+ operationId: create_definition__organization__definitions_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/DefinitionCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/DefinitionRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Definition
+ tags:
+ - definitions
+ /{organization}/definitions/{definition_id}:
+ delete:
+ description: Delete a definition.
+ operationId: delete_definition__organization__definitions__definition_id__delete
+ parameters:
+ - in: path
+ name: definition_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Definition Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Definition
+ tags:
+ - definitions
+ get:
+ description: Update a definition.
+ operationId: get_definition__organization__definitions__definition_id__get
+ parameters:
+ - in: path
+ name: definition_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Definition Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/DefinitionRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Definition
+ tags:
+ - definitions
+ put:
+ description: Update a definition.
+ operationId: update_definition__organization__definitions__definition_id__put
+ parameters:
+ - in: path
+ name: definition_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Definition Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/DefinitionUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/DefinitionRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Definition
+ tags:
+ - definitions
+ /{organization}/documents:
+ get:
+ description: Get all documents.
+ operationId: get_documents__organization__documents_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/DocumentPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Documents
+ tags:
+ - documents
+ post:
+ description: Create a new document.
+ operationId: create_document__organization__documents_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/DocumentCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/DocumentRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Document
+ tags:
+ - documents
+ /{organization}/documents/{document_id}:
+ delete:
+ description: Delete a document.
+ operationId: delete_document__organization__documents__document_id__delete
+ parameters:
+ - in: path
+ name: document_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Document Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Document
+ tags:
+ - documents
+ get:
+ description: Update a document.
+ operationId: get_document__organization__documents__document_id__get
+ parameters:
+ - in: path
+ name: document_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Document Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/DocumentRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Document
+ tags:
+ - documents
+ put:
+ description: Update a document.
+ operationId: update_document__organization__documents__document_id__put
+ parameters:
+ - in: path
+ name: document_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Document Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/DocumentUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/DocumentRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Document
+ tags:
+ - documents
+ /{organization}/entity:
+ get:
+ description: Get all entities, or only those matching a given search term.
+ operationId: get_entities__organization__entity_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/EntityPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Entities
+ tags:
+ - entities
+ post:
+ description: Creates a new entity.
+ operationId: create_entity__organization__entity_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/EntityCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/EntityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Entity
+ tags:
+ - entities
+ /{organization}/entity/{entity_id}:
+ delete:
+ description: Deletes a entity, returning only an HTTP 200 OK if successful.
+ operationId: delete_entity__organization__entity__entity_id__delete
+ parameters:
+ - in: path
+ name: entity_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Entity Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Entity
+ tags:
+ - entities
+ get:
+ description: Given its unique id, retrieve details about a single entity.
+ operationId: get_entity__organization__entity__entity_id__get
+ parameters:
+ - in: path
+ name: entity_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Entity Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/EntityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Entity
+ tags:
+ - entities
+ put:
+ description: Updates an existing entity.
+ operationId: update_entity__organization__entity__entity_id__put
+ parameters:
+ - in: path
+ name: entity_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Entity Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/EntityUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/EntityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Entity
+ tags:
+ - entities
+ /{organization}/entity/{entity_id}/cases/{days_back}:
+ get:
+ operationId: count_cases_with_entity__organization__entity__entity_id__cases__days_back__get
+ parameters:
+ - in: path
+ name: days_back
+ required: true
+ schema:
+ title: Days Back
+ type: integer
+ - in: path
+ name: entity_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Entity Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Count Cases With Entity
+ tags:
+ - entities
+ /{organization}/entity/{entity_id}/signal_instances/{days_back}:
+ get:
+ operationId: get_signal_instances_by_entity__organization__entity__entity_id__signal_instances__days_back__get
+ parameters:
+ - in: path
+ name: days_back
+ required: true
+ schema:
+ title: Days Back
+ type: integer
+ - in: path
+ name: entity_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Entity Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Signal Instances By Entity
+ tags:
+ - entities
+ /{organization}/entity_type:
+ get:
+ description: Get all entities, or only those matching a given search term.
+ operationId: get_entity_types__organization__entity_type_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/EntityTypePagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Entity Types
+ tags:
+ - entity_types
+ post:
+ description: Create a new entity.
+ operationId: create_entity_type__organization__entity_type_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/EntityTypeCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/EntityTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Entity Type
+ tags:
+ - entity_types
+ /{organization}/entity_type/{entity_type_id}:
+ delete:
+ description: Delete an entity.
+ operationId: delete_entity_type__organization__entity_type__entity_type_id__delete
+ parameters:
+ - in: path
+ name: entity_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Entity Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Entity Type
+ tags:
+ - entity_types
+ get:
+ description: Get a entity by its id.
+ operationId: get_entity_type__organization__entity_type__entity_type_id__get
+ parameters:
+ - in: path
+ name: entity_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Entity Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/EntityTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Entity Type
+ tags:
+ - entity_types
+ put:
+ description: Update an entity.
+ operationId: update_entity_type__organization__entity_type__entity_type_id__put
+ parameters:
+ - in: path
+ name: entity_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Entity Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/EntityTypeUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/EntityTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Entity Type
+ tags:
+ - entity_types
+ /{organization}/entity_type/{entity_type_id}/process:
+ put:
+ description: Process an entity type.
+ operationId: process_entity_type__organization__entity_type__entity_type_id__process_put
+ parameters:
+ - in: path
+ name: entity_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Entity Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/EntityTypeUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/EntityTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Process Entity Type
+ tags:
+ - entity_types
+ /{organization}/events/slack/action:
+ post:
+ description: Handle all incoming Slack actions.
+ operationId: slack_actions__organization__events_slack_action_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Slack Actions
+ tags:
+ - events
+ /{organization}/events/slack/command:
+ post:
+ description: Handle all incoming Slack commands.
+ operationId: slack_commands__organization__events_slack_command_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Slack Commands
+ tags:
+ - events
+ /{organization}/events/slack/event:
+ post:
+ description: Handle all incoming Slack events.
+ operationId: slack_events__organization__events_slack_event_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Slack Events
+ tags:
+ - events
+ /{organization}/events/slack/menu:
+ post:
+ description: Handle all incoming Slack actions.
+ operationId: slack_menus__organization__events_slack_menu_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Slack Menus
+ tags:
+ - events
+ /{organization}/feedback:
+ get:
+ description: Get all feedback entries, or only those matching a given search
+ term.
+ operationId: get_feedback_entries__organization__feedback_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/FeedbackPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Feedback Entries
+ tags:
+ - feedback
+ post:
+ description: Create a new feedback entry.
+ operationId: create_feedback__organization__feedback_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/FeedbackCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/FeedbackRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Feedback
+ tags:
+ - feedback
+ /{organization}/feedback/{feedback_id}:
+ delete:
+ description: Delete a feedback entry, returning only an HTTP 200 OK if successful.
+ operationId: delete_feedback__organization__feedback__feedback_id__delete
+ parameters:
+ - in: path
+ name: feedback_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Feedback Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Feedback
+ tags:
+ - feedback
+ get:
+ description: Get a feedback entry by its id.
+ operationId: get_feedback__organization__feedback__feedback_id__get
+ parameters:
+ - in: path
+ name: feedback_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Feedback Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/FeedbackRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Feedback
+ tags:
+ - feedback
+ put:
+ description: Updates a feedback entry by its id.
+ operationId: update_feedback__organization__feedback__feedback_id__put
+ parameters:
+ - in: path
+ name: feedback_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Feedback Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/FeedbackUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/FeedbackRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Feedback
+ tags:
+ - feedback
+ /{organization}/incident_cost_types:
+ get:
+ description: Get all incident cost types, or only those matching a given search
+ term.
+ operationId: get_incident_cost_types__organization__incident_cost_types_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentCostTypePagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Incident Cost Types
+ tags:
+ - incident_cost_types
+ post:
+ description: Create an incident cost type.
+ operationId: create_incident_cost_type__organization__incident_cost_types_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentCostTypeCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentCostTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Incident Cost Type
+ tags:
+ - incident_cost_types
+ /{organization}/incident_cost_types/{incident_cost_type_id}:
+ delete:
+ description: Delete an incident cost type, returning only an HTTP 200 OK if
+ successful.
+ operationId: delete_incident_cost_type__organization__incident_cost_types__incident_cost_type_id__delete
+ parameters:
+ - in: path
+ name: incident_cost_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Cost Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Incident Cost Type
+ tags:
+ - incident_cost_types
+ get:
+ description: Get an incident cost type by its id.
+ operationId: get_incident_cost_type__organization__incident_cost_types__incident_cost_type_id__get
+ parameters:
+ - in: path
+ name: incident_cost_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Cost Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentCostTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Incident Cost Type
+ tags:
+ - incident_cost_types
+ put:
+ description: Update an incident cost type by its id.
+ operationId: update_incident_cost_type__organization__incident_cost_types__incident_cost_type_id__put
+ parameters:
+ - in: path
+ name: incident_cost_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Cost Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentCostTypeUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentCostTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Incident Cost Type
+ tags:
+ - incident_cost_types
+ /{organization}/incident_costs:
+ get:
+ description: Get all incident costs, or only those matching a given search term.
+ operationId: get_incident_costs__organization__incident_costs_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentCostPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Incident Costs
+ tags:
+ - incident_costs
+ post:
+ description: Create an incident cost.
+ operationId: create_incident_cost__organization__incident_costs_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentCostCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentCostRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Incident Cost
+ tags:
+ - incident_costs
+ /{organization}/incident_costs/{incident_cost_id}:
+ delete:
+ description: Delete an incident cost, returning only an HTTP 200 OK if successful.
+ operationId: delete_incident_cost__organization__incident_costs__incident_cost_id__delete
+ parameters:
+ - in: path
+ name: incident_cost_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Cost Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Incident Cost
+ tags:
+ - incident_costs
+ get:
+ description: Get an incident cost by its id.
+ operationId: get_incident_cost__organization__incident_costs__incident_cost_id__get
+ parameters:
+ - in: path
+ name: incident_cost_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Cost Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentCostRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Incident Cost
+ tags:
+ - incident_costs
+ put:
+ description: Update an incident cost by its id.
+ operationId: update_incident_cost__organization__incident_costs__incident_cost_id__put
+ parameters:
+ - in: path
+ name: incident_cost_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Cost Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentCostUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentCostRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Incident Cost
+ tags:
+ - incident_costs
+ /{organization}/incident_priorities:
+ get:
+ description: Returns all incident priorities.
+ operationId: get_incident_priorities__organization__incident_priorities_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentPriorityPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Incident Priorities
+ tags:
+ - incident_priorities
+ - incident_priorities
+ post:
+ description: Create a new incident priority.
+ operationId: create_incident_priority__organization__incident_priorities_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentPriorityCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentPriorityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Incident Priority
+ tags:
+ - incident_priorities
+ /{organization}/incident_priorities/{incident_priority_id}:
+ get:
+ description: Get an incident priority.
+ operationId: get_incident_priority__organization__incident_priorities__incident_priority_id__get
+ parameters:
+ - in: path
+ name: incident_priority_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Priority Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentPriorityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Incident Priority
+ tags:
+ - incident_priorities
+ put:
+ description: Update an existing incident priority.
+ operationId: update_incident_priority__organization__incident_priorities__incident_priority_id__put
+ parameters:
+ - in: path
+ name: incident_priority_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Priority Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentPriorityUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentPriorityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Incident Priority
+ tags:
+ - incident_priorities
+ /{organization}/incident_roles/{role}:
+ get:
+ description: Get all incident role mappings.
+ operationId: get_incident_roles__organization__incident_roles__role__get
+ parameters:
+ - in: path
+ name: role
+ required: true
+ schema:
+ $ref: "#/components/schemas/ParticipantRoleType"
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: projectName
+ required: true
+ schema:
+ title: Projectname
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentRoles"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Incident Roles
+ tags:
+ - role
+ put:
+ description: Update a incident role mapping by its id.
+ operationId: update_incident_role__organization__incident_roles__role__put
+ parameters:
+ - in: path
+ name: role
+ required: true
+ schema:
+ $ref: "#/components/schemas/ParticipantRoleType"
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: projectName
+ required: true
+ schema:
+ title: Projectname
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentRolesCreateUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentRoles"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Incident Role
+ tags:
+ - role
+ /{organization}/incident_severities:
+ get:
+ description: Returns all incident severities.
+ operationId: get_incident_severities__organization__incident_severities_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentSeverityPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Incident Severities
+ tags:
+ - incident_severities
+ - incident_severities
+ post:
+ description: Creates a new incident severity.
+ operationId: create_incident_severity__organization__incident_severities_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentSeverityCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentSeverityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Incident Severity
+ tags:
+ - incident_severities
+ /{organization}/incident_severities/{incident_severity_id}:
+ get:
+ description: Gets an incident severity.
+ operationId: get_incident_severity__organization__incident_severities__incident_severity_id__get
+ parameters:
+ - in: path
+ name: incident_severity_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Severity Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentSeverityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Incident Severity
+ tags:
+ - incident_severities
+ put:
+ description: Updates an existing incident severity.
+ operationId: update_incident_severity__organization__incident_severities__incident_severity_id__put
+ parameters:
+ - in: path
+ name: incident_severity_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Severity Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentSeverityUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentSeverityRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Incident Severity
+ tags:
+ - incident_severities
+ /{organization}/incident_types:
+ get:
+ description: Returns all incident types.
+ operationId: get_incident_types__organization__incident_types_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentTypePagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Incident Types
+ tags:
+ - incident_types
+ - incident_types
+ post:
+ description: Create a new incident type.
+ operationId: create_incident_type__organization__incident_types_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentTypeCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Incident Type
+ tags:
+ - incident_types
+ /{organization}/incident_types/{incident_type_id}:
+ get:
+ description: Get an incident type.
+ operationId: get_incident_type__organization__incident_types__incident_type_id__get
+ parameters:
+ - in: path
+ name: incident_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Incident Type
+ tags:
+ - incident_types
+ put:
+ description: Update an existing incident type.
+ operationId: update_incident_type__organization__incident_types__incident_type_id__put
+ parameters:
+ - in: path
+ name: incident_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentTypeUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Incident Type
+ tags:
+ - incident_types
+ /{organization}/incidents:
+ get:
+ description: Retrieves a list of incidents.
+ operationId: get_incidents__organization__incidents_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: include[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Include[]
+ type: array
+ - in: query
+ name: expand
+ required: false
+ schema:
+ default: false
+ title: Expand
+ type: boolean
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Retrieve a list of incidents.
+ tags:
+ - incidents
+ post:
+ description: Creates a new incident.
+ operationId: create_incident__organization__incidents_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Creates a new incident.
+ tags:
+ - incidents
+ /{organization}/incidents/metric/forecast:
+ get:
+ description: Gets incident forecast data.
+ operationId: get_incident_forecast__organization__incidents_metric_forecast_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Gets incident forecast data.
+ tags:
+ - incidents
+ /{organization}/incidents/{incident_id}:
+ delete:
+ description: Deletes an incident and its external resources.
+ operationId: delete_incident__organization__incidents__incident_id__delete
+ parameters:
+ - in: path
+ name: incident_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Deletes an incident and its external resources.
+ tags:
+ - incidents
+ get:
+ description: Retrieves the details of a single incident.
+ operationId: get_incident__organization__incidents__incident_id__get
+ parameters:
+ - in: path
+ name: incident_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Retrieves a single incident.
+ tags:
+ - incidents
+ put:
+ description: Updates an existing incident.
+ operationId: update_incident__organization__incidents__incident_id__put
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: path
+ name: incident_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Id
+ type: integer
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IncidentRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Updates an existing incident.
+ tags:
+ - incidents
+ /{organization}/incidents/{incident_id}/join:
+ post:
+ description: Adds an individual to an incident.
+ operationId: join_incident__organization__incidents__incident_id__join_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: path
+ name: incident_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Id
+ type: integer
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Adds an individual to an incident.
+ tags:
+ - incidents
+ /{organization}/incidents/{incident_id}/report/executive:
+ post:
+ description: Creates an executive report.
+ operationId: create_executive_report__organization__incidents__incident_id__report_executive_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: path
+ name: incident_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Id
+ type: integer
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ExecutiveReportCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Creates an executive report.
+ tags:
+ - incidents
+ /{organization}/incidents/{incident_id}/report/tactical:
+ post:
+ description: Creates a tactical report.
+ operationId: create_tactical_report__organization__incidents__incident_id__report_tactical_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: path
+ name: incident_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Id
+ type: integer
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TacticalReportCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Creates a tactical report.
+ tags:
+ - incidents
+ /{organization}/incidents/{incident_id}/subscribe:
+ post:
+ description: Subscribes an individual to an incident.
+ operationId: subscribe_to_incident__organization__incidents__incident_id__subscribe_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: path
+ name: incident_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Incident Id
+ type: integer
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Subscribes an individual to an incident.
+ tags:
+ - incidents
+ /{organization}/individuals:
+ get:
+ description: Retrieve individual contacts.
+ operationId: get_individuals__organization__individuals_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IndividualContactPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Individuals
+ tags:
+ - individuals
+ post:
+ description: Creates a new individual contact.
+ operationId: create_individual__organization__individuals_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IndividualContactCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IndividualContactRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Individual
+ tags:
+ - individuals
+ /{organization}/individuals/{individual_contact_id}:
+ delete:
+ description: Deletes an individual contact.
+ operationId: delete_individual__organization__individuals__individual_contact_id__delete
+ parameters:
+ - in: path
+ name: individual_contact_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Individual Contact Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Deletes an individual contact.
+ tags:
+ - individuals
+ get:
+ description: Gets an individual contact.
+ operationId: get_individual__organization__individuals__individual_contact_id__get
+ parameters:
+ - in: path
+ name: individual_contact_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Individual Contact Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IndividualContactRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Individual
+ tags:
+ - individuals
+ put:
+ description: Updates an individual contact.
+ operationId: update_individual__organization__individuals__individual_contact_id__put
+ parameters:
+ - in: path
+ name: individual_contact_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Individual Contact Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IndividualContactUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/IndividualContactRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Updates an individual's contact information.
+ tags:
+ - individuals
+ /{organization}/notifications:
+ get:
+ description: Get all notifications, or only those matching a given search term.
+ operationId: get_notifications__organization__notifications_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/NotificationPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Notifications
+ tags:
+ - notifications
+ post:
+ description: Create a notification.
+ operationId: create_notification__organization__notifications_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/NotificationCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/NotificationRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Notification
+ tags:
+ - notifications
+ /{organization}/notifications/{notification_id}:
+ delete:
+ description: Delete a notification, returning only an HTTP 200 OK if successful.
+ operationId: delete_notification__organization__notifications__notification_id__delete
+ parameters:
+ - in: path
+ name: notification_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Notification Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Notification
+ tags:
+ - notifications
+ get:
+ description: Get a notification by its id.
+ operationId: get_notification__organization__notifications__notification_id__get
+ parameters:
+ - in: path
+ name: notification_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Notification Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/NotificationRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Notification
+ tags:
+ - notifications
+ put:
+ description: Update a notification by its id.
+ operationId: update_notification__organization__notifications__notification_id__put
+ parameters:
+ - in: path
+ name: notification_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Notification Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/NotificationUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/NotificationRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Notification
+ tags:
+ - notifications
+ /{organization}/plugins:
+ get:
+ description: Get all plugins.
+ operationId: get_plugins__organization__plugins_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/PluginPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Plugins
+ tags:
+ - plugins
+ /{organization}/plugins/instances:
+ get:
+ description: Get all plugin instances.
+ operationId: get_plugin_instances__organization__plugins_instances_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/PluginInstancePagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Plugin Instances
+ tags:
+ - plugins
+ post:
+ description: Create a new plugin instance.
+ operationId: create_plugin_instance__organization__plugins_instances_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/PluginInstanceCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/PluginInstanceRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Plugin Instance
+ tags:
+ - plugins
+ /{organization}/plugins/instances/{plugin_instance_id}:
+ delete:
+ description: Deletes an existing plugin instance.
+ operationId: delete_plugin_instances__organization__plugins_instances__plugin_instance_id__delete
+ parameters:
+ - in: path
+ name: plugin_instance_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Plugin Instance Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Plugin Instances
+ tags:
+ - plugins
+ get:
+ description: Get a plugin instance.
+ operationId: get_plugin_instance__organization__plugins_instances__plugin_instance_id__get
+ parameters:
+ - in: path
+ name: plugin_instance_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Plugin Instance Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/PluginInstanceRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Plugin Instance
+ tags:
+ - plugins
+ put:
+ description: Update a plugin instance.
+ operationId: update_plugin_instance__organization__plugins_instances__plugin_instance_id__put
+ parameters:
+ - in: path
+ name: plugin_instance_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Plugin Instance Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/PluginInstanceUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/PluginInstanceCreate"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Plugin Instance
+ tags:
+ - plugins
+ /{organization}/projects:
+ get:
+ description: Get all projects.
+ operationId: get_projects__organization__projects_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ProjectPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Projects
+ tags:
+ - projects
+ post:
+ description: Create a new project.
+ operationId: create_project__organization__projects_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ProjectCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create a new project.
+ tags:
+ - projects
+ /{organization}/projects/{project_id}:
+ delete:
+ description: Delete a project.
+ operationId: delete_project__organization__projects__project_id__delete
+ parameters:
+ - in: path
+ name: project_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Project Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Project
+ tags:
+ - projects
+ get:
+ description: Get a project.
+ operationId: get_project__organization__projects__project_id__get
+ parameters:
+ - in: path
+ name: project_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Project Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get a project.
+ tags:
+ - projects
+ put:
+ description: Update a project.
+ operationId: update_project__organization__projects__project_id__put
+ parameters:
+ - in: path
+ name: project_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Project Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ProjectUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/dispatch__project__models__ProjectRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Project
+ tags:
+ - projects
+ /{organization}/search:
+ get:
+ description: Perform a search.
+ operationId: search__organization__search_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: type[]
+ required: true
+ schema:
+ items:
+ $ref: "#/components/schemas/SearchTypes"
+ type: array
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Search
+ tags:
+ - search
+ /{organization}/search/filters:
+ get:
+ description: Retrieve filters.
+ operationId: get_filters__organization__search_filters_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SearchFilterPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Filters
+ tags:
+ - search_filters
+ post:
+ description: Create a new filter.
+ operationId: create_search_filter__organization__search_filters_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SearchFilterCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SearchFilterRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Search Filter
+ tags:
+ - search_filters
+ /{organization}/search/filters/{search_filter_id}:
+ delete:
+ description: Delete a search filter.
+ operationId: delete_filter__organization__search_filters__search_filter_id__delete
+ parameters:
+ - in: path
+ name: search_filter_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Search Filter Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Filter
+ tags:
+ - search_filters
+ put:
+ description: Update a search filter.
+ operationId: update_search_filter__organization__search_filters__search_filter_id__put
+ parameters:
+ - in: path
+ name: search_filter_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Search Filter Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SearchFilterUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SearchFilterRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Search Filter
+ tags:
+ - search_filters
+ /{organization}/services:
+ get:
+ description: Retrieves all services.
+ operationId: get_services__organization__services_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ServicePagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Services
+ tags:
+ - services
+ post:
+ description: Creates a new service.
+ operationId: create_service__organization__services_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ example:
+ external_id: "234234"
+ is_active: true
+ name: myService
+ type: pagerduty
+ schema:
+ $ref: "#/components/schemas/ServiceCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ServiceRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Service
+ tags:
+ - services
+ /{organization}/services/{service_id}:
+ delete:
+ description: Deletes a service.
+ operationId: delete_service__organization__services__service_id__delete
+ parameters:
+ - in: path
+ name: service_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Service Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Service
+ tags:
+ - services
+ get:
+ description: Gets a service.
+ operationId: get_service__organization__services__service_id__get
+ parameters:
+ - in: path
+ name: service_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Service Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ServiceRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Service
+ tags:
+ - services
+ put:
+ description: Updates an existing service.
+ operationId: update_service__organization__services__service_id__put
+ parameters:
+ - in: path
+ name: service_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Service Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ServiceUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ServiceRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Service
+ tags:
+ - services
+ /{organization}/signals:
+ get:
+ description: Get all signal definitions.
+ operationId: get_signals__organization__signals_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SignalPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Signals
+ tags:
+ - signals
+ post:
+ description: Create a new signal.
+ operationId: create_signal__organization__signals_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SignalCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/dispatch__signal__models__SignalRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Signal
+ tags:
+ - signals
+ /{organization}/signals/filters:
+ get:
+ description: Get all signal filters.
+ operationId: get_signal_filters__organization__signals_filters_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SignalFilterPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Signal Filters
+ tags:
+ - signals
+ post:
+ description: Create a new signal filter.
+ operationId: create_filter__organization__signals_filters_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SignalFilterCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SignalFilterRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Filter
+ tags:
+ - signals
+ /{organization}/signals/filters/{signal_filter_id}:
+ delete:
+ description: Deletes a signal filter.
+ operationId: delete_filter__organization__signals_filters__signal_filter_id__delete
+ parameters:
+ - in: path
+ name: signal_filter_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Signal Filter Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Filter
+ tags:
+ - signals
+ put:
+ description: Updates an existing signal filter.
+ operationId: update_filter__organization__signals_filters__signal_filter_id__put
+ parameters:
+ - in: path
+ name: signal_filter_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Signal Filter Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SignalFilterUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/dispatch__signal__models__SignalRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Filter
+ tags:
+ - signals
+ /{organization}/signals/instances:
+ get:
+ description: Get all signal instances.
+ operationId: get_signal_instances__organization__signals_instances_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SignalInstancePagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Signal Instances
+ tags:
+ - signals
+ post:
+ description: Create a new signal instance.
+ operationId: create_signal_instance__organization__signals_instances_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SignalInstanceCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/dispatch__signal__models__SignalInstanceRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Signal Instance
+ tags:
+ - signals
+ /{organization}/signals/{signal_id}:
+ delete:
+ description: Deletes a signal.
+ operationId: delete_signal__organization__signals__signal_id__delete
+ parameters:
+ - in: path
+ name: signal_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Signal Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Signal
+ tags:
+ - signals
+ get:
+ description: Get a signal by it's ID.
+ operationId: get_signal__organization__signals__signal_id__get
+ parameters:
+ - in: path
+ name: signal_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Signal Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/dispatch__signal__models__SignalRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Signal
+ tags:
+ - signals
+ put:
+ description: Updates an existing signal.
+ operationId: update_signal__organization__signals__signal_id__put
+ parameters:
+ - in: path
+ name: signal_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Signal Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/SignalUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/dispatch__signal__models__SignalRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Signal
+ tags:
+ - signals
+ /{organization}/tag_types:
+ get:
+ description: Get all tag types, or only those matching a given search term.
+ operationId: get_tag_types__organization__tag_types_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TagTypePagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Tag Types
+ tags:
+ - tag_types
+ post:
+ description: Create a new tag type.
+ operationId: create_tag_type__organization__tag_types_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TagTypeCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TagTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Tag Type
+ tags:
+ - tag_types
+ /{organization}/tag_types/{tag_type_id}:
+ delete:
+ description: Delete a tag type.
+ operationId: delete_tag_type__organization__tag_types__tag_type_id__delete
+ parameters:
+ - in: path
+ name: tag_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Tag Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Tag Type
+ tags:
+ - tag_types
+ get:
+ description: Get a tag type by its id.
+ operationId: get_tag_type__organization__tag_types__tag_type_id__get
+ parameters:
+ - in: path
+ name: tag_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Tag Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TagTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Tag Type
+ tags:
+ - tag_types
+ put:
+ description: Update a tag type.
+ operationId: update_tag_type__organization__tag_types__tag_type_id__put
+ parameters:
+ - in: path
+ name: tag_type_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Tag Type Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TagTypeUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TagTypeRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Tag Type
+ tags:
+ - tag_types
+ /{organization}/tags:
+ get:
+ description: Get all tags, or only those matching a given search term.
+ operationId: get_tags__organization__tags_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TagPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Tags
+ tags:
+ - tags
+ post:
+ description: Creates a new tag.
+ operationId: create_tag__organization__tags_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TagCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TagRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Tag
+ tags:
+ - tags
+ /{organization}/tags/recommendations/{model_name}/{id}:
+ get:
+ description: Retrieves a tag recommendation based on the model and model id.
+ operationId: get_tag_recommendations__organization__tags_recommendations__model_name___id__get
+ parameters:
+ - in: path
+ name: model_name
+ required: true
+ schema:
+ title: Model Name
+ type: string
+ - in: path
+ name: id
+ required: true
+ schema:
+ title: Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TagPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Tag Recommendations
+ tags:
+ - tags
+ /{organization}/tags/{tag_id}:
+ delete:
+ description: Deletes a tag, returning only an HTTP 200 OK if successful.
+ operationId: delete_tag__organization__tags__tag_id__delete
+ parameters:
+ - in: path
+ name: tag_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Tag Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Tag
+ tags:
+ - tags
+ get:
+ description: Given its unique id, retrieve details about a single tag.
+ operationId: get_tag__organization__tags__tag_id__get
+ parameters:
+ - in: path
+ name: tag_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Tag Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TagRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Tag
+ tags:
+ - tags
+ put:
+ description: Updates an existing tag.
+ operationId: update_tag__organization__tags__tag_id__put
+ parameters:
+ - in: path
+ name: tag_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Tag Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TagUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TagRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Tag
+ tags:
+ - tags
+ /{organization}/tasks:
+ get:
+ description: Retrieve all tasks.
+ operationId: get_tasks__organization__tasks_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: include[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Include[]
+ type: array
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Retrieve a list of all tasks.
+ tags:
+ - tasks
+ post:
+ description: Creates a new task.
+ operationId: create_task__organization__tasks_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TaskCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/dispatch__task__models__TaskRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Task
+ tags:
+ - tasks
+ - tasks
+ /{organization}/tasks/{task_id}:
+ delete:
+ description: Deletes an existing task.
+ operationId: delete_task__organization__tasks__task_id__delete
+ parameters:
+ - in: path
+ name: task_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Task Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Task
+ tags:
+ - tasks
+ - tasks
+ put:
+ description: Updates an existing task.
+ operationId: update_task__organization__tasks__task_id__put
+ parameters:
+ - in: path
+ name: task_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Task Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TaskUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/dispatch__task__models__TaskRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Task
+ tags:
+ - tasks
+ - tasks
+ /{organization}/teams:
+ get:
+ description: Get all team contacts.
+ operationId: get_teams__organization__teams_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TeamPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Teams
+ tags:
+ - teams
+ post:
+ description: Create a new team contact.
+ operationId: create_team__organization__teams_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TeamContactCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TeamContactRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Team
+ tags:
+ - teams
+ /{organization}/teams/{team_contact_id}:
+ delete:
+ description: Delete a team contact.
+ operationId: delete_team__organization__teams__team_contact_id__delete
+ parameters:
+ - in: path
+ name: team_contact_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Team Contact Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Team
+ tags:
+ - teams
+ get:
+ description: Get a team contact.
+ operationId: get_team__organization__teams__team_contact_id__get
+ parameters:
+ - in: path
+ name: team_contact_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Team Contact Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TeamContactRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Team
+ tags:
+ - teams
+ put:
+ description: Update a team contact.
+ operationId: update_team__organization__teams__team_contact_id__put
+ parameters:
+ - in: path
+ name: team_contact_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Team Contact Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TeamContactUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TeamContactRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Team
+ tags:
+ - teams
+ /{organization}/terms:
+ get:
+ description: Retrieve all terms.
+ operationId: get_terms__organization__terms_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TermPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Terms
+ tags:
+ - terms
+ post:
+ description: Create a new term.
+ operationId: create_term__organization__terms_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TermCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TermRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Term
+ tags:
+ - terms
+ /{organization}/terms/{term_id}:
+ delete:
+ description: Delete a term.
+ operationId: delete_term__organization__terms__term_id__delete
+ parameters:
+ - in: path
+ name: term_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Term Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Term
+ tags:
+ - terms
+ get:
+ description: Get a term.
+ operationId: get_term__organization__terms__term_id__get
+ parameters:
+ - in: path
+ name: term_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Term Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TermRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Term
+ tags:
+ - terms
+ put:
+ description: Update a term.
+ operationId: update_term__organization__terms__term_id__put
+ parameters:
+ - in: path
+ name: term_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Term Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TermUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/TermRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Term
+ tags:
+ - terms
+ /{organization}/users:
+ get:
+ description: Get all users.
+ operationId: get_users__organization__users_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/UserPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Users
+ tags:
+ - users
+ /{organization}/users/{user_id}:
+ get:
+ description: Get a user.
+ operationId: get_user__organization__users__user_id__get
+ parameters:
+ - in: path
+ name: user_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: User Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/UserRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get User
+ tags:
+ - users
+ put:
+ description: Update a user.
+ operationId: update_user__organization__users__user_id__put
+ parameters:
+ - in: path
+ name: user_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: User Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/UserUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/UserRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update User
+ tags:
+ - users
+ /{organization}/workflows:
+ get:
+ description: Get all workflows.
+ operationId: get_workflows__organization__workflows_get
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ - in: query
+ name: page
+ required: false
+ schema:
+ default: 1
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Page
+ type: integer
+ - in: query
+ name: itemsPerPage
+ required: false
+ schema:
+ default: 5
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: -2.0
+ title: Itemsperpage
+ type: integer
+ - in: query
+ name: q
+ required: false
+ schema:
+ minLength: 1
+ pattern: ^[ -~]+$
+ title: Q
+ type: string
+ - in: query
+ name: filter
+ required: false
+ schema:
+ default: []
+ format: json-string
+ title: Filter
+ type: string
+ - in: query
+ name: sortBy[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: string
+ title: Sortby[]
+ type: array
+ - in: query
+ name: descending[]
+ required: false
+ schema:
+ default: []
+ items:
+ type: boolean
+ title: Descending[]
+ type: array
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/WorkflowPagination"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Workflows
+ tags:
+ - workflows
+ post:
+ description: Create a new workflow.
+ operationId: create_workflow__organization__workflows_post
+ parameters:
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/WorkflowCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/WorkflowRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Create Workflow
+ tags:
+ - workflows
+ /{organization}/workflows/instances/{workflow_instance_id}:
+ get:
+ description: Get a workflow instance.
+ operationId: get_workflow_instance__organization__workflows_instances__workflow_instance_id__get
+ parameters:
+ - in: path
+ name: workflow_instance_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Workflow Instance Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/WorkflowInstanceRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Workflow Instance
+ tags:
+ - workflows
+ /{organization}/workflows/{workflow_id}:
+ delete:
+ description: Delete a workflow.
+ operationId: delete_workflow__organization__workflows__workflow_id__delete
+ parameters:
+ - in: path
+ name: workflow_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Workflow Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema: {}
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Delete Workflow
+ tags:
+ - workflows
+ get:
+ description: Get a workflow.
+ operationId: get_workflow__organization__workflows__workflow_id__get
+ parameters:
+ - in: path
+ name: workflow_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Workflow Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/WorkflowRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Get Workflow
+ tags:
+ - workflows
+ put:
+ description: Update a workflow.
+ operationId: update_workflow__organization__workflows__workflow_id__put
+ parameters:
+ - in: path
+ name: workflow_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Workflow Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/WorkflowUpdate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/WorkflowRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Update Workflow
+ tags:
+ - workflows
+ /{organization}/workflows/{workflow_id}/run:
+ post:
+ description: Runs a workflow with a given set of parameters.
+ operationId: run_workflow__organization__workflows__workflow_id__run_post
+ parameters:
+ - in: path
+ name: workflow_id
+ required: true
+ schema:
+ exclusiveMaximum: 2147483647.0
+ exclusiveMinimum: 0.0
+ title: Workflow Id
+ type: integer
+ - in: path
+ name: organization
+ required: true
+ schema:
+ minLength: 3
+ pattern: ^[\w]+(?:_[\w]+)*$
+ title: Organization
+ type: string
+ requestBody:
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/WorkflowInstanceCreate"
+ required: true
+ responses:
+ "200":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/WorkflowInstanceRead"
+ description: Successful Response
+ "400":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Bad Request
+ "401":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Unauthorized
+ "403":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Forbidden
+ "404":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Not Found
+ "422":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/HTTPValidationError"
+ description: Validation Error
+ "500":
+ content:
+ application/json:
+ schema:
+ $ref: "#/components/schemas/ErrorResponse"
+ description: Internal Server Error
+ summary: Run Workflow
+ tags:
+ - workflows
diff --git a/docs/security.md b/docs/security.md
deleted file mode 100644
index 23df95d3df8c..000000000000
--- a/docs/security.md
+++ /dev/null
@@ -1,43 +0,0 @@
----
-description: >-
- We take the security of Dispatch seriously. The following are a set of
- policies we have adopted to ensure that security issues are addressed in a
- timely fashion.
----
-
-# Security
-
-## Reporting a security issue
-
-We ask that you do not report security issue to our normal GitHub issue tracker.
-
-If you believe you've identified a security issue with `Dispatch`, please report it via our public Netflix bug bounty program at: [https://bugcrowd.com/netflix](https://bugcrowd.com/netflix)
-
-Once you've submitted the issue it will be handled by our triage team, typically within 48 hours.
-
-## Support Versions
-
-At any given time, we will provide security support for the `master` branch as well as the two most recent releases.
-
-## Disclosure Process
-
-Our process for taking a security issue from private discussion to public disclosure involves multiple steps.
-
-Approximately one week before full public disclosure, we will send advance notification of the issue to a list of people and organizations, primarily composed of known users of `Dispatch`. This notification will consist of an email message containing:
-
-* A full description of the issue and the affected versions of `dispatch`.
-* The steps we will be taking to remedy the issue.
-* The patches, if any, that will be applied to `dispatch`.
-* The date on which the `dispatch` team will apply these patches, issue new releases, and publicly disclose the issue.
-
-Simultaneously, the reporter of the issue will receive notification of the date on which we plan to make the issue public.
-
-On the day of disclosure, we will take the following steps:
-
-* Apply the relevant patches to the `dispatch` repository. The commit messages for these patches will indicate that they are for security issues, but will not describe the issue in any detail; instead, they will warn of upcoming disclosure.
-* Issue the relevant releases.
-
-If a reported issue is believed to be particularly time-sensitive â due to a known exploit in the wild, for example â the time between advance notification and public disclosure may be shortened considerably.
-
-The list of people and organizations who receives advanced notification of security issues is not, and will not, be made public. This list generally consists of high-profile downstream users and is entirely at the discretion of the `dispatch` team.
-
diff --git a/docs/sidebars.js b/docs/sidebars.js
new file mode 100644
index 000000000000..d789e39f85b4
--- /dev/null
+++ b/docs/sidebars.js
@@ -0,0 +1,21 @@
+/**
+ * Creating a sidebar enables you to:
+ - create an ordered group of docs
+ - render a sidebar for each doc of that group
+ - provide next/previous navigation
+
+ The sidebars can be generated from the filesystem, or explicitly defined here.
+
+ Create as many sidebars as you want.
+ */
+
+// @ts-check
+
+/** @type {import('@docusaurus/plugin-content-docs').SidebarsConfig} */
+const sidebars = {
+ // By default, Docusaurus generates a sidebar from the docs folder structure
+ adminSidebar: [{ type: "autogenerated", dirName: "administration" }],
+ userGuideSidebar: [{ type: "autogenerated", dirName: "user-guide" }],
+}
+
+module.exports = sidebars
diff --git a/docs/src/components/HomepageFeatures/index.js b/docs/src/components/HomepageFeatures/index.js
new file mode 100644
index 000000000000..2c9d1de81593
--- /dev/null
+++ b/docs/src/components/HomepageFeatures/index.js
@@ -0,0 +1,59 @@
+import React from "react"
+import clsx from "clsx"
+import styles from "./styles.module.css"
+
+const FeatureList = [
+ {
+ title: "Easy to Use",
+ Svg: require("@site/static/img/undraw_docusaurus_mountain.svg").default,
+ description: (
+ <>
+ Dispatch was designed to stay out of the spotlight, instead opting to supercharge existing
+ tools (Slack, Google Docs, etc.,) for us in incident response.
+ >
+ ),
+ },
+ {
+ title: "Focus on What Matters",
+ Svg: require("@site/static/img/undraw_docusaurus_tree.svg").default,
+ description: (
+ <>
+ Dispatch lets you focus on your incident, let Dispatch manage timelines, documentation and
+ people leaving you to focus on resolve the incident.
+ >
+ ),
+ },
+ {
+ title: "API First",
+ Svg: require("@site/static/img/undraw_docusaurus_react.svg").default,
+ description: <>Extend or customize Dispatch via it's API or integrated plugins.>,
+ },
+]
+
+function Feature({ Svg, title, description }) {
+ return (
+
+
+
+
+
+
{title}
+
{description}
+
+
+ )
+}
+
+export default function HomepageFeatures() {
+ return (
+
+
+
+ {FeatureList.map((props, idx) => (
+
+ ))}
+
+
+
+ )
+}
diff --git a/docs/src/components/HomepageFeatures/styles.module.css b/docs/src/components/HomepageFeatures/styles.module.css
new file mode 100644
index 000000000000..b248eb2e5dee
--- /dev/null
+++ b/docs/src/components/HomepageFeatures/styles.module.css
@@ -0,0 +1,11 @@
+.features {
+ display: flex;
+ align-items: center;
+ padding: 2rem 0;
+ width: 100%;
+}
+
+.featureSvg {
+ height: 200px;
+ width: 200px;
+}
diff --git a/docs/src/css/custom.css b/docs/src/css/custom.css
new file mode 100644
index 000000000000..8be6b8d2d80d
--- /dev/null
+++ b/docs/src/css/custom.css
@@ -0,0 +1,52 @@
+/**
+ * Any CSS included here will be global. The classic template
+ * bundles Infima by default. Infima is a CSS framework designed to
+ * work well for content-centric websites.
+ */
+
+/* You can override the default Infima variables here. */
+:root {
+ --ifm-color-primary: #e50914;
+ --ifm-color-primary-dark: #B20710;
+ --ifm-color-primary-darker: #82030a;
+ --ifm-color-primary-darkest: #580307;
+ --ifm-color-primary-light: #29d5b0;
+ --ifm-color-primary-lighter: #359962;
+ --ifm-color-primary-lightest: #3cad6e;
+ --ifm-code-font-size: 95%;
+ --docusaurus-highlighted-code-line-bg: rgba(0, 0, 0, 0.1);
+ --aa-primary-color-rgb: 229, 9, 20 !important;
+ --aa-muted-color-rgb: 255, 0, 0 !important;
+}
+
+/* For readability concerns, you should choose a lighter palette in dark mode. */
+[data-theme='dark'] {
+ --ifm-color-primary: #E50914;
+ --ifm-color-primary-dark: #B20710;
+ --ifm-color-primary-darker: #82030a;
+ --ifm-color-primary-darkest: #580307;
+ --ifm-color-primary-light: #29d5b0;
+ --ifm-color-primary-lighter: #32d8b4;
+ --ifm-color-primary-lightest: #4fddbf;
+ --docusaurus-highlighted-code-line-bg: rgba(0, 0, 0, 0.3);
+ --aa-primary-color-rgb: 229, 9, 20 !important;
+ --aa-muted-color-rgb: 255, 0, 0 !important;
+}
+
+.header-github-link:hover {
+ opacity: 0.6;
+}
+
+.header-github-link::before {
+ content: '';
+ width: 24px;
+ height: 24px;
+ display: flex;
+ background: url("data:image/svg+xml,%3Csvg viewBox='0 0 24 24' xmlns='http://www.w3.org/2000/svg'%3E%3Cpath d='M12 .297c-6.63 0-12 5.373-12 12 0 5.303 3.438 9.8 8.205 11.385.6.113.82-.258.82-.577 0-.285-.01-1.04-.015-2.04-3.338.724-4.042-1.61-4.042-1.61C4.422 18.07 3.633 17.7 3.633 17.7c-1.087-.744.084-.729.084-.729 1.205.084 1.838 1.236 1.838 1.236 1.07 1.835 2.809 1.305 3.495.998.108-.776.417-1.305.76-1.605-2.665-.3-5.466-1.332-5.466-5.93 0-1.31.465-2.38 1.235-3.22-.135-.303-.54-1.523.105-3.176 0 0 1.005-.322 3.3 1.23.96-.267 1.98-.399 3-.405 1.02.006 2.04.138 3 .405 2.28-1.552 3.285-1.23 3.285-1.23.645 1.653.24 2.873.12 3.176.765.84 1.23 1.91 1.23 3.22 0 4.61-2.805 5.625-5.475 5.92.42.36.81 1.096.81 2.22 0 1.606-.015 2.896-.015 3.286 0 .315.21.69.825.57C20.565 22.092 24 17.592 24 12.297c0-6.627-5.373-12-12-12'/%3E%3C/svg%3E")
+ no-repeat;
+}
+
+[data-theme='dark'] .header-github-link::before {
+ background: url("data:image/svg+xml,%3Csvg viewBox='0 0 24 24' xmlns='http://www.w3.org/2000/svg'%3E%3Cpath fill='white' d='M12 .297c-6.63 0-12 5.373-12 12 0 5.303 3.438 9.8 8.205 11.385.6.113.82-.258.82-.577 0-.285-.01-1.04-.015-2.04-3.338.724-4.042-1.61-4.042-1.61C4.422 18.07 3.633 17.7 3.633 17.7c-1.087-.744.084-.729.084-.729 1.205.084 1.838 1.236 1.838 1.236 1.07 1.835 2.809 1.305 3.495.998.108-.776.417-1.305.76-1.605-2.665-.3-5.466-1.332-5.466-5.93 0-1.31.465-2.38 1.235-3.22-.135-.303-.54-1.523.105-3.176 0 0 1.005-.322 3.3 1.23.96-.267 1.98-.399 3-.405 1.02.006 2.04.138 3 .405 2.28-1.552 3.285-1.23 3.285-1.23.645 1.653.24 2.873.12 3.176.765.84 1.23 1.91 1.23 3.22 0 4.61-2.805 5.625-5.475 5.92.42.36.81 1.096.81 2.22 0 1.606-.015 2.896-.015 3.286 0 .315.21.69.825.57C20.565 22.092 24 17.592 24 12.297c0-6.627-5.373-12-12-12'/%3E%3C/svg%3E")
+ no-repeat;
+}
diff --git a/docs/src/pages/index.js b/docs/src/pages/index.js
new file mode 100644
index 000000000000..78ce4c9c2ba1
--- /dev/null
+++ b/docs/src/pages/index.js
@@ -0,0 +1,35 @@
+import React from "react"
+import clsx from "clsx"
+import Link from "@docusaurus/Link"
+import useDocusaurusContext from "@docusaurus/useDocusaurusContext"
+import Layout from "@theme/Layout"
+import HomepageFeatures from "@site/src/components/HomepageFeatures"
+
+import styles from "./index.module.css"
+
+function HomepageHeader() {
+ const { siteConfig } = useDocusaurusContext()
+ return (
+
+
+
{siteConfig.title}
+
{siteConfig.tagline}
+
+
+ )
+}
+
+export default function Home() {
+ const { siteConfig } = useDocusaurusContext()
+ return (
+
+
+
+
+
+
+ )
+}
diff --git a/docs/src/pages/index.module.css b/docs/src/pages/index.module.css
new file mode 100644
index 000000000000..9f71a5da775b
--- /dev/null
+++ b/docs/src/pages/index.module.css
@@ -0,0 +1,23 @@
+/**
+ * CSS files with the .module.css suffix will be treated as CSS modules
+ * and scoped locally.
+ */
+
+.heroBanner {
+ padding: 4rem 0;
+ text-align: center;
+ position: relative;
+ overflow: hidden;
+}
+
+@media screen and (max-width: 996px) {
+ .heroBanner {
+ padding: 2rem;
+ }
+}
+
+.buttons {
+ display: flex;
+ align-items: center;
+ justify-content: center;
+}
diff --git a/docs/src/pages/markdown-page.md b/docs/src/pages/markdown-page.md
new file mode 100644
index 000000000000..9756c5b6685a
--- /dev/null
+++ b/docs/src/pages/markdown-page.md
@@ -0,0 +1,7 @@
+---
+title: Markdown page example
+---
+
+# Markdown page example
+
+You don't need React to write simple standalone pages.
diff --git a/src/dispatch/alembic/__init__.py b/docs/static/.nojekyll
similarity index 100%
rename from src/dispatch/alembic/__init__.py
rename to docs/static/.nojekyll
diff --git a/docs/static/img/admin-ui-admin.png b/docs/static/img/admin-ui-admin.png
new file mode 100644
index 000000000000..f7f7278a75c6
Binary files /dev/null and b/docs/static/img/admin-ui-admin.png differ
diff --git a/docs/static/img/admin-ui-associate-case-template.png b/docs/static/img/admin-ui-associate-case-template.png
new file mode 100644
index 000000000000..7cf8912f3718
Binary files /dev/null and b/docs/static/img/admin-ui-associate-case-template.png differ
diff --git a/docs/static/img/admin-ui-associate-incident-template.png b/docs/static/img/admin-ui-associate-incident-template.png
new file mode 100644
index 000000000000..df6d44266186
Binary files /dev/null and b/docs/static/img/admin-ui-associate-incident-template.png differ
diff --git a/docs/static/img/admin-ui-case-priorities.png b/docs/static/img/admin-ui-case-priorities.png
new file mode 100644
index 000000000000..7dcbf7883392
Binary files /dev/null and b/docs/static/img/admin-ui-case-priorities.png differ
diff --git a/docs/static/img/admin-ui-case-severities.png b/docs/static/img/admin-ui-case-severities.png
new file mode 100644
index 000000000000..3f9a918f7e30
Binary files /dev/null and b/docs/static/img/admin-ui-case-severities.png differ
diff --git a/docs/static/img/admin-ui-case-types.png b/docs/static/img/admin-ui-case-types.png
new file mode 100644
index 000000000000..fc4842a4309f
Binary files /dev/null and b/docs/static/img/admin-ui-case-types.png differ
diff --git a/docs/static/img/admin-ui-contacts-individuals.png b/docs/static/img/admin-ui-contacts-individuals.png
new file mode 100644
index 000000000000..1cbdcb10b69c
Binary files /dev/null and b/docs/static/img/admin-ui-contacts-individuals.png differ
diff --git a/docs/static/img/admin-ui-contacts-services.png b/docs/static/img/admin-ui-contacts-services.png
new file mode 100644
index 000000000000..4859d5003eda
Binary files /dev/null and b/docs/static/img/admin-ui-contacts-services.png differ
diff --git a/docs/static/img/admin-ui-contacts-teams.png b/docs/static/img/admin-ui-contacts-teams.png
new file mode 100644
index 000000000000..a7a81c8507af
Binary files /dev/null and b/docs/static/img/admin-ui-contacts-teams.png differ
diff --git a/docs/static/img/admin-ui-cost-model.png b/docs/static/img/admin-ui-cost-model.png
new file mode 100644
index 000000000000..c0148fc5bf35
Binary files /dev/null and b/docs/static/img/admin-ui-cost-model.png differ
diff --git a/docs/static/img/admin-ui-create-edit-runbook.png b/docs/static/img/admin-ui-create-edit-runbook.png
new file mode 100644
index 000000000000..ad76484f1ae6
Binary files /dev/null and b/docs/static/img/admin-ui-create-edit-runbook.png differ
diff --git a/docs/static/img/admin-ui-create-edit-template.png b/docs/static/img/admin-ui-create-edit-template.png
new file mode 100644
index 000000000000..5fe3df8ca8df
Binary files /dev/null and b/docs/static/img/admin-ui-create-edit-template.png differ
diff --git a/docs/static/img/admin-ui-dashboard-forecast.png b/docs/static/img/admin-ui-dashboard-forecast.png
new file mode 100644
index 000000000000..a3ac1d916391
Binary files /dev/null and b/docs/static/img/admin-ui-dashboard-forecast.png differ
diff --git a/docs/static/img/admin-ui-dashboard-priority.png b/docs/static/img/admin-ui-dashboard-priority.png
new file mode 100644
index 000000000000..8a53d0423a54
Binary files /dev/null and b/docs/static/img/admin-ui-dashboard-priority.png differ
diff --git a/docs/static/img/admin-ui-dashboard-top-line.png b/docs/static/img/admin-ui-dashboard-top-line.png
new file mode 100644
index 000000000000..762d009c2b99
Binary files /dev/null and b/docs/static/img/admin-ui-dashboard-top-line.png differ
diff --git a/docs/static/img/admin-ui-dashboard-type.png b/docs/static/img/admin-ui-dashboard-type.png
new file mode 100644
index 000000000000..ec428a5e5c9c
Binary files /dev/null and b/docs/static/img/admin-ui-dashboard-type.png differ
diff --git a/docs/static/img/admin-ui-edit-cost-model.png b/docs/static/img/admin-ui-edit-cost-model.png
new file mode 100644
index 000000000000..aefb98d84acd
Binary files /dev/null and b/docs/static/img/admin-ui-edit-cost-model.png differ
diff --git a/docs/static/img/admin-ui-incident-cost-types.png b/docs/static/img/admin-ui-incident-cost-types.png
new file mode 100644
index 000000000000..66606fdcaaeb
Binary files /dev/null and b/docs/static/img/admin-ui-incident-cost-types.png differ
diff --git a/docs/static/img/admin-ui-incident-feedback.png b/docs/static/img/admin-ui-incident-feedback.png
new file mode 100644
index 000000000000..48b7d361d179
Binary files /dev/null and b/docs/static/img/admin-ui-incident-feedback.png differ
diff --git a/docs/static/img/admin-ui-incident-plugins.png b/docs/static/img/admin-ui-incident-plugins.png
new file mode 100644
index 000000000000..7ecc5a1b3dde
Binary files /dev/null and b/docs/static/img/admin-ui-incident-plugins.png differ
diff --git a/docs/static/img/admin-ui-incident-priorities.png b/docs/static/img/admin-ui-incident-priorities.png
new file mode 100644
index 000000000000..9484634e1bea
Binary files /dev/null and b/docs/static/img/admin-ui-incident-priorities.png differ
diff --git a/docs/static/img/admin-ui-incident-report-receipt.png b/docs/static/img/admin-ui-incident-report-receipt.png
new file mode 100644
index 000000000000..49c405b2357f
Binary files /dev/null and b/docs/static/img/admin-ui-incident-report-receipt.png differ
diff --git a/docs/static/img/admin-ui-incident-report-resources.png b/docs/static/img/admin-ui-incident-report-resources.png
new file mode 100644
index 000000000000..f36458bb7954
Binary files /dev/null and b/docs/static/img/admin-ui-incident-report-resources.png differ
diff --git a/docs/static/img/admin-ui-incident-report.png b/docs/static/img/admin-ui-incident-report.png
new file mode 100644
index 000000000000..011d78b29bd1
Binary files /dev/null and b/docs/static/img/admin-ui-incident-report.png differ
diff --git a/docs/static/img/admin-ui-incident-types.png b/docs/static/img/admin-ui-incident-types.png
new file mode 100644
index 000000000000..a40778965a9d
Binary files /dev/null and b/docs/static/img/admin-ui-incident-types.png differ
diff --git a/docs/static/img/admin-ui-incident-workflows.png b/docs/static/img/admin-ui-incident-workflows.png
new file mode 100644
index 000000000000..04d29fcbf629
Binary files /dev/null and b/docs/static/img/admin-ui-incident-workflows.png differ
diff --git a/docs/static/img/admin-ui-incidents.png b/docs/static/img/admin-ui-incidents.png
new file mode 100644
index 000000000000..b84f8b4679ad
Binary files /dev/null and b/docs/static/img/admin-ui-incidents.png differ
diff --git a/docs/static/img/admin-ui-knowledge-documents.png b/docs/static/img/admin-ui-knowledge-documents.png
new file mode 100644
index 000000000000..206a2f514450
Binary files /dev/null and b/docs/static/img/admin-ui-knowledge-documents.png differ
diff --git a/docs/static/img/admin-ui-knowledge-tag-types.png b/docs/static/img/admin-ui-knowledge-tag-types.png
new file mode 100644
index 000000000000..cf16fd6f5e25
Binary files /dev/null and b/docs/static/img/admin-ui-knowledge-tag-types.png differ
diff --git a/docs/static/img/admin-ui-knowledge-tags.png b/docs/static/img/admin-ui-knowledge-tags.png
new file mode 100644
index 000000000000..7f61482c4428
Binary files /dev/null and b/docs/static/img/admin-ui-knowledge-tags.png differ
diff --git a/docs/static/img/admin-ui-notifications.png b/docs/static/img/admin-ui-notifications.png
new file mode 100644
index 000000000000..e131ed56fcd2
Binary files /dev/null and b/docs/static/img/admin-ui-notifications.png differ
diff --git a/docs/static/img/admin-ui-project.png b/docs/static/img/admin-ui-project.png
new file mode 100644
index 000000000000..a82161e9c39e
Binary files /dev/null and b/docs/static/img/admin-ui-project.png differ
diff --git a/docs/static/img/admin-ui-settings.png b/docs/static/img/admin-ui-settings.png
new file mode 100644
index 000000000000..9c6ba3d97e63
Binary files /dev/null and b/docs/static/img/admin-ui-settings.png differ
diff --git a/docs/static/img/admin-ui-signal-definition.png b/docs/static/img/admin-ui-signal-definition.png
new file mode 100644
index 000000000000..f130f7f197eb
Binary files /dev/null and b/docs/static/img/admin-ui-signal-definition.png differ
diff --git a/docs/static/img/admin-ui-signal-engagement-filter.png b/docs/static/img/admin-ui-signal-engagement-filter.png
new file mode 100644
index 000000000000..a07e54f2c6e1
Binary files /dev/null and b/docs/static/img/admin-ui-signal-engagement-filter.png differ
diff --git a/docs/static/img/admin-ui-signal-entity-type-name.png b/docs/static/img/admin-ui-signal-entity-type-name.png
new file mode 100644
index 000000000000..63c3b27e5139
Binary files /dev/null and b/docs/static/img/admin-ui-signal-entity-type-name.png differ
diff --git a/docs/static/img/admin-ui-signal-entity-type.png b/docs/static/img/admin-ui-signal-entity-type.png
new file mode 100644
index 000000000000..72de2a42c1db
Binary files /dev/null and b/docs/static/img/admin-ui-signal-entity-type.png differ
diff --git a/docs/static/img/admin-ui-signal-filter-dedupe.png b/docs/static/img/admin-ui-signal-filter-dedupe.png
new file mode 100644
index 000000000000..fd0f355a4e65
Binary files /dev/null and b/docs/static/img/admin-ui-signal-filter-dedupe.png differ
diff --git a/docs/static/img/admin-ui-signal-filter-snooze.png b/docs/static/img/admin-ui-signal-filter-snooze.png
new file mode 100644
index 000000000000..cd46fa658d94
Binary files /dev/null and b/docs/static/img/admin-ui-signal-filter-snooze.png differ
diff --git a/docs/static/img/admin-ui-tasks.png b/docs/static/img/admin-ui-tasks.png
new file mode 100644
index 000000000000..2517abb8fcba
Binary files /dev/null and b/docs/static/img/admin-ui-tasks.png differ
diff --git a/docs/static/img/admin-ui-users.png b/docs/static/img/admin-ui-users.png
new file mode 100644
index 000000000000..93b4e30d3c28
Binary files /dev/null and b/docs/static/img/admin-ui-users.png differ
diff --git a/docs/static/img/attorney-section.png b/docs/static/img/attorney-section.png
new file mode 100644
index 000000000000..73b7fdecb33d
Binary files /dev/null and b/docs/static/img/attorney-section.png differ
diff --git a/docs/static/img/docusaurus-social-card.jpg b/docs/static/img/docusaurus-social-card.jpg
new file mode 100644
index 000000000000..ffcb448210e1
Binary files /dev/null and b/docs/static/img/docusaurus-social-card.jpg differ
diff --git a/docs/static/img/docusaurus.png b/docs/static/img/docusaurus.png
new file mode 100644
index 000000000000..f458149e3c8f
Binary files /dev/null and b/docs/static/img/docusaurus.png differ
diff --git a/docs/static/img/email-incident-welcome.png b/docs/static/img/email-incident-welcome.png
new file mode 100644
index 000000000000..a59c092068d2
Binary files /dev/null and b/docs/static/img/email-incident-welcome.png differ
diff --git a/docs/static/img/example-risk-score.png b/docs/static/img/example-risk-score.png
new file mode 100644
index 000000000000..b82070c29d63
Binary files /dev/null and b/docs/static/img/example-risk-score.png differ
diff --git a/docs/static/img/favicon.ico b/docs/static/img/favicon.ico
new file mode 100644
index 000000000000..c01d54bcd39a
Binary files /dev/null and b/docs/static/img/favicon.ico differ
diff --git a/docs/.gitbook/assets/google-docs-task-comment.png b/docs/static/img/google-docs-task-comment.png
similarity index 100%
rename from docs/.gitbook/assets/google-docs-task-comment.png
rename to docs/static/img/google-docs-task-comment.png
diff --git a/docs/.gitbook/assets/google-setup-credentials-0.png b/docs/static/img/google-setup-credentials-0.png
similarity index 100%
rename from docs/.gitbook/assets/google-setup-credentials-0.png
rename to docs/static/img/google-setup-credentials-0.png
diff --git a/docs/.gitbook/assets/incident-flow-0.png b/docs/static/img/incident-flow-0.png
similarity index 100%
rename from docs/.gitbook/assets/incident-flow-0.png
rename to docs/static/img/incident-flow-0.png
diff --git a/docs/static/img/logo.svg b/docs/static/img/logo.svg
new file mode 100644
index 000000000000..9db6d0d066e3
--- /dev/null
+++ b/docs/static/img/logo.svg
@@ -0,0 +1 @@
+
\ No newline at end of file
diff --git a/docs/static/img/pagerduty-service-setup.png b/docs/static/img/pagerduty-service-setup.png
new file mode 100644
index 000000000000..d815705adedd
Binary files /dev/null and b/docs/static/img/pagerduty-service-setup.png differ
diff --git a/docs/static/img/slack-conversation-add-timeline-event.png b/docs/static/img/slack-conversation-add-timeline-event.png
new file mode 100644
index 000000000000..46fbe954ef79
Binary files /dev/null and b/docs/static/img/slack-conversation-add-timeline-event.png differ
diff --git a/docs/.gitbook/assets/slack-conversation-assign-role.png b/docs/static/img/slack-conversation-assign-role.png
similarity index 100%
rename from docs/.gitbook/assets/slack-conversation-assign-role.png
rename to docs/static/img/slack-conversation-assign-role.png
diff --git a/docs/static/img/slack-conversation-create-task.png b/docs/static/img/slack-conversation-create-task.png
new file mode 100644
index 000000000000..c352953a9f44
Binary files /dev/null and b/docs/static/img/slack-conversation-create-task.png differ
diff --git a/docs/.gitbook/assets/slack-conversation-edit-incident.png b/docs/static/img/slack-conversation-edit-incident.png
similarity index 100%
rename from docs/.gitbook/assets/slack-conversation-edit-incident.png
rename to docs/static/img/slack-conversation-edit-incident.png
diff --git a/docs/static/img/slack-conversation-engage-oncall.png b/docs/static/img/slack-conversation-engage-oncall.png
new file mode 100644
index 000000000000..c92e2ce6c6f3
Binary files /dev/null and b/docs/static/img/slack-conversation-engage-oncall.png differ
diff --git a/docs/static/img/slack-conversation-list-my-tasks.png b/docs/static/img/slack-conversation-list-my-tasks.png
new file mode 100644
index 000000000000..23f50d0eba5b
Binary files /dev/null and b/docs/static/img/slack-conversation-list-my-tasks.png differ
diff --git a/docs/.gitbook/assets/slack-conversation-list-participants.png b/docs/static/img/slack-conversation-list-participants.png
similarity index 100%
rename from docs/.gitbook/assets/slack-conversation-list-participants.png
rename to docs/static/img/slack-conversation-list-participants.png
diff --git a/docs/static/img/slack-conversation-list-tasks.png b/docs/static/img/slack-conversation-list-tasks.png
new file mode 100644
index 000000000000..06f1a44db02a
Binary files /dev/null and b/docs/static/img/slack-conversation-list-tasks.png differ
diff --git a/docs/static/img/slack-conversation-list-workflows.png b/docs/static/img/slack-conversation-list-workflows.png
new file mode 100644
index 000000000000..4e442cbf313e
Binary files /dev/null and b/docs/static/img/slack-conversation-list-workflows.png differ
diff --git a/docs/static/img/slack-conversation-notifications-group.png b/docs/static/img/slack-conversation-notifications-group.png
new file mode 100644
index 000000000000..0a37c29c771a
Binary files /dev/null and b/docs/static/img/slack-conversation-notifications-group.png differ
diff --git a/docs/.gitbook/assets/slack-conversation-pir.png b/docs/static/img/slack-conversation-pir.png
similarity index 100%
rename from docs/.gitbook/assets/slack-conversation-pir.png
rename to docs/static/img/slack-conversation-pir.png
diff --git a/docs/static/img/slack-conversation-report-executive.png b/docs/static/img/slack-conversation-report-executive.png
new file mode 100644
index 000000000000..753e77f392ee
Binary files /dev/null and b/docs/static/img/slack-conversation-report-executive.png differ
diff --git a/docs/static/img/slack-conversation-report-incident.png b/docs/static/img/slack-conversation-report-incident.png
new file mode 100644
index 000000000000..bb219b66d5aa
Binary files /dev/null and b/docs/static/img/slack-conversation-report-incident.png differ
diff --git a/docs/static/img/slack-conversation-run-workflow.png b/docs/static/img/slack-conversation-run-workflow.png
new file mode 100644
index 000000000000..0e6fd27b3542
Binary files /dev/null and b/docs/static/img/slack-conversation-run-workflow.png differ
diff --git a/docs/.gitbook/assets/slack-conversation-status-report-response.png b/docs/static/img/slack-conversation-status-report-response.png
similarity index 100%
rename from docs/.gitbook/assets/slack-conversation-status-report-response.png
rename to docs/static/img/slack-conversation-status-report-response.png
diff --git a/docs/.gitbook/assets/slack-conversation-status-report.png b/docs/static/img/slack-conversation-status-report.png
similarity index 100%
rename from docs/.gitbook/assets/slack-conversation-status-report.png
rename to docs/static/img/slack-conversation-status-report.png
diff --git a/docs/static/img/slack-conversation-update-participant.png b/docs/static/img/slack-conversation-update-participant.png
new file mode 100644
index 000000000000..08ebfed9607e
Binary files /dev/null and b/docs/static/img/slack-conversation-update-participant.png differ
diff --git a/docs/.gitbook/assets/slack-conversations-incident-notification.png b/docs/static/img/slack-conversations-incident-notification.png
similarity index 100%
rename from docs/.gitbook/assets/slack-conversations-incident-notification.png
rename to docs/static/img/slack-conversations-incident-notification.png
diff --git a/docs/static/img/slack-plugin-contact-information-resolver.png b/docs/static/img/slack-plugin-contact-information-resolver.png
new file mode 100644
index 000000000000..b0331aa145fc
Binary files /dev/null and b/docs/static/img/slack-plugin-contact-information-resolver.png differ
diff --git a/docs/static/img/slack-plugin-conversation-management.png b/docs/static/img/slack-plugin-conversation-management.png
new file mode 100644
index 000000000000..b8fbf1a3276f
Binary files /dev/null and b/docs/static/img/slack-plugin-conversation-management.png differ
diff --git a/docs/.gitbook/assets/slack-setup-commands-0.png b/docs/static/img/slack-setup-commands-0.png
similarity index 100%
rename from docs/.gitbook/assets/slack-setup-commands-0.png
rename to docs/static/img/slack-setup-commands-0.png
diff --git a/docs/.gitbook/assets/slack-setup-commands-1.png b/docs/static/img/slack-setup-commands-1.png
similarity index 100%
rename from docs/.gitbook/assets/slack-setup-commands-1.png
rename to docs/static/img/slack-setup-commands-1.png
diff --git a/docs/static/img/slack-setup-commands-2.png b/docs/static/img/slack-setup-commands-2.png
new file mode 100644
index 000000000000..9301c2b2e700
Binary files /dev/null and b/docs/static/img/slack-setup-commands-2.png differ
diff --git a/docs/.gitbook/assets/slack-setup-dialogs.png b/docs/static/img/slack-setup-dialogs.png
similarity index 100%
rename from docs/.gitbook/assets/slack-setup-dialogs.png
rename to docs/static/img/slack-setup-dialogs.png
diff --git a/docs/.gitbook/assets/slack-setup-events.png b/docs/static/img/slack-setup-events.png
similarity index 100%
rename from docs/.gitbook/assets/slack-setup-events.png
rename to docs/static/img/slack-setup-events.png
diff --git a/docs/static/img/thumb-1.png b/docs/static/img/thumb-1.png
new file mode 100644
index 000000000000..49c405b2357f
Binary files /dev/null and b/docs/static/img/thumb-1.png differ
diff --git a/docs/static/img/thumb-2.png b/docs/static/img/thumb-2.png
new file mode 100644
index 000000000000..f36458bb7954
Binary files /dev/null and b/docs/static/img/thumb-2.png differ
diff --git a/docs/static/img/thumb-3.png b/docs/static/img/thumb-3.png
new file mode 100644
index 000000000000..7dfe62ac4544
Binary files /dev/null and b/docs/static/img/thumb-3.png differ
diff --git a/docs/static/img/thumb-4.png b/docs/static/img/thumb-4.png
new file mode 100644
index 000000000000..a8a2b2b4985a
Binary files /dev/null and b/docs/static/img/thumb-4.png differ
diff --git a/docs/static/img/thumb-5.png b/docs/static/img/thumb-5.png
new file mode 100644
index 000000000000..d318b6b7c72b
Binary files /dev/null and b/docs/static/img/thumb-5.png differ
diff --git a/docs/static/img/thumb-6.png b/docs/static/img/thumb-6.png
new file mode 100644
index 000000000000..011d78b29bd1
Binary files /dev/null and b/docs/static/img/thumb-6.png differ
diff --git a/docs/static/img/undraw_docusaurus_mountain.svg b/docs/static/img/undraw_docusaurus_mountain.svg
new file mode 100644
index 000000000000..af961c49a888
--- /dev/null
+++ b/docs/static/img/undraw_docusaurus_mountain.svg
@@ -0,0 +1,171 @@
+
+ Easy to Use
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/docs/static/img/undraw_docusaurus_react.svg b/docs/static/img/undraw_docusaurus_react.svg
new file mode 100644
index 000000000000..94b5cf08f88f
--- /dev/null
+++ b/docs/static/img/undraw_docusaurus_react.svg
@@ -0,0 +1,170 @@
+
+ Powered by React
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/docs/static/img/undraw_docusaurus_tree.svg b/docs/static/img/undraw_docusaurus_tree.svg
new file mode 100644
index 000000000000..d9161d33920c
--- /dev/null
+++ b/docs/static/img/undraw_docusaurus_tree.svg
@@ -0,0 +1,40 @@
+
+ Focus on What Matters
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/docs/static/img/user-guide-incident-feedback-conversation-direct-message.png b/docs/static/img/user-guide-incident-feedback-conversation-direct-message.png
new file mode 100644
index 000000000000..7c29cc69f4ca
Binary files /dev/null and b/docs/static/img/user-guide-incident-feedback-conversation-direct-message.png differ
diff --git a/docs/static/img/user-guide-incident-feedback-conversation-modal.png b/docs/static/img/user-guide-incident-feedback-conversation-modal.png
new file mode 100644
index 000000000000..2427fe851eb6
Binary files /dev/null and b/docs/static/img/user-guide-incident-feedback-conversation-modal.png differ
diff --git a/docs/upgrading.md b/docs/upgrading.md
deleted file mode 100644
index 223d82e8830d..000000000000
--- a/docs/upgrading.md
+++ /dev/null
@@ -1,45 +0,0 @@
----
-description: Staying up to date.
----
-
-# Upgrading
-
-If you're upgrading to a new major release, it's always recommended to start by generating a new configuration file \(using the new version of Dispatch\). This will ensure that any new settings which may have been added are clearly visable and get configurated correctly.
-
-Beyond that, upgrades are simple as bumping the version of Dispatch \(which will cause any changed dependencies to upgrade\), running data migrations, and restarting all related services.
-
-{% hint style="info" %}
-In some cases you may want to stop services before doing the upgrade process or avoid intermittent errors.
-{% endhint %}
-
-## Upgrading Dispatch
-
-### Upgrading the package
-
-The easiest way to upgrade the Dispatch package using`pip`:
-
-```bash
-pip install --upgrade dispatch
-```
-
-You may prefer to install a fixed version rather than just assuming the latest, as it will allow you to better understand what is changing.
-
-If you're installing from source, you may have additional requirements that are unfulfilled, so take the necessary precautions of testing your environment before committing to the upgrade.
-
-### Running Migrations
-
-Just as during the initial setup, migrations are applied with the upgrade command.
-
-```bash
-dispatch database upgrade
-```
-
-### Restarting services
-
-You'll need to ensure that _all_ services running Dispatch code are restarted after an upgrade. This is important as Python loads modules in memory and code changes will not be reflected until a restart.
-
-These services include:
-
-* server -- `dispatch server start`
-* scheduler -- `dispatch scheduler start`
-
diff --git a/docs/user-guide/README.md b/docs/user-guide/README.md
deleted file mode 100644
index fb74601b5dc4..000000000000
--- a/docs/user-guide/README.md
+++ /dev/null
@@ -1,6 +0,0 @@
-# User Guide
-
-All Dispatch incidents follow the same flow:
-
-
-
diff --git a/docs/user-guide/administration/README.md b/docs/user-guide/administration/README.md
deleted file mode 100644
index a8ece8e4cc46..000000000000
--- a/docs/user-guide/administration/README.md
+++ /dev/null
@@ -1,10 +0,0 @@
----
-description: Setting up dispatch for your organization's needs.
----
-
-# Administration
-
-The Dispatch Web UI is home to all of Dispatch administration abilities, these contain things that are likely to change frequently \(less frequently changed items live in the Dispatch configuration file\).
-
-
-
diff --git a/docs/user-guide/administration/configuration.md b/docs/user-guide/administration/configuration.md
deleted file mode 100644
index aa4189548034..000000000000
--- a/docs/user-guide/administration/configuration.md
+++ /dev/null
@@ -1,12 +0,0 @@
-# Configuration
-
-## Incident Types
-
-Dispatch allows you to define your own incident types. To create a new incident type navigate to: `Dispatch > Incident Types > New`
-
-
-
-Once there, you can create a name for you incident type \(this will be displayed on the report form, among other places\) and give it a short description.
-
-Additionally you can define the service you wish to use as the default incident commander for this incident, and which document you wish to use as the Incident Document.
-
diff --git a/docs/user-guide/administration/contacts.md b/docs/user-guide/administration/contacts.md
deleted file mode 100644
index cd0890553693..000000000000
--- a/docs/user-guide/administration/contacts.md
+++ /dev/null
@@ -1,20 +0,0 @@
----
-description: Configuring Dispatch's contact repository
----
-
-# Contacts
-
-## Individual
-
-In Dispatch, Individuals are either internal or external people identifiers. Typically, an organization will have a robust internal whitepages/phone-book. Dispatch does not expect to replace those data stores, instead it keeps a lightweight notion of identities to associate with incidents.
-
-Everyone has a spreadsheet somewhere of who to contact for a given incident. Dispatch allows the folks to be pulled directly into an incident. By assigning individuals terms, incident types or incident priorities dispatch is able to directly add those the folks \(if internal\) or suggest reaching out \(if external\).
-
-## Team
-
-Like `Individuals`, there are often groups of individuals that need to be engaged and/or notified during an incident. Here we give you a place to manage those team \(typically, team distribution lists\), and have Dispatch help keep those folks up-to-date.
-
-## Service
-
-Similar to `Teams` there are often groups of individuals responsible for a an application or service which need to be involved in an incident. However, in these circumstances you don't want to engage the _whole_ team. Here we only want to engage the on-call person. Services are the way to resolve things like PagerDuty schedules \(built-in\).
-
diff --git a/docs/user-guide/administration/incidents.md b/docs/user-guide/administration/incidents.md
deleted file mode 100644
index c0726b9f514f..000000000000
--- a/docs/user-guide/administration/incidents.md
+++ /dev/null
@@ -1,18 +0,0 @@
----
-description: View and manage all incidents.
----
-
-# Incidents
-
-## Dashboard
-
-The dashboard should be pretty self-explanatory. It attempts to provide some meaningful insights across all of incident managed by Dispatch. Do we have an uptick in a particular type of incident? Are we getting better at responding to incidents?
-
-## Incidents
-
-During an incident most actions are managed by the incident channel ops commands \(e.g. `/dispatch-edit-incident` \). However, when an incident closes you may need to modify incident details, the Admin Web UI is the place to manage these regardless of the state of the incident. It provides a more holistic view incidents as well as allowing you to search across incident.
-
-## Tasks
-
-Similar to `Incident` most incident tasks are managed in the incident channel itself. But when you want to view _all_ incident tasks across the organization this admin view gives you that ability.
-
diff --git a/docs/user-guide/administration/knowledge.md b/docs/user-guide/administration/knowledge.md
deleted file mode 100644
index 3471299d3322..000000000000
--- a/docs/user-guide/administration/knowledge.md
+++ /dev/null
@@ -1,30 +0,0 @@
----
-description: Configure Dispatch's incident knowledge base
----
-
-# Knowledge
-
-## Applications
-
-Applications like Terms only have meaning in particular contexts, within Dispatch application names are important enough to be separated from other terms, and given additional metadata surrounding them.
-
-## Documents
-
-Documents are links to external sources \(Web Pages, Google Documents, etc.,\). These documents can be associated with Terms allowing these documents to be recommended reading for incident participants.
-
-The best use case for documents is tracking, managing and recommending, incident run books.
-
-## Terms
-
-Terms are words or phrases that may not have any meaning to an outside observer but have deep organizational meaning, things like acronyms or phrases.
-
-Terms are also used for explicit association of incidents. For instance take the term "PCI" this has a deep organizational meaning, and thus is a term that should be track. More over, it's important enough the Teams, and Individuals may want to be pulled into any incident that contains this key term.
-
-Any defined term and be associated to Teams, Services, or Individuals for incident inclusion.
-
-## Definitions
-
-Definitions are used to collect and manage term definitions from various sources, this enabled incident participants to quickly understand the language and terms being used through an incident.
-
-A definition can be associated with one or more terms \(in case of term overload\).
-
diff --git a/docs/user-guide/incident-commander.md b/docs/user-guide/incident-commander.md
deleted file mode 100644
index 5e3dff17a4d2..000000000000
--- a/docs/user-guide/incident-commander.md
+++ /dev/null
@@ -1,66 +0,0 @@
----
-description: What to expect as an incident commander.
----
-
-# Incident Commander
-
-## Reporting
-
-Within Dispatch Incident Commander's \(ICs\) are also participants and will receive all of the messaging that a participant does. When resolved as incident commander you are assigned that role by Dispatch and your identity is propagated via various messaging. Additionally for `High` severity events, the Incident Commander is automatically paged \(via PagerDuty\).
-
-## During
-
-During an incident, the IC is responsible for pushing the incident forward and keeping the group going in the right direction. To help them with this, Dispatch provides a few useful commands:
-
-### Roles
-
-`/dispatch-assign-role`
-
-You're not always incident commander forever, sometimes a situation changes or you just need a break. The `dispatch-assign-role` command makes it easy and more importantly _clear_ to everyone involved in the incident who the current incident commander is.
-
-
-
-### Status
-
-`/dispatch-list-participants`
-
-This command is most useful for determining which teams are already engaged and who the IC commander might want to task or ask questions of.
-
-
-
-`/dispatch-edit-incident`
-
-This command allows the IC to modify several aspects of the incident without ever leaving the conversation interface.
-
-
-
-`/dispatch-status-report`
-
-One of the most important aspects of an IC commanders job is notifying external stakeholders on the current state of the incident. These stakeholders should never be down in the weeds of incident management but do need a way to view concise and up-to-date incident information. The `status-report` command allows the IC to do so in a standardized way.
-
-
-
-
-
-### Tasks
-
-During an incident you may want assign asks or ask questions. Dispatch provides a light weight tasking mechanism by integrating with the Google Docs commenting system.
-
-In the Incident Document:
-
-
-
-## After
-
-Once the incident is stable or can be closed, Dispatch again provides a few helpful commands to the IC:
-
-`/dispatch-mark-stable`
-
-This command marks the incident as stable, updating external tickets as necessary.
-
-It also creates a Post Incident Review \(PIR\) document to help you track the post incident review process.
-
-`/dispatch-mark-closed`
-
-The command marks the incident as close and begins cleaning up incident resources including, archiving incident documents, conversation channels and groups.
-
diff --git a/docs/user-guide/incident-participant.md b/docs/user-guide/incident-participant.md
deleted file mode 100644
index 3cf1c75ae406..000000000000
--- a/docs/user-guide/incident-participant.md
+++ /dev/null
@@ -1,44 +0,0 @@
----
-description: What to expect as an incident participant.
----
-
-# Incident Participant
-
-## Reporting
-
-Dispatch attempts to make reporting incidents as easy as possible. To that end Dispatch provides a dedicated incident report form that users throughout the organization can submit to engage incident related resources.
-
-Located at: `https:///incidents/report`
-
-
-
-Once submitted the user is then presented with all of the incident resources they need to start managing the incident.
-
-
-
-
-
-## During
-
-After an incident is created Dispatch will start to automatically pull new participants into the incident. Who it pulls is in is determined on rules that have been setup in the Dispatch Admin UI.
-
-Each new participant receives a welcome message \(Email + Slack\) providing them resources and information to help orient them for this given incident.
-
-
-
-
-
-From there use the resources as you normally would to run your investigation, Dispatch will be there managing permissions, providing reminders and helping track the incident as it progresses to resolution.
-
-## After
-
-After an incident has been marked stable Dispatch is still there to help creating the resources necessary to run Post Incident Reviews \(PIRs\) and help track any associated action items.
-
-## Notifications
-
-In addition to Dispatch pulling in individuals that will be directly responsible for managing the incident, it provides notifications for general awareness throughout the organization.
-
-{% hint style="info" %}
-The incident notification message includes a "Get Involved" button, this allows individuals to add themselves to the incident \(and it's resources\) without involvement from the incident commander.
-{% endhint %}
-
diff --git a/package-lock.json b/package-lock.json
index 18e243e7b444..9b00efdee37b 100644
--- a/package-lock.json
+++ b/package-lock.json
@@ -1,19 +1,96 @@
{
+ "name": "dispatch-e2e",
+ "version": "1.0.0",
+ "lockfileVersion": 3,
"requires": true,
- "lockfileVersion": 1,
- "dependencies": {
- "good-env": {
- "version": "5.1.1",
- "resolved": "https://registry.npmjs.org/good-env/-/good-env-5.1.1.tgz",
- "integrity": "sha512-WI01feY8/OIVyajYNeXF+urIK5UzuGtVldz2c7/dCW8kePVvX/nKnQ7tbL+Bz3Rqk84BTbHbMYT+nV+ARxjAUA==",
- "requires": {
- "is_js": "^0.9.0"
+ "packages": {
+ "": {
+ "name": "dispatch-e2e",
+ "version": "1.0.0",
+ "devDependencies": {
+ "@playwright/test": "^1.40.0",
+ "@types/node": "^20.0.0"
}
},
- "is_js": {
- "version": "0.9.0",
- "resolved": "https://registry.npmjs.org/is_js/-/is_js-0.9.0.tgz",
- "integrity": "sha1-CrlFQFArp6+iTIVqqYVWFmnpxS0="
+ "node_modules/@playwright/test": {
+ "version": "1.54.1",
+ "resolved": "https://registry.npmjs.org/@playwright/test/-/test-1.54.1.tgz",
+ "integrity": "sha512-FS8hQ12acieG2dYSksmLOF7BNxnVf2afRJdCuM1eMSxj6QTSE6G4InGF7oApGgDb65MX7AwMVlIkpru0yZA4Xw==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "dependencies": {
+ "playwright": "1.54.1"
+ },
+ "bin": {
+ "playwright": "cli.js"
+ },
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/@types/node": {
+ "version": "20.19.9",
+ "resolved": "https://registry.npmjs.org/@types/node/-/node-20.19.9.tgz",
+ "integrity": "sha512-cuVNgarYWZqxRJDQHEB58GEONhOK79QVR/qYx4S7kcUObQvUwvFnYxJuuHUKm2aieN9X3yZB4LZsuYNU1Qphsw==",
+ "dev": true,
+ "license": "MIT",
+ "dependencies": {
+ "undici-types": "~6.21.0"
+ }
+ },
+ "node_modules/fsevents": {
+ "version": "2.3.2",
+ "resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.2.tgz",
+ "integrity": "sha512-xiqMQR4xAeHTuB9uWm+fFRcIOgKBMiOBP+eXiyT7jsgVCq1bkVygt00oASowB7EdtpOHaaPgKt812P9ab+DDKA==",
+ "dev": true,
+ "hasInstallScript": true,
+ "license": "MIT",
+ "optional": true,
+ "os": [
+ "darwin"
+ ],
+ "engines": {
+ "node": "^8.16.0 || ^10.6.0 || >=11.0.0"
+ }
+ },
+ "node_modules/playwright": {
+ "version": "1.54.1",
+ "resolved": "https://registry.npmjs.org/playwright/-/playwright-1.54.1.tgz",
+ "integrity": "sha512-peWpSwIBmSLi6aW2auvrUtf2DqY16YYcCMO8rTVx486jKmDTJg7UAhyrraP98GB8BoPURZP8+nxO7TSd4cPr5g==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "dependencies": {
+ "playwright-core": "1.54.1"
+ },
+ "bin": {
+ "playwright": "cli.js"
+ },
+ "engines": {
+ "node": ">=18"
+ },
+ "optionalDependencies": {
+ "fsevents": "2.3.2"
+ }
+ },
+ "node_modules/playwright-core": {
+ "version": "1.54.1",
+ "resolved": "https://registry.npmjs.org/playwright-core/-/playwright-core-1.54.1.tgz",
+ "integrity": "sha512-Nbjs2zjj0htNhzgiy5wu+3w09YetDx5pkrpI/kZotDlDUaYk0HVA5xrBVPdow4SAUIlhgKcJeJg4GRKW6xHusA==",
+ "dev": true,
+ "license": "Apache-2.0",
+ "bin": {
+ "playwright-core": "cli.js"
+ },
+ "engines": {
+ "node": ">=18"
+ }
+ },
+ "node_modules/undici-types": {
+ "version": "6.21.0",
+ "resolved": "https://registry.npmjs.org/undici-types/-/undici-types-6.21.0.tgz",
+ "integrity": "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ==",
+ "dev": true,
+ "license": "MIT"
}
}
}
diff --git a/package.json b/package.json
new file mode 100644
index 000000000000..fa238809d0da
--- /dev/null
+++ b/package.json
@@ -0,0 +1,15 @@
+{
+ "name": "dispatch-e2e",
+ "version": "1.0.0",
+ "private": true,
+ "description": "End-to-end tests for Dispatch",
+ "scripts": {
+ "test:e2e": "playwright test",
+ "test:e2e:ui": "playwright test --ui",
+ "test:e2e:debug": "playwright test --debug"
+ },
+ "devDependencies": {
+ "@playwright/test": "^1.40.0",
+ "@types/node": "^20.0.0"
+ }
+}
diff --git a/playwright.config.ts b/playwright.config.ts
new file mode 100644
index 000000000000..493bf1f9e58a
--- /dev/null
+++ b/playwright.config.ts
@@ -0,0 +1,84 @@
+import type { PlaywrightTestConfig } from "@playwright/test"
+import { devices } from "@playwright/test"
+
+/**
+ * @see https://playwright.dev/docs/test-configuration
+ */
+const config: PlaywrightTestConfig = {
+ testDir: "./tests/static/e2e",
+ outputDir: "./tests/static/e2e/artifacts/test-failures",
+ use: {
+ /* Maximum time each action such as `click()` can take. Defaults to 0 (no limit). */
+ actionTimeout: 0,
+ /* Base URL to use in actions like `await page.goto('/')`. */
+ baseURL: "http://localhost:8080/",
+ /* Collect trace when retrying the failed test. See https://playwright.dev/docs/trace-viewer */
+ trace: "retain-on-failure",
+ video: "retain-on-failure",
+ screenshot: "only-on-failure",
+ /* Navigation timeout - increase for slower CI environments */
+ navigationTimeout: process.env.CI ? 60000 : 30000,
+ },
+ /* Maximum time one test can run for. */
+ timeout: process.env.CI ? 200 * 1000 : 60 * 1000,
+ expect: {
+ /**
+ * Maximum time expect() should wait for the condition to be met.
+ * For example in `await expect(locator).toHaveText();`
+ */
+ timeout: process.env.CI ? 20000 : 10000,
+ },
+ /* Run tests in files in parallel */
+ fullyParallel: true,
+ /* Fail the build on CI if you accidentally left test.only in the source code. */
+ forbidOnly: !!process.env.CI,
+ /* Retry on CI only */
+ retries: process.env.CI ? 2 : 0,
+ /* Optimize workers for CI - use more workers for faster execution */
+ workers: process.env.CI ? 4 : undefined,
+ /* Reporter to use. See https://playwright.dev/docs/test-reporters */
+ reporter: process.env.CI ? [["html"], ["github"]] : "html",
+ /* Configure projects for major browsers */
+ projects: process.env.CI
+ ? [
+ {
+ name: "chromium",
+ use: {
+ ...devices["Desktop Chrome"],
+ },
+ },
+ ]
+ : [
+ {
+ name: "chromium",
+ use: {
+ ...devices["Desktop Chrome"],
+ },
+ },
+ {
+ name: "firefox",
+ use: {
+ ...devices["Desktop Firefox"],
+ },
+ },
+ {
+ name: "webkit",
+ use: {
+ ...devices["Desktop Safari"],
+ },
+ },
+ ],
+ /* Folder for test artifacts such as screenshots, videos, traces, etc. */
+ // outputDir: 'test-results/',
+
+ /* Run your local dev server before starting the tests */
+ webServer: {
+ command: "dispatch server develop",
+ url: "http://localhost:8080/",
+ reuseExistingServer: !process.env.CI,
+ /* Increase timeout to allow server to fully start and settle */
+ timeout: 240 * 1000, // 4 minutes to ensure server is fully ready
+ },
+}
+
+export default config
diff --git a/pyproject.toml b/pyproject.toml
index 1510d32a5974..4473417ab0a0 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -1,4 +1,296 @@
+[build-system]
+requires = ["hatchling", "versioningit"]
+build-backend = "hatchling.build"
+
+[project]
+name = "dispatch"
+dynamic = ["version"]
+description = "Dispatch is a incident management and orchestration platform"
+readme = "README.md"
+license = {text = "Apache-2.0"}
+authors = [
+ {name = "Netflix, Inc.", email = "oss@netflix.com"}
+]
+maintainers = [
+ {name = "Netflix OSS", email = "oss@netflix.com"}
+]
+requires-python = ">=3.11"
+classifiers = [
+ "Development Status :: 5 - Production/Stable",
+ "Intended Audience :: Developers",
+ "Programming Language :: Python :: 3.11",
+ "Programming Language :: Python :: 3.12",
+ "Programming Language :: Python :: 3.13",
+ "Topic :: Internet :: WWW/HTTP :: HTTP Servers",
+ "Topic :: Internet :: WWW/HTTP :: Dynamic Content",
+ "Topic :: Software Development :: Libraries :: Python Modules",
+ "Topic :: System :: Systems Administration",
+]
+keywords = ["incident", "management", "orchestration", "response", "security"]
+
+dependencies = [
+ "aiocache",
+ "aiofiles",
+ "aiohttp",
+ "alembic",
+ "atlassian-python-api",
+ "attrs>=22.2.0",
+ "bcrypt",
+ "blockkit==1.9.2",
+ "boto3",
+ "cachetools",
+ "chardet",
+ "click",
+ "cryptography<42,>=38.0.0",
+ "duo-client",
+ "email-validator",
+ "emails",
+ "fastapi==0.115.12",
+ "google-api-python-client",
+ "google-auth-oauthlib",
+ "h11",
+ "httpx",
+ "jinja2",
+ "jira",
+ "joblib",
+ "jsonpath_ng",
+ "lxml==5.3.0",
+ "markdown",
+ "msal",
+ "numpy",
+ "oauth2client",
+ "openai==1.77.0",
+ "pandas",
+ "pdpyras",
+ "protobuf<5.0dev,>=4.21.6",
+ "psycopg2-binary",
+ "pyarrow",
+ "pydantic==2.11.4",
+ "pydantic-extra-types==2.10.4",
+ "pyparsing",
+ "python-dateutil",
+ "python-jose",
+ "python-multipart",
+ "python-slugify",
+ "pytz",
+ "requests",
+ "schedule",
+ "schemathesis",
+ "sentry-asgi",
+ "sentry-sdk==1.45.0",
+ "sh",
+ "slack-bolt",
+ "slack_sdk",
+ "slowapi",
+ "spacy==3.8.7",
+ "sqlalchemy-filters",
+ "sqlalchemy-utils",
+ "sqlalchemy==2.0.8",
+ "statsmodels",
+ "tabulate",
+ "tenacity",
+ "thinc==8.3.4",
+ "tiktoken",
+ "typing-extensions==4.13.2",
+ "uvicorn",
+ "uvloop",
+ "validators==0.18.2",
+]
+
+[project.optional-dependencies]
+dev = [
+ "attrs>=22.2.0",
+ "black",
+ "click",
+ "coverage",
+ "devtools",
+ "easydict",
+ "factory-boy",
+ "faker",
+ "ipython",
+ "pre-commit",
+ "pytest==7.4.4",
+ "pytest-mock",
+ "ruff",
+ "typing-extensions==4.13.2",
+]
+netflix = [
+ "dispatch-internal-plugins",
+]
+
+[project.scripts]
+dispatch = "dispatch.cli:entrypoint"
+
+[project.entry-points."dispatch.plugins"]
+dispatch_atlassian_confluence = "dispatch.plugins.dispatch_atlassian_confluence.plugin:ConfluencePagePlugin"
+dispatch_atlassian_confluence_document = "dispatch.plugins.dispatch_atlassian_confluence.docs.plugin:ConfluencePageDocPlugin"
+dispatch_auth_mfa = "dispatch.plugins.dispatch_core.plugin:DispatchMfaPlugin"
+dispatch_aws_alb_auth = "dispatch.plugins.dispatch_core.plugin:AwsAlbAuthProviderPlugin"
+dispatch_aws_sqs = "dispatch.plugins.dispatch_aws.plugin:AWSSQSSignalConsumerPlugin"
+dispatch_basic_auth = "dispatch.plugins.dispatch_core.plugin:BasicAuthProviderPlugin"
+dispatch_contact = "dispatch.plugins.dispatch_core.plugin:DispatchContactPlugin"
+dispatch_header_auth = "dispatch.plugins.dispatch_core.plugin:HeaderAuthProviderPlugin"
+dispatch_participant_resolver = "dispatch.plugins.dispatch_core.plugin:DispatchParticipantResolverPlugin"
+dispatch_pkce_auth = "dispatch.plugins.dispatch_core.plugin:PKCEAuthProviderPlugin"
+dispatch_ticket = "dispatch.plugins.dispatch_core.plugin:DispatchTicketPlugin"
+duo_auth_mfa = "dispatch.plugins.dispatch_duo.plugin:DuoMfaPlugin"
+generic_workflow = "dispatch.plugins.generic_workflow.plugin:GenericWorkflowPlugin"
+github_monitor = "dispatch.plugins.dispatch_github.plugin:GithubMonitorPlugin"
+google_calendar_conference = "dispatch.plugins.dispatch_google.calendar.plugin:GoogleCalendarConferencePlugin"
+google_docs_document = "dispatch.plugins.dispatch_google.docs.plugin:GoogleDocsDocumentPlugin"
+google_drive_storage = "dispatch.plugins.dispatch_google.drive.plugin:GoogleDriveStoragePlugin"
+google_drive_task = "dispatch.plugins.dispatch_google.drive.plugin:GoogleDriveTaskPlugin"
+google_gmail_email = "dispatch.plugins.dispatch_google.gmail.plugin:GoogleGmailEmailPlugin"
+google_groups_participants = "dispatch.plugins.dispatch_google.groups.plugin:GoogleGroupParticipantGroupPlugin"
+jira_ticket = "dispatch.plugins.dispatch_jira.plugin:JiraTicketPlugin"
+microsoft_teams_conference = "dispatch.plugins.dispatch_microsoft_teams.conference.plugin:MicrosoftTeamsConferencePlugin"
+openai_artificial_intelligence = "dispatch.plugins.dispatch_openai.plugin:OpenAIPlugin"
+opsgenie_oncall = "dispatch.plugins.dispatch_opsgenie.plugin:OpsGenieOncallPlugin"
+pagerduty_oncall = "dispatch.plugins.dispatch_pagerduty.plugin:PagerDutyOncallPlugin"
+slack_contact = "dispatch.plugins.dispatch_slack.plugin:SlackContactPlugin"
+slack_conversation = "dispatch.plugins.dispatch_slack.plugin:SlackConversationPlugin"
+zoom_conference = "dispatch.plugins.dispatch_zoom.plugin:ZoomConferencePlugin"
+
+[project.urls]
+Homepage = "https://dispatch.io"
+Documentation = "https://dispatch.io/docs"
+Repository = "https://github.com/netflix/dispatch"
+Issues = "https://github.com/netflix/dispatch/issues"
+Changelog = "https://github.com/netflix/dispatch/releases"
+
+[tool.hatch.version]
+source = "versioningit"
+
+[tool.hatch.metadata]
+allow-direct-references = true
+
+[tool.hatch.build]
+packages = ["src/dispatch"]
+include = [
+ "/src/dispatch/static/**/*",
+ "/src/dispatch/**/*.py",
+]
+
+[tool.hatch.build.targets.wheel]
+packages = ["src/dispatch"]
+
+[tool.hatch.build.targets.sdist]
+include = [
+ "/src/dispatch",
+ "/tests",
+ "/src/dispatch/static/**/*",
+ "README.md",
+ "LICENSE",
+]
+
+[tool.versioningit]
+default-version = "0.1.0"
+
+[tool.versioningit.vcs]
+method = "git"
+match = ["v*"]
+
+[tool.versioningit.format]
+distance = "{base_version}.{distance}+{vcs}{rev}"
+dirty = "{base_version}.{distance}+d{build_date:%Y%m%d}"
+distance-dirty = "{base_version}.{distance}+{vcs}{rev}.d{build_date:%Y%m%d}"
+
+[tool.pytest.ini_options]
+python_files = ["test*.py"]
+addopts = [
+ "--tb=native",
+ "-p", "no:doctest",
+ "-p", "no:warnings"
+]
+norecursedirs = ["bin", "dist", "docs", "htmlcov", "script", "hooks", "node_modules", ".*"]
+testpaths = ["tests"]
+
+[tool.coverage.run]
+source = ["src", "tests"]
+omit = ["dispatch/migrations/*"]
+
+[tool.coverage.report]
+exclude_lines = [
+ "pragma: no cover",
+ "def __repr__",
+ "raise AssertionError",
+ "raise NotImplementedError",
+ "if __name__ == .__main__.:",
+ "if TYPE_CHECKING:",
+]
+
[tool.black]
line-length = 100
-target_version = ['py37']
+target-version = ['py311']
include = '\.pyi?$'
+
+[tool.ruff]
+# Exclude a variety of commonly ignored directories.
+exclude = [
+ ".bzr",
+ ".direnv",
+ ".eggs",
+ ".git",
+ ".hg",
+ ".mypy_cache",
+ ".nox",
+ ".pants.d",
+ ".ruff_cache",
+ ".svn",
+ ".tox",
+ ".venv",
+ "__pypackages__",
+ "_build",
+ "buck-out",
+ "build",
+ "dist",
+ "node_modules",
+ "venv",
+]
+
+# Same as Black.
+line-length = 100
+
+# Assume Python 3.11
+target-version = "py311"
+
+[tool.ruff.lint]
+select = [
+ "E", # pycodestyle errors
+ "W", # pycodestyle warnings
+ "F", # pyflakes
+ # "I", # isort
+ "C", # flake8-comprehensions
+ "B", # flake8-bugbear
+]
+ignore = [
+ "E501", # line too long, handled by black
+ "B008", # do not perform function calls in argument defaults
+ "C901", # complexity
+]
+
+# Allow autofix for all enabled rules (when `--fix`) is provided.
+fixable = ["A", "B", "C", "D", "E", "F"]
+unfixable = []
+
+# Allow unused variables when underscore-prefixed.
+dummy-variable-rgx = "^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$"
+
+[tool.ruff.lint.mccabe]
+# Unlike Flake8, default to a complexity level of 10.
+max-complexity = 10
+
+[tool.ruff.lint.isort]
+known-third-party = ["fastapi", "pydantic", "starlette"]
+
+[tool.ruff.lint.per-file-ignores]
+"tests/conftest.py" = ["E402"]
+"src/dispatch/entity/service.py" = ["W605"]
+
+[dependency-groups]
+netflix = [
+ "dispatch-internal-plugins",
+]
+
+[tool.uv.sources]
+dispatch-internal-plugins = { path = "../dispatch-internal-plugins", editable = true }
diff --git a/requirements-base.txt b/requirements-base.txt
deleted file mode 100644
index 78c3552cf9de..000000000000
--- a/requirements-base.txt
+++ /dev/null
@@ -1,35 +0,0 @@
-SQLAlchemy-Searchable
-aiofiles
-alembic
-arrow
-cachetools==3.1.1 # NOTE pinning for google-auth
-click
-email-validator
-emails
-fastapi
-google-api-python-client
-google-auth-oauthlib
-httpx
-jinja2
-jira
-joblib
-oauth2client
-psycopg2-binary
-pyparsing
-pypd # pagerduty plugin
-python-dateutil
-python-jose
-python-multipart
-pytz
-requests
-schedule
-sentry-asgi
-sentry_sdk
-sh
-slackclient
-spacy
-sqlalchemy
-sqlalchemy-filters
-tabulate
-tenacity
-uvicorn
diff --git a/requirements-dev.txt b/requirements-dev.txt
deleted file mode 100644
index 89818122ad41..000000000000
--- a/requirements-dev.txt
+++ /dev/null
@@ -1,9 +0,0 @@
-IPython
-black
-coverage
-devtools
-factory-boy
-faker
-flake8
-pytest
-vulture
diff --git a/requirements-metrics.txt b/requirements-metrics.txt
deleted file mode 100644
index 97259fddc1e3..000000000000
--- a/requirements-metrics.txt
+++ /dev/null
@@ -1,3 +0,0 @@
-numpy
-pystan
-fbprophet
diff --git a/scripts/verify-uv-setup.sh b/scripts/verify-uv-setup.sh
new file mode 100755
index 000000000000..fd26294412fe
--- /dev/null
+++ b/scripts/verify-uv-setup.sh
@@ -0,0 +1,113 @@
+#!/bin/bash
+# Verification script for the complete uv + pyproject.toml setup
+
+set -e
+
+echo "đ§Ē Verifying complete uv setup for Dispatch..."
+
+# Check if uv is installed
+if ! command -v uv &> /dev/null; then
+ echo "â uv is not installed. Please install it first:"
+ echo " curl -LsSf https://astral.sh/uv/install.sh | sh"
+ exit 1
+fi
+
+echo "â
uv is installed: $(uv --version)"
+
+# Check if we're in the dispatch directory
+if [ ! -f "pyproject.toml" ]; then
+ echo "â Please run this script from the dispatch repository root"
+ exit 1
+fi
+
+echo "â
Running from dispatch repository root"
+
+# Check pyproject.toml has project table
+echo "đ Checking pyproject.toml configuration..."
+if grep -q "^\[project\]" pyproject.toml; then
+ echo "â
pyproject.toml has [project] table"
+else
+ echo "â pyproject.toml missing [project] table"
+ exit 1
+fi
+
+# Test modern uv commands
+echo "đ Testing modern uv commands..."
+
+# Test uv sync (dry run)
+echo " Testing uv sync..."
+if DISPATCH_LIGHT_BUILD=1 uv sync --dev --no-install-project --python 3.11 > /dev/null 2>&1; then
+ echo "â
uv sync works correctly"
+else
+ echo "â uv sync failed"
+ exit 1
+fi
+
+# Test uv add (dry run with remove)
+echo " Testing uv add/remove..."
+if uv add --dev pytest-timeout --python 3.11 > /dev/null 2>&1; then
+ if uv remove --dev pytest-timeout > /dev/null 2>&1; then
+ echo "â
uv add/remove works correctly"
+ else
+ echo "â uv remove failed"
+ exit 1
+ fi
+else
+ echo "â uv add failed"
+ exit 1
+fi
+
+# Test legacy pip install still works
+echo "đ Testing legacy pip install..."
+if DISPATCH_LIGHT_BUILD=1 uv pip install --dry-run -e . --python 3.11 > /dev/null 2>&1; then
+ echo "â
Legacy uv pip install still works"
+else
+ echo "â Legacy uv pip install failed"
+ exit 1
+fi
+
+# Check if lock file exists or can be created
+echo "đ Checking lock file..."
+if [ -f "uv.lock" ]; then
+ echo "â
uv.lock file exists"
+else
+ echo " Generating uv.lock..."
+ if DISPATCH_LIGHT_BUILD=1 uv lock --python 3.11 > /dev/null 2>&1; then
+ echo "â
uv.lock generated successfully"
+ else
+ echo "â Failed to generate uv.lock"
+ exit 1
+ fi
+fi
+
+# Test dynamic versioning
+echo "đ Testing dynamic versioning..."
+if python -c "import importlib.metadata; version = importlib.metadata.version('dispatch'); print(f'Version: {version}'); assert version != 'unknown'" 2>/dev/null; then
+ echo "â
Dynamic versioning works correctly"
+else
+ echo "â Dynamic versioning failed"
+ exit 1
+fi
+
+# Check if setup.py is disabled
+if [ -f "setup.py" ]; then
+ echo "â ī¸ setup.py still exists (should be setup.py.bak)"
+ echo " Consider running: mv setup.py setup.py.bak"
+else
+ echo "â
setup.py is disabled/removed"
+fi
+
+echo ""
+echo "đ All checks passed! Full uv migration is working correctly."
+echo ""
+echo "Modern workflow:"
+echo "1. Setup: DISPATCH_LIGHT_BUILD=1 uv sync --dev"
+echo "2. Add deps: uv add package-name"
+echo "3. Remove deps: uv remove package-name"
+echo "4. Update: uv sync --upgrade"
+echo "5. Lock: uv lock --upgrade"
+echo ""
+echo "Legacy workflow (still works):"
+echo "1. Create venv: uv venv --python 3.11"
+echo "2. Activate: source .venv/bin/activate"
+echo "3. Install: DISPATCH_LIGHT_BUILD=1 uv pip install -e \".[dev]\""
diff --git a/setup.cfg b/setup.cfg
deleted file mode 100644
index 64e3208e23bc..000000000000
--- a/setup.cfg
+++ /dev/null
@@ -1,45 +0,0 @@
-[tool:pytest]
-python_files = test*.py
-addopts = --tb=native -p no:doctest -p no:warnings
-norecursedirs = bin dist docs htmlcov script hooks node_modules .* {args}
-looponfailroots = src tests
-selenium_driver = chrome
-self-contained-html = true
-
-[flake8]
-# E203 false positive, see https://github.com/PyCQA/pycodestyle/issues/373
-ignore = F999,E203,E266,W403,F401,E501,E128,E124,E402,W503,W504,E731,C901,B007,B306,B009,B010
-exclude = .venv/.git,*/migrations/*,node_modules/*,docs/*
-max-line-length=100
-max-complexity = 18
-select = B,C,E,F,W,T4,B9
-
-# XXX: E501 is ignored, which disables line length checking.
-# Currently, the black formatter doesn't wrap long strings: https://github.com/psf/black/issues/182#issuecomment-385325274
-# We already have a lot of E501's - these are lines black didn't wrap.
-# But rather than append # noqa: E501 to all of them, we just ignore E501 for now.
-
-[bdist_wheel]
-python-tag = py38
-
-[coverage:run]
-omit =
- dispatch/migrations/*
-source =
- src
- tests
-
-[isort]
-line_length=100
-lines_between_sections=1
-multi_line_output=3
-include_trailing_comma=True
-force_grid_wrap=0
-use_parentheses=True
-known_first_party=dispatch
-default_section=THIRDPARTY
-indent=' '
-skip=setup.py,src/dispatch/models.py
-
-[black]
-line_length=100
diff --git a/setup.py b/setup.py
deleted file mode 100644
index 0d528155a563..000000000000
--- a/setup.py
+++ /dev/null
@@ -1,423 +0,0 @@
-#!/usr/bin/env python
-import datetime
-import json
-import os
-import os.path
-import shutil
-import sys
-import traceback
-from distutils import log
-from distutils.command.build import build as BuildCommand
-from distutils.core import Command
-from subprocess import check_output
-
-from setuptools import find_packages, setup
-from setuptools.command.develop import develop as DevelopCommand
-from setuptools.command.sdist import sdist as SDistCommand
-
-ROOT_PATH = os.path.abspath(os.path.dirname(__file__))
-
-
-# modified from:
-# https://raw.githubusercontent.com/getsentry/sentry/055cfe74bb88bbb2083f37f5df21b91d0ef4f9a7/src/sentry/utils/distutils/commands/base.py
-class BaseBuildCommand(Command):
- user_options = [
- ("work-path=", "w", "The working directory for source files. Defaults to ."),
- ("build-lib=", "b", "directory for script runtime modules"),
- (
- "inplace",
- "i",
- "ignore build-lib and put compiled javascript files into the source "
- + "directory alongside your pure Python modules",
- ),
- (
- "force",
- "f",
- "Force rebuilding of static content. Defaults to rebuilding on version "
- "change detection.",
- ),
- ]
-
- boolean_options = ["force"]
-
- def initialize_options(self):
- self.build_lib = None
- self.force = None
- self.work_path = os.path.join(ROOT_PATH, "src/dispatch/static/dispatch")
- self.inplace = None
-
- def get_root_path(self):
- return os.path.abspath(os.path.dirname(sys.modules["__main__"].__file__))
-
- def get_dist_paths(self):
- return []
-
- def get_manifest_additions(self):
- return []
-
- def finalize_options(self):
- # This requires some explanation. Basically what we want to do
- # here is to control if we want to build in-place or into the
- # build-lib folder. Traditionally this is set by the `inplace`
- # command line flag for build_ext. However as we are a subcommand
- # we need to grab this information from elsewhere.
- #
- # An in-place build puts the files generated into the source
- # folder, a regular build puts the files into the build-lib
- # folder.
- #
- # The following situations we need to cover:
- #
- # command default in-place
- # setup.py build_js 0
- # setup.py build_ext value of in-place for build_ext
- # setup.py build_ext --inplace 1
- # pip install --editable . 1
- # setup.py install 0
- # setup.py sdist 0
- # setup.py bdist_wheel 0
- #
- # The way this is achieved is that build_js is invoked by two
- # subcommands: bdist_ext (which is in our case always executed
- # due to a custom distribution) or sdist.
- #
- # Note: at one point install was an in-place build but it's not
- # quite sure why. In case a version of install breaks again:
- # installations via pip from git URLs definitely require the
- # in-place flag to be disabled. So we might need to detect
- # that separately.
- #
- # To find the default value of the inplace flag we inspect the
- # sdist and build_ext commands.
- sdist = self.distribution.get_command_obj("sdist")
- build_ext = self.get_finalized_command("build_ext")
-
- # If we are not decided on in-place we are inplace if either
- # build_ext is inplace or we are invoked through the install
- # command (easiest check is to see if it's finalized).
- if self.inplace is None:
- self.inplace = (build_ext.inplace or sdist.finalized) and 1 or 0
-
- # If we're coming from sdist, clear the hell out of the dist
- # folder first.
- if sdist.finalized:
- for path in self.get_dist_paths():
- try:
- shutil.rmtree(path)
- except (OSError, IOError):
- pass
-
- # In place means build_lib is src. We also log this.
- if self.inplace:
- log.debug("in-place js building enabled")
- self.build_lib = "src"
- # Otherwise we fetch build_lib from the build command.
- else:
- self.set_undefined_options("build", ("build_lib", "build_lib"))
- log.debug("regular js build: build path is %s" % self.build_lib)
-
- if self.work_path is None:
- self.work_path = self.get_root_path()
-
- def _needs_built(self):
- for path in self.get_dist_paths():
- if not os.path.isdir(path):
- return True
- return False
-
- def _setup_git(self):
- work_path = self.work_path
-
- if os.path.exists(os.path.join(work_path, ".git")):
- log.info("initializing git submodules")
- self._run_command(["git", "submodule", "init"])
- self._run_command(["git", "submodule", "update"])
-
- def _setup_js_deps(self):
- node_version = None
- try:
- node_version = self._run_command(["node", "--version"]).decode("utf-8").rstrip()
- except OSError:
- log.fatal("Cannot find node executable. Please install node" " and try again.")
- sys.exit(1)
-
- if node_version[2] is not None:
- log.info("using node ({0})".format(node_version))
- self._run_npm_command(["install"])
- self._run_npm_command(["run", "build", "--quiet"])
-
- def _run_command(self, cmd, env=None):
- cmd_str = " ".join(cmd)
- log.debug(f"running [{cmd_str}]")
- try:
- return check_output(cmd, cwd=self.work_path, env=env)
- except Exception:
- log.error(f"command failed [{cmd_str}] via [{self.work_path}]")
- raise
-
- def _run_npm_command(self, cmd, env=None):
- self._run_command(["npm"] + cmd, env=env)
-
- def update_manifests(self):
- # if we were invoked from sdist, we need to inform sdist about
- # which files we just generated. Otherwise they will be missing
- # in the manifest. This adds the files for what webpack generates
- # plus our own assets.json file.
- sdist = self.distribution.get_command_obj("sdist")
- if not sdist.finalized:
- return
-
- # The path down from here only works for sdist:
-
- # Use the underlying file list so that we skip the file-exists
- # check which we do not want here.
- files = sdist.filelist.files
- base = os.path.abspath(".")
-
- # We need to split off the local parts of the files relative to
- # the current folder. This will chop off the right path for the
- # manifest.
- for path in self.get_dist_paths():
- for dirname, _, filenames in os.walk(os.path.abspath(path)):
- for filename in filenames:
- filename = os.path.join(dirname, filename)
- files.append(filename[len(base) :].lstrip(os.path.sep))
-
- for file in self.get_manifest_additions():
- files.append(file)
-
- def run(self):
- if self.force or self._needs_built():
- self._setup_git()
- self._setup_js_deps()
- self._build()
- self.update_manifests()
-
-
-class BuildAssetsCommand(BaseBuildCommand):
- user_options = BaseBuildCommand.user_options + [
- (
- "asset-json-path=",
- None,
- "Relative path for JSON manifest. Defaults to {dist_name}/assets.json",
- ),
- (
- "inplace",
- "i",
- "ignore build-lib and put compiled javascript files into the source "
- + "directory alongside your pure Python modules",
- ),
- (
- "force",
- "f",
- "Force rebuilding of static content. Defaults to rebuilding on version "
- "change detection.",
- ),
- ]
-
- description = "build static media assets"
-
- def initialize_options(self):
- self.work_path = os.path.join(ROOT_PATH, "src/dispatch/static/dispatch")
- self.asset_json_path = os.path.join(self.work_path, "assets.json")
- BaseBuildCommand.initialize_options(self)
-
- def get_dist_paths(self):
- return [os.path.join(self.work_path, "/dist")]
-
- def get_manifest_additions(self):
- return (self.asset_json_path,)
-
- def _get_package_version(self):
- """
- Attempt to get the most correct current version of Dispatch.
- """
- pkg_path = os.path.join(ROOT_PATH, "src")
-
- sys.path.insert(0, pkg_path)
- try:
- import dispatch
- except Exception:
- version = None
- build = None
- else:
- log.info(f"pulled version information from 'dispatch' module. {dispatch.__file__}")
- version = self.distribution.get_version()
- build = dispatch.__build__
- finally:
- sys.path.pop(0)
-
- if not (version and build):
- json_path = self.get_asset_json_path()
- try:
- with open(json_path) as fp:
- data = json.loads(fp.read())
- except Exception:
- pass
- else:
- log.info("pulled version information from '{}'".format(json_path))
- version, build = data["version"], data["build"]
-
- return {"version": version, "build": build}
-
- def _needs_static(self, version_info):
- json_path = self.get_asset_json_path()
- if not os.path.exists(json_path):
- return True
-
- with open(json_path) as fp:
- data = json.load(fp)
- if data.get("version") != version_info.get("version"):
- return True
- if data.get("build") != version_info.get("build"):
- return True
- return False
-
- def _needs_built(self):
- if BaseBuildCommand._needs_built(self):
- return True
- version_info = self._get_package_version()
- return self._needs_static(version_info)
-
- def _build(self):
- version_info = self._get_package_version()
- log.info(
- "building assets for {} v{} (build {})".format(
- self.distribution.get_name(),
- version_info["version"] or "UNKNOWN",
- version_info["build"] or "UNKNOWN",
- )
- )
- if not version_info["version"] or not version_info["build"]:
- log.fatal("Could not determine dispatch version or build")
- sys.exit(1)
-
- try:
- self._build_static()
- except Exception:
- traceback.print_exc()
- log.fatal("unable to build Dispatch's static assets!")
- sys.exit(1)
-
- log.info("writing version manifest")
- manifest = self._write_version_file(version_info)
- log.info("recorded manifest\n{}".format(json.dumps(manifest, indent=2)))
-
- def _build_static(self):
- # By setting NODE_ENV=production, a few things happen
- # * Vue optimizes out certain code paths
- # * Webpack will add version strings to built/referenced assets
- env = dict(os.environ)
- env["DISPATCH_STATIC_DIST_PATH"] = self.dispatch_static_dist_path
- env["NODE_ENV"] = "production"
- # TODO: Our JS builds should not require 4GB heap space
- env["NODE_OPTIONS"] = (
- (env.get("NODE_OPTIONS", "") + " --max-old-space-size=4096")
- ).lstrip()
- # self._run_npm_command(["webpack", "--bail"], env=env)
-
- def _write_version_file(self, version_info):
- manifest = {
- "createdAt": datetime.datetime.utcnow().isoformat() + "Z",
- "version": version_info["version"],
- "build": version_info["build"],
- }
- with open(self.get_asset_json_path(), "w") as fp:
- json.dump(manifest, fp)
- return manifest
-
- @property
- def dispatch_static_dist_path(self):
- return os.path.abspath(os.path.join(self.build_lib, "src/static/dispatch/dist"))
-
- def get_asset_json_path(self):
- return os.path.abspath(os.path.join(self.build_lib, self.asset_json_path))
-
-
-VERSION = "0.1.0.dev0"
-IS_LIGHT_BUILD = os.environ.get("DISPATCH_LIGHT_BUILD") == "1"
-
-
-def get_requirements(env):
- with open("requirements-{}.txt".format(env)) as fp:
- return [x.strip() for x in fp.read().split("\n") if not x.startswith("#")]
-
-
-install_requires = get_requirements("base")
-dev_requires = get_requirements("dev")
-metrics_requires = get_requirements("metrics")
-
-
-class DispatchSDistCommand(SDistCommand):
- # If we are not a light build we want to also execute build_assets as
- # part of our source build pipeline.
- if not IS_LIGHT_BUILD:
- sub_commands = SDistCommand.sub_commands + [("build_assets", None)]
-
-
-class DispatchBuildCommand(BuildCommand):
- def run(self):
- if not IS_LIGHT_BUILD:
- self.run_command("build_assets")
- BuildCommand.run(self)
-
-
-class DispatchDevelopCommand(DevelopCommand):
- def run(self):
- DevelopCommand.run(self)
- if not IS_LIGHT_BUILD:
- self.run_command("build_assets")
-
-
-cmdclass = {
- "sdist": DispatchSDistCommand,
- "develop": DispatchDevelopCommand,
- "build": DispatchBuildCommand,
- "build_assets": BuildAssetsCommand,
-}
-
-# Get the long description from the README file
-with open(os.path.join(ROOT_PATH, "README.md"), encoding="utf-8") as f:
- long_description = f.read()
-
-setup(
- name="dispatch",
- version=VERSION,
- long_description=long_description,
- long_description_content_type="text/markdown",
- author="Netflix, Inc.",
- classifiers=[ # Optional
- "Development Status :: 3 - Alpha",
- "Intended Audience :: Developers",
- "License :: OSI Approved :: Apache",
- "Programming Language :: Python :: 3.7",
- "Programming Language :: Python :: 3.8",
- ],
- package_dir={"": "src"},
- packages=find_packages("src"),
- python_requires=">=3.7",
- install_requires=install_requires,
- extras_require={"dev": dev_requires, "metrics": metrics_requires},
- cmdclass=cmdclass,
- zip_save=False,
- include_package_data=True,
- entry_points={
- "console_scripts": ["dispatch = dispatch.cli:entrypoint"],
- "dispatch.plugins": [
- "dispatch_participants = dispatch.plugins.dispatch_core.plugin:DispatchParticipantPlugin",
- "dispatch_document_resolver = dispatch.plugins.dispatch_core.plugin:DispatchDocumentResolverPlugin",
- "dispatch_pkce_auth = dispatch.plugins.dispatch_core.plugin:PKCEAuthProviderPlugin",
- "google_docs_document = dispatch.plugins.dispatch_google.docs.plugin:GoogleDocsDocumentPlugin",
- "google_drive_storage = dispatch.plugins.dispatch_google.drive.plugin:GoogleDriveStoragePlugin",
- "google_drive_task = dispatch.plugins.dispatch_google.drive.plugin:GoogleDriveTaskPlugin",
- "google_gmail_conversation = dispatch.plugins.dispatch_google.gmail.plugin:GoogleGmailConversationPlugin",
- "google_groups_participants = dispatch.plugins.dispatch_google.groups.plugin:GoogleGroupParticipantGroupPlugin",
- "google_calendar_conference = dispatch.plugins.dispatch_google.calendar.plugin:GoogleCalendarConferencePlugin",
- "zoom_conference = dispatch.plugins.dispatch_zoom.plugin:ZoomConferencePlugin",
- "jira_ticket = dispatch.plugins.dispatch_jira.plugin:JiraTicketPlugin",
- "pagerduty_oncall = dispatch.plugins.dispatch_pagerduty.plugin:PagerDutyOncallPlugin",
- "slack_conversation = dispatch.plugins.dispatch_slack.plugin:SlackConversationPlugin",
- "slack_contact = dispatch.plugins.dispatch_slack.plugin:SlackContactPlugin",
- ],
- },
-)
diff --git a/src/dispatch/__init__.py b/src/dispatch/__init__.py
index e6ccd5afedd6..aef1a7b3550d 100644
--- a/src/dispatch/__init__.py
+++ b/src/dispatch/__init__.py
@@ -1,12 +1,90 @@
import os
import os.path
+import traceback
from subprocess import check_output
try:
- VERSION = __import__("pkg_resources").get_distribution("dispatch").version
+ from importlib.metadata import version
+
+ VERSION = version("dispatch")
except Exception:
VERSION = "unknown"
+# fix is in the works see: https://github.com/mpdavis/python-jose/pull/207
+import warnings
+
+warnings.filterwarnings("ignore", message="int_from_bytes is deprecated")
+
+# sometimes we pull version info before dispatch is totally installed
+try:
+ from dispatch.ai.prompt.models import Prompt # noqa lgtm[py/unused-import]
+ from dispatch.organization.models import Organization # noqa lgtm[py/unused-import]
+ from dispatch.project.models import Project # noqa lgtm[py/unused-import]
+ from dispatch.route.models import Recommendation # noqa lgtm[py/unused-import]
+ from dispatch.conference.models import Conference # noqa lgtm[py/unused-import]
+ from dispatch.conversation.models import Conversation # noqa lgtm[py/unused-import]
+ from dispatch.cost_model.models import (
+ CostModel, # noqa lgtm[py/unused-import]
+ CostModelActivity, # noqa lgtm[py/unused-import]
+ )
+ from dispatch.definition.models import Definition # noqa lgtm[py/unused-import]
+ from dispatch.document.models import Document # noqa lgtm[py/unused-import]
+ from dispatch.event.models import Event # noqa lgtm[py/unused-import]
+ from dispatch.incident.models import Incident # noqa lgtm[py/unused-import]
+ from dispatch.monitor.models import Monitor # noqa lgtm[py/unused-import]
+ from dispatch.feedback.incident.models import Feedback # noqa lgtm[py/unused-import]
+ from dispatch.feedback.service.models import ServiceFeedback # noqa lgtm[py/unused-import]
+ from dispatch.group.models import Group # noqa lgtm[py/unused-import]
+ from dispatch.incident_cost.models import IncidentCost # noqa lgtm[py/unused-import]
+ from dispatch.incident_cost_type.models import IncidentCostType # noqa lgtm[py/unused-import]
+ from dispatch.incident_role.models import IncidentRole # noqa lgtm[py/unused-import]
+ from dispatch.incident.priority.models import IncidentPriority # noqa lgtm[py/unused-import]
+ from dispatch.incident.severity.models import IncidentSeverity # noqa lgtm[py/unused-import]
+ from dispatch.incident.type.models import IncidentType # noqa lgtm[py/unused-import]
+ from dispatch.individual.models import IndividualContact # noqa lgtm[py/unused-import]
+ from dispatch.notification.models import Notification # noqa lgtm[py/unused-import]
+ from dispatch.participant.models import Participant # noqa lgtm[py/unused-import]
+ from dispatch.participant_activity.models import (
+ ParticipantActivity, # noqa lgtm[py/unused-import]
+ )
+ from dispatch.participant_role.models import ParticipantRole # noqa lgtm[py/unused-import]
+ from dispatch.plugin.models import Plugin, PluginEvent # noqa lgtm[py/unused-import]
+ from dispatch.report.models import Report # noqa lgtm[py/unused-import]
+ from dispatch.service.models import Service # noqa lgtm[py/unused-import]
+ from dispatch.storage.models import Storage # noqa lgtm[py/unused-import]
+ from dispatch.tag.models import Tag # noqa lgtm[py/unused-import]
+ from dispatch.tag_type.models import TagType # noqa lgtm[py/unused-import]
+ from dispatch.task.models import Task # noqa lgtm[py/unused-import]
+ from dispatch.team.models import TeamContact # noqa lgtm[py/unused-import]
+ from dispatch.term.models import Term # noqa lgtm[py/unused-import]
+ from dispatch.ticket.models import Ticket # noqa lgtm[py/unused-import]
+ from dispatch.workflow.models import Workflow # noqa lgtm[py/unused-import]
+ from dispatch.data.source.status.models import SourceStatus # noqa lgtm[py/unused-import]
+ from dispatch.data.source.transport.models import SourceTransport # noqa lgtm[py/unused-import]
+ from dispatch.data.source.type.models import SourceType # noqa lgtm[py/unused-import]
+ from dispatch.data.alert.models import Alert # noqa lgtm[py/unused-import]
+ from dispatch.data.query.models import Query # noqa lgtm[py/unused-import]
+ from dispatch.data.source.models import Source # noqa lgtm[py/unused-import]
+ from dispatch.search_filter.models import SearchFilter # noqa lgtm[py/unused-impot]
+ from dispatch.case.models import Case # noqa lgtm[py/unused-impot]
+ from dispatch.case.priority.models import CasePriority # noqa lgtm[py/unused-import]
+ from dispatch.case.severity.models import CaseSeverity # noqa lgtm[py/unused-import]
+ from dispatch.case.type.models import CaseType # noqa lgtm[py/unused-import]
+ from dispatch.case_cost.models import CaseCost # noqa lgtm[py/unused-import]
+ from dispatch.case_cost_type.models import CaseCostType # noqa lgtm[py/unused-import]
+ from dispatch.signal.models import Signal # noqa lgtm[py/unused-import]
+ from dispatch.feedback.service.reminder.models import (
+ ServiceFeedbackReminder, # noqa lgtm[py/unused-import]
+ )
+ from dispatch.forms.type.models import FormsType # noqa lgtm[py/unused-import]
+ from dispatch.forms.models import Forms # noqa lgtm[py/unused-import]
+ from dispatch.email_templates.models import EmailTemplates # noqa lgtm[py/unused-import]
+ from dispatch.canvas.models import Canvas # noqa lgtm[py/unused-import]
+
+
+except Exception:
+ traceback.print_exc()
+
def _get_git_revision(path):
if not os.path.exists(os.path.join(path, ".git")):
diff --git a/src/dispatch/incident_priority/__init__.py b/src/dispatch/ai/__init__.py
similarity index 100%
rename from src/dispatch/incident_priority/__init__.py
rename to src/dispatch/ai/__init__.py
diff --git a/src/dispatch/ai/constants.py b/src/dispatch/ai/constants.py
new file mode 100644
index 000000000000..72508dc00afb
--- /dev/null
+++ b/src/dispatch/ai/constants.py
@@ -0,0 +1,5 @@
+# Cache duration for AI-generated read-in summaries (in seconds)
+READ_IN_SUMMARY_CACHE_DURATION = 120 # 2 minutes
+
+# Tactical report generation reference in Slack
+TACTICAL_REPORT_SLACK_ACTION = "tactical_report_genai"
diff --git a/src/dispatch/ai/enums.py b/src/dispatch/ai/enums.py
new file mode 100644
index 000000000000..530b19d7d509
--- /dev/null
+++ b/src/dispatch/ai/enums.py
@@ -0,0 +1,43 @@
+from dispatch.enums import DispatchEnum
+from enum import IntEnum
+
+
+class AIEventSource(DispatchEnum):
+ """Source identifiers for AI-generated events."""
+
+ dispatch_genai = "Dispatch GenAI"
+
+
+class AIEventDescription(DispatchEnum):
+ """Description templates for AI-generated events."""
+
+ read_in_summary_created = "AI-generated read-in summary created for {participant_email}"
+
+ tactical_report_created = "AI-generated tactical report created for incident {incident_name}"
+
+
+class GenAIType(IntEnum):
+ """GenAI prompt types for different AI operations."""
+
+ TAG_RECOMMENDATION = 1
+ INCIDENT_SUMMARY = 2
+ SIGNAL_ANALYSIS = 3
+ CONVERSATION_SUMMARY = 4
+ TACTICAL_REPORT_SUMMARY = 5
+
+ @property
+ def display_name(self) -> str:
+ """Get the human-friendly display name for the type."""
+ display_names = {
+ self.TAG_RECOMMENDATION: "Tag Recommendation",
+ self.INCIDENT_SUMMARY: "Incident Summary",
+ self.SIGNAL_ANALYSIS: "Signal Analysis",
+ self.CONVERSATION_SUMMARY: "Conversation Summary",
+ self.TACTICAL_REPORT_SUMMARY: "Tactical Report Summary",
+ }
+ return display_names.get(self, f"Unknown Type ({self.value})")
+
+ @classmethod
+ def get_all_types(cls) -> list[dict]:
+ """Get all types with their IDs and display names."""
+ return [{"id": type_enum.value, "name": type_enum.display_name} for type_enum in cls]
diff --git a/src/dispatch/ai/exceptions.py b/src/dispatch/ai/exceptions.py
new file mode 100644
index 000000000000..3e2dbc7b91e3
--- /dev/null
+++ b/src/dispatch/ai/exceptions.py
@@ -0,0 +1,5 @@
+from dispatch.exceptions import DispatchException
+
+
+class GenAIException(DispatchException):
+ pass
diff --git a/src/dispatch/ai/models.py b/src/dispatch/ai/models.py
new file mode 100644
index 000000000000..976b57df796b
--- /dev/null
+++ b/src/dispatch/ai/models.py
@@ -0,0 +1,116 @@
+from pydantic import Field
+
+from dispatch.models import DispatchBase
+from dispatch.tag.models import TagTypeRecommendation
+
+
+class TagRecommendations(DispatchBase):
+ """
+ Model for structured tag recommendations output from AI analysis.
+
+ This model ensures the AI response contains properly structured tag recommendations
+ grouped by tag type.
+ """
+
+ recommendations: list[TagTypeRecommendation] = Field(
+ description="List of tag recommendations grouped by tag type", default_factory=list
+ )
+
+
+class ReadInSummary(DispatchBase):
+ """
+ Model for structured read-in summary output from AI analysis.
+
+ This model ensures the AI response is properly structured with timeline,
+ actions taken, and current status sections.
+ """
+
+ timeline: list[str] = Field(
+ description="Chronological list of key events and decisions", default_factory=list
+ )
+ actions_taken: list[str] = Field(
+ description="List of actions that were taken to address the security event",
+ default_factory=list,
+ )
+ current_status: str = Field(
+ description="Current status of the security event and any unresolved issues", default=""
+ )
+ summary: str = Field(description="Overall summary of the security event", default="")
+
+
+class ReadInSummaryResponse(DispatchBase):
+ """
+ Response model for read-in summary generation.
+
+ Includes the structured summary and any error messages.
+ """
+
+ summary: ReadInSummary | None = None
+ error_message: str | None = None
+
+
+class TacticalReport(DispatchBase):
+ """
+ Model for structured tactical report output from AI analysis. Enforces the presence of fields
+ dedicated to the incident's conditions, actions, and needs.
+ """
+
+ conditions: str = Field(
+ description="Summary of incident circumstances, with focus on scope and impact", default=""
+ )
+ actions: list[str] = Field(
+ description=(
+ "Chronological list of actions and analysis by both the party instigating "
+ "the incident and the response team"
+ ),
+ default_factory=list,
+ )
+ needs: list[str] = Field(
+ description=(
+ "Identified and unresolved action items from the incident, or an indication "
+ "that the incident is at resolution"
+ ),
+ default_factory=list,
+ )
+
+
+class TacticalReportResponse(DispatchBase):
+ """
+ Response model for tactical report generation. Includes the structured summary and any error messages.
+ """
+
+ tactical_report: TacticalReport | None = None
+ error_message: str | None = None
+
+
+class CaseSignalSummary(DispatchBase):
+ """
+ Model for structured case signal summary output from AI analysis.
+
+ This model represents the specific structure expected from the GenAI signal analysis prompt.
+ """
+
+ summary: str = Field(
+ description="4-5 sentence summary of the security event using precise, factual language",
+ default="",
+ )
+ historical_summary: str = Field(
+ description="2-3 sentence summary of historical cases for this signal", default=""
+ )
+ critical_analysis: str = Field(
+ description="Critical analysis considering false positive scenarios", default=""
+ )
+ recommendation: str = Field(
+ description="Recommended next steps based on the analysis", default=""
+ )
+
+
+class CaseSignalSummaryResponse(DispatchBase):
+ """
+ Response model for case signal summary generation.
+
+ Includes the structured summary and any error messages.
+ """
+
+ summary: CaseSignalSummary | None = None
+ error_message: str | None = None
diff --git a/src/dispatch/ai/prompt/__init__.py b/src/dispatch/ai/prompt/__init__.py
new file mode 100644
index 000000000000..755204fd7c62
--- /dev/null
+++ b/src/dispatch/ai/prompt/__init__.py
@@ -0,0 +1 @@
+# This file makes the prompt directory a Python package
diff --git a/src/dispatch/ai/prompt/models.py b/src/dispatch/ai/prompt/models.py
new file mode 100644
index 000000000000..d277c17b5c80
--- /dev/null
+++ b/src/dispatch/ai/prompt/models.py
@@ -0,0 +1,65 @@
+from datetime import datetime
+from sqlalchemy import Column, Integer, String, Boolean, UniqueConstraint
+
+from dispatch.models import DispatchBase
+from dispatch.database.core import Base
+from dispatch.models import TimeStampMixin, ProjectMixin, Pagination, PrimaryKey
+from dispatch.project.models import ProjectRead
+
+
+class Prompt(Base, TimeStampMixin, ProjectMixin):
+ """
+ SQLAlchemy model for AI prompts.
+
+ This model stores AI prompts that can be used for various GenAI operations
+ like tag recommendations, incident summaries, etc.
+ """
+
+ # Columns
+ id = Column(Integer, primary_key=True)
+ genai_type = Column(Integer, nullable=False)
+ genai_prompt = Column(String, nullable=False)
+ genai_system_message = Column(String, nullable=True)
+ enabled = Column(Boolean, default=False, nullable=False)
+
+ # Constraints
+ __table_args__ = (
+ UniqueConstraint(
+ "genai_type",
+ "project_id",
+ "enabled",
+ name="uq_prompt_type_project_enabled",
+ deferrable=True,
+ initially="DEFERRED",
+ ),
+ )
+
+
+# AI Prompt Models
+class PromptBase(DispatchBase):
+ genai_type: int | None = None
+ genai_prompt: str | None = None
+ genai_system_message: str | None = None
+ enabled: bool | None = None
+
+
+class PromptCreate(PromptBase):
+ project: ProjectRead | None = None
+
+
+class PromptUpdate(DispatchBase):
+ genai_type: int | None = None
+ genai_prompt: str | None = None
+ genai_system_message: str | None = None
+ enabled: bool | None = None
+
+
+class PromptRead(PromptBase):
+ id: PrimaryKey
+ created_at: datetime | None = None
+ updated_at: datetime | None = None
+ project: ProjectRead | None = None
+
+
+class PromptPagination(Pagination):
+ items: list[PromptRead]
diff --git a/src/dispatch/ai/prompt/service.py b/src/dispatch/ai/prompt/service.py
new file mode 100644
index 000000000000..ed4d2b947440
--- /dev/null
+++ b/src/dispatch/ai/prompt/service.py
@@ -0,0 +1,99 @@
+import logging
+from sqlalchemy.orm import Session
+
+from dispatch.project import service as project_service
+from dispatch.ai.enums import GenAIType
+from .models import Prompt, PromptCreate, PromptUpdate
+
+log = logging.getLogger(__name__)
+
+
+def get(*, prompt_id: int, db_session: Session) -> Prompt | None:
+ """Gets a prompt by its id."""
+ return db_session.query(Prompt).filter(Prompt.id == prompt_id).one_or_none()
+
+
+def get_by_type(*, genai_type: int, project_id: int, db_session: Session) -> Prompt | None:
+ """Gets an enabled prompt by its type."""
+ return (
+ db_session.query(Prompt)
+ .filter(Prompt.project_id == project_id)
+ .filter(Prompt.genai_type == genai_type)
+ .filter(Prompt.enabled)
+ .first()
+ )
+
+
+def get_all(*, db_session: Session) -> list[Prompt | None]:
+ """Gets all prompts."""
+ return db_session.query(Prompt)
+
+
+def create(*, prompt_in: PromptCreate, db_session: Session) -> Prompt:
+ """Creates prompt data."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=prompt_in.project
+ )
+
+ # If this prompt is being enabled, check if another enabled prompt of the same type exists
+ if prompt_in.enabled:
+ existing_enabled = (
+ db_session.query(Prompt)
+ .filter(Prompt.project_id == project.id)
+ .filter(Prompt.genai_type == prompt_in.genai_type)
+ .filter(Prompt.enabled)
+ .first()
+ )
+ if existing_enabled:
+ type_name = GenAIType(prompt_in.genai_type).display_name
+ raise ValueError(
+ f"Another prompt of type '{type_name}' is already enabled for this project. "
+ "Only one prompt per type can be enabled."
+ )
+
+ prompt = Prompt(**prompt_in.dict(exclude={"project"}), project=project)
+
+ db_session.add(prompt)
+ db_session.commit()
+ return prompt
+
+
+def update(
+ *,
+ prompt: Prompt,
+ prompt_in: PromptUpdate,
+ db_session: Session,
+) -> Prompt:
+ """Updates a prompt."""
+ update_data = prompt_in.dict(exclude_unset=True)
+
+ # If this prompt is being enabled, check if another enabled prompt of the same type exists
+ if update_data.get("enabled", False):
+ existing_enabled = (
+ db_session.query(Prompt)
+ .filter(Prompt.project_id == prompt.project_id)
+ .filter(Prompt.genai_type == prompt.genai_type)
+ .filter(Prompt.enabled)
+ .filter(Prompt.id != prompt.id) # Exclude current prompt
+ .first()
+ )
+ if existing_enabled:
+ type_name = GenAIType(prompt.genai_type).display_name
+ raise ValueError(
+ f"Another prompt of type '{type_name}' is already enabled for this project. "
+ "Only one prompt per type can be enabled."
+ )
+
+ # Update only the fields that were provided in the update data
+ for field, value in update_data.items():
+ setattr(prompt, field, value)
+
+ db_session.commit()
+ return prompt
+
+
+def delete(*, db_session, prompt_id: int):
+ """Deletes a prompt."""
+ prompt = db_session.query(Prompt).filter(Prompt.id == prompt_id).one_or_none()
+ db_session.delete(prompt)
+ db_session.commit()
diff --git a/src/dispatch/ai/prompt/views.py b/src/dispatch/ai/prompt/views.py
new file mode 100644
index 000000000000..1f3d011e239e
--- /dev/null
+++ b/src/dispatch/ai/prompt/views.py
@@ -0,0 +1,140 @@
+import logging
+from fastapi import APIRouter, HTTPException, status, Depends
+
+from sqlalchemy.exc import IntegrityError
+
+from dispatch.auth.permissions import (
+ SensitiveProjectActionPermission,
+ PermissionsDependency,
+)
+from dispatch.database.core import DbSession
+from dispatch.auth.service import CurrentUser
+from dispatch.database.service import search_filter_sort_paginate, CommonParameters
+from dispatch.models import PrimaryKey
+
+from .models import (
+ PromptRead,
+ PromptUpdate,
+ PromptPagination,
+ PromptCreate,
+)
+from .service import get, create, update, delete
+from dispatch.ai.strings import DEFAULT_PROMPTS, DEFAULT_SYSTEM_MESSAGES
+from dispatch.ai.enums import GenAIType
+
+log = logging.getLogger(__name__)
+router = APIRouter()
+
+
+@router.get("", response_model=PromptPagination)
+def get_prompts(commons: CommonParameters):
+ """Get all AI prompts, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="Prompt", **commons)
+
+
+@router.get("/genai-types")
+def get_genai_types():
+ """Get all available GenAI types from the backend enum."""
+ return {"types": GenAIType.get_all_types()}
+
+
+@router.get("/defaults")
+def get_default_prompts():
+ """Get default prompts and system messages for different GenAI types."""
+ return {
+ "prompts": DEFAULT_PROMPTS,
+ "system_messages": DEFAULT_SYSTEM_MESSAGES,
+ }
+
+
+@router.get("/{prompt_id}", response_model=PromptRead)
+def get_prompt(db_session: DbSession, prompt_id: PrimaryKey):
+ """Get an AI prompt by its id."""
+ prompt = get(db_session=db_session, prompt_id=prompt_id)
+ if not prompt:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An AI prompt with this id does not exist."}],
+ )
+ return prompt
+
+
+@router.post(
+ "",
+ response_model=PromptRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_prompt(
+ db_session: DbSession,
+ prompt_in: PromptCreate,
+ current_user: CurrentUser,
+):
+ """Create a new AI prompt."""
+ try:
+ return create(db_session=db_session, prompt_in=prompt_in)
+ except ValueError as e:
+ raise HTTPException(
+ status_code=status.HTTP_400_BAD_REQUEST,
+ detail=[{"msg": str(e)}],
+ ) from None
+ except IntegrityError:
+ raise HTTPException(
+ status_code=status.HTTP_400_BAD_REQUEST,
+ detail=[
+ {"msg": "An AI prompt with this configuration already exists.", "loc": "genai_type"}
+ ],
+ ) from None
+
+
+@router.put(
+ "/{prompt_id}",
+ response_model=PromptRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_prompt(
+ db_session: DbSession,
+ prompt_id: PrimaryKey,
+ prompt_in: PromptUpdate,
+):
+ """Update an AI prompt."""
+ prompt = get(db_session=db_session, prompt_id=prompt_id)
+ if not prompt:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An AI prompt with this id does not exist."}],
+ )
+ try:
+ prompt = update(
+ db_session=db_session,
+ prompt=prompt,
+ prompt_in=prompt_in,
+ )
+ except ValueError as e:
+ raise HTTPException(
+ status_code=status.HTTP_400_BAD_REQUEST,
+ detail=[{"msg": str(e)}],
+ ) from None
+ except IntegrityError:
+ raise HTTPException(
+ status_code=status.HTTP_400_BAD_REQUEST,
+ detail=[
+ {"msg": "An AI prompt with this configuration already exists.", "loc": "genai_type"}
+ ],
+ ) from None
+ return prompt
+
+
+@router.delete(
+ "/{prompt_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def delete_prompt(db_session: DbSession, prompt_id: PrimaryKey):
+ """Delete an AI prompt, returning only an HTTP 200 OK if successful."""
+ prompt = get(db_session=db_session, prompt_id=prompt_id)
+ if not prompt:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An AI prompt with this id does not exist."}],
+ )
+ delete(db_session=db_session, prompt_id=prompt_id)
diff --git a/src/dispatch/ai/service.py b/src/dispatch/ai/service.py
new file mode 100644
index 000000000000..490046bfbcca
--- /dev/null
+++ b/src/dispatch/ai/service.py
@@ -0,0 +1,785 @@
+import logging
+
+from dispatch.ai.constants import READ_IN_SUMMARY_CACHE_DURATION
+from dispatch.plugins.dispatch_slack.models import IncidentSubjects
+import tiktoken
+from sqlalchemy.orm import aliased, Session
+
+from dispatch.case.enums import CaseResolutionReason
+from dispatch.case.models import Case
+from dispatch.enums import Visibility
+from dispatch.incident.models import Incident
+from dispatch.plugin import service as plugin_service
+from dispatch.project.models import Project
+from dispatch.signal import service as signal_service
+from dispatch.tag.models import Tag, TagRecommendationResponse
+from dispatch.tag_type.models import TagType
+from dispatch.case import service as case_service
+from dispatch.incident import service as incident_service
+from dispatch.types import Subject
+from dispatch.event import service as event_service
+from dispatch.enums import EventType
+
+from .exceptions import GenAIException
+from .models import (
+ ReadInSummary,
+ ReadInSummaryResponse,
+ CaseSignalSummary,
+ CaseSignalSummaryResponse,
+ TacticalReport,
+ TacticalReportResponse,
+ TagRecommendations,
+)
+from .prompt.service import get_by_type
+from .enums import AIEventSource, AIEventDescription, GenAIType
+from .strings import (
+ TAG_RECOMMENDATION_PROMPT,
+ TAG_RECOMMENDATION_SYSTEM_MESSAGE,
+ INCIDENT_SUMMARY_PROMPT,
+ INCIDENT_SUMMARY_SYSTEM_MESSAGE,
+ READ_IN_SUMMARY_PROMPT,
+ READ_IN_SUMMARY_SYSTEM_MESSAGE,
+ SIGNAL_ANALYSIS_PROMPT,
+ SIGNAL_ANALYSIS_SYSTEM_MESSAGE,
+ STRUCTURED_OUTPUT,
+ TACTICAL_REPORT_PROMPT,
+ TACTICAL_REPORT_SYSTEM_MESSAGE,
+)
+
+log = logging.getLogger(__name__)
+
+
+def get_model_token_limit(model_name: str, buffer_percentage: float = 0.05) -> int:
+ """
+ Returns the maximum token limit for a given LLM model with a safety buffer.
+
+ Args:
+ model_name (str): The name of the LLM model.
+ buffer_percentage (float): Percentage of tokens to reserve as buffer (default: 5%).
+
+ Returns:
+ int: The maximum number of tokens allowed in the context window for the specified model,
+ with a safety buffer applied.
+ """
+ default_max_tokens = 128000
+
+ model_token_limits = {
+ # OpenAI models (most recent)
+ "gpt-4o": 128000,
+ # Anthropic models (Claude 3.5 and 3.7 Sonnet variants)
+ "claude-3-5-sonnet-20241022": 200000,
+ "claude-3-7-sonnet-20250219": 200000,
+ }
+
+ # Get the raw token limit for the model
+ raw_limit = model_token_limits.get(model_name.lower(), default_max_tokens)
+
+ # Apply safety buffer
+ safe_limit = int(raw_limit * (1 - buffer_percentage))
+
+ return safe_limit
+
+
+def num_tokens_from_string(message: str, model: str) -> tuple[list[int], int, tiktoken.Encoding]:
+ """
+ Calculate the number of tokens in a given string for a specified model.
+
+ Args:
+ message (str): The input string to be tokenized.
+ model (str): The model name to use for tokenization.
+
+ Returns:
+ tuple: A tuple containing a list of token integers, the number of tokens, and the encoding object.
+ """
+ try:
+ encoding = tiktoken.encoding_for_model(model)
+ except KeyError:
+ log.warning(
+ f"We could not automatically map {model} to a tokeniser. Using o200k_base encoding."
+ )
+ # defaults to o200k_base encoding used in gpt-4o, gpt-4o-mini models
+ encoding = tiktoken.get_encoding("o200k_base")
+
+ tokenized_message = encoding.encode(message)
+ num_tokens = len(tokenized_message)
+
+ return tokenized_message, num_tokens, encoding
+
+
+def truncate_prompt(
+ tokenized_prompt: list[int],
+ num_tokens: int,
+ encoding: tiktoken.Encoding,
+ model_token_limit: int,
+) -> str:
+ """
+ Truncate the tokenized prompt to ensure it does not exceed the maximum number of tokens.
+
+ Args:
+ tokenized_prompt (list[int]): The tokenized input prompt to be truncated.
+ num_tokens (int): The number of tokens in the input prompt.
+ encoding (tiktoken.Encoding): The encoding object used for tokenization.
+
+ Returns:
+ str: The truncated prompt as a string.
+ """
+ excess_tokens = num_tokens - model_token_limit
+ truncated_tokenized_prompt = tokenized_prompt[:-excess_tokens]
+ truncated_prompt = encoding.decode(truncated_tokenized_prompt)
+ log.warning(f"GenAI prompt truncated to fit within {model_token_limit} tokens.")
+ return truncated_prompt
+
+
+def prepare_prompt_for_model(prompt: str, model_name: str) -> str:
+ """
+ Tokenizes and truncates the prompt if it exceeds the model's token limit.
+ Returns a prompt string that is safe to send to the model.
+ """
+ tokenized_prompt, num_tokens, encoding = num_tokens_from_string(prompt, model_name)
+ model_token_limit = get_model_token_limit(model_name)
+ if num_tokens > model_token_limit:
+ prompt = truncate_prompt(tokenized_prompt, num_tokens, encoding, model_token_limit)
+ return prompt
+
+
+def generate_case_signal_historical_context(case: Case, db_session: Session) -> str:
+ """
+ Generate historical context for a case stemming from a signal, including related cases and relevant data.
+
+ Args:
+ case (Case): The case object for which historical context is being generated.
+ db_session (Session): The database session used for querying related data.
+
+ Returns:
+ str: A string containing the historical context for the case, or an error message if context generation fails.
+ """
+ # we fetch the first instance id and signal
+ (first_instance_id, first_instance_signal) = signal_service.get_instances_in_case(
+ db_session=db_session, case_id=case.id
+ ).first()
+
+ signal_instance = signal_service.get_signal_instance(
+ db_session=db_session, signal_instance_id=first_instance_id
+ )
+
+ # Check if the signal instance is valid
+ if not signal_instance:
+ message = "Unable to generate historical context. Signal instance not found."
+ log.warning(message)
+ raise GenAIException(message)
+
+ # Check if the signal is valid
+ if not signal_instance.signal:
+ message = "Unable to generate historical context. Signal not found."
+ log.warning(message)
+ raise GenAIException(message)
+
+ # Check if GenAI is enabled for the signal
+ if not signal_instance.signal.genai_enabled:
+ message = (
+ "Unable to generate historical context. GenAI feature not enabled for this detection."
+ )
+ log.warning(message)
+ raise GenAIException(message)
+
+ # we fetch related cases
+ related_cases = []
+ for resolution_reason in CaseResolutionReason:
+ # Get the query for cases for a specific resolution reason
+ query = signal_service.get_cases_for_signal_by_resolution_reason(
+ db_session=db_session,
+ signal_id=first_instance_signal.id,
+ resolution_reason=resolution_reason,
+ )
+
+ # Create an alias for the subquery
+ subquery = query.subquery()
+ case_alias = aliased(Case, subquery)
+
+ # Filter the cases and extend the related_cases list
+ related_cases.extend(db_session.query(case_alias).filter(case_alias.id != case.id).all())
+
+ # we prepare historical context
+ historical_context = []
+ for related_case in related_cases:
+ historical_context.append("")
+ historical_context.append(f"{related_case.name} ")
+ historical_context.append(f"{related_case.resolution} {related_case.resolution_reason}"
+ )
+ historical_context.append(
+ f"{related_case.signal_instances[0].raw} "
+ )
+ conversation_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+ if conversation_plugin:
+ if related_case.conversation and related_case.conversation.channel_id:
+ # we fetch conversation replies for the related case
+ conversation_replies = conversation_plugin.instance.get_conversation_replies(
+ conversation_id=related_case.conversation.channel_id,
+ thread_ts=related_case.conversation.thread_id,
+ )
+ for reply in conversation_replies:
+ historical_context.append(
+ f"{reply} "
+ )
+ else:
+ log.warning(
+ "Conversation replies not included in historical context. No conversation plugin enabled."
+ )
+ historical_context.append(" ")
+
+ return "\n".join(historical_context)
+
+
+def generate_case_signal_summary(case: Case, db_session: Session) -> CaseSignalSummaryResponse:
+ """
+ Generate an analysis summary of a case stemming from a signal.
+
+ Args:
+ case (Case): The case object for which the analysis summary is being generated.
+ db_session (Session): The database session used for querying related data.
+
+ Returns:
+ CaseSignalSummaryResponse: A structured response containing the analysis summary or error message.
+ """
+ # we generate the historical context
+ try:
+ historical_context = generate_case_signal_historical_context(
+ case=case, db_session=db_session
+ )
+ except GenAIException as e:
+ log.warning(f"Error generating GenAI historical context for {case.name}: {str(e)}")
+ raise e
+
+ # we fetch the artificial intelligence plugin
+ genai_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="artificial-intelligence"
+ )
+
+ # we check if the artificial intelligence plugin is enabled
+ if not genai_plugin:
+ message = (
+ "Unable to generate GenAI signal analysis. No artificial-intelligence plugin enabled."
+ )
+ log.warning(message)
+ raise GenAIException(message)
+
+ # we fetch the first instance id and signal
+ (first_instance_id, first_instance_signal) = signal_service.get_instances_in_case(
+ db_session=db_session, case_id=case.id
+ ).first()
+
+ signal_instance = signal_service.get_signal_instance(
+ db_session=db_session, signal_instance_id=first_instance_id
+ )
+
+ # Check if the signal instance is valid
+ if not signal_instance:
+ message = "Unable to generate GenAI signal analysis. Signal instance not found."
+ log.warning(message)
+ raise GenAIException(message)
+
+ # Check if the signal is valid
+ if not signal_instance.signal:
+ message = "Unable to generate GenAI signal analysis. Signal not found."
+ log.warning(message)
+ raise GenAIException(message)
+
+ # Check if GenAI is enabled for the signal
+ if not signal_instance.signal.genai_enabled:
+ message = f"Unable to generate GenAI signal analysis. GenAI feature not enabled for {signal_instance.signal.name}."
+ log.warning(message)
+ raise GenAIException(message)
+
+ # we check if the signal has a prompt defined
+ if not signal_instance.signal.genai_prompt:
+ message = f"Unable to generate GenAI signal analysis. No GenAI prompt defined for {signal_instance.signal.name}."
+ log.warning(message)
+ raise GenAIException(message)
+
+ # we generate the prompt
+ db_prompt = get_by_type(
+ genai_type=GenAIType.SIGNAL_ANALYSIS,
+ project_id=case.project.id,
+ db_session=db_session,
+ )
+ prompt = f"""
+
+ {signal_instance.signal.genai_prompt
+ if signal_instance.signal.genai_prompt
+ else (db_prompt.genai_prompt if db_prompt and db_prompt.genai_prompt else SIGNAL_ANALYSIS_PROMPT)}
+
+
+
+ {str(signal_instance.raw)}
+
+
+
+ {signal_instance.signal.runbook}
+
+
+
+ {historical_context}
+
+ """
+
+ prompt = prepare_prompt_for_model(
+ prompt, genai_plugin.instance.configuration.chat_completion_model
+ )
+
+ # we generate the analysis
+ try:
+ # Use the system message from the signal if available
+ system_message = (
+ signal_instance.signal.genai_system_message
+ if signal_instance.signal.genai_system_message
+ else (
+ db_prompt.genai_system_message
+ if db_prompt and db_prompt.genai_system_message
+ else SIGNAL_ANALYSIS_SYSTEM_MESSAGE
+ )
+ )
+
+ result = genai_plugin.instance.chat_parse(
+ prompt=prompt, response_model=CaseSignalSummary, system_message=system_message
+ )
+
+ return CaseSignalSummaryResponse(summary=result)
+
+ except Exception as e:
+ log.exception(f"Error generating case signal summary: {e}")
+ error_msg = f"Error generating case signal summary: {str(e)}"
+ return CaseSignalSummaryResponse(error_message=error_msg)
+
+
+def generate_incident_summary(incident: Incident, db_session: Session) -> str:
+ """
+ Generate a summary for an incident.
+
+ Args:
+ incident (Incident): The incident object for which the summary is being generated.
+ db_session (Session): The database session used for querying related data.
+
+ Returns:
+ str: A string containing the summary of the incident, or an error message if summary generation fails.
+ """
+ # Skip summary for restricted incidents
+ if incident.visibility == Visibility.restricted:
+ return "Incident summary not generated for restricted incident."
+
+ # Skip if incident is a duplicate
+ if incident.duplicates:
+ return "Incident summary not generated for duplicate incident."
+
+ # Skip if no incident review document
+ if not incident.incident_review_document or not incident.incident_review_document.resource_id:
+ log.info(
+ f"Incident summary not generated for incident {incident.name}. No review document found."
+ )
+ return "Incident summary not generated. No review document found."
+
+ # Don't generate if no enabled ai plugin or storage plugin
+ genai_plugin = plugin_service.get_active_instance(
+ db_session=db_session, plugin_type="artificial-intelligence", project_id=incident.project.id
+ )
+ if not genai_plugin:
+ message = f"Incident summary not generated for incident {incident.name}. No artificial-intelligence plugin enabled."
+ log.warning(message)
+ return "Incident summary not generated. No artificial-intelligence plugin enabled."
+
+ storage_plugin = plugin_service.get_active_instance(
+ db_session=db_session, plugin_type="storage", project_id=incident.project.id
+ )
+
+ if not storage_plugin:
+ log.info(
+ f"Incident summary not generated for incident {incident.name}. No storage plugin enabled."
+ )
+ return "Incident summary not generated. No storage plugin enabled."
+
+ try:
+ pir_doc = storage_plugin.instance.get(
+ file_id=incident.incident_review_document.resource_id,
+ mime_type="text/plain",
+ )
+
+ # Check for enabled prompt in database (type 2 = incident summary)
+ db_prompt = get_by_type(
+ genai_type=GenAIType.INCIDENT_SUMMARY,
+ project_id=incident.project.id,
+ db_session=db_session,
+ )
+ prompt = f"{db_prompt.genai_prompt if db_prompt else INCIDENT_SUMMARY_PROMPT}\n\n{pir_doc}"
+ system_message = (
+ getattr(db_prompt, "genai_system_message", None) or INCIDENT_SUMMARY_SYSTEM_MESSAGE
+ )
+
+ prompt = prepare_prompt_for_model(
+ prompt, genai_plugin.instance.configuration.chat_completion_model
+ )
+
+ summary = genai_plugin.instance.chat_completion(
+ prompt=prompt, system_message=system_message
+ )
+
+ incident.summary = summary
+ db_session.add(incident)
+ db_session.commit()
+
+ # Log the AI summary generation event
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="AI-generated incident summary created",
+ incident_id=incident.id,
+ details={"summary": summary},
+ type=EventType.other,
+ )
+
+ return summary
+
+ except Exception as e:
+ log.exception(f"Error trying to generate summary for incident {incident.name}: {e}")
+ return "Incident summary not generated. An error occurred."
+
+
+def get_tag_recommendations(
+ *, db_session, project_id: int, case_id: int | None = None, incident_id: int | None = None
+) -> TagRecommendationResponse:
+ """Gets tag recommendations for a project."""
+ genai_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type="artificial-intelligence"
+ )
+
+ # we check if the artificial intelligence plugin is enabled
+ if not genai_plugin:
+ message = (
+ "AI tag suggestions are not available. No AI plugin is configured for this project."
+ )
+ log.warning(message)
+ return TagRecommendationResponse(recommendations=[], error_message=message)
+
+ # Check for enabled prompt in database (type 1 = tag recommendation)
+ db_prompt = get_by_type(
+ genai_type=GenAIType.TAG_RECOMMENDATION, project_id=project_id, db_session=db_session
+ )
+ prompt = db_prompt.genai_prompt if db_prompt else TAG_RECOMMENDATION_PROMPT
+ system_message = (
+ getattr(db_prompt, "genai_system_message", None) or TAG_RECOMMENDATION_SYSTEM_MESSAGE
+ )
+
+ storage_plugin = plugin_service.get_active_instance(
+ db_session=db_session, plugin_type="storage", project_id=project_id
+ )
+
+ # get resources from the case or incident
+ resources = ""
+ if case_id:
+ case = case_service.get(db_session=db_session, case_id=case_id)
+ if not case:
+ raise ValueError(f"Case with id {case_id} not found")
+ if case.visibility == Visibility.restricted:
+ message = "AI tag suggestions are not available for restricted cases."
+ return TagRecommendationResponse(recommendations=[], error_message=message)
+
+ resources += f"Case title: {case.name}\n"
+ resources += f"Description: {case.description}\n"
+ resources += f"Resolution: {case.resolution}\n"
+ resources += f"Resolution Reason: {case.resolution_reason}\n"
+ resources += f"Case type: {case.case_type.name}\n"
+
+ if storage_plugin and case.case_document and case.case_document.resource_id:
+ case_doc = storage_plugin.instance.get(
+ file_id=case.case_document.resource_id,
+ mime_type="text/plain",
+ )
+ resources += f"Case document: {case_doc}\n"
+
+ elif incident_id:
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+ if not incident:
+ raise ValueError(f"Incident with id {incident_id} not found")
+ if incident.visibility == Visibility.restricted:
+ message = "AI tag suggestions are not available for restricted incidents."
+ return TagRecommendationResponse(recommendations=[], error_message=message)
+
+ resources += f"Incident: {incident.name}\n"
+ resources += f"Description: {incident.description}\n"
+ resources += f"Resolution: {incident.resolution}\n"
+ resources += f"Incident type: {incident.incident_type.name}\n"
+
+ if storage_plugin and incident.incident_document and incident.incident_document.resource_id:
+ incident_doc = storage_plugin.instance.get(
+ file_id=incident.incident_document.resource_id,
+ mime_type="text/plain",
+ )
+ resources += f"Incident document: {incident_doc}\n"
+
+ if (
+ storage_plugin
+ and incident.incident_review_document
+ and incident.incident_review_document.resource_id
+ ):
+ incident_review_doc = storage_plugin.instance.get(
+ file_id=incident.incident_review_document.resource_id,
+ mime_type="text/plain",
+ )
+ resources += f"Incident review document: {incident_review_doc}\n"
+
+ else:
+ raise ValueError("Either case_id or incident_id must be provided")
+ # get all tags for the project with the tag_type that has genai_suggestions set to True
+ tags: list[Tag] = (
+ db_session.query(Tag)
+ .filter(Tag.project_id == project_id)
+ .filter(Tag.tag_type.has(TagType.genai_suggestions.is_(True)))
+ .all()
+ )
+
+ # Check if there are any tags available for AI suggestions
+ if not tags:
+ message = (
+ "AI tag suggestions are not available. No tag types are configured "
+ "for AI suggestions in this project."
+ )
+ return TagRecommendationResponse(recommendations=[], error_message=message)
+
+ # add to the resources each tag name, id, tag_type_id, and description
+ tag_list = "Tags you can use:\n" + (
+ "\n".join(
+ [
+ f"tag_name: {tag.name}\n"
+ f"tag_id: {tag.id}\n"
+ f"description: {tag.description}\n"
+ f"tag_type_id: {tag.tag_type_id}\n"
+ f"tag_type_name: {tag.tag_type.name}\n"
+ f"tag_type_description: {tag.tag_type.description}\n"
+ for tag in tags
+ ]
+ )
+ + "\n"
+ )
+
+ prompt += f"** Tags you can use: {tag_list} \n ** Security event details: {resources}"
+
+ prompt = prepare_prompt_for_model(
+ prompt, genai_plugin.instance.configuration.chat_completion_model
+ )
+
+ try:
+ result = genai_plugin.instance.chat_parse(
+ prompt=prompt, response_model=TagRecommendations, system_message=system_message
+ )
+ return TagRecommendationResponse(recommendations=result.recommendations, error_message=None)
+ except Exception as e:
+ log.exception(f"Error generating tag recommendations: {e}")
+ message = "AI tag suggestions encountered an error. Please try again later."
+ return TagRecommendationResponse(recommendations=[], error_message=message)
+
+
+def generate_read_in_summary(
+ *,
+ db_session,
+ subject: Subject,
+ project: Project,
+ channel_id: str,
+ important_reaction: str,
+ participant_email: str = "",
+) -> ReadInSummaryResponse:
+ """
+ Generate a read-in summary for a subject.
+
+ Args:
+ subject (Subject): The subject object for which the read-in summary is being generated.
+ project (Project): The project context.
+ channel_id (str): The channel ID to get conversation from.
+ important_reaction (str): The reaction to filter important messages.
+ participant_email (str): The email of the participant for whom the summary was generated.
+
+ Returns:
+ ReadInSummaryResponse: A structured response containing the read-in summary or error message.
+ """
+ subject_type = subject.type
+
+ # Check for recent summary event
+ if subject_type == IncidentSubjects.incident:
+ recent_event = event_service.get_recent_summary_event(
+ db_session, incident_id=subject.id, max_age_seconds=READ_IN_SUMMARY_CACHE_DURATION
+ )
+ else:
+ recent_event = event_service.get_recent_summary_event(
+ db_session, case_id=subject.id, max_age_seconds=READ_IN_SUMMARY_CACHE_DURATION
+ )
+
+ if recent_event and recent_event.details:
+ try:
+ summary = ReadInSummary(**recent_event.details)
+ return ReadInSummaryResponse(summary=summary)
+ except Exception as e:
+ log.warning(
+ f"Failed to parse cached summary from event {recent_event.id}: {e}. Generating new summary."
+ )
+
+ # Don't generate if no enabled ai plugin or storage plugin
+ genai_plugin = plugin_service.get_active_instance(
+ db_session=db_session, plugin_type="artificial-intelligence", project_id=project.id
+ )
+ if not genai_plugin:
+ message = f"Read-in summary not generated for {subject.name}. No artificial-intelligence plugin enabled."
+ log.warning(message)
+ return ReadInSummaryResponse(error_message=message)
+
+ conversation_plugin = plugin_service.get_active_instance(
+ db_session=db_session, plugin_type="conversation", project_id=project.id
+ )
+ if not conversation_plugin:
+ message = (
+ f"Read-in summary not generated for {subject.name}. No conversation plugin enabled."
+ )
+ log.warning(message)
+ return ReadInSummaryResponse(error_message=message)
+
+ conversation = conversation_plugin.instance.get_conversation(
+ conversation_id=channel_id, include_user_details=True, important_reaction=important_reaction
+ )
+ if not conversation:
+ message = f"Read-in summary not generated for {subject.name}. No conversation found."
+ log.warning(message)
+ return ReadInSummaryResponse(error_message=message)
+
+ # Check for enabled prompt in database (type 4 = read-in summary)
+ db_prompt = get_by_type(
+ genai_type=GenAIType.CONVERSATION_SUMMARY, project_id=project.id, db_session=db_session
+ )
+ prompt = f"{db_prompt.genai_prompt if db_prompt else READ_IN_SUMMARY_PROMPT}\nConversation messages:\n{conversation}"
+ system_message = (
+ getattr(db_prompt, "genai_system_message", None) or READ_IN_SUMMARY_SYSTEM_MESSAGE
+ ) + STRUCTURED_OUTPUT
+
+ prompt = prepare_prompt_for_model(
+ prompt, genai_plugin.instance.configuration.chat_completion_model
+ )
+
+ try:
+ result = genai_plugin.instance.chat_parse(
+ prompt=prompt, response_model=ReadInSummary, system_message=system_message
+ )
+
+ # Log the AI read-in summary generation event
+ if subject.type == IncidentSubjects.incident:
+ # This is an incident
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=AIEventSource.dispatch_genai,
+ description=AIEventDescription.read_in_summary_created.format(
+ participant_email=participant_email
+ ),
+ incident_id=subject.id,
+ details=result.dict(),
+ type=EventType.other,
+ )
+ else:
+ # This is a case
+ event_service.log_case_event(
+ db_session=db_session,
+ source=AIEventSource.dispatch_genai,
+ description=AIEventDescription.read_in_summary_created.format(
+ participant_email=participant_email
+ ),
+ case_id=subject.id,
+ details=result.dict(),
+ type=EventType.other,
+ )
+
+ return ReadInSummaryResponse(summary=result)
+
+ except Exception as e:
+ log.exception(f"Error generating read-in summary: {e}")
+ error_msg = f"Error generating read-in summary: {str(e)}"
+ return ReadInSummaryResponse(error_message=error_msg)
+
+
+def generate_tactical_report(
+ *,
+ db_session,
+ incident: Incident,
+ project: Project,
+ important_reaction: str | None = None,
+) -> TacticalReportResponse:
+ """
+ Generate a tactical report for a given subject.
+
+ Args:
+ channel_id (str): The channel ID to target when fetching conversation history
+ important_reaction (str): The emoji reaction denoting important messages
+
+ Returns:
+ TacticalReportResponse: A structured response containing the tactical report or error message.
+ """
+
+ genai_plugin = plugin_service.get_active_instance(
+ db_session=db_session, plugin_type="artificial-intelligence", project_id=project.id
+ )
+ if not genai_plugin:
+ message = f"Tactical report not generated for {incident.name}. No artificial-intelligence plugin enabled."
+ log.warning(message)
+ return TacticalReportResponse(error_message=message)
+
+ conversation_plugin = plugin_service.get_active_instance(
+ db_session=db_session, plugin_type="conversation", project_id=project.id
+ )
+ if not conversation_plugin:
+ message = (
+ f"Tactical report not generated for {incident.name}. No conversation plugin enabled."
+ )
+ log.warning(message)
+ return TacticalReportResponse(error_message=message)
+
+ conversation = conversation_plugin.instance.get_conversation(
+ conversation_id=incident.conversation.channel_id,
+ include_user_details=True,
+ important_reaction=important_reaction,
+ )
+ if not conversation:
+ message = f"Tactical report not generated for {incident.name}. No conversation found."
+ log.warning(message)
+ return TacticalReportResponse(error_message=message)
+
+ # Check for enabled prompt in database (type 5 = tactical report)
+ db_prompt = get_by_type(
+ genai_type=GenAIType.TACTICAL_REPORT_SUMMARY, project_id=project.id, db_session=db_session
+ )
+ raw_prompt = f"{db_prompt.genai_prompt if db_prompt else TACTICAL_REPORT_PROMPT}\nConversation messages:\n{conversation}"
+ system_message = (
+ getattr(db_prompt, "genai_system_message", None) or TACTICAL_REPORT_SYSTEM_MESSAGE
+ ) + STRUCTURED_OUTPUT
+
+ prompt = prepare_prompt_for_model(
+ raw_prompt, genai_plugin.instance.configuration.chat_completion_model
+ )
+
+ try:
+ result = genai_plugin.instance.chat_parse(
+ prompt=prompt, response_model=TacticalReport, system_message=system_message
+ )
+
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=AIEventSource.dispatch_genai,
+ description=AIEventDescription.tactical_report_created.format(
+ incident_name=incident.name
+ ),
+ incident_id=incident.id,
+ details=result.dict(),
+ type=EventType.other,
+ )
+
+ return TacticalReportResponse(tactical_report=result)
+
+ except Exception as e:
+ error_message = f"Error generating tactical report: {str(e)}"
+ log.exception(error_message)
+ return TacticalReportResponse(error_message=error_message)
diff --git a/src/dispatch/ai/strings.py b/src/dispatch/ai/strings.py
new file mode 100644
index 000000000000..5f437d926d6d
--- /dev/null
+++ b/src/dispatch/ai/strings.py
@@ -0,0 +1,99 @@
+"""AI prompt string constants."""
+
+# Tag recommendation
+TAG_RECOMMENDATION_PROMPT = """
+Please recommend the top three tags of each tag_type_id for this event.
+"""
+
+TAG_RECOMMENDATION_SYSTEM_MESSAGE = """
+You are a security professional that helps with tag recommendations.
+You will be given details about a security event and a list of tags with their descriptions.
+Use the tag descriptions to recommend tags for the security event.
+Always identify the top three tags of each tag_type_id that best apply to the event.
+"""
+
+
+# Incident summary
+INCIDENT_SUMMARY_PROMPT = """
+Answer the following questions based on the provided security post-incident review document.
+1. What is the summary of what happened?
+2. What were the overall risk(s)?
+3. How were the risk(s) mitigated?
+4. How was the incident resolved?
+5. What are the follow-up tasks?
+"""
+
+INCIDENT_SUMMARY_SYSTEM_MESSAGE = """
+You are a security professional that helps with incident summaries.
+You will be given a security post-incident review document.
+Use the text to summarize the incident and answer the questions provided.
+Do not include the questions in your response.
+Do not use any of these words in your summary unless they appear in the document: breach, unauthorized, leak, violation, unlawful, illegal.
+"""
+
+# Read-in summary
+READ_IN_SUMMARY_SYSTEM_MESSAGE = """You are a cybersecurity analyst tasked with creating structured read-in summaries.
+Analyze the provided channel messages and extract key information about a security event.
+Focus on identifying:
+1. Timeline: Chronological list of key events and decisions (skip channel join/remove messages)
+ - For all timeline events, format timestamps as YYYY-MM-DD HH:MM (no seconds, no 'T').
+2. Actions taken: List of actions that were taken to address the security event
+3. Current status: Current status of the security event and any unresolved issues
+4. Summary: Overall summary of the security event
+
+Only include the most relevant events and outcomes. Be clear and concise.
+"""
+
+READ_IN_SUMMARY_PROMPT = """Analyze the following channel messages regarding a security event and provide a structured summary."""
+
+# Signal analysis
+SIGNAL_ANALYSIS_SYSTEM_MESSAGE = """
+You are a cybersecurity analyst evaluating potential security incidents.
+Review the current event, historical cases, and runbook details.
+Be factual, concise, and balanced-do not assume every alert is a true positive.
+"""
+
+SIGNAL_ANALYSIS_PROMPT = """
+Given the following information, analyze the security event and provide your response in the required format.
+"""
+
+# Tactical report
+TACTICAL_REPORT_SYSTEM_MESSAGE = """
+You are a cybersecurity analyst tasked with creating structured tactical reports. Analyze the
+provided channel messages and extract these 3 key types of information:
+1. Conditions: the circumstances surrounding the event. For example, initial identification, event description,
+affected parties and systems, the nature of the security flaw or security type, and the observable impact both inside and outside
+the organization.
+2. Actions: the actions performed in response to the event. For example, containment/mitigation steps, investigation or log analysis, internal
+and external communications or notifications, remediation steps (such as policy or configuration changes), and
+vendor or partner engagements. Prioritize executed actions over plans. Include relevant team or individual names.
+3. Needs: unfulfilled requests associated with the event's resolution. For example, information to gather,
+technical remediation steps, process improvements and preventative actions, or alignment/decision making. Include individuals
+or teams as assignees where possible. If the incident is at its resolution with no unresolved needs, this section
+can instead be populated with a note to that effect.
+
+Only include the most impactful events and outcomes. Be clear, professional, and concise. Use complete sentences with clear subjects, including when writing in bullet points.
+"""
+
+TACTICAL_REPORT_PROMPT = """Analyze the following channel messages regarding a security event and provide a structured tactical report."""
+
+# Default prompts for different GenAI types
+DEFAULT_PROMPTS = {
+ 1: TAG_RECOMMENDATION_PROMPT,
+ 2: INCIDENT_SUMMARY_PROMPT,
+ 3: SIGNAL_ANALYSIS_PROMPT,
+ 4: READ_IN_SUMMARY_PROMPT,
+ 5: TACTICAL_REPORT_PROMPT,
+}
+
+DEFAULT_SYSTEM_MESSAGES = {
+ 1: TAG_RECOMMENDATION_SYSTEM_MESSAGE,
+ 2: INCIDENT_SUMMARY_SYSTEM_MESSAGE,
+ 3: SIGNAL_ANALYSIS_SYSTEM_MESSAGE,
+ 4: READ_IN_SUMMARY_SYSTEM_MESSAGE,
+ 5: TACTICAL_REPORT_SYSTEM_MESSAGE,
+}
+
+STRUCTURED_OUTPUT = """
+Return results as structured JSON.
+"""
diff --git a/src/dispatch/alembic.ini b/src/dispatch/alembic.ini
index 723145a01837..c37925f22c80 100644
--- a/src/dispatch/alembic.ini
+++ b/src/dispatch/alembic.ini
@@ -1,11 +1,14 @@
# A generic, single database configuration.
[alembic]
-# path to migration scripts
-script_location = dispatch:alembic
-
# template used to generate migration files
-# file_template = %%(rev)s_%%(slug)s
+file_template = %%(year)d-%%(month).2d-%%(day).2d_%%(rev)s
+
+[core]
+script_location = dispatch:database/revisions/core
+
+[tenant]
+script_location = dispatch:database/revisions/tenant
# timezone to use when rendering the date
# within the migration file as well as the filename.
@@ -35,6 +38,12 @@ script_location = dispatch:alembic
# are written from script.py.mako
# output_encoding = utf-8
+# format using black
+hooks = black
+black.type = console_scripts
+black.entrypoint = black
+black.options = -l 79 REVISION_SCRIPT_FILENAME
+
# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic
@@ -56,7 +65,7 @@ handlers =
qualname = sqlalchemy.engine
[logger_alembic]
-level = INFO
+level = WARN
handlers =
qualname = alembic
diff --git a/src/dispatch/alembic/env.py b/src/dispatch/alembic/env.py
deleted file mode 100644
index 46ab8e2b5bf6..000000000000
--- a/src/dispatch/alembic/env.py
+++ /dev/null
@@ -1,84 +0,0 @@
-from logging.config import fileConfig
-
-from alembic import context
-from sqlalchemy import engine_from_config, pool
-
-from dispatch.config import SQLALCHEMY_DATABASE_URI
-
-from dispatch.models import * # noqa; noqa
-
-# this is the Alembic Config object, which provides
-# access to the values within the .ini file in use.
-config = context.config
-
-# Interpret the config file for Python logging.
-# This line sets up loggers basically.
-fileConfig(config.config_file_name)
-
-
-config.set_main_option("sqlalchemy.url", str(SQLALCHEMY_DATABASE_URI))
-
-
-target_metadata = Base.metadata # noqa
-
-# other values from the config, defined by the needs of env.py,
-# can be acquired:
-# my_important_option = config.get_main_option("my_important_option")
-# ... etc.
-
-
-def run_migrations_offline():
- """Run migrations in 'offline' mode.
-
- This configures the context with just a URL
- and not an Engine, though an Engine is acceptable
- here as well. By skipping the Engine creation
- we don't even need a DBAPI to be available.
-
- Calls to context.execute() here emit the given string to the
- script output.
-
- """
- url = config.get_main_option("sqlalchemy.url")
- context.configure(url=url, target_metadata=target_metadata, literal_binds=True)
-
- def process_revision_directives(context, revision, directives):
- if config.cmd_opts.autogenerate:
- script = directives[0]
- if script.upgrade_ops.is_empty():
- directives[:] = []
-
- # connectable = ...
- with connectable.connect() as connection: # noqa
- context.configure(
- connection=connection,
- target_metadata=target_metadata,
- process_revision_directives=process_revision_directives,
- )
-
- with context.begin_transaction():
- context.run_migrations()
-
-
-def run_migrations_online():
- """Run migrations in 'online' mode.
-
- In this scenario we need to create an Engine
- and associate a connection with the context.
-
- """
- connectable = engine_from_config(
- config.get_section(config.config_ini_section), prefix="sqlalchemy.", poolclass=pool.NullPool
- )
-
- with connectable.connect() as connection:
- context.configure(connection=connection, target_metadata=target_metadata)
-
- with context.begin_transaction():
- context.run_migrations()
-
-
-if context.is_offline_mode():
- run_migrations_offline()
-else:
- run_migrations_online()
diff --git a/src/dispatch/alembic/versions/4691fc21e309_.py b/src/dispatch/alembic/versions/4691fc21e309_.py
deleted file mode 100644
index a36a55587f06..000000000000
--- a/src/dispatch/alembic/versions/4691fc21e309_.py
+++ /dev/null
@@ -1,50 +0,0 @@
-"""Adds additional fields to incident_type model.
-
-Revision ID: 4691fc21e309
-Revises: d0501fc6be89
-Create Date: 2020-01-30 14:56:28.797631
-
-"""
-from alembic import op
-import sqlalchemy as sa
-import sqlalchemy_utils
-
-
-# revision identifiers, used by Alembic.
-revision = "4691fc21e309"
-down_revision = "e3d9f5ca6958"
-branch_labels = None
-depends_on = None
-
-
-def upgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.add_column("incident_type", sa.Column("commander_service_id", sa.Integer(), nullable=True))
- op.add_column(
- "incident_type",
- sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
- )
- op.add_column("incident_type", sa.Column("slug", sa.String(), nullable=True))
- op.add_column("incident_type", sa.Column("template_document_id", sa.Integer(), nullable=True))
- op.create_index(
- "ix_incident_type_search_vector",
- "incident_type",
- ["search_vector"],
- unique=False,
- postgresql_using="gin",
- )
- op.create_foreign_key(None, "incident_type", "service", ["commander_service_id"], ["id"])
- op.create_foreign_key(None, "incident_type", "document", ["template_document_id"], ["id"])
- # ### end Alembic commands ###
-
-
-def downgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.drop_constraint(None, "incident_type", type_="foreignkey")
- op.drop_constraint(None, "incident_type", type_="foreignkey")
- op.drop_index("ix_incident_type_search_vector", table_name="incident_type")
- op.drop_column("incident_type", "template_document_id")
- op.drop_column("incident_type", "slug")
- op.drop_column("incident_type", "search_vector")
- op.drop_column("incident_type", "commander_service_id")
- # ### end Alembic commands ###
diff --git a/src/dispatch/alembic/versions/50d0688878ae_.py b/src/dispatch/alembic/versions/50d0688878ae_.py
deleted file mode 100644
index 2c95a84b9029..000000000000
--- a/src/dispatch/alembic/versions/50d0688878ae_.py
+++ /dev/null
@@ -1,28 +0,0 @@
-"""Adds visibility column to incident_type model
-
-Revision ID: 50d0688878ae
-Revises: 4691fc21e309
-Create Date: 2020-03-17 15:07:08.261027
-
-"""
-from alembic import op
-import sqlalchemy as sa
-
-
-# revision identifiers, used by Alembic.
-revision = "50d0688878ae"
-down_revision = "fe8c213f2c54"
-branch_labels = None
-depends_on = None
-
-
-def upgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.add_column("incident_type", sa.Column("visibility", sa.String(), nullable=True))
- # ### end Alembic commands ###
-
-
-def downgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.drop_column("incident_type", "visibility")
- # ### end Alembic commands ###
diff --git a/src/dispatch/alembic/versions/5d4dee3e24fc_.py b/src/dispatch/alembic/versions/5d4dee3e24fc_.py
deleted file mode 100644
index 46861f396895..000000000000
--- a/src/dispatch/alembic/versions/5d4dee3e24fc_.py
+++ /dev/null
@@ -1,28 +0,0 @@
-"""Adds column for conference challenge
-
-Revision ID: 5d4dee3e24fc
-Revises: 8b67c774279d
-Create Date: 2020-03-27 10:47:57.672426
-
-"""
-from alembic import op
-import sqlalchemy as sa
-
-
-# revision identifiers, used by Alembic.
-revision = '5d4dee3e24fc'
-down_revision = '8b67c774279d'
-branch_labels = None
-depends_on = None
-
-
-def upgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.add_column('conference', sa.Column('conference_challenge', sa.String(), server_default='N/A', nullable=False))
- # ### end Alembic commands ###
-
-
-def downgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.drop_column('conference', 'conference_challenge')
- # ### end Alembic commands ###
diff --git a/src/dispatch/alembic/versions/81038e961414_.py b/src/dispatch/alembic/versions/81038e961414_.py
deleted file mode 100644
index 2a26ad0bc855..000000000000
--- a/src/dispatch/alembic/versions/81038e961414_.py
+++ /dev/null
@@ -1,38 +0,0 @@
-"""Adds team and department columns as strings.
-
-Revision ID: 81038e961414
-Revises: 50d0688878ae
-Create Date: 2020-03-18 17:23:18.417762
-
-"""
-from alembic import op
-import sqlalchemy as sa
-
-
-# revision identifiers, used by Alembic.
-revision = "81038e961414"
-down_revision = "50d0688878ae"
-branch_labels = None
-depends_on = None
-
-
-def upgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.add_column("participant", sa.Column("department", sa.String(), nullable=True))
- op.add_column("participant", sa.Column("team", sa.String(), nullable=True))
- op.drop_constraint("participant_team_id_fkey", "participant", type_="foreignkey")
- op.drop_column("participant", "team_id")
- # ### end Alembic commands ###
-
-
-def downgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.add_column(
- "participant", sa.Column("team_id", sa.INTEGER(), autoincrement=False, nullable=True)
- )
- op.create_foreign_key(
- "participant_team_id_fkey", "participant", "team_contact", ["team_id"], ["id"]
- )
- op.drop_column("participant", "team")
- op.drop_column("participant", "department")
- # ### end Alembic commands ###
diff --git a/src/dispatch/alembic/versions/8b67c774279d_.py b/src/dispatch/alembic/versions/8b67c774279d_.py
deleted file mode 100644
index d413913fbe14..000000000000
--- a/src/dispatch/alembic/versions/8b67c774279d_.py
+++ /dev/null
@@ -1,42 +0,0 @@
-"""empty message
-
-Revision ID: 8b67c774279d
-Revises: e7c696ffd69a
-Create Date: 2020-03-24 14:08:47.756920
-
-"""
-from alembic import op
-import sqlalchemy as sa
-
-
-# revision identifiers, used by Alembic.
-revision = '8b67c774279d'
-down_revision = 'e7c696ffd69a'
-branch_labels = None
-depends_on = None
-
-
-def upgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.create_table('assoc_incident_tags',
- sa.Column('incident_id', sa.Integer(), nullable=False),
- sa.Column('tag_id', sa.Integer(), nullable=False),
- sa.ForeignKeyConstraint(['incident_id'], ['incident.id'], ),
- sa.ForeignKeyConstraint(['tag_id'], ['tag.id'], ),
- sa.PrimaryKeyConstraint('incident_id', 'tag_id')
- )
- op.drop_table('tags_incidents')
- # ### end Alembic commands ###
-
-
-def downgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.create_table('tags_incidents',
- sa.Column('incident_id', sa.INTEGER(), autoincrement=False, nullable=False),
- sa.Column('tag_id', sa.INTEGER(), autoincrement=False, nullable=False),
- sa.ForeignKeyConstraint(['incident_id'], ['incident.id'], name='tags_incidents_incident_id_fkey'),
- sa.ForeignKeyConstraint(['tag_id'], ['tag.id'], name='tags_incidents_tag_id_fkey'),
- sa.PrimaryKeyConstraint('incident_id', 'tag_id', name='tags_incidents_pkey')
- )
- op.drop_table('assoc_incident_tags')
- # ### end Alembic commands ###
diff --git a/src/dispatch/alembic/versions/b12f7a59ced9_.py b/src/dispatch/alembic/versions/b12f7a59ced9_.py
deleted file mode 100644
index e12e90122b72..000000000000
--- a/src/dispatch/alembic/versions/b12f7a59ced9_.py
+++ /dev/null
@@ -1,39 +0,0 @@
-"""empty message
-
-Revision ID: b12f7a59ced9
-Revises: 81038e961414
-Create Date: 2020-03-20 09:05:21.184011
-
-"""
-from alembic import op
-import sqlalchemy as sa
-
-
-# revision identifiers, used by Alembic.
-revision = 'b12f7a59ced9'
-down_revision = '81038e961414'
-branch_labels = None
-depends_on = None
-
-
-def upgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.create_table('conference',
- sa.Column('resource_type', sa.String(), nullable=True),
- sa.Column('resource_id', sa.String(), nullable=True),
- sa.Column('weblink', sa.String(), nullable=True),
- sa.Column('id', sa.Integer(), nullable=False),
- sa.Column('conference_id', sa.String(), nullable=True),
- sa.Column('incident_id', sa.Integer(), nullable=True),
- sa.Column('created_at', sa.DateTime(), nullable=True),
- sa.Column('updated_at', sa.DateTime(), nullable=True),
- sa.ForeignKeyConstraint(['incident_id'], ['incident.id'], ),
- sa.PrimaryKeyConstraint('id')
- )
- # ### end Alembic commands ###
-
-
-def downgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.drop_table('conference')
- # ### end Alembic commands ###
diff --git a/src/dispatch/alembic/versions/d0501fc6be89_.py b/src/dispatch/alembic/versions/d0501fc6be89_.py
deleted file mode 100644
index 26a9696122af..000000000000
--- a/src/dispatch/alembic/versions/d0501fc6be89_.py
+++ /dev/null
@@ -1,38 +0,0 @@
-"""empty message
-
-Revision ID: d0501fc6be89
-Revises: e75e103693f2
-Create Date: 2020-01-17 09:46:07.966965
-
-"""
-from alembic import op
-import sqlalchemy as sa
-
-
-# revision identifiers, used by Alembic.
-revision = "d0501fc6be89"
-down_revision = "e75e103693f2"
-branch_labels = None
-depends_on = None
-
-
-def upgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.add_column("application", sa.Column("created_at", sa.DateTime(), nullable=True))
- op.add_column("application", sa.Column("description", sa.String(), nullable=True))
- op.add_column("application", sa.Column("source", sa.String(), nullable=True))
- op.add_column("application", sa.Column("updated_at", sa.DateTime(), nullable=True))
- op.drop_column("application", "uri_source")
- # ### end Alembic commands ###
-
-
-def downgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.add_column(
- "application", sa.Column("uri_source", sa.VARCHAR(), autoincrement=False, nullable=True)
- )
- op.drop_column("application", "updated_at")
- op.drop_column("application", "source")
- op.drop_column("application", "description")
- op.drop_column("application", "created_at")
- # ### end Alembic commands ###
diff --git a/src/dispatch/alembic/versions/e3d9f5ca6958_.py b/src/dispatch/alembic/versions/e3d9f5ca6958_.py
deleted file mode 100644
index 54c8435df073..000000000000
--- a/src/dispatch/alembic/versions/e3d9f5ca6958_.py
+++ /dev/null
@@ -1,28 +0,0 @@
-"""Adding reported_at timestamp
-
-Revision ID: e3d9f5ca6958
-Revises: d0501fc6be89
-Create Date: 2020-02-04 10:40:49.342897
-
-"""
-from alembic import op
-import sqlalchemy as sa
-
-
-# revision identifiers, used by Alembic.
-revision = "e3d9f5ca6958"
-down_revision = "d0501fc6be89"
-branch_labels = None
-depends_on = None
-
-
-def upgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.add_column("incident", sa.Column("reported_at", sa.DateTime(), nullable=True))
- # ### end Alembic commands ###
-
-
-def downgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.drop_column("incident", "reported_at")
- # ### end Alembic commands ###
diff --git a/src/dispatch/alembic/versions/e75e103693f2_.py b/src/dispatch/alembic/versions/e75e103693f2_.py
deleted file mode 100644
index cf3cfa90db3f..000000000000
--- a/src/dispatch/alembic/versions/e75e103693f2_.py
+++ /dev/null
@@ -1,22 +0,0 @@
-"""Initial migration
-
-Revision ID: e75e103693f2
-Revises: 87988e60fba3
-Create Date: 2020-01-07 10:47:47.327777
-
-"""
-
-
-# revision identifiers, used by Alembic.
-revision = "e75e103693f2"
-down_revision = None
-branch_labels = None
-depends_on = None
-
-
-def upgrade():
- pass
-
-
-def downgrade():
- pass
diff --git a/src/dispatch/alembic/versions/e7c696ffd69a_.py b/src/dispatch/alembic/versions/e7c696ffd69a_.py
deleted file mode 100644
index c739f7c8a013..000000000000
--- a/src/dispatch/alembic/versions/e7c696ffd69a_.py
+++ /dev/null
@@ -1,95 +0,0 @@
-"""Replaces application table with tag table.
-
-Revision ID: e7c696ffd69a
-Revises: b12f7a59ced9
-Create Date: 2020-03-21 18:15:33.609234
-
-"""
-from alembic import op
-import sqlalchemy as sa
-from sqlalchemy.dialects import postgresql
-import sqlalchemy_utils
-
-# revision identifiers, used by Alembic.
-revision = "e7c696ffd69a"
-down_revision = "b12f7a59ced9"
-branch_labels = None
-depends_on = None
-
-
-def upgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.create_table(
- "tag",
- sa.Column("id", sa.Integer(), nullable=False),
- sa.Column("name", sa.String(), nullable=True),
- sa.Column("description", sa.String(), nullable=True),
- sa.Column("uri", sa.String(), nullable=True),
- sa.Column("source", sa.String(), nullable=True),
- sa.Column("type", sa.String(), nullable=True),
- sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
- sa.Column("created_at", sa.DateTime(), nullable=True),
- sa.Column("updated_at", sa.DateTime(), nullable=True),
- sa.PrimaryKeyConstraint("id"),
- sa.UniqueConstraint("name"),
- )
- op.create_index(
- "ix_tag_search_vector", "tag", ["search_vector"], unique=False, postgresql_using="gin"
- )
- op.create_table(
- "tags_incidents",
- sa.Column("incident_id", sa.Integer(), nullable=False),
- sa.Column("tag_id", sa.Integer(), nullable=False),
- sa.ForeignKeyConstraint(["incident_id"], ["incident.id"],),
- sa.ForeignKeyConstraint(["tag_id"], ["tag.id"],),
- sa.PrimaryKeyConstraint("incident_id", "tag_id"),
- )
- op.drop_table("applications_incidents")
- op.drop_index("ix_application_search_vector", table_name="application")
- op.drop_table("application")
- # ### end Alembic commands ###
-
-
-def downgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.create_table(
- "application",
- sa.Column(
- "id",
- sa.INTEGER(),
- server_default=sa.text("nextval('application_id_seq'::regclass)"),
- autoincrement=True,
- nullable=False,
- ),
- sa.Column("name", sa.VARCHAR(), autoincrement=False, nullable=True),
- sa.Column("description", sa.VARCHAR(), autoincrement=False, nullable=True),
- sa.Column("uri", sa.VARCHAR(), autoincrement=False, nullable=True),
- sa.Column("source", sa.VARCHAR(), autoincrement=False, nullable=True),
- sa.Column("search_vector", postgresql.TSVECTOR(), autoincrement=False, nullable=True),
- sa.Column("created_at", postgresql.TIMESTAMP(), autoincrement=False, nullable=True),
- sa.Column("updated_at", postgresql.TIMESTAMP(), autoincrement=False, nullable=True),
- sa.PrimaryKeyConstraint("id", name="application_pkey"),
- sa.UniqueConstraint("name", name="application_name_key"),
- postgresql_ignore_search_path=False,
- )
- op.create_index("ix_application_search_vector", "application", ["search_vector"], unique=False)
- op.create_table(
- "applications_incidents",
- sa.Column("incident_id", sa.INTEGER(), autoincrement=False, nullable=False),
- sa.Column("application_id", sa.INTEGER(), autoincrement=False, nullable=False),
- sa.ForeignKeyConstraint(
- ["application_id"],
- ["application.id"],
- name="applications_incidents_application_id_fkey",
- ),
- sa.ForeignKeyConstraint(
- ["incident_id"], ["incident.id"], name="applications_incidents_incident_id_fkey"
- ),
- sa.PrimaryKeyConstraint(
- "incident_id", "application_id", name="applications_incidents_pkey"
- ),
- )
- op.drop_table("tags_incidents")
- op.drop_index("ix_tag_search_vector", table_name="tag")
- op.drop_table("tag")
- # ### end Alembic commands ###
diff --git a/src/dispatch/alembic/versions/fe8c213f2c54_.py b/src/dispatch/alembic/versions/fe8c213f2c54_.py
deleted file mode 100644
index d70c86e6ea2d..000000000000
--- a/src/dispatch/alembic/versions/fe8c213f2c54_.py
+++ /dev/null
@@ -1,28 +0,0 @@
-"""Adds participant location column
-
-Revision ID: fe8c213f2c54
-Revises: 4691fc21e309
-Create Date: 2020-03-18 15:02:02.866154
-
-"""
-from alembic import op
-import sqlalchemy as sa
-
-
-# revision identifiers, used by Alembic.
-revision = "fe8c213f2c54"
-down_revision = "4691fc21e309"
-branch_labels = None
-depends_on = None
-
-
-def upgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.add_column("participant", sa.Column("location", sa.String(), nullable=True))
- # ### end Alembic commands ###
-
-
-def downgrade():
- # ### commands auto generated by Alembic - please adjust! ###
- op.drop_column("participant", "location")
- # ### end Alembic commands ###
diff --git a/src/dispatch/api.py b/src/dispatch/api.py
index abd893a6c8c8..22ce233c5bac 100644
--- a/src/dispatch/api.py
+++ b/src/dispatch/api.py
@@ -1,77 +1,269 @@
+"""This module defines the main Dispatch API endpoints."""
+
from fastapi import APIRouter, Depends
-from fastapi.openapi.docs import get_redoc_html
-from fastapi.openapi.utils import get_openapi
+from pydantic import BaseModel
from starlette.responses import JSONResponse
-from dispatch.tag.views import router as tag_router
from dispatch.auth.service import get_current_user
+from dispatch.auth.views import user_router, auth_router
+from dispatch.case.priority.views import router as case_priority_router
+from dispatch.case.severity.views import router as case_severity_router
+from dispatch.case.type.views import router as case_type_router
+from dispatch.case.views import router as case_router
+from dispatch.case_cost.views import router as case_cost_router
+from dispatch.case_cost_type.views import router as case_cost_type_router
+from dispatch.cost_model.views import router as cost_model_router
+from dispatch.data.alert.views import router as alert_router
+from dispatch.data.query.views import router as query_router
+from dispatch.data.source.data_format.views import router as source_data_format_router
+from dispatch.data.source.environment.views import router as source_environment_router
+from dispatch.data.source.status.views import router as source_status_router
+from dispatch.data.source.transport.views import router as source_transport_router
+from dispatch.data.source.type.views import router as source_type_router
+from dispatch.data.source.views import router as source_router
from dispatch.definition.views import router as definition_router
+from dispatch.document.views import router as document_router
+from dispatch.email_templates.views import router as email_template_router
+from dispatch.ai.prompt.views import router as ai_router
+from dispatch.entity.views import router as entity_router
+from dispatch.entity_type.views import router as entity_type_router
+from dispatch.feedback.incident.views import router as feedback_router
+from dispatch.feedback.service.views import router as service_feedback_router
+from dispatch.forms.type.views import router as forms_type_router
+from dispatch.forms.views import router as forms_router
+from dispatch.incident.priority.views import router as incident_priority_router
+from dispatch.incident.severity.views import router as incident_severity_router
+from dispatch.incident.type.views import router as incident_type_router
from dispatch.incident.views import router as incident_router
-from dispatch.incident_priority.views import router as incident_priority_router
-from dispatch.incident_type.views import router as incident_type_router
+from dispatch.incident_cost.views import router as incident_cost_router
+from dispatch.incident_cost_type.views import router as incident_cost_type_router
+from dispatch.incident_role.views import router as incident_role_router
from dispatch.individual.views import router as individual_contact_router
-
-from dispatch.policy.views import router as policy_router
-from dispatch.route.views import router as route_router
+from dispatch.models import OrganizationSlug
+from dispatch.notification.views import router as notification_router
+from dispatch.organization.views import router as organization_router
+from dispatch.plugin.views import router as plugin_router
+from dispatch.project.views import router as project_router
from dispatch.search.views import router as search_router
+from dispatch.search_filter.views import router as search_filter_router
from dispatch.service.views import router as service_router
-from dispatch.team.views import router as team_contact_router
-from dispatch.term.views import router as team_router
-from dispatch.document.views import router as document_router
+from dispatch.signal.views import router as signal_router
+from dispatch.tag.views import router as tag_router
+from dispatch.tag_type.views import router as tag_type_router
from dispatch.task.views import router as task_router
+from dispatch.team.views import router as team_contact_router
+from dispatch.term.views import router as term_router
+from dispatch.workflow.views import router as workflow_router
+
-from .common.utils.cli import install_plugins, install_plugin_events
+class ErrorMessage(BaseModel):
+ """Represents a single error message."""
-api_router = APIRouter() # WARNING: Don't use this unless you want unauthenticated routes
+ msg: str
+
+
+class ErrorResponse(BaseModel):
+ """Defines the structure for API error responses."""
+
+ detail: list[ErrorMessage] | None = None
+
+
+api_router = APIRouter(
+ default_response_class=JSONResponse,
+ responses={
+ 400: {"model": ErrorResponse},
+ 401: {"model": ErrorResponse},
+ 403: {"model": ErrorResponse},
+ 404: {"model": ErrorResponse},
+ 500: {"model": ErrorResponse},
+ },
+)
+
+# WARNING: Don't use this unless you want unauthenticated routes
authenticated_api_router = APIRouter()
+def get_organization_path(organization: OrganizationSlug):
+ """Dependency for validating organization slug in path."""
+ pass
+
+
+api_router.include_router(auth_router, prefix="/{organization}/auth", tags=["auth"])
+
# NOTE: All api routes should be authenticated by default
-authenticated_api_router.include_router(document_router, prefix="/documents", tags=["documents"])
-authenticated_api_router.include_router(tag_router, prefix="/tags", tags=["Tags"])
-authenticated_api_router.include_router(service_router, prefix="/services", tags=["services"])
-authenticated_api_router.include_router(team_contact_router, prefix="/teams", tags=["teams"])
authenticated_api_router.include_router(
- individual_contact_router, prefix="/individuals", tags=["individuals"]
-)
-authenticated_api_router.include_router(route_router, prefix="/route", tags=["route"])
-authenticated_api_router.include_router(policy_router, prefix="/policies", tags=["policies"])
-authenticated_api_router.include_router(
- definition_router, prefix="/definitions", tags=["definitions"]
+ organization_router, prefix="/organizations", tags=["organizations"]
)
-authenticated_api_router.include_router(team_router, prefix="/terms", tags=["terms"])
-authenticated_api_router.include_router(task_router, prefix="/tasks", tags=["tags"])
-authenticated_api_router.include_router(search_router, prefix="/search", tags=["search"])
-authenticated_api_router.include_router(incident_router, prefix="/incidents", tags=["incidents"])
-authenticated_api_router.include_router(
- incident_type_router, prefix="/incident_types", tags=["incident_types"]
+
+authenticated_organization_api_router = APIRouter(
+ prefix="/{organization}", dependencies=[Depends(get_organization_path)]
)
-authenticated_api_router.include_router(
- incident_priority_router, prefix="/incident_priorities", tags=["incident_priorities"]
+
+authenticated_organization_api_router.include_router(
+ project_router, prefix="/projects", tags=["projects"]
)
-doc_router = APIRouter()
+# Order matters for path eval
+authenticated_organization_api_router.include_router(
+ source_type_router, prefix="/data/sources/types", tags=["source_types"]
+)
+authenticated_organization_api_router.include_router(
+ source_transport_router, prefix="/data/sources/transports", tags=["source_transports"]
+)
+authenticated_organization_api_router.include_router(
+ source_status_router, prefix="/data/sources/statuses", tags=["source_statuses"]
+)
+authenticated_organization_api_router.include_router(
+ source_data_format_router, prefix="/data/sources/dataFormats", tags=["source_data_formats"]
+)
-@doc_router.get("/openapi.json")
-async def get_open_api_endpoint():
- return JSONResponse(get_openapi(title="Dispatch Docs", version=1, routes=api_router.routes))
+authenticated_organization_api_router.include_router(
+ source_environment_router, prefix="/data/sources/environments", tags=["source_environments"]
+)
+authenticated_organization_api_router.include_router(
+ source_router, prefix="/data/sources", tags=["sources"]
+)
-@doc_router.get("/")
-async def get_documentation():
- return get_redoc_html(openapi_url="/api/v1/docs/openapi.json", title="Dispatch Docs")
+authenticated_organization_api_router.include_router(
+ query_router, prefix="/data/queries", tags=["queries"]
+)
+authenticated_organization_api_router.include_router(
+ alert_router, prefix="/data/alerts", tags=["alerts"]
+)
+authenticated_organization_api_router.include_router(
+ signal_router, prefix="/signals", tags=["signals"]
+)
-authenticated_api_router.include_router(doc_router, prefix="/docs")
+authenticated_organization_api_router.include_router(user_router, prefix="/users", tags=["users"])
+authenticated_organization_api_router.include_router(
+ document_router, prefix="/documents", tags=["documents"]
+)
+authenticated_organization_api_router.include_router(
+ entity_router, prefix="/entity", tags=["entities"]
+)
+authenticated_organization_api_router.include_router(
+ entity_type_router, prefix="/entity_type", tags=["entity_types"]
+)
+authenticated_organization_api_router.include_router(tag_router, prefix="/tags", tags=["tags"])
+authenticated_organization_api_router.include_router(
+ tag_type_router, prefix="/tag_types", tags=["tag_types"]
+)
+authenticated_organization_api_router.include_router(
+ service_router, prefix="/services", tags=["services"]
+)
+authenticated_organization_api_router.include_router(
+ team_contact_router, prefix="/teams", tags=["teams"]
+)
+authenticated_organization_api_router.include_router(
+ individual_contact_router, prefix="/individuals", tags=["individuals"]
+)
+# authenticated_api_router.include_router(route_router, prefix="/route", tags=["route"])
+authenticated_organization_api_router.include_router(
+ definition_router, prefix="/definitions", tags=["definitions"]
+)
+authenticated_organization_api_router.include_router(term_router, prefix="/terms", tags=["terms"])
+authenticated_organization_api_router.include_router(task_router, prefix="/tasks", tags=["tasks"])
+authenticated_organization_api_router.include_router(
+ search_router, prefix="/search", tags=["search"]
+)
+authenticated_organization_api_router.include_router(
+ search_filter_router, prefix="/search/filters", tags=["search_filters"]
+)
+authenticated_organization_api_router.include_router(
+ incident_router, prefix="/incidents", tags=["incidents"]
+)
+authenticated_organization_api_router.include_router(
+ incident_priority_router,
+ prefix="/incident_priorities",
+ tags=["incident_priorities"],
+)
+authenticated_organization_api_router.include_router(
+ incident_severity_router,
+ prefix="/incident_severities",
+ tags=["incident_severities"],
+)
+authenticated_organization_api_router.include_router(
+ incident_type_router, prefix="/incident_types", tags=["incident_types"]
+)
+authenticated_organization_api_router.include_router(case_router, prefix="/cases", tags=["cases"])
+authenticated_organization_api_router.include_router(
+ case_type_router, prefix="/case_types", tags=["case_types"]
+)
+authenticated_organization_api_router.include_router(
+ case_priority_router,
+ prefix="/case_priorities",
+ tags=["case_priorities"],
+)
+authenticated_organization_api_router.include_router(
+ case_severity_router,
+ prefix="/case_severities",
+ tags=["case_severities"],
+)
+authenticated_organization_api_router.include_router(
+ case_cost_router,
+ prefix="/case_costs",
+ tags=["case_costs"],
+)
+authenticated_organization_api_router.include_router(
+ case_cost_type_router,
+ prefix="/case_cost_types",
+ tags=["case_cost_types"],
+)
+authenticated_organization_api_router.include_router(
+ cost_model_router,
+ prefix="/cost_models",
+ tags=["cost_models"],
+)
+authenticated_organization_api_router.include_router(
+ workflow_router, prefix="/workflows", tags=["workflows"]
+)
+authenticated_organization_api_router.include_router(
+ plugin_router, prefix="/plugins", tags=["plugins"]
+)
+authenticated_organization_api_router.include_router(
+ feedback_router, prefix="/feedback", tags=["feedback"]
+)
+authenticated_organization_api_router.include_router(
+ service_feedback_router, prefix="/service_feedback", tags=["service_feedback"]
+)
+authenticated_organization_api_router.include_router(
+ notification_router, prefix="/notifications", tags=["notifications"]
+)
+authenticated_organization_api_router.include_router(
+ incident_cost_router, prefix="/incident_costs", tags=["incident_costs"]
+)
+authenticated_organization_api_router.include_router(
+ incident_cost_type_router,
+ prefix="/incident_cost_types",
+ tags=["incident_cost_types"],
+)
+authenticated_organization_api_router.include_router(
+ incident_role_router, prefix="/incident_roles", tags=["role"]
+)
+authenticated_organization_api_router.include_router(forms_router, prefix="/forms", tags=["forms"])
+authenticated_organization_api_router.include_router(
+ forms_type_router, prefix="/forms_type", tags=["forms_type"]
+)
+authenticated_organization_api_router.include_router(
+ email_template_router, prefix="/email_template", tags=["email_template"]
+)
+authenticated_organization_api_router.include_router(ai_router, prefix="/ai", tags=["ai"])
-@api_router.get("/healthcheck")
+@api_router.get("/healthcheck", include_in_schema=False)
def healthcheck():
+ """Simple healthcheck endpoint."""
return {"status": "ok"}
-install_plugins()
-install_plugin_events(api_router)
+api_router.include_router(
+ authenticated_organization_api_router, dependencies=[Depends(get_current_user)]
+)
-api_router.include_router(authenticated_api_router, dependencies=[Depends(get_current_user)])
+api_router.include_router(
+ authenticated_api_router,
+ dependencies=[Depends(get_current_user)],
+)
diff --git a/src/dispatch/auth/models.py b/src/dispatch/auth/models.py
new file mode 100644
index 000000000000..db34de39e265
--- /dev/null
+++ b/src/dispatch/auth/models.py
@@ -0,0 +1,361 @@
+"""This module defines the models for the Dispatch authentication system."""
+
+import string
+import secrets
+from datetime import datetime, timedelta
+from uuid import uuid4
+
+import bcrypt
+from jose import jwt
+from pydantic import field_validator
+from pydantic import EmailStr
+
+from sqlalchemy import DateTime, Column, String, LargeBinary, Integer, Boolean
+from sqlalchemy.dialects.postgresql import UUID
+from sqlalchemy.orm import relationship
+from sqlalchemy.sql.schema import ForeignKey
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.config import (
+ DISPATCH_JWT_SECRET,
+ DISPATCH_JWT_ALG,
+ DISPATCH_JWT_EXP,
+)
+from dispatch.database.core import Base
+from dispatch.enums import DispatchEnum, UserRoles
+from dispatch.models import OrganizationSlug, PrimaryKey, TimeStampMixin, DispatchBase, Pagination
+from dispatch.organization.models import Organization, OrganizationRead
+from dispatch.project.models import Project, ProjectRead
+
+
+def generate_password():
+ """Generate a random, strong password with at least one lowercase, one uppercase, and three digits."""
+ alphanumeric = string.ascii_letters + string.digits
+ while True:
+ password = "".join(secrets.choice(alphanumeric) for i in range(10))
+ # Ensure password meets complexity requirements
+ if (
+ any(c.islower() for c in password)
+ and any(c.isupper() for c in password)
+ and sum(c.isdigit() for c in password) >= 3
+ ):
+ break
+ return password
+
+
+def hash_password(password: str):
+ """Hash a password using bcrypt."""
+ pw = bytes(password, "utf-8")
+ salt = bcrypt.gensalt()
+ return bcrypt.hashpw(pw, salt)
+
+
+class DispatchUser(Base, TimeStampMixin):
+ """SQLAlchemy model for a Dispatch user."""
+
+ __table_args__ = {"schema": "dispatch_core"}
+
+ id = Column(Integer, primary_key=True)
+ email = Column(String, unique=True)
+ password = Column(LargeBinary, nullable=False)
+ last_mfa_time = Column(DateTime, nullable=True)
+ experimental_features = Column(Boolean, default=False)
+
+ # relationships
+ events = relationship("Event", backref="dispatch_user")
+
+ search_vector = Column(
+ TSVectorType("email", regconfig="pg_catalog.simple", weights={"email": "A"})
+ )
+
+ def verify_password(self, password: str) -> bool:
+ """Check if the provided password matches the stored hash."""
+ if not password or not self.password:
+ return False
+ return bcrypt.checkpw(password.encode("utf-8"), self.password)
+
+ def set_password(self, password: str) -> None:
+ """Set a new password for the user."""
+ if not password:
+ raise ValueError("Password cannot be empty")
+ self.password = hash_password(password)
+
+ def is_owner(self, organization_slug: str) -> bool:
+ """Return True if the user is an owner in the given organization."""
+ role = self.get_organization_role(organization_slug)
+ return role == UserRoles.owner
+
+ @property
+ def token(self):
+ """Generate a JWT token for the user."""
+ now = datetime.utcnow()
+ exp = (now + timedelta(seconds=DISPATCH_JWT_EXP)).timestamp()
+ data = {
+ "exp": exp,
+ "email": self.email,
+ }
+ return jwt.encode(data, DISPATCH_JWT_SECRET, algorithm=DISPATCH_JWT_ALG)
+
+ def get_organization_role(self, organization_slug: OrganizationSlug):
+ """Get the user's role for a given organization slug."""
+ for o in self.organizations:
+ if o.organization.slug == organization_slug:
+ return o.role
+
+
+class DispatchUserOrganization(Base, TimeStampMixin):
+ """SQLAlchemy model for the relationship between users and organizations."""
+
+ __table_args__ = {"schema": "dispatch_core"}
+ dispatch_user_id = Column(Integer, ForeignKey(DispatchUser.id), primary_key=True)
+ dispatch_user = relationship(DispatchUser, backref="organizations")
+
+ organization_id = Column(Integer, ForeignKey(Organization.id), primary_key=True)
+ organization = relationship(Organization, backref="users")
+
+ role = Column(String, default=UserRoles.member)
+
+
+class DispatchUserProject(Base, TimeStampMixin):
+ """SQLAlchemy model for the relationship between users and projects."""
+
+ dispatch_user_id = Column(Integer, ForeignKey(DispatchUser.id), primary_key=True)
+ dispatch_user = relationship(DispatchUser, backref="projects")
+
+ project_id = Column(Integer, ForeignKey(Project.id), primary_key=True)
+ project = relationship(Project, backref="users", overlaps="dispatch_user_project")
+
+ default = Column(Boolean, default=False)
+
+ role = Column(String, nullable=False, default=UserRoles.member)
+
+
+class DispatchUserSettings(Base, TimeStampMixin):
+ """SQLAlchemy model for user settings."""
+
+ __table_args__ = {"schema": "dispatch_core"}
+
+ id = Column(Integer, primary_key=True)
+ dispatch_user_id = Column(Integer, ForeignKey(DispatchUser.id), unique=True)
+ dispatch_user = relationship(DispatchUser, backref="settings")
+
+ auto_add_to_incident_bridges = Column(Boolean, default=True)
+
+
+class UserProject(DispatchBase):
+ """Pydantic model for a user's project membership."""
+
+ project: ProjectRead
+ default: bool | None = False
+ role: str | None = None
+
+
+class UserOrganization(DispatchBase):
+ """Pydantic model for a user's organization membership."""
+
+ organization: OrganizationRead
+ default: bool | None = False
+ role: str | None = None
+
+
+class UserBase(DispatchBase):
+ """Base Pydantic model for user data."""
+
+ email: EmailStr
+ projects: list[UserProject] | None = []
+ organizations: list[UserOrganization] | None = []
+
+ @field_validator("email")
+ @classmethod
+ def email_required(cls, v):
+ """Ensure the email field is not empty."""
+ if not v:
+ raise ValueError("Must not be empty string and must be a email")
+ return v
+
+
+class UserLogin(UserBase):
+ """Pydantic model for user login data."""
+
+ password: str
+
+ @field_validator("password")
+ @classmethod
+ def password_required(cls, v):
+ """Ensure the password field is not empty."""
+ if not v:
+ raise ValueError("Must not be empty string")
+ return v
+
+
+class UserRegister(UserLogin):
+ """Pydantic model for user registration data."""
+
+ password: str = ""
+
+ @field_validator("password", mode="before")
+ @classmethod
+ def password_required(cls, v):
+ """Generate and hash a password if not provided."""
+ password = v or generate_password()
+ return hash_password(password)
+
+
+class UserLoginResponse(DispatchBase):
+ """Pydantic model for the response after user login."""
+
+ projects: list[UserProject] | None
+ token: str | None = None
+
+
+class UserRead(UserBase):
+ """Pydantic model for reading user data."""
+
+ id: PrimaryKey
+ role: str | None = None
+ experimental_features: bool | None = None
+ settings: "UserSettingsRead | None" = None
+
+
+class UserUpdate(DispatchBase):
+ """Pydantic model for updating user data."""
+
+ id: PrimaryKey
+ projects: list[UserProject] | None = None
+ organizations: list[UserOrganization] | None
+ experimental_features: bool | None = None
+ role: str | None = None
+
+
+class UserPasswordUpdate(DispatchBase):
+ """Pydantic model for password updates only."""
+
+ current_password: str
+ new_password: str
+
+ @field_validator("new_password")
+ @classmethod
+ def validate_password(cls, v):
+ """Validate the new password for length and complexity."""
+ if not v or len(v) < 8:
+ raise ValueError("Password must be at least 8 characters long")
+ if not any(c.isdigit() for c in v):
+ raise ValueError("Password must contain at least one number")
+ if not (any(c.isupper() for c in v) and any(c.islower() for c in v)):
+ raise ValueError("Password must contain both uppercase and lowercase characters")
+ return v
+
+ @field_validator("current_password")
+ @classmethod
+ def password_required(cls, v):
+ """Ensure the current password is provided."""
+ if not v:
+ raise ValueError("Current password is required")
+ return v
+
+
+class AdminPasswordReset(DispatchBase):
+ """Pydantic model for admin password resets."""
+
+ new_password: str
+
+ @field_validator("new_password")
+ @classmethod
+ def validate_password(cls, v):
+ """Validate the new password for length and complexity."""
+ if not v or len(v) < 8:
+ raise ValueError("Password must be at least 8 characters long")
+ if not any(c.isdigit() for c in v):
+ raise ValueError("Password must contain at least one number")
+ if not (any(c.isupper() for c in v) and any(c.islower() for c in v)):
+ raise ValueError("Password must contain both uppercase and lowercase characters")
+ return v
+
+
+class UserCreate(DispatchBase):
+ """Pydantic model for creating a new user."""
+
+ email: EmailStr
+ password: str | None = None
+ projects: list[UserProject] | None = None
+ organizations: list[UserOrganization] | None = None
+ role: str | None = None
+
+ @field_validator("password", mode="before")
+ @classmethod
+ def hash(cls, v):
+ """Hash the password before storing."""
+ return hash_password(str(v))
+
+
+class UserRegisterResponse(DispatchBase):
+ """Pydantic model for the response after user registration."""
+
+ token: str | None = None
+
+
+class UserPagination(Pagination):
+ """Pydantic model for paginated user results."""
+
+ items: list[UserRead] = []
+
+
+class MfaChallengeStatus(DispatchEnum):
+ """Enumeration of possible MFA challenge statuses."""
+
+ APPROVED = "approved"
+ DENIED = "denied"
+ EXPIRED = "expired"
+ PENDING = "pending"
+
+
+class MfaChallenge(Base, TimeStampMixin):
+ """SQLAlchemy model for an MFA challenge event."""
+
+ id = Column(Integer, primary_key=True, autoincrement=True)
+ valid = Column(Boolean, default=False)
+ reason = Column(String, nullable=True)
+ action = Column(String)
+ status = Column(String, default=MfaChallengeStatus.PENDING)
+ challenge_id = Column(UUID(as_uuid=True), default=uuid4, unique=True)
+ dispatch_user_id = Column(Integer, ForeignKey(DispatchUser.id), nullable=False)
+ dispatch_user = relationship(DispatchUser, backref="mfa_challenges")
+
+
+class MfaPayloadResponse(DispatchBase):
+ """Pydantic model for the response to an MFA challenge payload."""
+
+ status: str
+
+
+class MfaPayload(DispatchBase):
+ """Pydantic model for an MFA challenge payload."""
+
+ action: str
+ project_id: int
+ challenge_id: str
+
+
+class UserSettingsBase(DispatchBase):
+ """Base Pydantic model for user settings."""
+
+ auto_add_to_incident_bridges: bool = True
+
+
+class UserSettingsRead(UserSettingsBase):
+ """Pydantic model for reading user settings."""
+
+ id: PrimaryKey | None = None
+ dispatch_user_id: PrimaryKey | None = None
+
+
+class UserSettingsUpdate(UserSettingsBase):
+ """Pydantic model for updating user settings."""
+
+ pass
+
+
+class UserSettingsCreate(UserSettingsBase):
+ """Pydantic model for creating user settings."""
+
+ dispatch_user_id: PrimaryKey
diff --git a/src/dispatch/auth/permissions.py b/src/dispatch/auth/permissions.py
new file mode 100644
index 000000000000..9f68dbb1e06f
--- /dev/null
+++ b/src/dispatch/auth/permissions.py
@@ -0,0 +1,631 @@
+import logging
+from abc import ABC, abstractmethod
+import json
+
+from fastapi import HTTPException
+from starlette.requests import Request
+from starlette.status import HTTP_403_FORBIDDEN, HTTP_404_NOT_FOUND
+
+from dispatch.auth.service import get_current_user
+from dispatch.case import service as case_service
+from dispatch.case.models import Case
+from dispatch.incident.models import Incident
+from dispatch.enums import UserRoles, Visibility
+from dispatch.incident import service as incident_service
+from dispatch.individual import service as individual_contact_service
+from dispatch.models import PrimaryKeyModel
+from dispatch.organization import service as organization_service
+from dispatch.organization.models import OrganizationRead
+from dispatch.participant_role.enums import ParticipantRoleType
+from dispatch.task import service as task_service
+
+log = logging.getLogger(__name__)
+
+
+def any_permission(permissions: list, request: Request) -> bool:
+ for p in permissions:
+ try:
+ p(request=request)
+ return True
+ except HTTPException:
+ pass
+ return False
+
+
+class BasePermission(ABC):
+ """
+ Abstract permission that all other Permissions must be inherited from.
+
+ Defines basic error message, status & error codes.
+
+ Upon initialization, calls abstract method `has_required_permissions`
+ which will be specific to concrete implementation of Permission class.
+
+ You would write your permissions like this:
+
+ .. code-block:: python
+
+ class TeapotUserAgentPermission(BasePermission):
+
+ def has_required_permissions(self, request: Request) -> bool:
+ return request.headers.get('User-Agent') == "Teapot v1.0"
+
+ """
+
+ org_error_msg = [{"msg": "Organization not found. Please, contact your Dispatch admin."}]
+ org_error_code = HTTP_404_NOT_FOUND
+
+ user_error_msg = [{"msg": "User not found. Please, contact your Dispatch admin"}]
+ user_error_code = HTTP_404_NOT_FOUND
+
+ user_role_error_msg = [
+ {
+ "msg": "Your user doesn't have permissions to create, update, or delete this resource. Please, contact your Dispatch admin."
+ }
+ ]
+ user_role_error_code = HTTP_403_FORBIDDEN
+
+ role = None
+
+ @abstractmethod
+ def has_required_permissions(self, request: Request) -> bool: ...
+
+ def __init__(self, request: Request):
+ organization = None
+ if request.path_params.get("organization"):
+ organization = organization_service.get_by_slug_or_raise(
+ db_session=request.state.db,
+ organization_in=OrganizationRead(
+ slug=request.path_params["organization"],
+ name=request.path_params["organization"],
+ ),
+ )
+ elif request.path_params.get("organization_id"):
+ organization = organization_service.get(
+ db_session=request.state.db, organization_id=request.path_params["organization_id"]
+ )
+
+ if not organization:
+ raise HTTPException(status_code=self.org_error_code, detail=self.org_error_msg)
+
+ org_check = organization_service.get_by_slug(
+ db_session=request.state.db, slug=organization.slug
+ )
+
+ if not org_check or org_check.id != organization.id:
+ raise HTTPException(status_code=self.org_error_code, detail=self.org_error_msg)
+
+ user = get_current_user(request=request)
+ if not user:
+ raise HTTPException(status_code=self.user_error_code, detail=self.user_error_msg)
+
+ self.role = user.get_organization_role(organization.slug)
+ if not self.has_required_permissions(request):
+ raise HTTPException(
+ status_code=self.user_role_error_code, detail=self.user_role_error_msg
+ )
+
+
+class PermissionsDependency(object):
+ """
+ Permission dependency that is used to define and check all the permission
+ classes from one place inside route definition.
+
+ Use it as an argument to FastAPI's `Depends` as follows:
+
+ .. code-block:: python
+
+ app = FastAPI()
+
+ @app.get(
+ "/teapot/",
+ dependencies=[Depends(
+ PermissionsDependency([TeapotUserAgentPermission]))]
+ )
+ async def teapot() -> dict:
+ return {"teapot": True}
+ """
+
+ def __init__(self, permissions_classes: list):
+ self.permissions_classes = permissions_classes
+
+ def __call__(self, request: Request):
+ for permission_class in self.permissions_classes:
+ permission_class(request=request)
+
+
+class OrganizationOwnerPermission(BasePermission):
+ def has_required_permissions(self, request: Request) -> bool:
+ return self.role == UserRoles.owner
+
+
+class OrganizationManagerPermission(BasePermission):
+ def has_required_permissions(self, request: Request) -> bool:
+ permission = any_permission(
+ permissions=[
+ OrganizationOwnerPermission,
+ ],
+ request=request,
+ )
+ if not permission:
+ if self.role == UserRoles.manager:
+ return True
+ return permission
+
+
+class OrganizationAdminPermission(BasePermission):
+ def has_required_permissions(self, request: Request) -> bool:
+ permission = any_permission(
+ permissions=[
+ OrganizationOwnerPermission,
+ OrganizationManagerPermission,
+ ],
+ request=request,
+ )
+ if not permission:
+ if self.role == UserRoles.admin:
+ return True
+ return permission
+
+
+class OrganizationMemberPermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ permission = any_permission(
+ permissions=[
+ OrganizationOwnerPermission,
+ OrganizationManagerPermission,
+ OrganizationAdminPermission,
+ ],
+ request=request,
+ )
+ if not permission:
+ if self.role == UserRoles.member:
+ return True
+ return permission
+
+
+class SensitiveProjectActionPermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ return any_permission(
+ permissions=[
+ OrganizationOwnerPermission,
+ OrganizationManagerPermission,
+ OrganizationAdminPermission,
+ ],
+ request=request,
+ )
+
+
+class IndividualContactUpdatePermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ permission = any_permission(
+ permissions=[
+ SensitiveProjectActionPermission,
+ ],
+ request=request,
+ )
+ if not permission:
+ pk = PrimaryKeyModel(id=request.path_params["individual_contact_id"])
+ individual_contact = individual_contact_service.get(
+ db_session=request.state.db, individual_contact_id=pk.id
+ )
+
+ if not individual_contact:
+ return False
+
+ current_user = get_current_user(request=request)
+ if individual_contact.email == current_user.email:
+ return True
+
+ return permission
+
+
+class ProjectCreatePermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ return any_permission(
+ permissions=[OrganizationOwnerPermission, OrganizationManagerPermission],
+ request=request,
+ )
+
+
+class ProjectUpdatePermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ return any_permission(
+ permissions=[
+ OrganizationOwnerPermission,
+ OrganizationManagerPermission,
+ OrganizationAdminPermission,
+ ],
+ request=request,
+ )
+
+
+class IncidentJoinOrSubscribePermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ pk = PrimaryKeyModel(id=request.path_params["incident_id"])
+ current_incident = incident_service.get(db_session=request.state.db, incident_id=pk.id)
+
+ if not current_incident:
+ return False
+
+ # Check if incident is restricted - only admins can join restricted incidents
+ if current_incident.visibility == Visibility.restricted:
+ return any_permission(
+ permissions=[OrganizationAdminPermission],
+ request=request,
+ )
+
+ # Check project's allow_self_join setting - only admins can override
+ if not current_incident.project.allow_self_join:
+ return any_permission(
+ permissions=[OrganizationAdminPermission],
+ request=request,
+ )
+
+ return True
+
+
+class IncidentViewPermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ pk = PrimaryKeyModel(id=request.path_params["incident_id"])
+ current_incident = incident_service.get(db_session=request.state.db, incident_id=pk.id)
+
+ if not current_incident:
+ return False
+
+ if current_incident.visibility == Visibility.restricted:
+ return any_permission(
+ permissions=[
+ OrganizationAdminPermission,
+ IncidentCommanderPermission,
+ IncidentReporterPermission,
+ IncidentParticipantPermission,
+ ],
+ request=request,
+ )
+ return True
+
+
+class IncidentEditPermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ return any_permission(
+ permissions=[
+ OrganizationAdminPermission,
+ IncidentCommanderPermission,
+ IncidentReporterPermission,
+ ],
+ request=request,
+ )
+
+
+class IncidentEventPermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ return any_permission(
+ permissions=[
+ OrganizationAdminPermission,
+ IncidentCommanderOrScribePermission,
+ IncidentReporterPermission,
+ ],
+ request=request,
+ )
+
+
+class IncidentTaskCreateEditPermission(BasePermission):
+ """
+ Permissions dependency to apply incident edit permissions to task-based requests.
+ """
+
+ def has_required_permissions(self, request: Request) -> bool:
+ incident_id = None
+ # for task creation, retrieve the incident id from the payload
+ if request.method == "POST" and hasattr(request, "_body"):
+ try:
+ body = json.loads(request._body.decode())
+ incident_id = body["incident"]["id"]
+ except (json.JSONDecodeError, KeyError, AttributeError):
+ log.error(
+ "Encountered create_task request without expected incident ID. Cannot properly ascertain incident permissions."
+ )
+ return False
+ else: # otherwise, retrieve via the task id
+ pk = PrimaryKeyModel(id=request.path_params["task_id"])
+ current_task = task_service.get(db_session=request.state.db, task_id=pk.id)
+ if not current_task or not current_task.incident:
+ return False
+ incident_id = current_task.incident.id
+
+ # minimal object with the attributes required for IncidentViewPermission
+ incident_request = type(
+ "IncidentRequest",
+ (),
+ {
+ "path_params": {**request.path_params, "incident_id": incident_id},
+ "state": request.state,
+ },
+ )()
+
+ # copy necessary request attributes
+ for attr in ["headers", "method", "url", "query_params"]:
+ if hasattr(request, attr):
+ setattr(incident_request, attr, getattr(request, attr))
+
+ return any_permission(
+ permissions=[IncidentEditPermission],
+ request=incident_request,
+ )
+
+
+class IncidentReporterPermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ current_user = get_current_user(request=request)
+ pk = PrimaryKeyModel(id=request.path_params["incident_id"])
+ current_incident = incident_service.get(db_session=request.state.db, incident_id=pk.id)
+
+ if not current_incident:
+ return False
+
+ if current_incident.reporter:
+ if current_incident.reporter.individual.email == current_user.email:
+ return True
+
+ return False
+
+
+class IncidentCommanderPermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ current_user = get_current_user(request=request)
+ pk = PrimaryKeyModel(id=request.path_params["incident_id"])
+ current_incident = incident_service.get(db_session=request.state.db, incident_id=pk.id)
+ if not current_incident:
+ return False
+
+ if current_incident.commander:
+ if current_incident.commander.individual.email == current_user.email:
+ return True
+
+ return False
+
+
+class IncidentCommanderOrScribePermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ current_user = get_current_user(request=request)
+ pk = PrimaryKeyModel(id=request.path_params["incident_id"])
+ current_incident = incident_service.get(db_session=request.state.db, incident_id=pk.id)
+ if not current_incident:
+ return False
+
+ if (
+ current_incident.commander
+ and current_incident.commander.individual.email == current_user.email
+ ):
+ return True
+
+ scribes = [
+ participant.individual.email
+ for participant in current_incident.participants
+ if ParticipantRoleType.scribe in [role.role for role in participant.participant_roles]
+ ]
+ if current_user.email in scribes:
+ return True
+
+ return False
+
+
+class IncidentParticipantPermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ current_user = get_current_user(request=request)
+ pk = PrimaryKeyModel(id=request.path_params["incident_id"])
+ current_incident: Incident = incident_service.get(
+ db_session=request.state.db, incident_id=pk.id
+ )
+ if not current_incident:
+ return False
+
+ participant_emails: list[str] = [
+ participant.individual.email for participant in current_incident.participants
+ ]
+ return current_user.email in participant_emails
+
+
+# Cases
+
+
+class CaseViewPermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ pk = PrimaryKeyModel(id=request.path_params["case_id"])
+
+ current_case = case_service.get(db_session=request.state.db, case_id=pk.id)
+
+ if not current_case:
+ return False
+
+ if current_case.visibility == Visibility.restricted:
+ return any_permission(
+ permissions=[
+ OrganizationAdminPermission,
+ CaseParticipantPermission,
+ ],
+ request=request,
+ )
+ return True
+
+
+class CaseReporterPermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ current_user = get_current_user(request=request)
+ pk = PrimaryKeyModel(id=request.path_params["case_id"])
+ current_case = case_service.get(db_session=request.state.db, case_id=pk.id)
+
+ if not current_case:
+ return False
+
+ if current_case.reporter:
+ if current_case.reporter.individual.email == current_user.email:
+ return True
+
+ return False
+
+
+class CaseAssigneePermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ current_user = get_current_user(request=request)
+ pk = PrimaryKeyModel(id=request.path_params["case_id"])
+ current_case = case_service.get(db_session=request.state.db, case_id=pk.id)
+
+ if not current_case:
+ return False
+
+ if current_case.assignee:
+ if current_case.assignee.individual.email == current_user.email:
+ return True
+
+ return False
+
+
+class CaseEditPermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ return any_permission(
+ permissions=[
+ OrganizationAdminPermission,
+ CaseReporterPermission,
+ CaseAssigneePermission,
+ ],
+ request=request,
+ )
+
+
+class CaseParticipantPermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ current_user = get_current_user(request=request)
+ pk = PrimaryKeyModel(id=request.path_params["case_id"])
+ current_case: Case = case_service.get(db_session=request.state.db, case_id=pk.id)
+ participant_emails: list[str] = [
+ participant.individual.email for participant in current_case.participants
+ ]
+ return current_user.email in participant_emails
+
+
+class CaseJoinPermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ pk = PrimaryKeyModel(id=request.path_params["case_id"])
+ current_case = case_service.get(db_session=request.state.db, case_id=pk.id)
+
+ if not current_case:
+ return False
+
+ # Check if case is restricted - only admins can join restricted cases
+ if current_case.visibility == Visibility.restricted:
+ return any_permission(
+ permissions=[OrganizationAdminPermission],
+ request=request,
+ )
+
+ # Check project's allow_self_join setting - only admins can override
+ if not current_case.project.allow_self_join:
+ return any_permission(
+ permissions=[OrganizationAdminPermission],
+ request=request,
+ )
+
+ return True
+
+
+class CaseEventPermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ return any_permission(
+ permissions=[
+ OrganizationAdminPermission,
+ CaseParticipantPermission,
+ ],
+ request=request,
+ )
+
+
+class FeedbackDeletePermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ permission = any_permission(
+ permissions=[
+ SensitiveProjectActionPermission,
+ ],
+ request=request,
+ )
+ if not permission:
+ individual_contact_id = request.path_params.get("individual_contact_id", "0")
+ # "0" is passed if the feedback is anonymous
+ if individual_contact_id != "0":
+ pk = PrimaryKeyModel(id=individual_contact_id)
+ individual_contact = individual_contact_service.get(
+ db_session=request.state.db, individual_contact_id=pk.id
+ )
+
+ if not individual_contact:
+ return False
+
+ current_user = get_current_user(request=request)
+ if individual_contact.email == current_user.email:
+ return True
+
+ return permission
diff --git a/src/dispatch/auth/service.py b/src/dispatch/auth/service.py
index b275b6f91a50..eb3366e63994 100644
--- a/src/dispatch/auth/service.py
+++ b/src/dispatch/auth/service.py
@@ -4,25 +4,330 @@
:copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
:license: Apache, see LICENSE for more details.
"""
+
import logging
+from typing import Annotated
+from fastapi import HTTPException, Depends
from starlette.requests import Request
-from dispatch.plugins.base import plugins
+from starlette.status import HTTP_401_UNAUTHORIZED
+from sqlalchemy.exc import IntegrityError
+
from dispatch.config import (
DISPATCH_AUTHENTICATION_PROVIDER_SLUG,
DISPATCH_AUTHENTICATION_DEFAULT_USER,
)
+from dispatch.enums import UserRoles
+from dispatch.organization import service as organization_service
+from dispatch.organization.models import OrganizationRead
+from dispatch.plugins.base import plugins
+from dispatch.project import service as project_service
+
+from dispatch.project.models import ProjectBase
+
+from .models import (
+ DispatchUser,
+ DispatchUserOrganization,
+ DispatchUserProject,
+ DispatchUserSettings,
+ UserOrganization,
+ UserProject,
+ UserRegister,
+ UserUpdate,
+ UserCreate,
+ UserSettingsCreate,
+ UserSettingsUpdate,
+)
+
log = logging.getLogger(__name__)
+InvalidCredentialException = HTTPException(
+ status_code=HTTP_401_UNAUTHORIZED, detail=[{"msg": "Could not validate credentials"}]
+)
+
+
+def get(*, db_session, user_id: int) -> DispatchUser | None:
+ """Returns a user based on the given user id."""
+ return db_session.query(DispatchUser).filter(DispatchUser.id == user_id).one_or_none()
+
+
+def get_by_email(*, db_session, email: str) -> DispatchUser | None:
+ """Returns a user object based on user email."""
+ return db_session.query(DispatchUser).filter(DispatchUser.email == email).one_or_none()
+
+
+def create_or_update_project_role(*, db_session, user: DispatchUser, role_in: UserProject):
+ """Creates a new project role or updates an existing role."""
+ if not role_in.project.id:
+ project = project_service.get_by_name(db_session=db_session, name=role_in.project.name)
+ project_id = project.id
+ else:
+ project_id = role_in.project.id
+
+ project_role = (
+ db_session.query(DispatchUserProject)
+ .filter(
+ DispatchUserProject.dispatch_user_id == user.id,
+ )
+ .filter(DispatchUserProject.project_id == project_id)
+ .one_or_none()
+ )
+
+ if not project_role:
+ return DispatchUserProject(
+ project_id=project_id,
+ role=role_in.role,
+ )
+ project_role.role = role_in.role
+ return project_role
+
+
+def create_or_update_project_default(
+ *, db_session, user: DispatchUser, user_project_in: UserProject
+):
+ """Creates a new user project or updates an existing one."""
+ if user_project_in.project.id:
+ project_id = user_project_in.project.id
+ else:
+ project = project_service.get_by_name(
+ db_session=db_session, name=user_project_in.project.name
+ )
+ project_id = project.id
+
+ user_project = (
+ db_session.query(DispatchUserProject)
+ .filter(
+ DispatchUserProject.dispatch_user_id == user.id,
+ )
+ .filter(DispatchUserProject.project_id == project_id)
+ .one_or_none()
+ )
+
+ if not user_project:
+ user_project = DispatchUserProject(
+ dispatch_user_id=user.id,
+ project_id=project_id,
+ default=True,
+ )
+ db_session.add(user_project)
+ return user_project
+
+ user_project.default = user_project_in.default
+ return user_project
+
+
+def create_or_update_organization_role(
+ *, db_session, user: DispatchUser, role_in: UserOrganization
+):
+ """Creates a new organization role or updates an existing role."""
+ if not role_in.organization.id:
+ organization = organization_service.get_by_name(
+ db_session=db_session, name=role_in.organization.name
+ )
+ organization_id = organization.id
+ else:
+ organization_id = role_in.organization.id
+
+ organization_role = (
+ db_session.query(DispatchUserOrganization)
+ .filter(
+ DispatchUserOrganization.dispatch_user_id == user.id,
+ )
+ .filter(DispatchUserOrganization.organization_id == organization_id)
+ .one_or_none()
+ )
+
+ if not organization_role:
+ return DispatchUserOrganization(
+ organization_id=organization.id,
+ role=role_in.role,
+ )
+
+ organization_role.role = role_in.role
+ return organization_role
+
+
+def create(*, db_session, organization: str, user_in: (UserRegister | UserCreate)) -> DispatchUser:
+ """Creates a new dispatch user."""
+ # pydantic forces a string password, but we really want bytes
+ password = bytes(user_in.password, "utf-8")
+
+ # create the user
+ user = DispatchUser(
+ **user_in.model_dump(exclude={"password", "organizations", "projects", "role"}),
+ password=password,
+ )
+
+ org = organization_service.get_by_slug_or_raise(
+ db_session=db_session,
+ organization_in=OrganizationRead(name=organization, slug=organization),
+ )
-def get_current_user(*, request: Request):
+ # add user to the current organization
+ role = UserRoles.member
+ if hasattr(user_in, "role"):
+ role = user_in.role
+
+ user.organizations.append(DispatchUserOrganization(organization=org, role=role))
+
+ projects = []
+ if user_in.projects:
+ # we reset the default value for all user projects
+ for user_project in user.projects:
+ user_project.default = False
+
+ for user_project in user_in.projects:
+ projects.append(
+ create_or_update_project_default(
+ db_session=db_session, user=user, user_project_in=user_project
+ )
+ )
+ else:
+ # get the default project
+ default_project = project_service.get_default_or_raise(db_session=db_session)
+ projects.append(
+ create_or_update_project_default(
+ db_session=db_session,
+ user=user,
+ user_project_in=UserProject(project=ProjectBase(**default_project.dict())),
+ )
+ )
+ user.projects = projects
+
+ db_session.add(user)
+ db_session.commit()
+ return user
+
+
+def get_or_create(*, db_session, organization: str, user_in: UserRegister) -> DispatchUser:
+ """Gets an existing user or creates a new one."""
+ user = get_by_email(db_session=db_session, email=user_in.email)
+
+ if not user:
+ try:
+ user = create(db_session=db_session, organization=organization, user_in=user_in)
+ except IntegrityError:
+ db_session.rollback()
+ log.exception(f"Unable to create user with email address {user_in.email}.")
+
+ return user
+
+
+def update(*, db_session, user: DispatchUser, user_in: UserUpdate) -> DispatchUser:
+ """Updates a user."""
+ user_data = user.dict()
+
+ update_data = user_in.dict(
+ exclude={"password", "organizations", "projects"}, exclude_unset=True
+ )
+ for field in user_data:
+ if field in update_data:
+ setattr(user, field, update_data[field])
+
+ if user_in.organizations:
+ roles = []
+
+ for role in user_in.organizations:
+ roles.append(
+ create_or_update_organization_role(db_session=db_session, user=user, role_in=role)
+ )
+
+ if user_in.projects:
+ # we reset the default value for all user projects
+ for user_project in user.projects:
+ user_project.default = False
+
+ projects = []
+ for user_project in user_in.projects:
+ projects.append(
+ create_or_update_project_default(
+ db_session=db_session, user=user, user_project_in=user_project
+ )
+ )
+
+ if experimental_features := user_in.experimental_features:
+ user.experimental_features = experimental_features
+
+ db_session.commit()
+ return user
+
+
+def get_current_user(request: Request) -> DispatchUser:
"""Attempts to get the current user depending on the configured authentication provider."""
if DISPATCH_AUTHENTICATION_PROVIDER_SLUG:
auth_plugin = plugins.get(DISPATCH_AUTHENTICATION_PROVIDER_SLUG)
- return auth_plugin.get_current_user(request)
+ user_email = auth_plugin.get_current_user(request)
else:
- log.warning(
- "No authentication provider has been provided. There is currently no user authentication."
+ log.debug("No authentication provider. Default user will be used")
+ user_email = DISPATCH_AUTHENTICATION_DEFAULT_USER
+
+ if not user_email:
+ log.exception(
+ f"Unable to determine user email based on configured auth provider or no default auth user email defined. Provider: {DISPATCH_AUTHENTICATION_PROVIDER_SLUG}"
)
- return DISPATCH_AUTHENTICATION_DEFAULT_USER
+ raise InvalidCredentialException
+
+ user = get_or_create(
+ db_session=request.state.db,
+ organization=request.state.organization,
+ user_in=UserRegister(email=user_email),
+ )
+
+ return user
+
+
+CurrentUser = Annotated[DispatchUser, Depends(get_current_user)]
+
+
+def get_current_role(
+ request: Request, current_user: DispatchUser = Depends(get_current_user)
+) -> UserRoles:
+ """Attempts to get the current user depending on the configured authentication provider."""
+ return current_user.get_organization_role(organization_slug=request.state.organization)
+
+
+def get_user_settings(*, db_session, user_id: int) -> DispatchUserSettings | None:
+ """Get user settings for a specific user."""
+ return (
+ db_session.query(DispatchUserSettings)
+ .filter(DispatchUserSettings.dispatch_user_id == user_id)
+ .one_or_none()
+ )
+
+
+def get_or_create_user_settings(*, db_session, user_id: int) -> DispatchUserSettings:
+ """Get or create user settings for a specific user."""
+ settings = get_user_settings(db_session=db_session, user_id=user_id)
+
+ if not settings:
+ settings_in = UserSettingsCreate(dispatch_user_id=user_id)
+ settings = create_user_settings(db_session=db_session, settings_in=settings_in)
+
+ return settings
+
+
+def create_user_settings(*, db_session, settings_in: UserSettingsCreate) -> DispatchUserSettings:
+ """Create user settings."""
+ settings = DispatchUserSettings(**settings_in.model_dump())
+ db_session.add(settings)
+ db_session.commit()
+ db_session.refresh(settings)
+ return settings
+
+
+def update_user_settings(
+ *,
+ db_session,
+ settings: DispatchUserSettings,
+ settings_in: UserSettingsUpdate,
+) -> DispatchUserSettings:
+ """Update user settings."""
+ settings_data = settings_in.model_dump(exclude_unset=True)
+
+ for field, value in settings_data.items():
+ setattr(settings, field, value)
+
+ db_session.commit()
+ db_session.refresh(settings)
+ return settings
diff --git a/src/dispatch/auth/views.py b/src/dispatch/auth/views.py
new file mode 100644
index 000000000000..4718165bb481
--- /dev/null
+++ b/src/dispatch/auth/views.py
@@ -0,0 +1,446 @@
+import logging
+
+from fastapi import APIRouter, Depends, HTTPException, status
+from pydantic import ValidationError
+
+from dispatch.config import DISPATCH_AUTH_REGISTRATION_ENABLED
+
+from dispatch.auth.permissions import (
+ OrganizationMemberPermission,
+ PermissionsDependency,
+)
+from dispatch.auth.service import CurrentUser
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.enums import UserRoles
+from dispatch.models import OrganizationSlug, PrimaryKey
+from dispatch.plugin import service as plugin_service
+from dispatch.plugins.dispatch_core.exceptions import MfaException
+from dispatch.organization.models import OrganizationRead
+
+from .models import (
+ MfaPayload,
+ MfaPayloadResponse,
+ UserLogin,
+ UserLoginResponse,
+ UserOrganization,
+ UserPagination,
+ UserRead,
+ UserRegister,
+ UserRegisterResponse,
+ UserCreate,
+ UserUpdate,
+ UserPasswordUpdate,
+ AdminPasswordReset,
+ UserSettingsRead,
+ UserSettingsUpdate,
+)
+from .service import (
+ get,
+ get_by_email,
+ update,
+ create,
+ get_or_create_user_settings,
+ update_user_settings,
+)
+
+
+log = logging.getLogger(__name__)
+
+auth_router = APIRouter()
+user_router = APIRouter()
+
+
+@user_router.get(
+ "",
+ dependencies=[
+ Depends(
+ PermissionsDependency(
+ [
+ OrganizationMemberPermission,
+ ]
+ )
+ )
+ ],
+ response_model=UserPagination,
+)
+def get_users(organization: OrganizationSlug, common: CommonParameters):
+ """Gets all organization users."""
+ common["filter_spec"] = {
+ "and": [{"model": "Organization", "op": "==", "field": "slug", "value": organization}]
+ }
+
+ items = search_filter_sort_paginate(model="DispatchUser", **common)
+
+ return {
+ "items": [
+ {
+ "id": u.id,
+ "email": u.email,
+ "projects": u.projects,
+ "role": u.get_organization_role(organization),
+ }
+ for u in items["items"]
+ ],
+ "itemsPerPage": items["itemsPerPage"],
+ "page": items["page"],
+ "total": items["total"],
+ }
+
+
+@user_router.post(
+ "",
+ response_model=UserRead,
+)
+def create_user(
+ user_in: UserCreate,
+ organization: OrganizationSlug,
+ db_session: DbSession,
+ current_user: CurrentUser,
+):
+ """Creates a new user."""
+ user = get_by_email(db_session=db_session, email=user_in.email)
+ if user:
+ raise ValidationError(
+ [
+ {
+ "msg": "A user with this email already exists.",
+ "loc": "email",
+ }
+ ]
+ )
+
+ current_user_organization_role = current_user.get_organization_role(organization)
+ if current_user_organization_role != UserRoles.owner:
+ raise HTTPException(
+ status_code=status.HTTP_403_FORBIDDEN,
+ detail=[
+ {
+ "msg": "You don't have permissions to create a new user for this organization. Please, contact the organization's owner."
+ }
+ ],
+ )
+
+ user = create(db_session=db_session, organization=organization, user_in=user_in)
+ return user
+
+
+@user_router.get(
+ "/{user_id}",
+ dependencies=[
+ Depends(
+ PermissionsDependency(
+ [
+ OrganizationMemberPermission,
+ ]
+ )
+ )
+ ],
+ response_model=UserRead,
+)
+def get_user(db_session: DbSession, user_id: PrimaryKey):
+ """Get a user."""
+ user = get(db_session=db_session, user_id=user_id)
+ if not user:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A user with this id does not exist."}],
+ )
+
+ return user
+
+
+@user_router.put(
+ "/{user_id}",
+ response_model=UserRead,
+)
+def update_user(
+ db_session: DbSession,
+ user_id: PrimaryKey,
+ organization: OrganizationSlug,
+ user_in: UserUpdate,
+ current_user: CurrentUser,
+):
+ """Check if Current_user is Owner and is trying to edit another user"""
+ current_user_organization_role = current_user.get_organization_role(organization)
+ if current_user_organization_role != UserRoles.owner and current_user.id != user_id:
+ raise HTTPException(
+ status_code=status.HTTP_403_FORBIDDEN,
+ detail=[{"msg": "A user that is not an Owner is trying to update another user."}],
+ )
+ """Update a user."""
+ user = get(db_session=db_session, user_id=user_id)
+ if not user:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A user with this id does not exist."}],
+ )
+
+ if user_in.role:
+ # New user role is provided
+ user_organization_role = user.get_organization_role(organization)
+ if user_organization_role != user_in.role:
+ # New user role provided is different than current user role
+ current_user_organization_role = current_user.get_organization_role(organization)
+ if current_user_organization_role != UserRoles.owner:
+ raise HTTPException(
+ status_code=status.HTTP_403_FORBIDDEN,
+ detail=[
+ {
+ "msg": "You don't have permissions to update the user's role. Please, contact the organization's owner."
+ }
+ ],
+ )
+
+ # add organization information
+ user_in.organizations = [
+ UserOrganization(role=user_in.role, organization=OrganizationRead(name=organization))
+ ]
+
+ return update(db_session=db_session, user=user, user_in=user_in)
+
+
+@user_router.post("/{user_id}/change-password", response_model=UserRead)
+def change_password(
+ db_session: DbSession,
+ user_id: PrimaryKey,
+ password_update: UserPasswordUpdate,
+ current_user: CurrentUser,
+ organization: OrganizationSlug,
+):
+ """Change user password with proper validation"""
+ user = get(db_session=db_session, user_id=user_id)
+ if not user:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A user with this id does not exist."}],
+ )
+
+ # Only allow users to change their own password or owners to reset
+ if user.id != current_user.id and not current_user.is_owner(organization):
+ raise HTTPException(
+ status_code=status.HTTP_403_FORBIDDEN,
+ detail=[{"msg": "Not authorized to change other user passwords"}],
+ )
+
+ # Validate current password if user is changing their own password
+ if user.id == current_user.id:
+ if not user.verify_password(password_update.current_password):
+ raise HTTPException(
+ status_code=status.HTTP_400_BAD_REQUEST,
+ detail=[{"msg": "Invalid current password"}],
+ )
+
+ # Set new password
+ try:
+ user.set_password(password_update.new_password)
+ db_session.commit()
+ except ValueError as e:
+ raise HTTPException(
+ status_code=status.HTTP_400_BAD_REQUEST,
+ detail=[{"msg": str(e)}],
+ ) from e
+
+ return user
+
+
+@user_router.post("/{user_id}/reset-password", response_model=UserRead)
+def admin_reset_password(
+ db_session: DbSession,
+ user_id: PrimaryKey,
+ password_reset: AdminPasswordReset,
+ current_user: CurrentUser,
+ organization: OrganizationSlug,
+):
+ """Admin endpoint to reset user password"""
+ # Verify current user is an owner
+ if not current_user.is_owner(organization):
+ raise HTTPException(
+ status_code=status.HTTP_403_FORBIDDEN,
+ detail=[{"msg": "Only owners can reset passwords"}],
+ )
+
+ user = get(db_session=db_session, user_id=user_id)
+ if not user:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A user with this id does not exist."}],
+ )
+
+ try:
+ user.set_password(password_reset.new_password)
+ db_session.commit()
+ except ValueError as e:
+ raise HTTPException(
+ status_code=status.HTTP_400_BAD_REQUEST,
+ detail=[{"msg": str(e)}],
+ ) from e
+
+ return user
+
+
+@auth_router.get("/me", response_model=UserRead)
+def get_me(
+ *,
+ db_session: DbSession,
+ current_user: CurrentUser,
+):
+ # Get user settings and include in response
+ user_settings = get_or_create_user_settings(db_session=db_session, user_id=current_user.id)
+
+ # Create a response dict that includes settings
+ response_data = {
+ "id": current_user.id,
+ "email": current_user.email,
+ "projects": current_user.projects,
+ "organizations": current_user.organizations,
+ "experimental_features": current_user.experimental_features,
+ "settings": user_settings,
+ }
+
+ return response_data
+
+
+@auth_router.get("/myrole")
+def get_my_role(
+ *,
+ db_session: DbSession,
+ current_user: CurrentUser,
+ organization: OrganizationSlug,
+):
+ return current_user.get_organization_role(organization)
+
+
+@auth_router.post("/login", response_model=UserLoginResponse)
+def login_user(
+ user_in: UserLogin,
+ organization: OrganizationSlug,
+ db_session: DbSession,
+):
+ user = get_by_email(db_session=db_session, email=user_in.email)
+ if user and user.verify_password(user_in.password):
+ projects = []
+ for user_project in user.projects:
+ projects.append(
+ {
+ "project": user_project.project,
+ "default": user_project.default,
+ "role": user_project.role,
+ }
+ )
+ return {"projects": projects, "token": user.token}
+
+ # Pydantic v2 compatible error handling
+ raise HTTPException(
+ status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
+ detail=[
+ {
+ "msg": "Invalid username.",
+ "loc": ["username"],
+ "type": "value_error",
+ },
+ {
+ "msg": "Invalid password.",
+ "loc": ["password"],
+ "type": "value_error",
+ },
+ ],
+ )
+
+
+def register_user(
+ user_in: UserRegister,
+ organization: OrganizationSlug,
+ db_session: DbSession,
+):
+ user = get_by_email(db_session=db_session, email=user_in.email)
+ if user:
+ # Pydantic v2 compatible error handling
+ raise HTTPException(
+ status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
+ detail=[
+ {
+ "msg": "A user with this email already exists.",
+ "loc": ["email"],
+ "type": "value_error",
+ }
+ ],
+ )
+
+ user = create(db_session=db_session, organization=organization, user_in=user_in)
+ return user
+
+
+@auth_router.post("/mfa", response_model=MfaPayloadResponse)
+def mfa_check(
+ payload_in: MfaPayload,
+ current_user: CurrentUser,
+ db_session: DbSession,
+):
+ log.info(f"MFA check initiated for user: {current_user.email}")
+ log.debug(f"Payload received: {payload_in.dict()}")
+
+ try:
+ log.info(f"Attempting to get active MFA plugin for project: {payload_in.project_id}")
+ mfa_auth_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=payload_in.project_id, plugin_type="auth-mfa"
+ )
+
+ if not mfa_auth_plugin:
+ log.error(f"MFA plugin not enabled for project: {payload_in.project_id}")
+ raise HTTPException(
+ status_code=400, detail="MFA plugin is not enabled for the project."
+ )
+
+ log.info(f"MFA plugin found: {mfa_auth_plugin.__class__.__name__}")
+
+ log.info("Validating MFA token")
+ status = mfa_auth_plugin.instance.validate_mfa_token(payload_in, current_user, db_session)
+
+ log.info("MFA token validation successful")
+ return MfaPayloadResponse(status=status)
+
+ except MfaException as e:
+ log.error(f"MFA Exception occurred: {str(e)}")
+ log.debug(f"MFA Exception details: {type(e).__name__}", exc_info=True)
+ raise HTTPException(status_code=400, detail=str(e)) from e
+
+ except Exception as e:
+ log.critical(f"Unexpected error in MFA check: {str(e)}")
+ log.exception("Full traceback:")
+ raise HTTPException(status_code=500, detail="An unexpected error occurred") from e
+
+ finally:
+ log.info("MFA check completed")
+
+
+@auth_router.get("/me/settings", response_model=UserSettingsRead)
+def get_my_settings(
+ *,
+ db_session: DbSession,
+ current_user: CurrentUser,
+):
+ """Get current user's settings."""
+ settings = get_or_create_user_settings(db_session=db_session, user_id=int(current_user.id))
+ return settings
+
+
+@auth_router.put("/me/settings", response_model=UserSettingsRead)
+def update_my_settings(
+ *,
+ db_session: DbSession,
+ current_user: CurrentUser,
+ settings_in: UserSettingsUpdate,
+):
+ """Update current user's settings."""
+ settings = get_or_create_user_settings(db_session=db_session, user_id=int(current_user.id))
+ updated_settings = update_user_settings(
+ db_session=db_session, settings=settings, settings_in=settings_in
+ )
+ return updated_settings
+
+
+if DISPATCH_AUTH_REGISTRATION_ENABLED:
+ register_user = auth_router.post("/register", response_model=UserRegisterResponse)(
+ register_user
+ )
diff --git a/src/dispatch/canvas/__init__.py b/src/dispatch/canvas/__init__.py
new file mode 100644
index 000000000000..4fbaf14b3c76
--- /dev/null
+++ b/src/dispatch/canvas/__init__.py
@@ -0,0 +1,39 @@
+"""Canvas management module for Dispatch."""
+
+from .enums import CanvasType
+from .models import Canvas, CanvasBase, CanvasCreate, CanvasRead, CanvasUpdate
+from .service import (
+ create,
+ delete,
+ delete_by_slack_canvas_id,
+ get,
+ get_by_canvas_id,
+ get_by_case,
+ get_by_incident,
+ get_by_project,
+ get_by_type,
+ get_or_create_by_case,
+ get_or_create_by_incident,
+ update,
+)
+
+__all__ = [
+ "Canvas",
+ "CanvasBase",
+ "CanvasCreate",
+ "CanvasRead",
+ "CanvasType",
+ "CanvasUpdate",
+ "create",
+ "delete",
+ "delete_by_slack_canvas_id",
+ "get",
+ "get_by_canvas_id",
+ "get_by_case",
+ "get_by_incident",
+ "get_by_project",
+ "get_by_type",
+ "get_or_create_by_case",
+ "get_or_create_by_incident",
+ "update",
+]
diff --git a/src/dispatch/canvas/enums.py b/src/dispatch/canvas/enums.py
new file mode 100644
index 000000000000..0489e17ca47e
--- /dev/null
+++ b/src/dispatch/canvas/enums.py
@@ -0,0 +1,10 @@
+from dispatch.enums import DispatchEnum
+
+
+class CanvasType(DispatchEnum):
+ """Types of canvases that can be created."""
+
+ summary = "summary"
+ tactical_reports = "tactical_reports"
+ participants = "participants"
+ tasks = "tasks"
diff --git a/src/dispatch/canvas/flows.py b/src/dispatch/canvas/flows.py
new file mode 100644
index 000000000000..4717ba5e38de
--- /dev/null
+++ b/src/dispatch/canvas/flows.py
@@ -0,0 +1,528 @@
+"""Canvas flows for managing incident and case-related canvases."""
+
+import logging
+from typing import Optional
+
+from sqlalchemy.orm import Session
+
+from .models import Canvas, CanvasCreate
+from .enums import CanvasType
+from .service import create
+from dispatch.incident.models import Incident
+from dispatch.case.models import Case
+from dispatch.participant.models import Participant
+from dispatch.plugin import service as plugin_service
+
+
+log = logging.getLogger(__name__)
+
+
+def create_participants_canvas(
+ incident: Incident = None, case: Case = None, db_session: Session = None
+) -> Optional[str]:
+ """
+ Creates a new participants canvas in the incident's or case's Slack channel.
+
+ Args:
+ incident: The incident to create the canvas for (mutually exclusive with case)
+ case: The case to create the canvas for (mutually exclusive with incident)
+ db_session: Database session
+
+ Returns:
+ The canvas ID if successful, None if failed
+ """
+ if incident and case:
+ raise ValueError("Cannot specify both incident and case")
+ if not incident and not case:
+ raise ValueError("Must specify either incident or case")
+
+ if incident:
+ return _create_incident_participants_canvas(incident, db_session)
+ else:
+ return _create_case_participants_canvas(case, db_session)
+
+
+def _create_incident_participants_canvas(incident: Incident, db_session: Session) -> Optional[str]:
+ """
+ Creates a new participants canvas in the incident's Slack channel.
+
+ Args:
+ incident: The incident to create the canvas for
+ db_session: Database session
+
+ Returns:
+ The canvas ID if successful, None if failed
+ """
+ # Check if incident has a conversation
+ if not incident.conversation:
+ log.debug(f"Skipping canvas creation for incident {incident.id} - no conversation")
+ return None
+
+ # Check if conversation has a channel_id
+ if not incident.conversation.channel_id:
+ log.debug(f"Skipping canvas creation for incident {incident.id} - no channel_id")
+ return None
+
+ try:
+ # Get the Slack plugin instance
+ slack_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+
+ # Create the canvas in Slack
+ canvas_id = slack_plugin.instance.create_canvas(
+ conversation_id=incident.conversation.channel_id,
+ title="Participants",
+ user_emails=(
+ [incident.commander.individual.email] if incident.commander else []
+ ), # Give commander edit permissions
+ content=_build_participants_table(incident, db_session),
+ )
+
+ if canvas_id:
+ # Store the canvas record in the database
+ create(
+ db_session=db_session,
+ canvas_in=CanvasCreate(
+ canvas_id=canvas_id,
+ incident_id=incident.id,
+ case_id=None,
+ type=CanvasType.participants,
+ project_id=incident.project_id,
+ ),
+ )
+ return canvas_id
+ else:
+ log.error(f"Failed to create participants canvas for incident {incident.id}")
+ return None
+
+ except Exception as e:
+ log.exception(f"Error creating participants canvas for incident {incident.id}: {e}")
+ return None
+
+
+def _create_case_participants_canvas(case: Case, db_session: Session) -> Optional[str]:
+ """
+ Creates a new participants canvas in the case's Slack channel.
+
+ Args:
+ case: The case to create the canvas for
+ db_session: Database session
+
+ Returns:
+ The canvas ID if successful, None if failed
+ """
+ # Only create canvas for cases with dedicated channels
+ if not case.dedicated_channel:
+ log.debug(f"Skipping canvas creation for case {case.id} - no dedicated channel")
+ return None
+
+ # Check if case has a conversation
+ if not case.conversation:
+ log.debug(f"Skipping canvas creation for case {case.id} - no conversation")
+ return None
+
+ # Check if conversation has a channel_id
+ if not case.conversation.channel_id:
+ log.debug(f"Skipping canvas creation for case {case.id} - no channel_id")
+ return None
+
+ try:
+ # Get the Slack plugin instance
+ slack_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+
+ if not slack_plugin:
+ log.error(f"No conversation plugin found for case {case.id}")
+ return None
+
+ # Build the participants table content
+ table_content = _build_case_participants_table(case, db_session)
+ log.debug(f"Built participants table for case {case.id}: {table_content[:100]}...")
+
+ # Create the canvas in Slack
+ canvas_id = slack_plugin.instance.create_canvas(
+ conversation_id=case.conversation.channel_id,
+ title="Participants",
+ user_emails=(
+ [case.assignee.individual.email] if case.assignee else []
+ ), # Give assignee edit permissions
+ content=table_content,
+ )
+
+ if canvas_id:
+ # Store the canvas record in the database
+ create(
+ db_session=db_session,
+ canvas_in=CanvasCreate(
+ canvas_id=canvas_id,
+ incident_id=None,
+ case_id=case.id,
+ type=CanvasType.participants,
+ project_id=case.project_id,
+ ),
+ )
+ log.info(f"Successfully created participants canvas {canvas_id} for case {case.id}")
+ return canvas_id
+ else:
+ log.error(f"Failed to create participants canvas for case {case.id}")
+ return None
+
+ except Exception as e:
+ log.exception(f"Error creating participants canvas for case {case.id}: {e}")
+ return None
+
+
+def update_participants_canvas(
+ incident: Incident = None, case: Case = None, db_session: Session = None
+) -> bool:
+ """
+ Updates the participants canvas with current participant information.
+
+ Args:
+ incident: The incident to update the canvas for (mutually exclusive with case)
+ case: The case to update the canvas for (mutually exclusive with incident)
+ db_session: Database session
+
+ Returns:
+ True if successful, False if failed
+ """
+ if incident and case:
+ raise ValueError("Cannot specify both incident and case")
+ if not incident and not case:
+ raise ValueError("Must specify either incident or case")
+
+ if incident:
+ return _update_incident_participants_canvas(incident, db_session)
+ else:
+ return _update_case_participants_canvas(case, db_session)
+
+
+def _update_incident_participants_canvas(incident: Incident, db_session: Session) -> bool:
+ """
+ Updates the participants canvas with current participant information.
+
+ Args:
+ incident: The incident to update the canvas for
+ db_session: Database session
+
+ Returns:
+ True if successful, False if failed
+ """
+ # Check if incident has a conversation
+ if not incident.conversation:
+ log.debug(f"Skipping canvas update for incident {incident.id} - no conversation")
+ return False
+
+ # Check if conversation has a channel_id
+ if not incident.conversation.channel_id:
+ log.debug(f"Skipping canvas update for incident {incident.id} - no channel_id")
+ return False
+
+ try:
+ # Get the existing canvas record by incident and type
+ canvas = (
+ db_session.query(Canvas)
+ .filter(Canvas.incident_id == incident.id, Canvas.type == CanvasType.participants)
+ .first()
+ )
+
+ if not canvas:
+ log.warning(
+ f"No participants canvas found for incident {incident.id}, creating new one"
+ )
+ return False
+
+ # Get the Slack plugin instance
+ slack_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+
+ # Build the updated table content
+ table_content = _build_participants_table(incident, db_session)
+
+ # Update the canvas
+ success = slack_plugin.instance.edit_canvas(
+ canvas_id=canvas.canvas_id, content=table_content
+ )
+
+ if success:
+ log.info(f"Updated participants canvas {canvas.canvas_id} for incident {incident.id}")
+ else:
+ log.error(
+ f"Failed to update participants canvas {canvas.canvas_id} for incident {incident.id}"
+ )
+
+ return success
+
+ except Exception as e:
+ log.exception(f"Error updating participants canvas for incident {incident.id}: {e}")
+ return False
+
+
+def _build_participants_table(incident: Incident, db_session: Session) -> str:
+ """
+ Builds markdown tables of participants for the canvas.
+ Splits into multiple tables if there are more than 60 participants to avoid Slack's 300 cell limit.
+
+ Args:
+ incident: The incident to build the table for
+ db_session: Database session
+
+ Returns:
+ Markdown table string
+ """
+ # Get all participants for the incident
+ participants = (
+ db_session.query(Participant).filter(Participant.incident_id == incident.id).all()
+ )
+
+ if not participants:
+ return "# Participants\n\nNo participants have been added to this incident yet."
+
+ # Define role priority for sorting (lower number = higher priority)
+ role_priority = {
+ "Incident Commander": 1,
+ "Scribe": 2,
+ "Reporter": 3,
+ "Participant": 4,
+ "Observer": 5,
+ }
+
+ # Filter out inactive participants and sort by role priority
+ active_participants = []
+ for participant in participants:
+ if participant.active_roles:
+ # Get the highest priority role for this participant
+ highest_priority = float("inf")
+ primary_role = "Other"
+
+ for role in participant.active_roles:
+ # role.role is already a string (role name), not an object
+ role_name = role.role if role.role else "Other"
+ priority = role_priority.get(role_name, 999) # Default to low priority
+ if priority < highest_priority:
+ highest_priority = priority
+ primary_role = role_name
+
+ active_participants.append((participant, highest_priority, primary_role))
+
+ # Sort by priority, then by name
+ active_participants.sort(
+ key=lambda x: (x[1], x[0].individual.name if x[0].individual else "Unknown")
+ )
+
+ # Extract just the participants in sorted order
+ sorted_participants = [p[0] for p in active_participants]
+
+ if not sorted_participants:
+ return "# Participants\n\nNo active participants found for this incident."
+
+ # Build the content
+ content = f"# Participants ({len(participants)} total)\n\n"
+
+ # Group participants by their primary role
+ participants_by_role = {}
+ for participant in sorted_participants:
+ # Get the highest priority role for this participant
+ highest_priority = float("inf")
+ primary_role = "Other"
+
+ for role in participant.active_roles:
+ # role.role is already a string (role name), not an object
+ role_name = role.role if role.role else "Other"
+ priority = role_priority.get(role_name, 999) # Default to low priority
+ if priority < highest_priority:
+ highest_priority = priority
+ primary_role = role_name
+
+ if primary_role not in participants_by_role:
+ participants_by_role[primary_role] = []
+ participants_by_role[primary_role].append(participant)
+
+ # Add participants grouped by role
+ for role_name in [
+ "Incident Commander",
+ "Scribe",
+ "Reporter",
+ "Participant",
+ "Observer",
+ "Other",
+ ]:
+ if role_name in participants_by_role:
+ participants_count = len(participants_by_role[role_name])
+ # Add "s" only if there are multiple participants in this role
+ heading = f"## {role_name}{'s' if participants_count > 1 else ''}\n\n"
+ content += heading
+ for participant in participants_by_role[role_name]:
+ name = participant.individual.name if participant.individual else "Unknown"
+ team = participant.team or "Unknown"
+ location = participant.location or "Unknown"
+ content += f"* **{name}** - {team} - {location}\n"
+ content += "\n"
+
+ return content
+
+
+def _update_case_participants_canvas(case: Case, db_session: Session) -> bool:
+ """
+ Updates the participants canvas with current participant information.
+
+ Args:
+ case: The case to update the canvas for
+ db_session: Database session
+
+ Returns:
+ True if successful, False if failed
+ """
+ # Only update canvas for cases with dedicated channels
+ if not case.dedicated_channel:
+ log.debug(f"Skipping canvas update for case {case.id} - no dedicated channel")
+ return False
+
+ # Check if case has a conversation
+ if not case.conversation:
+ log.debug(f"Skipping canvas update for case {case.id} - no conversation")
+ return False
+
+ # Check if conversation has a channel_id
+ if not case.conversation.channel_id:
+ log.debug(f"Skipping canvas update for case {case.id} - no channel_id")
+ return False
+
+ try:
+ # Get the existing canvas record by case and type
+ canvas = (
+ db_session.query(Canvas)
+ .filter(Canvas.case_id == case.id, Canvas.type == CanvasType.participants)
+ .first()
+ )
+
+ if not canvas:
+ log.warning(f"No participants canvas found for case {case.id}, creating new one")
+ return False
+
+ # Get the Slack plugin instance
+ slack_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+
+ # Build the updated table content
+ table_content = _build_case_participants_table(case, db_session)
+
+ # Update the canvas
+ success = slack_plugin.instance.edit_canvas(
+ canvas_id=canvas.canvas_id, content=table_content
+ )
+
+ if success:
+ log.info(f"Updated participants canvas {canvas.canvas_id} for case {case.id}")
+ else:
+ log.error(f"Failed to update participants canvas {canvas.canvas_id} for case {case.id}")
+
+ return success
+
+ except Exception as e:
+ log.exception(f"Error updating participants canvas for case {case.id}: {e}")
+ return False
+
+
+def _build_case_participants_table(case: Case, db_session: Session) -> str:
+ """
+ Builds markdown tables of participants for the canvas.
+ Splits into multiple tables if there are more than 60 participants to avoid Slack's 300 cell limit.
+
+ Args:
+ case: The case to build the table for
+ db_session: Database session
+
+ Returns:
+ Markdown table string
+ """
+ # Get all participants for the case
+ participants = db_session.query(Participant).filter(Participant.case_id == case.id).all()
+
+ if not participants:
+ return "# Participants\n\nNo participants have been added to this case yet."
+
+ # Define role priority for sorting (lower number = higher priority)
+ role_priority = {
+ "Assignee": 1,
+ "Reporter": 2,
+ "Participant": 3,
+ "Observer": 4,
+ }
+
+ # Filter out inactive participants and sort by role priority
+ active_participants = []
+ for participant in participants:
+ if participant.active_roles:
+ # Get the highest priority role for this participant
+ highest_priority = float("inf")
+ primary_role = "Other"
+
+ for role in participant.active_roles:
+ # role.role is already a string (role name), not an object
+ role_name = role.role if role.role else "Other"
+ priority = role_priority.get(role_name, 999) # Default to low priority
+ if priority < highest_priority:
+ highest_priority = priority
+ primary_role = role_name
+
+ active_participants.append((participant, highest_priority, primary_role))
+
+ # Sort by priority, then by name
+ active_participants.sort(
+ key=lambda x: (x[1], x[0].individual.name if x[0].individual else "Unknown")
+ )
+
+ # Extract just the participants in sorted order
+ sorted_participants = [p[0] for p in active_participants]
+
+ if not sorted_participants:
+ return "# Participants\n\nNo active participants found for this case."
+
+ # Build the content
+ content = f"# Participants ({len(participants)} total)\n\n"
+
+ # Group participants by their primary role
+ participants_by_role = {}
+ for participant in sorted_participants:
+ # Get the highest priority role for this participant
+ highest_priority = float("inf")
+ primary_role = "Other"
+
+ for role in participant.active_roles:
+ # role.role is already a string (role name), not an object
+ role_name = role.role if role.role else "Other"
+ priority = role_priority.get(role_name, 999) # Default to low priority
+ if priority < highest_priority:
+ highest_priority = priority
+ primary_role = role_name
+
+ if primary_role not in participants_by_role:
+ participants_by_role[primary_role] = []
+ participants_by_role[primary_role].append(participant)
+
+ # Add participants grouped by role
+ for role_name in [
+ "Assignee",
+ "Reporter",
+ "Participant",
+ "Observer",
+ "Other",
+ ]:
+ if role_name in participants_by_role:
+ participants_count = len(participants_by_role[role_name])
+ # Add "s" only if there are multiple participants in this role
+ heading = f"## {role_name}{'s' if participants_count > 1 else ''}\n\n"
+ content += heading
+ for participant in participants_by_role[role_name]:
+ name = participant.individual.name if participant.individual else "Unknown"
+ team = participant.team or "Unknown"
+ location = participant.location or "Unknown"
+ content += f"* **{name}** - {team} - {location}\n"
+ content += "\n"
+
+ return content
diff --git a/src/dispatch/canvas/models.py b/src/dispatch/canvas/models.py
new file mode 100644
index 000000000000..e1bfcd126410
--- /dev/null
+++ b/src/dispatch/canvas/models.py
@@ -0,0 +1,55 @@
+"""Models and schemas for the Dispatch canvas management system."""
+
+from datetime import datetime
+from typing import Optional
+from pydantic import Field
+from sqlalchemy import Column, ForeignKey, Integer, String
+from sqlalchemy.orm import relationship
+
+from dispatch.database.core import Base
+from dispatch.models import DispatchBase, PrimaryKey, ProjectMixin, TimeStampMixin
+
+
+class Canvas(Base, TimeStampMixin, ProjectMixin):
+ """SQLAlchemy model for a Canvas, representing a Slack canvas in the system."""
+
+ id = Column(Integer, primary_key=True)
+ canvas_id = Column(String, nullable=False) # Slack canvas ID
+ incident_id = Column(Integer, ForeignKey("incident.id", ondelete="CASCADE"), nullable=True)
+ case_id = Column(Integer, ForeignKey("case.id", ondelete="CASCADE"), nullable=True)
+ type = Column(String, nullable=False) # CanvasType enum value
+
+ # Relationships
+ incident = relationship("Incident", back_populates="canvases")
+ case = relationship("Case", back_populates="canvases")
+
+
+# Pydantic models...
+class CanvasBase(DispatchBase):
+ """Base Pydantic model for canvas-related fields."""
+
+ canvas_id: str = Field(..., description="The Slack canvas ID")
+ incident_id: Optional[int] = Field(None, description="The associated incident ID")
+ case_id: Optional[int] = Field(None, description="The associated case ID")
+ type: str = Field(..., description="The type of canvas")
+
+
+class CanvasCreate(CanvasBase):
+ """Pydantic model for creating a new canvas."""
+
+ project_id: int = Field(..., description="The project ID")
+
+
+class CanvasUpdate(CanvasBase):
+ """Pydantic model for updating an existing canvas."""
+
+ pass
+
+
+class CanvasRead(CanvasBase):
+ """Pydantic model for reading canvas data."""
+
+ id: PrimaryKey
+ created_at: datetime
+ updated_at: datetime
+ project_id: int
diff --git a/src/dispatch/canvas/service.py b/src/dispatch/canvas/service.py
new file mode 100644
index 000000000000..b93b120be638
--- /dev/null
+++ b/src/dispatch/canvas/service.py
@@ -0,0 +1,149 @@
+"""Service functions for canvas management."""
+
+import logging
+from typing import Optional
+from sqlalchemy.orm import Session
+
+from dispatch.case.models import Case
+from dispatch.incident.models import Incident
+
+from .models import Canvas, CanvasCreate, CanvasUpdate
+
+log = logging.getLogger(__name__)
+
+
+def get(*, db_session: Session, canvas_id: int) -> Optional[Canvas]:
+ """Returns a canvas based on the given id."""
+ return db_session.query(Canvas).filter(Canvas.id == canvas_id).first()
+
+
+def get_by_canvas_id(*, db_session: Session, slack_canvas_id: str) -> Optional[Canvas]:
+ """Returns a canvas based on the Slack canvas ID."""
+ return db_session.query(Canvas).filter(Canvas.canvas_id == slack_canvas_id).first()
+
+
+def get_by_incident(*, db_session: Session, incident_id: int) -> list[Canvas]:
+ """Returns all canvases associated with an incident."""
+ return db_session.query(Canvas).filter(Canvas.incident_id == incident_id).all()
+
+
+def get_by_case(*, db_session: Session, case_id: int) -> list[Canvas]:
+ """Returns all canvases associated with a case."""
+ return db_session.query(Canvas).filter(Canvas.case_id == case_id).all()
+
+
+def get_by_project(*, db_session: Session, project_id: int) -> list[Canvas]:
+ """Returns all canvases for a project."""
+ return db_session.query(Canvas).filter(Canvas.project_id == project_id).all()
+
+
+def get_by_type(*, db_session: Session, project_id: int, canvas_type: str) -> list[Canvas]:
+ """Returns all canvases of a specific type for a project."""
+ return (
+ db_session.query(Canvas)
+ .filter(Canvas.project_id == project_id)
+ .filter(Canvas.type == canvas_type)
+ .all()
+ )
+
+
+def create(*, db_session: Session, canvas_in: CanvasCreate) -> Canvas:
+ """Creates a new canvas."""
+ canvas = Canvas(
+ canvas_id=canvas_in.canvas_id,
+ incident_id=canvas_in.incident_id,
+ case_id=canvas_in.case_id,
+ type=canvas_in.type,
+ project_id=canvas_in.project_id,
+ )
+ db_session.add(canvas)
+ db_session.commit()
+ return canvas
+
+
+def update(*, db_session: Session, canvas_id: int, canvas_in: CanvasUpdate) -> Canvas | None:
+ """Updates an existing canvas."""
+ canvas = get(db_session=db_session, canvas_id=canvas_id)
+ if not canvas:
+ log.error(f"Canvas with id {canvas_id} not found")
+ return None
+
+ update_data = canvas_in.model_dump(exclude_unset=True)
+
+ for field, value in update_data.items():
+ setattr(canvas, field, value)
+
+ db_session.add(canvas)
+ db_session.commit()
+ db_session.refresh(canvas)
+ return canvas
+
+
+def delete(*, db_session: Session, canvas_id: int) -> bool:
+ """Deletes a canvas."""
+ canvas = db_session.query(Canvas).filter(Canvas.id == canvas_id).first()
+ if not canvas:
+ return False
+
+ db_session.delete(canvas)
+ db_session.commit()
+ return True
+
+
+def delete_by_slack_canvas_id(*, db_session: Session, slack_canvas_id: str) -> bool:
+ """Deletes a canvas by its Slack canvas ID."""
+ canvas = get_by_canvas_id(db_session=db_session, slack_canvas_id=slack_canvas_id)
+ if not canvas:
+ return False
+
+ db_session.delete(canvas)
+ db_session.commit()
+ return True
+
+
+def get_or_create_by_incident(
+ *, db_session: Session, incident: Incident, canvas_type: str, slack_canvas_id: str
+) -> Canvas:
+ """Gets an existing canvas for an incident and type, or creates a new one."""
+ canvas = (
+ db_session.query(Canvas)
+ .filter(Canvas.incident_id == incident.id)
+ .filter(Canvas.type == canvas_type)
+ .first()
+ )
+
+ if not canvas:
+ canvas_in = CanvasCreate(
+ canvas_id=slack_canvas_id,
+ incident_id=incident.id,
+ case_id=None,
+ type=canvas_type,
+ project_id=incident.project_id,
+ )
+ canvas = create(db_session=db_session, canvas_in=canvas_in)
+
+ return canvas
+
+
+def get_or_create_by_case(
+ *, db_session: Session, case: Case, canvas_type: str, slack_canvas_id: str
+) -> Canvas:
+ """Gets an existing canvas for a case and type, or creates a new one."""
+ canvas = (
+ db_session.query(Canvas)
+ .filter(Canvas.case_id == case.id)
+ .filter(Canvas.type == canvas_type)
+ .first()
+ )
+
+ if not canvas:
+ canvas_in = CanvasCreate(
+ canvas_id=slack_canvas_id,
+ incident_id=None,
+ case_id=case.id,
+ type=canvas_type,
+ project_id=case.project_id,
+ )
+ canvas = create(db_session=db_session, canvas_in=canvas_in)
+
+ return canvas
diff --git a/src/dispatch/incident_type/__init__.py b/src/dispatch/case/__init__.py
similarity index 100%
rename from src/dispatch/incident_type/__init__.py
rename to src/dispatch/case/__init__.py
diff --git a/src/dispatch/case/enums.py b/src/dispatch/case/enums.py
new file mode 100644
index 000000000000..14e03c83e3f4
--- /dev/null
+++ b/src/dispatch/case/enums.py
@@ -0,0 +1,65 @@
+from dispatch.enums import DispatchEnum
+
+
+class CaseStatus(DispatchEnum):
+ new = "New"
+ triage = "Triage"
+ escalated = "Escalated"
+ stable = "Stable"
+ closed = "Closed"
+
+
+class CaseResolutionReason(DispatchEnum):
+ benign = "Benign"
+ contained = "Contained"
+ escalated = "Escalated"
+ false_positive = "False Positive"
+ information_gathered = "Information Gathered"
+ insufficient_information = "Insufficient Information"
+ mitigated = "Mitigated"
+ operational_error = "Operational Error"
+ policy_violation = "Policy Violation"
+ user_acknowledged = "User Acknowledged"
+
+
+class CaseResolutionReasonDescription(DispatchEnum):
+ """Descriptions for case resolution reasons."""
+
+ benign = (
+ "The event was legitimate but posed no security threat, such as expected behavior "
+ "from a known application or user."
+ )
+ contained = (
+ "(True positive) The event was a legitimate threat but was contained to prevent "
+ "further spread or damage."
+ )
+ escalated = "There was enough information to create an incident based on the security event."
+ false_positive = "The event was incorrectly flagged as a security event."
+ information_gathered = (
+ "Used when a case was opened with the primary purpose of collecting information."
+ )
+ insufficient_information = (
+ "There was not enough information to determine the nature of the event conclusively."
+ )
+ mitigated = (
+ "(True Positive) The event was a legitimate security threat and was successfully "
+ "mitigated before causing harm."
+ )
+ operational_error = (
+ "The event was caused by a mistake in system configuration or user operation, "
+ "not malicious activity."
+ )
+ policy_violation = (
+ "The event was a breach of internal security policies but did not result in a "
+ "security incident."
+ )
+ user_acknowledged = (
+ "While the event was suspicious it was confirmed by the actor to be intentional."
+ )
+
+
+class CostModelType(DispatchEnum):
+ """Type of cost model used to calculate costs."""
+
+ new = "New"
+ classic = "Classic"
diff --git a/src/dispatch/case/flows.py b/src/dispatch/case/flows.py
new file mode 100644
index 000000000000..56b5bbe0fd2a
--- /dev/null
+++ b/src/dispatch/case/flows.py
@@ -0,0 +1,1449 @@
+import logging
+from datetime import datetime
+
+from sqlalchemy.orm import Session
+
+from dispatch.case import service as case_service
+from dispatch.case.messaging import send_case_welcome_participant_message
+from dispatch.case.models import CaseRead
+from dispatch.conversation import flows as conversation_flows
+from dispatch.decorators import background_task
+from dispatch.document import flows as document_flows
+from dispatch.email_templates import service as email_template_service
+from dispatch.email_templates.enums import EmailTemplateTypes
+from dispatch.enums import DocumentResourceTypes, EventType, Visibility
+from dispatch.event import service as event_service
+from dispatch.group import flows as group_flows
+from dispatch.group.enums import GroupAction, GroupType
+from dispatch.incident import flows as incident_flows
+from dispatch.incident import service as incident_service
+from dispatch.incident.enums import IncidentStatus
+from dispatch.incident.messaging import send_participant_announcement_message
+from dispatch.incident.models import Incident, IncidentCreate
+from dispatch.incident.priority.models import IncidentPriority
+from dispatch.incident.type.models import IncidentType
+from dispatch.individual import service as individual_service
+from dispatch.individual.models import IndividualContactRead
+from dispatch.models import OrganizationSlug, PrimaryKey
+from dispatch.participant import flows as participant_flows
+from dispatch.participant import service as participant_service
+from dispatch.participant.models import ParticipantUpdate
+from dispatch.participant_role import flows as role_flow
+from dispatch.participant_role.models import ParticipantRole, ParticipantRoleType
+from dispatch.plugin import service as plugin_service
+from dispatch.service import service as service_service
+from dispatch.storage import flows as storage_flows
+from dispatch.storage.enums import StorageAction
+from dispatch.ticket import flows as ticket_flows
+from dispatch.canvas import flows as canvas_flows
+
+from .enums import CaseResolutionReason, CaseStatus
+from .messaging import (
+ send_case_created_notifications,
+ send_case_rating_feedback_message,
+ send_case_update_notifications,
+ send_event_paging_message,
+ send_event_update_prompt_reminder,
+)
+from .models import Case
+from .service import get
+
+log = logging.getLogger(__name__)
+
+
+def get_case_participants_flow(case: Case, db_session: Session):
+ """Get additional case participants based on priority, type and description."""
+ individual_contacts = []
+ team_contacts = []
+
+ if case.visibility == Visibility.open:
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="participant"
+ )
+ if plugin:
+ individual_contacts, team_contacts = plugin.instance.get(
+ class_instance=case,
+ project_id=case.project.id,
+ db_session=db_session,
+ )
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description="Case participants resolved",
+ case_id=case.id,
+ )
+
+ return individual_contacts, team_contacts
+
+
+@background_task
+def case_add_or_reactivate_participant_flow(
+ user_email: str,
+ case_id: int,
+ participant_role: ParticipantRoleType = ParticipantRoleType.observer,
+ service_id: int = 0,
+ add_to_conversation: bool = True,
+ event: dict = None,
+ organization_slug: str = None,
+ db_session=None,
+):
+ """Runs the case add or reactive participant flow."""
+ case = case_service.get(db_session=db_session, case_id=case_id)
+
+ if service_id:
+ # we need to ensure that we don't add another member of a service if one
+ # already exists (e.g. overlapping oncalls, we assume they will hand-off if necessary)
+ participant = participant_service.get_by_case_id_and_service_id(
+ case_id=case_id, service_id=service_id, db_session=db_session
+ )
+
+ if participant:
+ log.debug("Skipping resolved participant. Oncall service member already engaged.")
+ return
+
+ participant = participant_service.get_by_case_id_and_email(
+ db_session=db_session, case_id=case.id, email=user_email
+ )
+ if participant:
+ if participant.active_roles:
+ return participant
+
+ if case.status != CaseStatus.closed:
+ # we reactivate the participant
+ participant_flows.reactivate_participant(
+ user_email, case, db_session, service_id=service_id
+ )
+ else:
+ # we add the participant to the case
+ participant = participant_flows.add_participant(
+ user_email, case, db_session, service_id=service_id, roles=[participant_role]
+ )
+ if case.tactical_group:
+ # we add the participant to the tactical group
+ group_flows.update_group(
+ subject=case,
+ group=case.tactical_group,
+ group_action=GroupAction.add_member,
+ group_member=participant.individual.email,
+ db_session=db_session,
+ )
+
+ if case.status != CaseStatus.closed:
+ # we add the participant to the conversation
+ if add_to_conversation:
+ conversation_flows.add_case_participants(
+ case, [participant.individual.email], db_session
+ )
+
+ # check to see if there is an override welcome message template
+ welcome_template = email_template_service.get_by_type(
+ db_session=db_session,
+ project_id=case.project_id,
+ email_template_type=EmailTemplateTypes.case_welcome,
+ )
+
+ send_case_welcome_participant_message(
+ participant_email=user_email,
+ case=case,
+ db_session=db_session,
+ welcome_template=welcome_template,
+ )
+
+ # Update the participants canvas since a new participant was added
+ try:
+ canvas_flows.update_participants_canvas(case=case, db_session=db_session)
+ log.info(f"Updated participants canvas for case {case.id} after adding {user_email}")
+ except Exception as e:
+ log.exception(f"Failed to update participants canvas for case {case.id}: {e}")
+
+ return participant
+
+
+@background_task
+def case_remove_participant_flow(
+ user_email: str,
+ case_id: int,
+ db_session: Session,
+):
+ """Runs the remove participant flow."""
+ case = case_service.get(db_session=db_session, case_id=case_id)
+
+ if not case:
+ log.warn(
+ f"Unable to remove participant from case with id {case_id}. A case with this id does not exist."
+ )
+ return
+
+ # we remove the participant from the incident
+ participant_flows.remove_case_participant(
+ user_email=user_email,
+ case=case,
+ db_session=db_session,
+ )
+
+ # we remove the participant from the tactical group
+ group_flows.update_group(
+ subject=case,
+ group=case.tactical_group,
+ group_action=GroupAction.remove_member,
+ group_member=user_email,
+ db_session=db_session,
+ )
+
+ # Update the participants canvas since a participant was removed
+ try:
+ canvas_flows.update_participants_canvas(case=case, db_session=db_session)
+ log.info(f"Updated participants canvas for case {case.id} after removing {user_email}")
+ except Exception as e:
+ log.exception(f"Failed to update participants canvas for case {case.id}: {e}")
+
+ # we also try to remove the user from the Slack conversation
+ slack_conversation_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+
+ if not slack_conversation_plugin:
+ log.warning(f"{user_email} not updated. No conversation plugin enabled.")
+ return
+
+ if not case.conversation:
+ log.warning("No conversation enabled for this case.")
+ return
+
+ try:
+ slack_conversation_plugin.instance.remove_user(
+ conversation_id=case.conversation.channel_id, user_email=user_email
+ )
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source=slack_conversation_plugin.plugin.title,
+ description=f"{user_email} removed from conversation (channel ID: {case.conversation.channel_id})",
+ case_id=case.id,
+ type=EventType.participant_updated,
+ )
+
+ log.info(
+ f"Removed {user_email} from conversation in channel {case.conversation.channel_id}"
+ )
+
+ except Exception as e:
+ log.exception(f"Failed to remove user from Slack conversation: {e}")
+
+
+def update_conversation(case: Case, db_session: Session) -> None:
+ """Updates external communication conversation."""
+
+ # if no case conversation or case has dedicated channel, there's no thread to update
+ if case.conversation is None or case.conversation.thread_id is None:
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+ plugin.instance.update_thread(
+ case=case, conversation_id=case.conversation.channel_id, ts=case.conversation.thread_id
+ )
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description="Case conversation updated.",
+ case_id=case.id,
+ )
+
+
+def case_auto_close_flow(case: Case, db_session: Session):
+ "Runs the case auto close flow."
+ # we mark the case as closed
+ case.resolution = "Auto closed via case type auto close configuration."
+ case.resolution_reason = CaseResolutionReason.user_acknowledged
+ case.status = CaseStatus.closed
+ db_session.add(case)
+ db_session.commit()
+
+ # we transition the case from the new to the closed state
+ case_triage_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+ case_closed_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+
+ if case.conversation and case.has_thread:
+ # we update the case conversation
+ update_conversation(case=case, db_session=db_session)
+
+
+def case_new_create_flow(
+ *,
+ case_id: int,
+ organization_slug: str | None = None,
+ conversation_target: str | None = None,
+ service_id: int | None = None,
+ db_session: Session,
+ create_all_resources: bool = True,
+):
+ """Runs the case new creation flow."""
+ # we get the case
+ case = get(db_session=db_session, case_id=case_id)
+
+ # we create the ticket
+ ticket_flows.create_case_ticket(case=case, db_session=db_session)
+
+ # we resolve participants
+ individual_participants, team_participants = get_case_participants_flow(
+ case=case, db_session=db_session
+ )
+
+ # NOTE: we create all external resources for a Case unless it's
+ # created from a Signal, as it gets expensive when we have lots of them.
+ case_create_resources_flow(
+ db_session=db_session,
+ case_id=case.id,
+ individual_participants=individual_participants,
+ team_participants=team_participants,
+ conversation_target=conversation_target or "",
+ create_all_resources=create_all_resources,
+ )
+
+ db_session.add(case)
+ db_session.commit()
+
+ if case.dedicated_channel:
+ send_case_created_notifications(case, db_session)
+
+ if case.case_priority.page_assignee:
+ if not service_id:
+ if case.case_type.oncall_service:
+ service_id = case.case_type.oncall_service.external_id
+ else:
+ log.warning(
+ "Case assignee not paged. No relationship between case type and an oncall service."
+ )
+ return case
+
+ oncall_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="oncall"
+ )
+ if oncall_plugin:
+ oncall_plugin.instance.page(
+ service_id=service_id,
+ incident_name=case.name,
+ incident_title=case.title,
+ incident_description=case.description,
+ )
+ else:
+ log.warning("Case assignee not paged. No plugin of type oncall enabled.")
+ return case
+ elif case.event:
+ # no one has been paged, inform the channel that they can
+ # engage the oncall if the priority changes
+ oncall_name = "the relevant team"
+
+ try:
+ if case.case_type and case.case_type.oncall_service:
+ oncall_service = service_service.get_by_external_id(
+ db_session=db_session,
+ external_id=case.case_type.oncall_service.external_id,
+ )
+ oncall_name = oncall_service.name
+ except Exception as e:
+ log.error(
+ f"Failed to get oncall service: {e}. Falling back to default oncall_name string."
+ )
+
+ # send a message to the channel to inform them that they can engage the oncall
+ send_event_paging_message(case, db_session, oncall_name)
+
+ # send ephemeral message to assignee to update the security event
+ send_event_update_prompt_reminder(case, db_session)
+
+ if case and case.case_type.auto_close:
+ # we transition the case to the closed state if its case type has auto close enabled
+ case_auto_close_flow(case=case, db_session=db_session)
+
+ if case.assignee and case.assignee.individual:
+ group_flows.update_group(
+ subject=case,
+ group=case.tactical_group,
+ group_action=GroupAction.add_member,
+ group_member=case.assignee.individual.email,
+ db_session=db_session,
+ )
+
+ if case.reporter and case.reporter.individual:
+ group_flows.update_group(
+ subject=case,
+ group=case.tactical_group,
+ group_action=GroupAction.add_member,
+ group_member=case.reporter.individual.email,
+ db_session=db_session,
+ )
+
+ return case
+
+
+@background_task
+def case_triage_create_flow(*, case_id: int, organization_slug: OrganizationSlug, db_session=None):
+ """Runs the case triage creation flow."""
+ # we run the case new creation flow
+ case_new_create_flow(
+ case_id=case_id, organization_slug=organization_slug, db_session=db_session
+ )
+
+ # we get the case
+ case = get(db_session=db_session, case_id=case_id)
+
+ # we transition the case to the triage state
+ case_triage_status_flow(case=case, db_session=db_session)
+
+
+@background_task
+def case_stable_create_flow(*, case_id: int, organization_slug: OrganizationSlug, db_session=None):
+ """Runs the case stable create flow."""
+ # we run the case new creation flow
+ case_new_create_flow(
+ case_id=case_id, organization_slug=organization_slug, db_session=db_session
+ )
+
+ # we get the case
+ case = get(db_session=db_session, case_id=case_id)
+
+ # we transition the case to the triage state
+ case_triage_status_flow(case=case, db_session=db_session)
+
+ case_stable_status_flow(case=case, db_session=db_session)
+
+
+@background_task
+def case_escalated_create_flow(
+ *, case_id: int, organization_slug: OrganizationSlug, db_session=None
+):
+ """Runs the case escalated create flow."""
+ # we run the case new creation flow
+ case_new_create_flow(
+ case_id=case_id, organization_slug=organization_slug, db_session=db_session
+ )
+
+ # we get the case
+ case = get(db_session=db_session, case_id=case_id)
+
+ # we transition the case to the triage state
+ case_triage_status_flow(case=case, db_session=db_session)
+
+ # then to the stable state
+ case_stable_status_flow(case=case, db_session=db_session)
+
+ case_escalated_status_flow(
+ case=case,
+ organization_slug=organization_slug,
+ db_session=db_session,
+ )
+
+
+@background_task
+def case_closed_create_flow(*, case_id: int, organization_slug: OrganizationSlug, db_session=None):
+ """Runs the case closed creation flow."""
+ # we run the case new creation flow
+ case_new_create_flow(
+ case_id=case_id, organization_slug=organization_slug, db_session=db_session
+ )
+
+ # we get the case
+ case = get(db_session=db_session, case_id=case_id)
+
+ # we transition the case to the triage state
+ case_triage_status_flow(case=case, db_session=db_session)
+
+ # we transition the case to the closed state
+ case_closed_status_flow(case=case, db_session=db_session)
+
+
+def case_details_changed(case: Case, previous_case: CaseRead) -> bool:
+ """Checks if the case details have changed."""
+ return (
+ case.case_type.name != previous_case.case_type.name
+ or case.case_severity.name != previous_case.case_severity.name
+ or case.case_priority.name != previous_case.case_priority.name
+ or case.status != previous_case.status
+ )
+
+
+@background_task
+def case_update_flow(
+ *,
+ case_id: int,
+ previous_case: CaseRead,
+ reporter_email: str | None,
+ assignee_email: str | None,
+ organization_slug: OrganizationSlug,
+ db_session=None,
+):
+ """Runs the case update flow."""
+ # we get the case
+ case = get(db_session=db_session, case_id=case_id)
+
+ if not case:
+ log.warning(f"Case with id {case_id} not found.")
+ return
+
+ if reporter_email and case.reporter and reporter_email != case.reporter.individual.email:
+ # we run the case assign role flow for the reporter if it changed
+ case_assign_role_flow(
+ case_id=case.id,
+ participant_email=reporter_email,
+ participant_role=ParticipantRoleType.reporter,
+ db_session=db_session,
+ )
+
+ if assignee_email and case.assignee and assignee_email != case.assignee.individual.email:
+ # we run the case assign role flow for the assignee if it changed
+ case_assign_role_flow(
+ case_id=case.id,
+ participant_email=assignee_email,
+ participant_role=ParticipantRoleType.assignee,
+ db_session=db_session,
+ )
+
+ # we run the transition flow based on the current and previous status of the case
+ case_status_transition_flow_dispatcher(
+ case=case,
+ current_status=case.status,
+ previous_status=previous_case.status,
+ organization_slug=organization_slug,
+ db_session=db_session,
+ )
+
+ # we update the ticket
+ ticket_flows.update_case_ticket(case=case, db_session=db_session)
+
+ if case.status in [CaseStatus.escalated, CaseStatus.closed] and case.case_document:
+ # we update the document
+ document_flows.update_document(
+ document=case.case_document, project_id=case.project.id, db_session=db_session
+ )
+
+ if case.tactical_group:
+ # we update the tactical group
+ if reporter_email and case.reporter and reporter_email != case.reporter.individual.email:
+ group_flows.update_group(
+ subject=case,
+ group=case.tactical_group,
+ group_action=GroupAction.add_member,
+ group_member=reporter_email,
+ db_session=db_session,
+ )
+ if assignee_email and case.assignee and assignee_email != case.assignee.individual.email:
+ group_flows.update_group(
+ subject=case,
+ group=case.tactical_group,
+ group_action=GroupAction.add_member,
+ group_member=assignee_email,
+ db_session=db_session,
+ )
+
+ if case.conversation and case.has_thread:
+ # we send the case updated notification
+ update_conversation(case, db_session)
+
+ if (
+ case.has_channel
+ and not case.has_thread
+ and case.status not in [CaseStatus.escalated, CaseStatus.closed]
+ ):
+ # determine if case channel topic needs to be updated
+ if case_details_changed(case, previous_case):
+ conversation_flows.set_conversation_topic(case, db_session)
+
+ # we send the case update notifications
+ if case.dedicated_channel:
+ send_case_update_notifications(case, previous_case, db_session)
+
+
+def case_delete_flow(case: Case, db_session: Session):
+ """Runs the case delete flow."""
+ # we delete the external ticket
+ if case.ticket:
+ ticket_flows.delete_ticket(
+ ticket=case.ticket, project_id=case.project.id, db_session=db_session
+ )
+
+ # we delete the external groups
+ if case.groups:
+ for group in case.groups:
+ group_flows.delete_group(group=group, project_id=case.project.id, db_session=db_session)
+
+ # we delete the external storage
+ if case.storage:
+ storage_flows.delete_storage(
+ storage=case.storage, project_id=case.project.id, db_session=db_session
+ )
+
+
+def case_new_status_flow(case: Case, db_session=None):
+ """Runs the case new transition flow."""
+ pass
+
+
+def case_triage_status_flow(case: Case, db_session=None):
+ """Runs the case triage transition flow."""
+ # we set the triage_at time during transitions if not already set
+ if not case.triage_at:
+ case.triage_at = datetime.utcnow()
+ db_session.add(case)
+ db_session.commit()
+
+
+def case_escalated_status_flow(
+ case: Case,
+ organization_slug: OrganizationSlug,
+ db_session: Session,
+ title: str | None,
+ incident_priority: IncidentPriority | None,
+ incident_type: IncidentType | None,
+ incident_description: str | None,
+):
+ """Runs the case escalated status flow."""
+ # we set the escalated at time
+ case.escalated_at = datetime.utcnow()
+ db_session.add(case)
+ db_session.commit()
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Case escalated",
+ case_id=case.id,
+ )
+
+ case_to_incident_escalate_flow(
+ case=case,
+ organization_slug=organization_slug,
+ db_session=db_session,
+ title=title,
+ incident_priority=incident_priority,
+ incident_type=incident_type,
+ incident_description=incident_description,
+ )
+
+
+def case_stable_status_flow(case: Case, db_session=None):
+ """Runs the case stable status flow."""
+ # we set the stable at time
+ case.stable_at = datetime.utcnow()
+ db_session.add(case)
+ db_session.commit()
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Case marked as stable",
+ case_id=case.id,
+ )
+
+
+def case_closed_status_flow(case: Case, db_session=None):
+ """Runs the case closed transition flow."""
+ # we set the closed_at time
+ case.closed_at = datetime.utcnow()
+ db_session.add(case)
+ db_session.commit()
+
+ # Archive the conversation if there is a dedicated channel
+ if case.dedicated_channel:
+ conversation_flows.archive_conversation(subject=case, db_session=db_session)
+
+ # Check if the case visibility is open
+ if case.visibility != Visibility.open:
+ return
+
+ # Get the active storage plugin for the case's project
+ storage_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="storage"
+ )
+
+ if not storage_plugin:
+ return
+
+ # we update the ticket
+ ticket_flows.update_case_ticket(case=case, db_session=db_session)
+
+ # Open document access if configured
+ if storage_plugin.configuration.open_on_close:
+ for document in case.documents:
+ document_flows.open_document_access(document=document, db_session=db_session)
+
+ # Mark documents as read-only if configured
+ if storage_plugin.configuration.read_only:
+ for document in case.documents:
+ document_flows.mark_document_as_readonly(document=document, db_session=db_session)
+
+ if case.dedicated_channel:
+ # we send a direct message to all participants asking them
+ # to rate and provide feedback about the case
+ send_case_rating_feedback_message(case, db_session)
+
+
+def reactivate_case_participants(case: Case, db_session: Session):
+ """Reactivates all case participants."""
+ for participant in case.participants:
+ try:
+ case_add_or_reactivate_participant_flow(
+ participant.individual.email, case.id, db_session=db_session
+ )
+ except Exception as e:
+ # don't fail to reactivate all participants if one fails
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Unable to reactivate participant with email {participant.individual.email}",
+ case_id=case.id,
+ type=EventType.participant_updated,
+ )
+ log.exception(e)
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Case participants reactivated",
+ case_id=case.id,
+ type=EventType.participant_updated,
+ )
+
+
+def case_active_status_flow(case: Case, db_session: Session) -> None:
+ """Runs the case active flow."""
+ # we un-archive the conversation
+ if case.dedicated_channel:
+ conversation_flows.unarchive_conversation(subject=case, db_session=db_session)
+ reactivate_case_participants(case, db_session)
+
+
+def case_status_transition_flow_dispatcher(
+ case: Case,
+ current_status: CaseStatus,
+ previous_status: CaseStatus,
+ organization_slug: OrganizationSlug,
+ db_session: Session,
+):
+ """Runs the correct flows based on the current and previous status of the case."""
+ log.info(
+ "Transitioning Case status",
+ extra={
+ "case_id": case.id,
+ "previous_status": previous_status,
+ "current_status": current_status,
+ },
+ )
+ match (previous_status, current_status):
+ case (CaseStatus.closed, CaseStatus.new):
+ # Closed -> New
+ case_active_status_flow(case, db_session)
+
+ case (_, CaseStatus.new):
+ # Any -> New
+ pass
+
+ case (CaseStatus.new, CaseStatus.triage):
+ # New -> Triage
+ case_triage_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+
+ case (CaseStatus.closed, CaseStatus.triage):
+ # Closed -> Triage
+ case_active_status_flow(case, db_session)
+ case_triage_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+
+ case (_, CaseStatus.triage):
+ # Any -> Triage/
+ log.warning(
+ "Unexpected previous state for Case transition to Triage state.",
+ extra={
+ "case_id": case.id,
+ "previous_status": previous_status,
+ "current_status": current_status,
+ },
+ )
+
+ case (CaseStatus.new, CaseStatus.escalated):
+ # New -> Escalated
+ case_triage_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+ case_escalated_status_flow(
+ case=case,
+ organization_slug=organization_slug,
+ db_session=db_session,
+ )
+
+ case (CaseStatus.triage, CaseStatus.escalated):
+ # Triage -> Escalated
+ case_escalated_status_flow(
+ case=case,
+ organization_slug=organization_slug,
+ db_session=db_session,
+ )
+
+ case (CaseStatus.closed, CaseStatus.escalated):
+ # Closed -> Escalated
+ case_active_status_flow(case, db_session)
+ case_triage_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+ case_escalated_status_flow(
+ case=case,
+ organization_slug=organization_slug,
+ db_session=db_session,
+ )
+
+ case (_, CaseStatus.escalated):
+ # Any -> Escalated
+ pass
+
+ case (CaseStatus.new, CaseStatus.closed):
+ # New -> Closed
+ case_triage_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+ case_closed_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+
+ case (CaseStatus.triage, CaseStatus.closed):
+ # Triage -> Closed
+ case_closed_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+
+ case (CaseStatus.escalated, CaseStatus.closed):
+ # Escalated -> Closed
+ case_closed_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+
+ case (CaseStatus.escalated, CaseStatus.stable):
+ # Escalated -> Stable
+ case_stable_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+
+ case (CaseStatus.triage, CaseStatus.stable):
+ # Triage -> Stable
+ case_stable_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+
+ case (CaseStatus.new, CaseStatus.stable):
+ # New -> Stable
+ case_triage_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+ case_stable_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+
+ case (CaseStatus.stable, CaseStatus.closed):
+ # Stable -> Closed
+ case_closed_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+
+ case (CaseStatus.closed, CaseStatus.stable):
+ # Closed -> Stable
+ case_active_status_flow(case, db_session)
+ case_stable_status_flow(
+ case=case,
+ db_session=db_session,
+ )
+
+ case (_, _):
+ pass
+
+
+def send_escalation_messages_for_channel_case(
+ case: Case,
+ db_session: Session,
+ incident: Incident,
+):
+ from dispatch.plugins.dispatch_slack.incident import messages
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+ if plugin is None:
+ log.warning("Case close reminder message not sent. No conversation plugin enabled.")
+ return
+
+ plugin.instance.send_message(
+ conversation_id=incident.conversation.channel_id,
+ blocks=messages.create_incident_channel_escalate_message(),
+ )
+
+
+def map_case_roles_to_incident_roles(
+ participant_roles: list[ParticipantRole], incident: Incident, db_session: Session
+) -> list[ParticipantRoleType | None]:
+ # Map the case role to an incident role
+ incident_roles = set()
+ for role in participant_roles:
+ if role.role == ParticipantRoleType.assignee:
+ # If incident commader role already assigned, assign as participant
+ if participant_service.get_by_incident_id_and_role(
+ db_session=db_session,
+ incident_id=incident.id,
+ role=ParticipantRoleType.incident_commander,
+ ):
+ incident_roles.add(ParticipantRoleType.participant)
+ else:
+ incident_roles.add(ParticipantRoleType.incident_commander)
+ else:
+ incident_roles.add(role.role)
+ return list(incident_roles) or None
+
+
+def copy_case_events_to_incident(
+ case: Case,
+ incident: Incident,
+ db_session: Session,
+):
+ """Copies all timeline events from a case to an incident."""
+ # Get all events from the case
+ case_events = event_service.get_by_case_id(db_session=db_session, case_id=case.id).all()
+
+ if not case_events:
+ log.info(f"No events to copy from case {case.id} to incident {incident.id}")
+ return
+
+ log.info(f"Copying {len(case_events)} events from case {case.id} to incident {incident.id}")
+
+ for case_event in case_events:
+ # Create a new event for the incident with the same data
+ copied_source = f"{case_event.source} (copied from {case.name})"
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=copied_source,
+ description=case_event.description,
+ incident_id=incident.id,
+ individual_id=case_event.individual_id,
+ started_at=case_event.started_at,
+ ended_at=case_event.ended_at,
+ details=case_event.details,
+ type=case_event.type,
+ owner=case_event.owner,
+ pinned=case_event.pinned,
+ )
+
+ log.info(
+ f"Successfully copied {len(case_events)} events from case {case.id} to incident {incident.id}"
+ )
+
+
+def common_escalate_flow(
+ case: Case,
+ incident: Incident,
+ organization_slug: OrganizationSlug,
+ db_session: Session,
+):
+ # This is a channel based Case, so we reuse the case conversation for the incident
+ if case.has_channel:
+ incident.conversation = case.conversation
+ db_session.add(incident)
+ db_session.commit()
+
+ # we run the incident create flow in a background task
+ incident = incident_flows.incident_create_flow(
+ incident_id=incident.id,
+ organization_slug=organization_slug,
+ db_session=db_session,
+ case_id=case.id,
+ )
+
+ # we link the case to the incident
+ case.incidents.append(incident)
+ db_session.add(case)
+ db_session.commit()
+
+ # Copy timeline events from case to incident
+ copy_case_events_to_incident(case=case, incident=incident, db_session=db_session)
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"The case has been linked to incident {incident.name} in the {incident.project.name} project",
+ case_id=case.id,
+ )
+
+ # we add the case participants to the incident
+ for participant in case.participants:
+ # check to see if already a participant in the incident
+ incident_participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=incident.id, email=participant.individual.email
+ )
+
+ if not incident_participant:
+ log.info(
+ f"Adding participant {participant.individual.email} from Case {case.id} to Incident {incident.id}"
+ )
+ # Get the roles for this participant
+ incident_roles = map_case_roles_to_incident_roles(
+ participant_roles=participant.participant_roles,
+ incident=incident,
+ db_session=db_session,
+ )
+
+ participant_flows.add_participant(
+ user_email=participant.individual.email,
+ subject=incident,
+ db_session=db_session,
+ roles=incident_roles,
+ )
+
+ # We add the participants to the conversation
+ conversation_flows.add_incident_participants_to_conversation(
+ db_session=db_session,
+ incident=incident,
+ participant_emails=[participant.individual.email],
+ )
+
+ # Add the participant to the incident tactical group if active
+ if participant.active_roles:
+ group_flows.update_group(
+ subject=incident,
+ group=incident.tactical_group,
+ group_action=GroupAction.add_member,
+ group_member=participant.individual.email,
+ db_session=db_session,
+ )
+
+ if case.has_channel:
+ # depends on `incident_create_flow()` (we need incident.name), so we invoke after we call it
+ send_escalation_messages_for_channel_case(
+ case=case,
+ db_session=db_session,
+ incident=incident,
+ )
+
+ if case.storage and incident.tactical_group:
+ storage_members = [incident.tactical_group.email]
+ storage_flows.update_storage(
+ subject=case,
+ storage_action=StorageAction.add_members,
+ storage_members=storage_members,
+ db_session=db_session,
+ )
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=(
+ f"The members of the incident's tactical group {incident.tactical_group.email} "
+ f"have been given permission to access the case's storage folder"
+ ),
+ case_id=case.id,
+ )
+
+
+def case_to_incident_escalate_flow(
+ case: Case,
+ organization_slug: OrganizationSlug,
+ db_session: Session,
+ title: str | None,
+ incident_priority: IncidentPriority | None,
+ incident_type: IncidentType,
+ incident_description: str | None,
+):
+ if case.incidents:
+ return
+
+ # use existing reporter or assignee if none
+ reporter = (
+ ParticipantUpdate(individual=IndividualContactRead(email=case.reporter.individual.email))
+ if case.reporter
+ else ParticipantUpdate(
+ individual=IndividualContactRead(email=case.assignee.individual.email)
+ )
+ )
+
+ title = title if title else case.title
+
+ description = (
+ f"{incident_description if incident_description else case.description}\n\n"
+ f"This incident was the result of escalating case {case.name} "
+ f"in the {case.project.name} project. Check out the case in the Dispatch Web UI for additional context."
+ )
+
+ incident_priority = incident_priority if incident_priority else case.case_priority
+
+ incident_in = IncidentCreate(
+ title=title,
+ description=description,
+ status=IncidentStatus.active,
+ incident_type=incident_type,
+ incident_priority=incident_priority,
+ project=case.project,
+ reporter=reporter,
+ )
+ incident = incident_service.create(db_session=db_session, incident_in=incident_in)
+
+ common_escalate_flow(
+ case=case,
+ incident=incident,
+ organization_slug=organization_slug,
+ db_session=db_session,
+ )
+
+
+@background_task
+def case_to_incident_endpoint_escalate_flow(
+ case_id: PrimaryKey,
+ incident_id: PrimaryKey,
+ organization_slug: OrganizationSlug,
+ db_session: Session = None,
+):
+ case = get(case_id=case_id, db_session=db_session)
+
+ case_triage_status_flow(case=case, db_session=db_session)
+ case.escalated_at = datetime.utcnow()
+ case.status = CaseStatus.escalated
+ db_session.commit()
+
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+
+ common_escalate_flow(
+ case=case,
+ incident=incident,
+ organization_slug=organization_slug,
+ db_session=db_session,
+ )
+
+
+def case_assign_role_flow(
+ case_id: int,
+ participant_email: str,
+ participant_role: str,
+ db_session: Session,
+):
+ """Runs the case participant role assignment flow."""
+ # we get the case
+ case = get(case_id=case_id, db_session=db_session)
+
+ # we add the participant to the incident if they're not a member already
+ case_add_or_reactivate_participant_flow(participant_email, case.id, db_session=db_session)
+
+ # we run the assign role flow
+ result = role_flow.assign_role_flow(case, participant_email, participant_role, db_session)
+
+ if result in ["assignee_has_role", "role_not_assigned"]:
+ return
+
+ # we stop here if this is not a dedicated channel case
+ if not case.dedicated_channel:
+ return
+
+ if case.status != CaseStatus.closed and participant_role == ParticipantRoleType.assignee:
+ # update the conversation topic
+ conversation_flows.set_conversation_topic(case, db_session)
+
+ # Update the participants canvas since a role was assigned
+ try:
+ canvas_flows.update_participants_canvas(case=case, db_session=db_session)
+ log.info(
+ f"Updated participants canvas for case {case.id} after assigning {participant_role} to {participant_email}"
+ )
+ except Exception as e:
+ log.exception(f"Failed to update participants canvas for case {case.id}: {e}")
+
+
+def case_create_conversation_flow(
+ db_session: Session,
+ case: Case,
+ participant_emails: list[str],
+ conversation_target: str | None = None,
+) -> None:
+ """Runs the case conversation creation flow."""
+
+ conversation_flows.create_case_conversation(case, conversation_target, db_session)
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Conversation added to case",
+ case_id=case.id,
+ )
+
+ for email in participant_emails:
+ # we don't rely on on this flow to add folks to the conversation because in this case
+ # we want to do it in bulk
+ try:
+ case_add_or_reactivate_participant_flow(
+ db_session=db_session,
+ user_email=email,
+ case_id=case.id,
+ add_to_conversation=False,
+ )
+ except Exception as e:
+ log.warning(
+ f"Failed to add participant {email} to case {case.id}: {e}. "
+ f"Continuing with other participants..."
+ )
+ # Log the event but don't fail the case creation
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Failed to add participant {email}: {e}",
+ case_id=case.id,
+ )
+
+ # we add the participant to the conversation
+ try:
+ conversation_flows.add_case_participants(
+ case=case,
+ participant_emails=participant_emails,
+ db_session=db_session,
+ )
+ except Exception as e:
+ log.warning(
+ f"Failed to add participants to conversation for case {case.id}: {e}. "
+ f"Continuing with case creation..."
+ )
+ # Log the event but don't fail the case creation
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Failed to add participants to conversation: {e}",
+ case_id=case.id,
+ )
+
+
+def case_create_resources_flow(
+ db_session: Session,
+ case_id: int,
+ individual_participants: list[str],
+ team_participants: list[str],
+ conversation_target: str = None,
+ create_all_resources: bool = True,
+) -> None:
+ """Runs the case resource creation flow."""
+ case = get(db_session=db_session, case_id=case_id)
+
+ if case.assignee:
+ individual_participants.append((case.assignee.individual, None))
+
+ if case.reporter:
+ individual_participants.append((case.reporter.individual, None))
+
+ if create_all_resources:
+ # we create the tactical group
+ direct_participant_emails = [i.email for i, _ in individual_participants]
+
+ indirect_participant_emails = [t.email for t in team_participants]
+
+ if not case.groups:
+ group_flows.create_group(
+ subject=case,
+ group_type=GroupType.tactical,
+ group_participants=list(
+ set(direct_participant_emails + indirect_participant_emails)
+ ),
+ db_session=db_session,
+ )
+
+ # we create the storage folder
+ storage_members = []
+ if case.tactical_group:
+ storage_members = [case.tactical_group.email]
+ # direct add members if not group exists
+ else:
+ storage_members = direct_participant_emails
+
+ if not case.storage:
+ storage_flows.create_storage(
+ subject=case, storage_members=storage_members, db_session=db_session
+ )
+
+ # we create the investigation document
+ if not case.case_document:
+ document_flows.create_document(
+ subject=case,
+ document_type=DocumentResourceTypes.case,
+ document_template=case.case_type.case_template_document,
+ db_session=db_session,
+ )
+
+ # we update the case document
+ document_flows.update_document(
+ document=case.case_document, project_id=case.project.id, db_session=db_session
+ )
+
+ try:
+ # wait until all resources are created before adding suggested participants
+ individual_participants = [x.email for x, _ in individual_participants]
+
+ # we create the conversation and add participants to the thread
+ case_create_conversation_flow(
+ db_session=db_session,
+ case=case,
+ participant_emails=individual_participants,
+ conversation_target=conversation_target,
+ )
+
+ # check to see if there is an override welcome message template
+ welcome_template = email_template_service.get_by_type(
+ db_session=db_session,
+ project_id=case.project_id,
+ email_template_type=EmailTemplateTypes.case_welcome,
+ )
+
+ for user_email in set(individual_participants):
+ send_participant_announcement_message(
+ db_session=db_session,
+ participant_email=user_email,
+ subject=case,
+ )
+
+ send_case_welcome_participant_message(
+ participant_email=user_email,
+ case=case,
+ db_session=db_session,
+ welcome_template=welcome_template,
+ )
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Case participants added to conversation.",
+ case_id=case.id,
+ )
+
+ except Exception as e:
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Creation of case conversation failed. Reason: {e}",
+ case_id=case.id,
+ )
+ log.exception(e)
+
+ if case.has_channel:
+ try:
+ canvas_flows.create_participants_canvas(case=case, db_session=db_session)
+ except Exception as e:
+ log.exception(f"Failed to create participants canvas for case {case.id}: {e}")
+
+ bookmarks = [
+ # resource, title
+ (case.case_document, None),
+ (case.ticket, "Case Ticket"),
+ (case.storage, "Case Storage"),
+ ]
+ for resource, title in bookmarks:
+ conversation_flows.add_conversation_bookmark(
+ subject=case,
+ resource=resource,
+ db_session=db_session,
+ title=title,
+ )
+ conversation_flows.set_conversation_topic(case, db_session)
+
+ # we update the ticket
+ ticket_flows.update_case_ticket(case=case, db_session=db_session)
+
+
+@background_task
+def case_engage_oncall_flow(
+ user_email: str,
+ case_id: int,
+ oncall_service_external_id: str,
+ page=None,
+ organization_slug: str = None,
+ db_session=None,
+):
+ """Runs the case engage oncall flow."""
+ # we load the case instance
+ case = case_service.get(db_session=db_session, case_id=case_id)
+
+ # we resolve the oncall service
+ oncall_service = service_service.get_by_external_id_and_project_id(
+ db_session=db_session,
+ external_id=oncall_service_external_id,
+ project_id=case.project.id,
+ )
+
+ # we get the active oncall plugin
+ oncall_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="oncall"
+ )
+
+ if oncall_plugin:
+ if oncall_plugin.plugin.slug != oncall_service.type:
+ log.warning(
+ f"Unable to engage the oncall. Oncall plugin enabled not of type {oncall_plugin.plugin.slug}." # noqa
+ )
+ return None, None
+ else:
+ log.warning("Unable to engage the oncall. No oncall plugins enabled.")
+ return None, None
+
+ oncall_email = oncall_plugin.instance.get(service_id=oncall_service_external_id)
+
+ # we attempt to add the oncall to the case
+ oncall_participant_added = case_add_or_reactivate_participant_flow(
+ user_email=oncall_email,
+ case_id=case.id,
+ service_id=oncall_service.id,
+ db_session=db_session,
+ )
+
+ if not oncall_participant_added:
+ # we already have the oncall for the service in the case
+ return None, oncall_service
+
+ individual = individual_service.get_by_email_and_project(
+ db_session=db_session, email=user_email, project_id=case.project.id
+ )
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source=oncall_plugin.plugin.title,
+ description=f"{individual.name} engages oncall service {oncall_service.name}",
+ case_id=case.id,
+ )
+
+ if page == "Yes":
+ # we page the oncall
+ oncall_plugin.instance.page(
+ service_id=oncall_service_external_id,
+ incident_name=case.name,
+ incident_title=case.title,
+ incident_description=case.description,
+ event_type="case",
+ )
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source=oncall_plugin.plugin.title,
+ description=f"{oncall_service.name} on-call paged",
+ case_id=case.id,
+ )
+
+ return oncall_participant_added.individual, oncall_service
diff --git a/src/dispatch/case/messaging.py b/src/dispatch/case/messaging.py
new file mode 100644
index 000000000000..28ccb3ca091b
--- /dev/null
+++ b/src/dispatch/case/messaging.py
@@ -0,0 +1,556 @@
+"""
+.. module: dispatch.case.messaging
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
+
+import logging
+
+
+from sqlalchemy.orm import Session
+
+from dispatch.database.core import resolve_attr
+from dispatch.document import service as document_service
+from dispatch.case.models import Case, CaseRead
+from dispatch.conversation.enums import ConversationCommands
+from dispatch.entity_type.models import EntityType
+from dispatch.messaging.strings import (
+ CASE_CLOSE_REMINDER,
+ CASE_TRIAGE_REMINDER,
+ CASE_NOTIFICATION,
+ CASE_NOTIFICATION_COMMON,
+ CASE_NAME_WITH_ENGAGEMENT,
+ CASE_NAME_WITH_ENGAGEMENT_NO_SELF_JOIN,
+ CASE_NAME,
+ CASE_ASSIGNEE,
+ CASE_STATUS_CHANGE,
+ CASE_TYPE_CHANGE,
+ CASE_SEVERITY_CHANGE,
+ CASE_PRIORITY_CHANGE,
+ CASE_VISIBILITY_CHANGE,
+ CASE_CLOSED_RATING_FEEDBACK_NOTIFICATION,
+ MessageType,
+ generate_welcome_message,
+)
+from dispatch.config import DISPATCH_UI_URL
+from dispatch.email_templates.models import EmailTemplates
+from dispatch.plugin import service as plugin_service
+from dispatch.plugins.dispatch_slack.models import SubjectMetadata
+from dispatch.plugins.dispatch_slack.case.enums import CaseNotificationActions
+from dispatch.event import service as event_service
+from dispatch.notification import service as notification_service
+
+from .enums import CaseStatus
+
+log = logging.getLogger(__name__)
+
+
+def send_case_close_reminder(case: Case, db_session: Session) -> None:
+ """
+ Sends a direct message to the assignee reminding them to close the case if possible.
+ """
+ message_text = "Case Close Reminder"
+ message_template = CASE_CLOSE_REMINDER
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+ if plugin is None:
+ log.warning("Case close reminder message not sent. No conversation plugin enabled.")
+ return
+ if case.assignee is None:
+ log.warning(f"Case close reminder message not sent. No assignee for {case.name}.")
+ return
+
+ items = [
+ {
+ "name": case.name,
+ "dispatch_ui_case_url": f"{DISPATCH_UI_URL}/{case.project.organization.name}/cases/{case.name}", # noqa
+ "title": case.title,
+ "status": case.status,
+ }
+ ]
+
+ plugin.instance.send_direct(
+ case.assignee.individual.email,
+ message_text,
+ message_template,
+ MessageType.case_status_reminder,
+ items=items,
+ )
+
+ log.debug(f"Case close reminder sent to {case.assignee.individual.email}.")
+
+
+def send_case_triage_reminder(case: Case, db_session: Session) -> None:
+ """
+ Sends a direct message to the assignee reminding them to triage the case if possible.
+ """
+ message_text = "Case Triage Reminder"
+ message_template = CASE_TRIAGE_REMINDER
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+ if plugin is None:
+ log.warning("Case triage reminder message not sent. No conversation plugin enabled.")
+ return
+
+ if case.assignee is None:
+ log.warning(f"Case triage reminder message not sent. No assignee for {case.name}.")
+ return
+
+ items = [
+ {
+ "name": case.name,
+ "title": case.title,
+ "status": case.status,
+ "dispatch_ui_case_url": f"{DISPATCH_UI_URL}/{case.project.organization.name}/cases/{case.name}", # noqa
+ }
+ ]
+
+ plugin.instance.send_direct(
+ case.assignee.individual.email,
+ message_text,
+ message_template,
+ MessageType.case_status_reminder,
+ items=items,
+ )
+
+ log.debug(f"Case triage reminder sent to {case.assignee.individual.email}.")
+
+
+def send_case_created_notifications(case: Case, db_session: Session):
+ """Sends case created notifications."""
+ notification_template = CASE_NOTIFICATION.copy()
+
+ if case.status != CaseStatus.closed:
+ if case.project.allow_self_join:
+ notification_template.insert(0, CASE_NAME_WITH_ENGAGEMENT)
+ else:
+ notification_template.insert(0, CASE_NAME_WITH_ENGAGEMENT_NO_SELF_JOIN)
+ else:
+ notification_template.insert(0, CASE_NAME)
+
+ case_description = (
+ case.description if len(case.description) <= 500 else f"{case.description[:500]}..."
+ )
+
+ notification_kwargs = {
+ "name": case.name,
+ "title": case.title,
+ "description": case_description,
+ "visibility": case.visibility,
+ "status": case.status,
+ "type": case.case_type.name,
+ "type_description": case.case_type.description,
+ "severity": case.case_severity.name,
+ "severity_description": case.case_severity.description,
+ "priority": case.case_priority.name,
+ "priority_description": case.case_priority.description,
+ "reporter_fullname": case.reporter.individual.name,
+ "reporter_team": case.reporter.team,
+ "reporter_weblink": case.reporter.individual.weblink,
+ "assignee_fullname": case.assignee.individual.name,
+ "assignee_team": case.assignee.team,
+ "assignee_weblink": case.assignee.individual.weblink,
+ "document_weblink": resolve_attr(case, "case_document.weblink"),
+ "storage_weblink": resolve_attr(case, "storage.weblink"),
+ "ticket_weblink": resolve_attr(case, "ticket.weblink"),
+ "contact_fullname": case.assignee.individual.name,
+ "contact_weblink": case.assignee.individual.weblink,
+ "case_id": case.id,
+ "organization_slug": case.project.organization.slug,
+ }
+
+ notification_params = {
+ "text": "Case Notification",
+ "type": MessageType.case_notification,
+ "template": notification_template,
+ "kwargs": notification_kwargs,
+ }
+
+ notification_service.filter_and_send(
+ db_session=db_session,
+ project_id=case.project.id,
+ class_instance=case,
+ notification_params=notification_params,
+ )
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Case notifications sent",
+ case_id=case.id,
+ )
+
+ log.debug("Case created notifications sent.")
+
+
+def send_case_update_notifications(case: Case, previous_case: CaseRead, db_session: Session):
+ """Sends notifications about case changes."""
+ notification_text = "Case Notification"
+ notification_type = MessageType.case_notification
+ notification_template = CASE_NOTIFICATION_COMMON.copy()
+
+ change = False
+ if previous_case.status != case.status:
+ change = True
+ notification_template.append(CASE_STATUS_CHANGE)
+ if previous_case.case_type.name != case.case_type.name:
+ notification_template.append(CASE_TYPE_CHANGE)
+
+ if previous_case.case_severity.name != case.case_severity.name:
+ notification_template.append(CASE_SEVERITY_CHANGE)
+
+ if previous_case.case_priority.name != case.case_priority.name:
+ notification_template.append(CASE_PRIORITY_CHANGE)
+ else:
+ if case.status != CaseStatus.closed:
+ if previous_case.case_type.name != case.case_type.name:
+ change = True
+ notification_template.append(CASE_TYPE_CHANGE)
+
+ if previous_case.case_severity.name != case.case_severity.name:
+ change = True
+ notification_template.append(CASE_SEVERITY_CHANGE)
+
+ if previous_case.case_priority.name != case.case_priority.name:
+ change = True
+ notification_template.append(CASE_PRIORITY_CHANGE)
+
+ if previous_case.visibility != case.visibility:
+ change = True
+ notification_template.append(CASE_VISIBILITY_CHANGE)
+
+ if not change:
+ # we don't need to send notifications
+ log.debug("Case updated notifications not sent. No changes were made.")
+ return
+
+ notification_template.append(CASE_ASSIGNEE)
+
+ # we send an update to the case conversation if the case is active or stable
+ if case.status != CaseStatus.closed:
+ case_conversation_notification_template = notification_template.copy()
+ case_conversation_notification_template.insert(0, CASE_NAME)
+
+ convo_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+ if convo_plugin:
+ convo_plugin.instance.send(
+ case.conversation.channel_id,
+ notification_text,
+ case_conversation_notification_template,
+ notification_type,
+ assignee_fullname=case.assignee.individual.name,
+ assignee_team=case.assignee.team,
+ assignee_weblink=case.assignee.individual.weblink,
+ case_priority_new=case.case_priority,
+ case_priority_old=previous_case.case_priority,
+ case_severity_new=case.case_severity,
+ case_severity_old=previous_case.case_severity,
+ case_status_new=case.status,
+ case_status_old=previous_case.status,
+ case_type_new=case.case_type.name,
+ case_type_old=previous_case.case_type.name,
+ case_visibility_new=case.visibility,
+ case_visibility_old=previous_case.visibility,
+ name=case.name,
+ ticket_weblink=case.ticket.weblink,
+ title=case.title,
+ escalated_to_incident=case.incidents[0] if case.incidents else None,
+ )
+ else:
+ log.debug(
+ "Case updated notification not sent to case conversation. No conversation plugin enabled." # noqa
+ )
+
+ # we send a notification to the notification conversations and emails
+ fyi_notification_template = notification_template.copy()
+ if case.status != CaseStatus.closed:
+ if case.project.allow_self_join:
+ fyi_notification_template.insert(0, CASE_NAME_WITH_ENGAGEMENT)
+ else:
+ fyi_notification_template.insert(0, CASE_NAME_WITH_ENGAGEMENT_NO_SELF_JOIN)
+ else:
+ fyi_notification_template.insert(0, CASE_NAME)
+
+ notification_kwargs = {
+ "assignee_fullname": case.assignee.individual.name,
+ "assignee_team": case.assignee.team,
+ "assignee_weblink": case.assignee.individual.weblink,
+ "contact_fullname": case.assignee.individual.name,
+ "contact_weblink": case.assignee.individual.weblink,
+ "case_id": case.id,
+ "case_priority_new": case.case_priority,
+ "case_priority_old": previous_case.case_priority,
+ "case_severity_new": case.case_severity,
+ "case_severity_old": previous_case.case_severity,
+ "case_status_new": case.status,
+ "case_status_old": previous_case.status,
+ "case_type_new": case.case_type.name,
+ "case_type_old": previous_case.case_type.name,
+ "case_visibility_new": case.visibility,
+ "case_visibility_old": previous_case.visibility,
+ "name": case.name,
+ "organization_slug": case.project.organization.slug,
+ "ticket_weblink": resolve_attr(case, "ticket.weblink"),
+ "title": case.title,
+ "escalated_to_incident": case.incidents[0] if case.incidents else None,
+ }
+
+ notification_params = {
+ "text": notification_text,
+ "type": notification_type,
+ "template": fyi_notification_template,
+ "kwargs": notification_kwargs,
+ }
+
+ notification_service.filter_and_send(
+ db_session=db_session,
+ project_id=case.project.id,
+ class_instance=case,
+ notification_params=notification_params,
+ )
+
+ log.debug("Case updated notifications sent.")
+
+
+def send_case_welcome_participant_message(
+ *,
+ participant_email: str,
+ case: Case,
+ db_session: Session,
+ welcome_template: EmailTemplates | None = None,
+):
+ if not case.dedicated_channel:
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+
+ if not plugin:
+ log.warning("Case participant welcome message not sent. No conversation plugin enabled.")
+ return
+
+ # we send the ephemeral message
+ message_kwargs = {
+ "name": case.name,
+ "title": case.title,
+ "description": case.description,
+ "visibility": case.visibility,
+ "status": case.status,
+ "type": case.case_type.name,
+ "type_description": case.case_type.description,
+ "severity": case.case_severity.name,
+ "severity_description": case.case_severity.description,
+ "priority": case.case_priority.name,
+ "priority_description": case.case_priority.description,
+ "assignee_fullname": case.assignee.individual.name,
+ "assignee_team": case.assignee.team,
+ "assignee_weblink": case.assignee.individual.weblink,
+ "reporter_fullname": case.reporter.individual.name if case.reporter else None,
+ "reporter_team": case.reporter.team if case.reporter else None,
+ "reporter_weblink": case.reporter.individual.weblink if case.reporter else None,
+ "document_weblink": resolve_attr(case, "case_document.weblink"),
+ "storage_weblink": resolve_attr(case, "storage.weblink"),
+ "ticket_weblink": resolve_attr(case, "ticket.weblink"),
+ "conference_weblink": resolve_attr(case, "conference.weblink"),
+ "conference_challenge": resolve_attr(case, "conference.conference_challenge"),
+ }
+ faq_doc = document_service.get_incident_faq_document(
+ db_session=db_session, project_id=case.project_id
+ )
+ if faq_doc:
+ message_kwargs.update({"faq_weblink": faq_doc.weblink})
+
+ conversation_reference = document_service.get_conversation_reference_document(
+ db_session=db_session, project_id=case.project_id
+ )
+ if conversation_reference:
+ message_kwargs.update(
+ {"conversation_commands_reference_document_weblink": conversation_reference.weblink}
+ )
+
+ plugin.instance.send_ephemeral(
+ conversation_id=case.conversation.channel_id,
+ user=participant_email,
+ text=f"Welcome to {case.name}",
+ message_template=generate_welcome_message(welcome_template, is_incident=False),
+ notification_type=MessageType.case_participant_welcome,
+ **message_kwargs,
+ )
+
+ log.debug(f"Welcome ephemeral message sent to {participant_email}.")
+
+
+def send_event_update_prompt_reminder(case: Case, db_session: Session) -> None:
+ """
+ Sends an ephemeral message to the assignee reminding them to update the visibility, title, priority
+ """
+ message_text = "Event Triage Reminder"
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+ if plugin is None:
+ log.warning("Event update prompt message not sent. No conversation plugin enabled.")
+ return
+ if case.assignee is None:
+ log.warning(f"Event update prompt message not sent. No assignee for {case.name}.")
+ return
+
+ button_metadata = SubjectMetadata(
+ type="case",
+ organization_slug=case.project.organization.slug,
+ id=case.id,
+ ).json()
+
+ plugin.instance.send_ephemeral(
+ conversation_id=case.conversation.channel_id,
+ user=case.assignee.individual.email,
+ text=message_text,
+ blocks=[
+ {
+ "type": "section",
+ "text": {
+ "type": "plain_text",
+ "text": f"Update the title, priority, case type and visibility during triage of this security event.", # noqa
+ },
+ },
+ {
+ "type": "actions",
+ "elements": [
+ {
+ "type": "button",
+ "text": {"type": "plain_text", "text": "Update Case"},
+ "action_id": CaseNotificationActions.update,
+ "style": "primary",
+ "value": button_metadata
+ }
+ ],
+ },
+ ],
+ )
+
+ log.debug(f"Security Event update reminder sent to {case.assignee.individual.email}.")
+
+def send_event_paging_message(case: Case, db_session: Session, oncall_name: str) -> None:
+ """
+ Sends a message to the case conversation channel to notify the reporter that they can engage
+ with oncall if they need immediate assistance.
+ """
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+ if plugin is None:
+ log.warning("Event paging message not sent, no conversation plugin enabled.")
+ return
+
+ notification_text = "Case can be engaged with oncall."
+ notification_type = MessageType.incident_notification
+
+ engage_oncall_command = plugin.instance.get_command_name(ConversationCommands.engage_oncall)
+
+ blocks = [
+ {"type": "section", "text": {"type": "mrkdwn", "text": "*Response Expectation*"}},
+ {
+ "type": "section",
+ "text": {
+ "type": "mrkdwn",
+ "text": f"""Event reported. Team will respond during business hours. For urgent assistance, type `{engage_oncall_command}` in this channel and select "Page" to contact `{oncall_name}`.""",
+ },
+ },
+ ]
+
+ try:
+ plugin.instance.send(
+ case.conversation.channel_id,
+ notification_text,
+ [],
+ notification_type,
+ blocks=blocks,
+ )
+ except Exception as e:
+ log.error(f"Error sending event paging message: {e}")
+
+
+def send_case_rating_feedback_message(case: Case, db_session: Session):
+ """
+ Sends a direct message to all case participants asking
+ them to rate and provide feedback about the case.
+ """
+ notification_text = "Case Rating and Feedback"
+ notification_template = CASE_CLOSED_RATING_FEEDBACK_NOTIFICATION
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning("Case rating and feedback message not sent, no conversation plugin enabled.")
+ return
+
+ items = [
+ {
+ "case_id": case.id,
+ "organization_slug": case.project.organization.slug,
+ "name": case.name,
+ "title": case.title,
+ "ticket_weblink": case.ticket.weblink,
+ }
+ ]
+
+ for participant in case.participants:
+ try:
+ plugin.instance.send_direct(
+ participant.individual.email,
+ notification_text,
+ notification_template,
+ MessageType.case_rating_feedback,
+ items=items,
+ )
+ except Exception as e:
+ # if one fails we don't want all to fail
+ log.exception(e)
+
+ log.debug("Case rating and feedback message sent to all participants.")
+
+
+def send_entity_update_notification(*, db_session: Session, entity_type: EntityType, case: Case):
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning("Entity update notification not sent, no conversation plugin enabled.")
+ return
+
+ if case.dedicated_channel or not case.has_thread:
+ log.warning("Entity update notification not sent, can only send to case threads.")
+ return
+
+ notification_text = "Entity Update Notification"
+
+ blocks = [
+ {
+ "type": "section",
+ "text": {
+ "type": "mrkdwn",
+ "text": f"{entity_type.name} created. You can now create a snooze for entities of this type.",
+ },
+ },
+ ]
+
+ plugin.instance.send(
+ conversation_id=case.conversation.channel_id,
+ text=notification_text,
+ message_template=[],
+ notification_type=MessageType.entity_update,
+ blocks=blocks,
+ ts=case.conversation.thread_id,
+ )
+
+ log.debug("Entity update notification sent to all participants.")
diff --git a/src/dispatch/case/models.py b/src/dispatch/case/models.py
new file mode 100644
index 000000000000..fccd9e71d2f4
--- /dev/null
+++ b/src/dispatch/case/models.py
@@ -0,0 +1,531 @@
+"""Models and schemas for the Dispatch case management system."""
+
+from collections import Counter, defaultdict
+from datetime import datetime
+from typing import Any
+from pydantic import field_validator, Field
+from sqlalchemy import (
+ Boolean,
+ Column,
+ DateTime,
+ ForeignKey,
+ Integer,
+ PrimaryKeyConstraint,
+ String,
+ Table,
+ UniqueConstraint,
+)
+from sqlalchemy.dialects.postgresql import JSONB
+from sqlalchemy.ext.hybrid import hybrid_property
+from sqlalchemy.orm import relationship
+from sqlalchemy_utils import TSVectorType, observes
+
+from dispatch.case.enums import CostModelType
+from dispatch.case_cost.models import CaseCostReadMinimal
+from dispatch.case.priority.models import CasePriorityBase, CasePriorityCreate, CasePriorityRead
+from dispatch.case.severity.models import CaseSeverityBase, CaseSeverityCreate, CaseSeverityRead
+from dispatch.case.type.models import CaseTypeBase, CaseTypeCreate, CaseTypeRead
+from dispatch.case_cost.models import (
+ CaseCostRead,
+ CaseCostUpdate,
+)
+from dispatch.conversation.models import ConversationRead
+from dispatch.database.core import Base
+from dispatch.document.models import Document, DocumentRead
+from dispatch.entity.models import EntityRead
+from dispatch.enums import Visibility
+from dispatch.event.models import EventRead
+from dispatch.group.models import Group, GroupRead
+from dispatch.individual.models import IndividualContactRead
+from dispatch.messaging.strings import CASE_RESOLUTION_DEFAULT
+from dispatch.models import (
+ DispatchBase,
+ NameStr,
+ Pagination,
+ PrimaryKey,
+ ProjectMixin,
+ TimeStampMixin,
+)
+from dispatch.participant.models import (
+ Participant,
+ ParticipantRead,
+ ParticipantReadMinimal,
+ ParticipantUpdate,
+)
+from dispatch.storage.models import StorageRead
+from dispatch.tag.models import TagRead
+from dispatch.ticket.models import TicketRead
+from dispatch.workflow.models import WorkflowInstanceRead
+
+from .enums import CaseResolutionReason, CaseStatus
+
+# Assoc table for case and tags
+assoc_case_tags = Table(
+ "assoc_case_tags",
+ Base.metadata,
+ Column("case_id", Integer, ForeignKey("case.id", ondelete="CASCADE")),
+ Column("tag_id", Integer, ForeignKey("tag.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("case_id", "tag_id"),
+)
+
+# Assoc table for cases and incidents
+assoc_cases_incidents = Table(
+ "assoc_case_incidents",
+ Base.metadata,
+ Column("case_id", Integer, ForeignKey("case.id", ondelete="CASCADE")),
+ Column("incident_id", Integer, ForeignKey("incident.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("case_id", "incident_id"),
+)
+
+
+class Case(Base, TimeStampMixin, ProjectMixin):
+ """SQLAlchemy model for a Case, representing an incident or issue in the system."""
+
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ title = Column(String, nullable=False)
+ description = Column(String, nullable=False)
+ resolution = Column(String, default=CASE_RESOLUTION_DEFAULT, nullable=False)
+ resolution_reason = Column(String)
+ status = Column(String, default=CaseStatus.new, nullable=False)
+ visibility = Column(String, default=Visibility.open, nullable=False)
+ participants_team = Column(String)
+ participants_location = Column(String)
+ reported_at = Column(DateTime, default=datetime.utcnow)
+ triage_at = Column(DateTime)
+ escalated_at = Column(DateTime)
+ closed_at = Column(DateTime)
+ dedicated_channel = Column(Boolean, default=False)
+ genai_analysis = Column(JSONB, default={}, nullable=False, server_default="{}")
+ event = Column(Boolean, default=False)
+ stable_at = Column(DateTime)
+
+ search_vector = Column(
+ TSVectorType(
+ "name", "title", "description", weights={"name": "A", "title": "B", "description": "C"}
+ )
+ )
+
+ # relationships
+ assignee_id = Column(Integer, ForeignKey("participant.id", ondelete="CASCADE"))
+ assignee = relationship(Participant, foreign_keys=[assignee_id], post_update=True)
+
+ reporter_id = Column(Integer, ForeignKey("participant.id", ondelete="CASCADE"))
+ reporter = relationship(Participant, foreign_keys=[reporter_id], post_update=True)
+
+ case_type = relationship("CaseType", backref="case")
+ case_type_id = Column(Integer, ForeignKey("case_type.id"))
+
+ case_severity = relationship("CaseSeverity", backref="case")
+ case_severity_id = Column(Integer, ForeignKey("case_severity.id"))
+
+ case_priority = relationship("CasePriority", backref="case")
+ case_priority_id = Column(Integer, ForeignKey("case_priority.id"))
+
+ case_document_id = Column(Integer, ForeignKey("document.id"))
+ case_document = relationship("Document", foreign_keys=[case_document_id])
+ documents = relationship(
+ "Document", backref="case", cascade="all, delete-orphan", foreign_keys=[Document.case_id]
+ )
+
+ duplicate_id = Column(Integer, ForeignKey("case.id"))
+ duplicates = relationship("Case", remote_side=[id], uselist=True, foreign_keys=[duplicate_id])
+
+ events = relationship("Event", backref="case", cascade="all, delete-orphan")
+
+ feedback = relationship("Feedback", backref="case", cascade="all, delete-orphan")
+
+ groups = relationship(
+ "Group", backref="case", cascade="all, delete-orphan", foreign_keys=[Group.case_id]
+ )
+
+ participants = relationship(
+ Participant,
+ backref="case",
+ cascade="all, delete-orphan",
+ foreign_keys=[Participant.case_id],
+ )
+
+ incidents = relationship("Incident", secondary=assoc_cases_incidents, backref="cases")
+
+ tactical_group_id = Column(Integer, ForeignKey("group.id"))
+ tactical_group = relationship("Group", foreign_keys=[tactical_group_id])
+
+ workflow_instances = relationship(
+ "WorkflowInstance", backref="case", cascade="all, delete-orphan"
+ )
+ canvases = relationship("Canvas", back_populates="case", cascade="all, delete-orphan")
+
+ conversation = relationship(
+ "Conversation", uselist=False, backref="case", cascade="all, delete-orphan"
+ )
+
+ related_id = Column(Integer, ForeignKey("case.id"))
+ related = relationship("Case", remote_side=[id], uselist=True, foreign_keys=[related_id])
+
+ signal_thread_ts = Column(String, nullable=True)
+
+ storage = relationship("Storage", uselist=False, backref="case", cascade="all, delete-orphan")
+
+ tags = relationship(
+ "Tag",
+ secondary=assoc_case_tags,
+ backref="cases",
+ )
+
+ ticket = relationship("Ticket", uselist=False, backref="case", cascade="all, delete-orphan")
+
+ # Foreign key to individual who resolved
+ resolved_by_id = Column(Integer, ForeignKey("individual_contact.id"))
+ resolved_by = relationship("IndividualContact", foreign_keys=[resolved_by_id])
+
+ # resources
+ case_costs = relationship(
+ "CaseCost",
+ backref="case",
+ cascade="all, delete-orphan",
+ lazy="subquery",
+ order_by="CaseCost.created_at",
+ )
+
+ case_notes = relationship(
+ "CaseNotes",
+ back_populates="case",
+ cascade="all, delete-orphan",
+ uselist=False,
+ )
+
+ @observes("participants")
+ def participant_observer(self, participants):
+ """Update team and location fields based on the most common values among participants."""
+ self.participants_team = Counter(p.team for p in participants).most_common(1)[0][0]
+ self.participants_location = Counter(p.location for p in participants).most_common(1)[0][0]
+
+ @property
+ def has_channel(self) -> bool:
+ """Return True if the case has a conversation channel but not a thread."""
+ if not self.conversation:
+ return False
+ return True if not self.conversation.thread_id else False
+
+ @property
+ def has_thread(self) -> bool:
+ """Return True if the case has a conversation thread."""
+ if not self.conversation:
+ return False
+ return True if self.conversation.thread_id else False
+
+ @property
+ def participant_emails(self) -> list:
+ """Return a list of emails for all participants in the case."""
+ return [participant.individual.email for participant in self.participants]
+
+ @hybrid_property
+ def total_cost_classic(self):
+ """Calculate the total cost for classic cost model types."""
+ total_cost = 0
+ if self.case_costs:
+ for cost in self.case_costs:
+ if cost.case_cost_type.model_type == CostModelType.new:
+ continue
+ total_cost += cost.amount
+ return total_cost
+
+ @hybrid_property
+ def total_cost_new(self):
+ """Calculate the total cost for new cost model types."""
+ total_cost = 0
+ if self.case_costs:
+ for cost in self.case_costs:
+ if cost.case_cost_type.model_type == CostModelType.classic:
+ continue
+ total_cost += cost.amount
+ return total_cost
+
+
+class CaseNotes(Base, TimeStampMixin):
+ """SQLAlchemy model for case investigation notes."""
+
+ id = Column(Integer, primary_key=True)
+ content = Column(String)
+
+ # Foreign key to case
+ case_id = Column(Integer, ForeignKey("case.id", ondelete="CASCADE"))
+ case = relationship("Case", back_populates="case_notes")
+
+ # Foreign key to individual who last updated
+ last_updated_by_id = Column(Integer, ForeignKey("individual_contact.id"))
+ last_updated_by = relationship("IndividualContact", foreign_keys=[last_updated_by_id])
+
+
+class SignalRead(DispatchBase):
+ """Pydantic model for reading signal data."""
+
+ id: PrimaryKey
+ name: str
+ owner: str
+ description: str | None = None
+ variant: str | None = None
+ external_id: str
+ external_url: str | None = None
+ workflow_instances: list[WorkflowInstanceRead] | None = []
+ tags: list[TagRead] | None = []
+
+
+class SignalInstanceRead(DispatchBase):
+ """Pydantic model for reading signal instance data."""
+
+ created_at: datetime
+ entities: list[EntityRead] | None = []
+ raw: Any
+ signal: SignalRead
+ tags: list[TagRead] | None = []
+
+
+class ProjectRead(DispatchBase):
+ """Pydantic model for reading project data."""
+
+ id: PrimaryKey | None = None
+ name: NameStr
+ display_name: str | None = None
+ color: str | None = None
+ allow_self_join: bool | None = Field(True, nullable=True)
+
+
+# CaseNotes Pydantic models
+class CaseNotesBase(DispatchBase):
+ """Base Pydantic model for case notes data."""
+
+ content: str | None = None
+
+
+class CaseNotesCreate(CaseNotesBase):
+ """Pydantic model for creating case notes."""
+
+ pass
+
+
+class CaseNotesUpdate(CaseNotesBase):
+ """Pydantic model for updating case notes."""
+
+ pass
+
+
+class CaseNotesRead(CaseNotesBase):
+ """Pydantic model for reading case notes data."""
+
+ id: PrimaryKey
+ created_at: datetime | None = None
+ updated_at: datetime | None = None
+ last_updated_by: IndividualContactRead | None = None
+
+
+# Pydantic models...
+class CaseBase(DispatchBase):
+ """Base Pydantic model for case data."""
+
+ title: str
+ description: str | None = None
+ resolution: str | None = None
+ resolution_reason: CaseResolutionReason | None = None
+ resolved_by: IndividualContactRead | None = None
+ status: CaseStatus | None = None
+ visibility: Visibility | None = None
+
+ @field_validator("title")
+ @classmethod
+ def title_required(cls, v):
+ """Ensure the title field is not empty."""
+ if not v:
+ raise ValueError("must not be empty string")
+ return v
+
+ @field_validator("description")
+ @classmethod
+ def description_required(cls, v):
+ """Ensure the description field is not empty."""
+ if not v:
+ raise ValueError("must not be empty string")
+ return v
+
+
+class CaseCreate(CaseBase):
+ """Pydantic model for creating a new case."""
+
+ assignee: ParticipantUpdate | None = None
+ case_priority: CasePriorityCreate | None = None
+ case_severity: CaseSeverityCreate | None = None
+ case_type: CaseTypeCreate | None = None
+ dedicated_channel: bool | None = None
+ project: ProjectRead | None = None
+ reporter: ParticipantUpdate | None = None
+ tags: list[TagRead] | None = []
+ event: bool | None = False
+
+
+class CaseReadBasic(DispatchBase):
+ """Pydantic model for reading basic case data."""
+
+ id: PrimaryKey
+ name: NameStr | None = None
+
+
+class IncidentReadBasic(DispatchBase):
+ """Pydantic model for reading basic incident data."""
+
+ id: PrimaryKey
+ name: NameStr | None = None
+
+
+class CaseReadMinimal(CaseBase):
+ """Pydantic model for reading minimal case data."""
+
+ id: PrimaryKey
+ name: NameStr | None = None
+ status: CaseStatus | None = None # Used in table and for action disabling
+ closed_at: datetime | None = None
+ reported_at: datetime | None = None
+ stable_at: datetime | None = None
+ dedicated_channel: bool | None = None # Used by CaseStatus component
+ case_type: CaseTypeRead
+ case_severity: CaseSeverityRead
+ case_priority: CasePriorityRead
+ project: ProjectRead
+ assignee: ParticipantReadMinimal | None = None
+ case_costs: list[CaseCostReadMinimal] = []
+ incidents: list[IncidentReadBasic] | None = []
+
+
+class CaseReadMinimalWithExtras(CaseBase):
+ """Pydantic model for reading minimal case data."""
+
+ id: PrimaryKey
+ title: str
+ name: NameStr | None = None
+ description: str | None = None
+ resolution: str | None = None
+ resolution_reason: CaseResolutionReason | None = None
+ visibility: Visibility | None = None
+ status: CaseStatus | None = None # Used in table and for action disabling
+ reported_at: datetime | None = None
+ triage_at: datetime | None = None
+ stable_at: datetime | None = None
+ escalated_at: datetime | None = None
+ closed_at: datetime | None = None
+ dedicated_channel: bool | None = None # Used by CaseStatus component
+ case_type: CaseTypeRead
+ case_severity: CaseSeverityRead
+ case_priority: CasePriorityRead
+ project: ProjectRead
+ reporter: ParticipantReadMinimal | None = None
+ assignee: ParticipantReadMinimal | None = None
+ case_costs: list[CaseCostReadMinimal] = []
+ participants: list[ParticipantReadMinimal] = []
+ tags: list[TagRead] = []
+ ticket: TicketRead | None = None
+
+
+CaseReadMinimal.update_forward_refs()
+
+
+class CaseRead(CaseBase):
+ """Pydantic model for reading detailed case data."""
+
+ id: PrimaryKey
+ assignee: ParticipantRead | None = None
+ case_costs: list[CaseCostRead] = []
+ case_priority: CasePriorityRead
+ case_severity: CaseSeverityRead
+ case_type: CaseTypeRead
+ closed_at: datetime | None = None
+ conversation: ConversationRead | None = None
+ created_at: datetime | None = None
+ stable_at: datetime | None = None
+ documents: list[DocumentRead] | None = []
+ duplicates: list[CaseReadBasic] | None = []
+ escalated_at: datetime | None = None
+ events: list[EventRead] | None = []
+ genai_analysis: dict[str, Any] | None = {}
+ groups: list[GroupRead] | None = []
+ incidents: list[IncidentReadBasic] | None = []
+ name: NameStr | None
+ participants: list[ParticipantRead] | None = []
+ project: ProjectRead
+ related: list[CaseReadMinimal] | None = []
+ reported_at: datetime | None = None
+ reporter: ParticipantRead | None = None
+ signal_instances: list[SignalInstanceRead] | None = []
+ storage: StorageRead | None = None
+ tags: list[TagRead] | None = []
+ ticket: TicketRead | None = None
+ total_cost_classic: float | None
+ total_cost_new: float | None
+ triage_at: datetime | None = None
+ updated_at: datetime | None = None
+ workflow_instances: list[WorkflowInstanceRead] | None = []
+ event: bool | None = False
+ case_notes: CaseNotesRead | None = None
+
+
+class CaseUpdate(CaseBase):
+ """Pydantic model for updating case data."""
+
+ assignee: ParticipantUpdate | None = None
+ case_costs: list[CaseCostUpdate] = []
+ case_priority: CasePriorityBase | None = None
+ case_severity: CaseSeverityBase | None = None
+ case_type: CaseTypeBase | None = None
+ closed_at: datetime | None = None
+ stable_at: datetime | None = None
+ duplicates: list[CaseReadBasic] | None = []
+ related: list[CaseRead] | None = []
+ reporter: ParticipantUpdate | None = None
+ escalated_at: datetime | None = None
+ incidents: list[IncidentReadBasic] | None = []
+ reported_at: datetime | None = None
+ tags: list[TagRead] | None = []
+ triage_at: datetime | None = None
+ case_notes: CaseNotesUpdate | None = None
+
+ @field_validator("tags")
+ @classmethod
+ def find_exclusive(cls, tags: list[TagRead] | None) -> list[TagRead] | None:
+ """Ensure only one exclusive tag per tag type is present."""
+ if not tags:
+ return tags
+
+ # Group tags by tag_type.id
+ exclusive_tags = defaultdict(list)
+ for t in tags:
+ if t.tag_type and t.tag_type.exclusive:
+ exclusive_tags[t.tag_type.id].append(t)
+
+ # Check for multiple exclusive tags of the same type
+ for tag_list in exclusive_tags.values():
+ if len(tag_list) > 1:
+ raise ValueError(
+ "Found multiple exclusive tags. Please ensure that only one tag of a given "
+ f"type is applied. Tags: {','.join([t.name for t in tag_list])}"
+ )
+
+ return tags
+
+
+class CasePagination(Pagination):
+ """Pydantic model for paginated minimal case results."""
+
+ items: list[CaseReadMinimal] = []
+
+
+class CasePaginationMinimalWithExtras(Pagination):
+ """Pydantic model for paginated minimal case results."""
+
+ items: list[CaseReadMinimalWithExtras] = []
+
+
+class CaseExpandedPagination(Pagination):
+ """Pydantic model for paginated expanded case results."""
+
+ items: list[CaseRead] = []
diff --git a/src/dispatch/plugins/dispatch_slack/tests/__init__.py b/src/dispatch/case/priority/__init__.py
similarity index 100%
rename from src/dispatch/plugins/dispatch_slack/tests/__init__.py
rename to src/dispatch/case/priority/__init__.py
diff --git a/src/dispatch/case/priority/config.py b/src/dispatch/case/priority/config.py
new file mode 100644
index 000000000000..04c7f959f475
--- /dev/null
+++ b/src/dispatch/case/priority/config.py
@@ -0,0 +1,52 @@
+default_case_priorities = [
+ {
+ "name": "Low",
+ "description": "This case should be triaged on a best-effort basis.",
+ "view_order": 1,
+ "color": "#8bc34a",
+ "page_assignee": False,
+ "default": True,
+ "enabled": True,
+ "disable_delayed_message_warning": False,
+ },
+ {
+ "name": "Medium",
+ "description": "This case should be triaged within 24hrs of case creation.",
+ "view_order": 2,
+ "color": "#ffeb3b",
+ "page_assignee": False,
+ "default": False,
+ "enabled": True,
+ "disable_delayed_message_warning": False,
+ },
+ {
+ "name": "High",
+ "description": "This case should be triaged within 8hrs of case creation.",
+ "view_order": 3,
+ "color": "#ff9800",
+ "page_assignee": False,
+ "default": False,
+ "enabled": True,
+ "disable_delayed_message_warning": False,
+ },
+ {
+ "name": "Critical",
+ "description": "This case should be triaged immediately.",
+ "view_order": 4,
+ "color": "#e53935",
+ "page_assignee": True,
+ "default": False,
+ "enabled": True,
+ "disable_delayed_message_warning": True,
+ },
+ {
+ "name": "Optional",
+ "description": "Triage of this case is optional.",
+ "view_order": 5,
+ "color": "#9e9e9e",
+ "page_assignee": False,
+ "default": False,
+ "enabled": True,
+ "disable_delayed_message_warning": False,
+ },
+]
diff --git a/src/dispatch/case/priority/models.py b/src/dispatch/case/priority/models.py
new file mode 100644
index 000000000000..ec4db44cad78
--- /dev/null
+++ b/src/dispatch/case/priority/models.py
@@ -0,0 +1,76 @@
+"""Models and schemas for the Dispatch case priority system."""
+
+from sqlalchemy import Column, Integer, String, Boolean
+from sqlalchemy.sql.schema import UniqueConstraint
+from sqlalchemy.event import listen
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base, ensure_unique_default_per_project
+from dispatch.models import DispatchBase, NameStr, ProjectMixin, PrimaryKey, Pagination
+from dispatch.project.models import ProjectRead
+
+
+class CasePriority(Base, ProjectMixin):
+ """SQLAlchemy model for a case priority, representing the priority level of a case."""
+
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ page_assignee = Column(Boolean, default=False)
+ color = Column(String)
+ enabled = Column(Boolean, default=True)
+ default = Column(Boolean, default=False)
+ disable_delayed_message_warning = Column(Boolean, default=False)
+
+ # This column is used to control how priorities should be displayed
+ # Lower numbers will be shown first.
+ view_order = Column(Integer, default=9999)
+
+ search_vector = Column(TSVectorType("name", "description"))
+
+
+default_listener_doc = (
+ """Ensure only one default priority per project by listening to the 'default' field."""
+)
+
+listen(CasePriority.default, "set", ensure_unique_default_per_project)
+
+
+# Pydantic models
+class CasePriorityBase(DispatchBase):
+ """Base Pydantic model for case priority data."""
+
+ color: str | None = None
+ default: bool | None = None
+ page_assignee: bool | None = None
+ description: str | None = None
+ enabled: bool | None = None
+ name: NameStr
+ project: ProjectRead | None = None
+ view_order: int | None = None
+ disable_delayed_message_warning: bool | None = None
+
+
+class CasePriorityCreate(CasePriorityBase):
+ """Pydantic model for creating a new case priority."""
+
+ pass
+
+
+class CasePriorityUpdate(CasePriorityBase):
+ """Pydantic model for updating a case priority."""
+
+ pass
+
+
+class CasePriorityRead(CasePriorityBase):
+ """Pydantic model for reading case priority data."""
+
+ id: PrimaryKey | None = None
+
+
+class CasePriorityPagination(Pagination):
+ """Pydantic model for paginated case priority results."""
+
+ items: list[CasePriorityRead] = []
diff --git a/src/dispatch/case/priority/service.py b/src/dispatch/case/priority/service.py
new file mode 100644
index 000000000000..76dc99474110
--- /dev/null
+++ b/src/dispatch/case/priority/service.py
@@ -0,0 +1,155 @@
+from pydantic import ValidationError
+
+from sqlalchemy.sql.expression import true
+
+from dispatch.project import service as project_service
+
+from .models import (
+ CasePriority,
+ CasePriorityCreate,
+ CasePriorityRead,
+ CasePriorityUpdate,
+)
+
+
+def get(*, db_session, case_priority_id: int) -> CasePriority | None:
+ """Returns a case priority based on the given priority id."""
+ return db_session.query(CasePriority).filter(CasePriority.id == case_priority_id).one_or_none()
+
+
+def get_default(*, db_session, project_id: int):
+ """Returns the default case priority."""
+ return (
+ db_session.query(CasePriority)
+ .filter(CasePriority.default == true())
+ .filter(CasePriority.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_default_or_raise(*, db_session, project_id: int) -> CasePriority:
+ """Returns the default case priority or raises a ValidationError if one doesn't exist."""
+ case_priority = get_default(db_session=db_session, project_id=project_id)
+
+ if not case_priority:
+ raise ValidationError.from_exception_data(
+ "CasePriority",
+ [
+ {
+ "type": "value_error",
+ "loc": ("case_priority",),
+ "input": None,
+ "ctx": {"error": ValueError("No default case priority defined.")},
+ }
+ ],
+ )
+ return case_priority
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> CasePriority | None:
+ """Returns a case priority based on the given priority name."""
+ return (
+ db_session.query(CasePriority)
+ .filter(CasePriority.name == name)
+ .filter(CasePriority.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(
+ *, db_session, project_id: int, case_priority_in=CasePriorityRead
+) -> CasePriority:
+ """Returns the case priority specified or raises ValidationError."""
+ case_priority = get_by_name(
+ db_session=db_session, project_id=project_id, name=case_priority_in.name
+ )
+
+ if not case_priority:
+ raise ValidationError.from_exception_data(
+ "CasePriority",
+ [
+ {
+ "type": "value_error",
+ "loc": ("case_priority",),
+ "input": case_priority_in.name,
+ "msg": "Value error, Case priority not found.",
+ "ctx": {
+ "error": ValueError(f"Case priority not found: {case_priority_in.name}")
+ },
+ }
+ ],
+ )
+
+ return case_priority
+
+
+def get_by_name_or_default(
+ *, db_session, project_id: int, case_priority_in=CasePriorityRead
+) -> CasePriority:
+ """Returns a case priority based on a name or the default if not specified."""
+ if case_priority_in and case_priority_in.name:
+ case_priority = get_by_name(
+ db_session=db_session, project_id=project_id, name=case_priority_in.name
+ )
+ if case_priority:
+ return case_priority
+ return get_default_or_raise(db_session=db_session, project_id=project_id)
+
+
+def get_all(*, db_session, project_id: int = None) -> list[CasePriority | None]:
+ """Returns all case priorities."""
+ if project_id is not None:
+ return db_session.query(CasePriority).filter(CasePriority.project_id == project_id)
+ return db_session.query(CasePriority)
+
+
+def get_all_enabled(*, db_session, project_id: int = None) -> list[CasePriority | None]:
+ """Returns all enabled case priorities."""
+ if project_id is not None:
+ return (
+ db_session.query(CasePriority)
+ .filter(CasePriority.project_id == project_id)
+ .filter(CasePriority.enabled == true())
+ )
+ return db_session.query(CasePriority).filter(CasePriority.enabled == true())
+
+
+def create(*, db_session, case_priority_in: CasePriorityCreate) -> CasePriority:
+ """Creates a case priority."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=case_priority_in.project
+ )
+ case_priority = CasePriority(
+ **case_priority_in.dict(exclude={"project", "color"}), project=project
+ )
+ if case_priority_in.color:
+ case_priority.color = case_priority_in.color
+
+ db_session.add(case_priority)
+ db_session.commit()
+ return case_priority
+
+
+def update(
+ *, db_session, case_priority: CasePriority, case_priority_in: CasePriorityUpdate
+) -> CasePriority:
+ """Updates a case priority."""
+ case_priority_data = case_priority.dict()
+
+ update_data = case_priority_in.dict(exclude_unset=True, exclude={"project", "color"})
+
+ for field in case_priority_data:
+ if field in update_data:
+ setattr(case_priority, field, update_data[field])
+
+ if case_priority_in.color:
+ case_priority.color = case_priority_in.color
+
+ db_session.commit()
+ return case_priority
+
+
+def delete(*, db_session, case_priority_id: int):
+ """Deletes a case priority."""
+ db_session.query(CasePriority).filter(CasePriority.id == case_priority_id).delete()
+ db_session.commit()
diff --git a/src/dispatch/case/priority/views.py b/src/dispatch/case/priority/views.py
new file mode 100644
index 000000000000..981ae524a507
--- /dev/null
+++ b/src/dispatch/case/priority/views.py
@@ -0,0 +1,76 @@
+from fastapi import APIRouter, Depends, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
+from dispatch.models import PrimaryKey
+
+from .models import (
+ CasePriorityCreate,
+ CasePriorityPagination,
+ CasePriorityRead,
+ CasePriorityUpdate,
+)
+from .service import create, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=CasePriorityPagination, tags=["case_priorities"])
+def get_case_priorities(common: CommonParameters):
+ """Returns all case priorities."""
+ return search_filter_sort_paginate(model="CasePriority", **common)
+
+
+@router.post(
+ "",
+ response_model=CasePriorityRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_case_priority(
+ db_session: DbSession,
+ case_priority_in: CasePriorityCreate,
+):
+ """Creates a new case priority."""
+ case_priority = create(db_session=db_session, case_priority_in=case_priority_in)
+ return case_priority
+
+
+@router.put(
+ "/{case_priority_id}",
+ response_model=CasePriorityRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_case_priority(
+ *,
+ db_session: DbSession,
+ case_priority_id: PrimaryKey,
+ case_priority_in: CasePriorityUpdate,
+):
+ """Updates an existing case priority."""
+ case_priority = get(db_session=db_session, case_priority_id=case_priority_id)
+ if not case_priority:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A case priority with this id does not exist."}],
+ )
+
+ case_priority = update(
+ db_session=db_session,
+ case_priority=case_priority,
+ case_priority_in=case_priority_in,
+ )
+ return case_priority
+
+
+@router.get("/{case_priority_id}", response_model=CasePriorityRead)
+def get_case_priority(db_session: DbSession, case_priority_id: PrimaryKey):
+ """Gets a case priority."""
+ case_priority = get(db_session=db_session, case_priority_id=case_priority_id)
+ if not case_priority:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A case priority with this id does not exist."}],
+ )
+ return case_priority
diff --git a/src/dispatch/case/scheduled.py b/src/dispatch/case/scheduled.py
new file mode 100644
index 000000000000..2c20a32b1e72
--- /dev/null
+++ b/src/dispatch/case/scheduled.py
@@ -0,0 +1,95 @@
+"""
+.. module: dispatch.case.scheduled
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
+
+import logging
+
+from datetime import datetime, date
+from schedule import every
+from sqlalchemy.orm import Session
+from sqlalchemy import or_
+
+from dispatch.decorators import scheduled_project_task, timer
+from dispatch.project.models import Project
+from dispatch.scheduler import scheduler
+
+from .enums import CaseStatus
+from .messaging import send_case_close_reminder, send_case_triage_reminder
+from .models import Case
+from .service import (
+ get_all_by_status,
+)
+
+
+log = logging.getLogger(__name__)
+
+
+@scheduler.add(every(1).day.at("18:00"), name="case-close-reminder")
+@timer
+@scheduled_project_task
+def case_close_reminder(db_session: Session, project: Project):
+ """Sends a reminder to the case assignee to close out their case."""
+ cases = get_all_by_status(
+ db_session=db_session, project_id=project.id, statuses=[CaseStatus.triage]
+ )
+
+ for case in cases:
+ try:
+ span = datetime.utcnow() - case.triage_at
+ q, r = divmod(span.days, 7)
+ if q >= 1 and date.today().isoweekday() == 1:
+ # we only send the reminder for cases that have been triaging
+ # longer than a week and only on Mondays
+ send_case_close_reminder(case, db_session)
+ except Exception as e:
+ # if one fails we don't want all to fail
+ log.exception(e)
+
+
+@scheduler.add(every(1).day.at("18:00"), name="case-triage-reminder")
+@timer
+@scheduled_project_task
+def case_triage_reminder(db_session: Session, project: Project):
+ """Sends a reminder to the case assignee to triage their case."""
+
+ cases = (
+ db_session.query(Case)
+ .filter(Case.project_id == project.id)
+ .filter(Case.status != CaseStatus.closed)
+ .filter(or_(Case.title == "Security Event Triage", Case.status == CaseStatus.new))
+ .all()
+ )
+
+ # if we want more specific SLA reminders, we would need to add additional data model
+ for case in cases:
+ span = datetime.utcnow() - case.created_at
+ q, r = divmod(span.days, 1)
+ if q >= 1:
+ # we only send one reminder per case per day
+ send_case_triage_reminder(case, db_session)
+
+
+@scheduler.add(every(1).day.at("18:00"), name="case-stable-reminder")
+@timer
+@scheduled_project_task
+def case_stable_reminder(db_session: Session, project: Project):
+ """Sends a reminder to the case assignee to close their stable case."""
+ cases = get_all_by_status(
+ db_session=db_session, project_id=project.id, statuses=[CaseStatus.stable]
+ )
+
+ for case in cases:
+ try:
+ if case.stable_at:
+ span = datetime.utcnow() - case.stable_at
+ q, r = divmod(span.days, 7)
+ if q >= 1 and date.today().isoweekday() == 1:
+ # we only send the reminder for cases that have been stable
+ # longer than a week and only on Mondays
+ send_case_close_reminder(case, db_session)
+ except Exception as e:
+ # if one fails we don't want all to fail
+ log.exception(e)
diff --git a/src/dispatch/case/scheduled_internal.py b/src/dispatch/case/scheduled_internal.py
new file mode 100644
index 000000000000..036dd344cbe3
--- /dev/null
+++ b/src/dispatch/case/scheduled_internal.py
@@ -0,0 +1,2 @@
+def schedule_placeholder():
+ pass
diff --git a/src/dispatch/case/service.py b/src/dispatch/case/service.py
new file mode 100644
index 000000000000..509deff240fb
--- /dev/null
+++ b/src/dispatch/case/service.py
@@ -0,0 +1,493 @@
+import logging
+
+from datetime import datetime, timedelta
+
+from pydantic import ValidationError
+from sqlalchemy.orm import Session, joinedload, load_only
+
+from dispatch.auth.models import DispatchUser
+from dispatch.case.priority import service as case_priority_service
+from dispatch.case.severity import service as case_severity_service
+from dispatch.case.type import service as case_type_service
+from dispatch.case_cost import service as case_cost_service
+from dispatch.event import service as event_service
+from dispatch.incident import service as incident_service
+from dispatch.individual import service as individual_service
+from dispatch.participant.models import Participant
+from dispatch.participant import flows as participant_flows
+from dispatch.participant_role.models import ParticipantRoleType
+from dispatch.project import service as project_service
+from dispatch.service import flows as service_flows
+from dispatch.tag import service as tag_service
+
+from .enums import CaseStatus
+from .models import (
+ Case,
+ CaseCreate,
+ CaseNotes,
+ CaseRead,
+ CaseUpdate,
+)
+
+
+log = logging.getLogger(__name__)
+
+
+def get(*, db_session, case_id: int) -> Case | None:
+ """Returns a case based on the given id."""
+ return db_session.query(Case).filter(Case.id == case_id).first()
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> Case | None:
+ """Returns a case based on the given name."""
+ return (
+ db_session.query(Case)
+ .filter(Case.project_id == project_id)
+ .filter(Case.name == name)
+ .first()
+ )
+
+
+def get_by_name_or_raise(*, db_session, project_id: int, case_in: CaseRead) -> Case:
+ """Returns a case based on a given name or raises ValidationError"""
+ case = get_by_name(db_session=db_session, project_id=project_id, name=case_in.name)
+
+ if not case:
+ raise ValidationError(
+ [
+ {
+ "msg": "Case not found.",
+ "query": case_in.name,
+ "loc": "case",
+ }
+ ]
+ )
+ return case
+
+
+def get_all(*, db_session, project_id: int) -> list[Case | None]:
+ """Returns all cases."""
+ return db_session.query(Case).filter(Case.project_id == project_id)
+
+
+def get_all_open_by_case_type(*, db_session, case_type_id: int) -> list[Case | None]:
+ """Returns all non-closed cases based on the given case type."""
+ return (
+ db_session.query(Case)
+ .filter(Case.status != CaseStatus.closed)
+ .filter(Case.status != CaseStatus.escalated)
+ .filter(Case.case_type_id == case_type_id)
+ .all()
+ )
+
+
+def get_all_by_status(
+ *, db_session: Session, project_id: int, statuses: list[str]
+) -> list[Case | None]:
+ """Returns all cases based on a given list of statuses."""
+ return (
+ db_session.query(Case)
+ .options(
+ load_only(
+ Case.id,
+ Case.status,
+ Case.reported_at,
+ Case.created_at,
+ Case.updated_at,
+ Case.triage_at,
+ Case.escalated_at,
+ Case.stable_at,
+ Case.closed_at,
+ ),
+ )
+ .filter(Case.project_id == project_id)
+ .filter(Case.status.in_(statuses))
+ .all()
+ )
+
+
+def get_all_last_x_hours(*, db_session, hours: int) -> list[Case | None]:
+ """Returns all cases in the last x hours."""
+ now = datetime.utcnow()
+ return db_session.query(Case).filter(Case.created_at >= now - timedelta(hours=hours)).all()
+
+
+def get_all_last_x_hours_by_status(
+ *, db_session, project_id: int, status: str, hours: int
+) -> list[Case | None]:
+ """Returns all cases of a given status in the last x hours."""
+ now = datetime.utcnow()
+
+ if status == CaseStatus.new:
+ return (
+ db_session.query(Case)
+ .filter(Case.project_id == project_id)
+ .filter(Case.status == CaseStatus.new)
+ .filter(Case.created_at >= now - timedelta(hours=hours))
+ .all()
+ )
+
+ if status == CaseStatus.triage:
+ return (
+ db_session.query(Case)
+ .filter(Case.project_id == project_id)
+ .filter(Case.status == CaseStatus.triage)
+ .filter(Case.triage_at >= now - timedelta(hours=hours))
+ .all()
+ )
+
+ if status == CaseStatus.escalated:
+ return (
+ db_session.query(Case)
+ .filter(Case.project_id == project_id)
+ .filter(Case.status == CaseStatus.escalated)
+ .filter(Case.escalated_at >= now - timedelta(hours=hours))
+ .all()
+ )
+
+ if status == CaseStatus.stable:
+ return (
+ db_session.query(Case)
+ .filter(Case.project_id == project_id)
+ .filter(Case.status == CaseStatus.stable)
+ .filter(Case.stable_at >= now - timedelta(hours=hours))
+ .all()
+ )
+
+ if status == CaseStatus.closed:
+ return (
+ db_session.query(Case)
+ .filter(Case.project_id == project_id)
+ .filter(Case.status == CaseStatus.closed)
+ .filter(Case.closed_at >= now - timedelta(hours=hours))
+ .all()
+ )
+
+
+def create(*, db_session, case_in: CaseCreate, current_user: DispatchUser = None) -> Case:
+ """
+ Creates a new case.
+
+ Returns:
+ The created case.
+
+ Raises:
+ ValidationError: If the case type does not have a conversation target and
+ the case is not being created with a dedicated channel, the case will not
+ be created.
+ """
+ project = project_service.get_by_name_or_default(
+ db_session=db_session, project_in=case_in.project
+ )
+
+ tag_objs = []
+ for t in case_in.tags:
+ tag_objs.append(tag_service.get_or_create(db_session=db_session, tag_in=t))
+
+ # TODO(mvilanova): allow to provide related cases and incidents, and duplicated cases
+ case_type = case_type_service.get_by_name_or_default(
+ db_session=db_session, project_id=project.id, case_type_in=case_in.case_type
+ )
+
+ # Cases with dedicated channels do not require a conversation target.
+ if not case_in.dedicated_channel:
+ if not case_type or not case_type.conversation_target:
+ raise ValueError(
+ f"Cases without dedicated channels require a conversation target. "
+ f"Case type with name {case_in.case_type.name} does not have a "
+ f"conversation target. The case will not be created."
+ )
+
+ case = Case(
+ title=case_in.title,
+ description=case_in.description,
+ project=project,
+ status=case_in.status,
+ dedicated_channel=case_in.dedicated_channel,
+ tags=tag_objs,
+ case_type=case_type,
+ event=case_in.event,
+ )
+
+ case.visibility = case_type.visibility
+ if case_in.visibility:
+ case.visibility = case_in.visibility
+
+ case_severity = case_severity_service.get_by_name_or_default(
+ db_session=db_session, project_id=project.id, case_severity_in=case_in.case_severity
+ )
+ case.case_severity = case_severity
+
+ case_priority = case_priority_service.get_by_name_or_default(
+ db_session=db_session, project_id=project.id, case_priority_in=case_in.case_priority
+ )
+ case.case_priority = case_priority
+
+ db_session.add(case)
+ db_session.commit()
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Case created",
+ details={
+ "title": case.title,
+ "description": case.description,
+ "type": case.case_type.name,
+ "severity": case.case_severity.name,
+ "priority": case.case_priority.name,
+ "status": case.status,
+ "visibility": case.visibility,
+ },
+ case_id=case.id,
+ pinned=True,
+ )
+
+ assignee_email = None
+ if case_in.assignee:
+ # we assign the case to the assignee provided
+ assignee_email = case_in.assignee.individual.email
+ elif case_type.oncall_service:
+ # we assign the case to the oncall person for the given case type
+ assignee_email = service_flows.resolve_oncall(
+ service=case_type.oncall_service, db_session=db_session
+ )
+ elif current_user:
+ # we fall back to assign the case to the current user
+ assignee_email = current_user.email
+
+ # add assignee
+ if assignee_email:
+ participant_flows.add_participant(
+ assignee_email,
+ case,
+ db_session,
+ roles=[ParticipantRoleType.assignee],
+ )
+
+ # add reporter
+ if case_in.reporter:
+ participant_flows.add_participant(
+ case_in.reporter.individual.email,
+ case,
+ db_session,
+ roles=[ParticipantRoleType.reporter],
+ )
+
+ return case
+
+
+def update(*, db_session, case: Case, case_in: CaseUpdate, current_user: DispatchUser) -> Case:
+ """Updates an existing case."""
+ update_data = case_in.dict(
+ exclude_unset=True,
+ exclude={
+ "assignee",
+ "case_costs",
+ "case_notes",
+ "case_priority",
+ "case_severity",
+ "case_type",
+ "duplicates",
+ "incidents",
+ "project",
+ "related",
+ "reporter",
+ "resolved_by",
+ "status",
+ "tags",
+ "visibility",
+ },
+ )
+
+ for field in update_data.keys():
+ setattr(case, field, update_data[field])
+
+ if case_in.case_type:
+ if case.case_type.name != case_in.case_type.name:
+ case_type = case_type_service.get_by_name(
+ db_session=db_session,
+ project_id=case.project.id,
+ name=case_in.case_type.name,
+ )
+ if case_type:
+ case.case_type = case_type
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=(
+ f"Case type changed to {case_in.case_type.name.lower()} "
+ f"by {current_user.email}"
+ ),
+ dispatch_user_id=current_user.id,
+ case_id=case.id,
+ )
+ else:
+ log.warning(f"Case type with name {case_in.case_type.name.lower()} not found.")
+
+ if case_in.case_severity:
+ if case.case_severity.name != case_in.case_severity.name:
+ case_severity = case_severity_service.get_by_name(
+ db_session=db_session,
+ project_id=case.project.id,
+ name=case_in.case_severity.name,
+ )
+ if case_severity:
+ case.case_severity = case_severity
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=(
+ f"Case severity changed to {case_in.case_severity.name.lower()} "
+ f"by {current_user.email}"
+ ),
+ dispatch_user_id=current_user.id,
+ case_id=case.id,
+ )
+ else:
+ log.warning(
+ f"Case severity with name {case_in.case_severity.name.lower()} not found."
+ )
+
+ if case_in.case_priority:
+ if case.case_priority.name != case_in.case_priority.name:
+ case_priority = case_priority_service.get_by_name(
+ db_session=db_session,
+ project_id=case.project.id,
+ name=case_in.case_priority.name,
+ )
+ if case_priority:
+ case.case_priority = case_priority
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=(
+ f"Case priority changed to {case_in.case_priority.name.lower()} "
+ f"by {current_user.email}"
+ ),
+ dispatch_user_id=current_user.id,
+ case_id=case.id,
+ )
+ else:
+ log.warning(
+ f"Case priority with name {case_in.case_priority.name.lower()} not found."
+ )
+
+ if case.status != case_in.status:
+ case.status = case_in.status
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=(
+ f"Case status changed to {case_in.status.lower()} by {current_user.email}"
+ ),
+ dispatch_user_id=current_user.id,
+ case_id=case.id,
+ )
+
+ if case.status == CaseStatus.closed:
+ individual = individual_service.get_or_create(
+ db_session=db_session,
+ email=current_user.email,
+ project=case.project,
+ )
+ case.resolved_by = individual
+
+ if case.visibility != case_in.visibility:
+ case.visibility = case_in.visibility
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=(
+ f"Case visibility changed to {case_in.visibility.lower()} by {current_user.email}"
+ ),
+ dispatch_user_id=current_user.id,
+ case_id=case.id,
+ )
+
+ case_costs = []
+ for case_cost in case_in.case_costs:
+ case_costs.append(
+ case_cost_service.get_or_create(db_session=db_session, case_cost_in=case_cost)
+ )
+ case.case_costs = case_costs
+
+ tags = []
+ for t in case_in.tags:
+ tags.append(tag_service.get_or_create(db_session=db_session, tag_in=t))
+ case.tags = tags
+
+ related = []
+ for r in case_in.related:
+ related.append(get(db_session=db_session, case_id=r.id))
+ case.related = related
+
+ duplicates = []
+ for d in case_in.duplicates:
+ duplicates.append(get(db_session=db_session, case_id=d.id))
+ case.duplicates = duplicates
+
+ incidents = []
+ for i in case_in.incidents:
+ incidents.append(incident_service.get(db_session=db_session, incident_id=i.id))
+ case.incidents = incidents
+
+ # Handle case notes update
+ if case_in.case_notes is not None:
+ # Get or create the individual contact
+ individual = individual_service.get_or_create(
+ db_session=db_session,
+ email=current_user.email,
+ project=case.project,
+ )
+
+ if case.case_notes:
+ # Update existing notes
+ case.case_notes.content = case_in.case_notes.content
+ case.case_notes.last_updated_by_id = individual.id
+ else:
+ # Create new notes
+ notes = CaseNotes(
+ content=case_in.case_notes.content,
+ last_updated_by_id=individual.id,
+ case_id=case.id,
+ )
+ db_session.add(notes)
+ case.case_notes = notes
+
+ db_session.commit()
+
+ return case
+
+
+def delete(*, db_session, case_id: int):
+ """Deletes an existing case."""
+ db_session.query(Case).filter(Case.id == case_id).delete()
+ db_session.commit()
+
+
+def get_participants(
+ *, db_session: Session, case_id: int, minimal: bool = False
+) -> list[Participant]:
+ """Returns a list of participants based on the given case id."""
+ if minimal:
+ case = (
+ db_session.query(Case)
+ .join(Case.participants) # Use join for minimal
+ .filter(Case.id == case_id)
+ .first()
+ )
+ else:
+ case = (
+ db_session.query(Case)
+ .options(joinedload(Case.participants)) # Use joinedload for full objects
+ .filter(Case.id == case_id)
+ .first()
+ )
+
+ return [] if case is None or case.participants is None else case.participants
diff --git a/src/dispatch/policy/__init__.py b/src/dispatch/case/severity/__init__.py
similarity index 100%
rename from src/dispatch/policy/__init__.py
rename to src/dispatch/case/severity/__init__.py
diff --git a/src/dispatch/case/severity/config.py b/src/dispatch/case/severity/config.py
new file mode 100644
index 000000000000..8b1172fd9323
--- /dev/null
+++ b/src/dispatch/case/severity/config.py
@@ -0,0 +1,42 @@
+default_case_severities = [
+ {
+ "name": "Undetermined",
+ "description": "The severity of the case has not yet been determined.",
+ "view_order": 1,
+ "color": "#9e9e9e",
+ "default": True,
+ "enabled": True,
+ },
+ {
+ "name": "Low",
+ "description": "Low severity.",
+ "view_order": 2,
+ "color": "#8bc34a",
+ "default": False,
+ "enabled": True,
+ },
+ {
+ "name": "Medium",
+ "description": "Medium severity.",
+ "view_order": 3,
+ "color": "#ffeb3b",
+ "default": False,
+ "enabled": True,
+ },
+ {
+ "name": "High",
+ "description": "High severity.",
+ "view_order": 4,
+ "color": "#ff9800",
+ "default": False,
+ "enabled": True,
+ },
+ {
+ "name": "Critical",
+ "description": "Critical severity.",
+ "view_order": 5,
+ "color": "#e53935",
+ "default": False,
+ "enabled": True,
+ },
+]
diff --git a/src/dispatch/case/severity/models.py b/src/dispatch/case/severity/models.py
new file mode 100644
index 000000000000..91a3ed12a0df
--- /dev/null
+++ b/src/dispatch/case/severity/models.py
@@ -0,0 +1,74 @@
+"""Models and schemas for the Dispatch case severity system."""
+
+from sqlalchemy import Column, Integer, String, Boolean
+from sqlalchemy.sql.schema import UniqueConstraint
+from sqlalchemy.event import listen
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base, ensure_unique_default_per_project
+from dispatch.models import DispatchBase, NameStr, ProjectMixin, PrimaryKey, Pagination
+from dispatch.project.models import ProjectRead
+
+
+class CaseSeverity(Base, ProjectMixin):
+ """SQLAlchemy model for a case severity, representing the severity level of a case."""
+
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ color = Column(String)
+ enabled = Column(Boolean, default=True)
+ default = Column(Boolean, default=False)
+
+ # This column is used to control how severities should be displayed
+ # Lower numbers will be shown first.
+ view_order = Column(Integer, default=9999)
+
+ search_vector = Column(
+ TSVectorType(
+ "name",
+ "description",
+ regconfig="pg_catalog.simple",
+ )
+ )
+
+
+listen(CaseSeverity.default, "set", ensure_unique_default_per_project)
+
+
+# Pydantic models
+class CaseSeverityBase(DispatchBase):
+ """Base Pydantic model for case severity data."""
+
+ color: str | None = None
+ default: bool | None = None
+ description: str | None = None
+ enabled: bool | None = None
+ name: NameStr
+ project: ProjectRead | None = None
+ view_order: int | None = None
+
+
+class CaseSeverityCreate(CaseSeverityBase):
+ """Pydantic model for creating a new case severity."""
+
+ pass
+
+
+class CaseSeverityUpdate(CaseSeverityBase):
+ """Pydantic model for updating a case severity."""
+
+ pass
+
+
+class CaseSeverityRead(CaseSeverityBase):
+ """Pydantic model for reading case severity data."""
+
+ id: PrimaryKey
+
+
+class CaseSeverityPagination(Pagination):
+ """Pydantic model for paginated case severity results."""
+
+ items: list[CaseSeverityRead] = []
diff --git a/src/dispatch/case/severity/service.py b/src/dispatch/case/severity/service.py
new file mode 100644
index 000000000000..b5420f2937d2
--- /dev/null
+++ b/src/dispatch/case/severity/service.py
@@ -0,0 +1,149 @@
+from pydantic import ValidationError
+
+from sqlalchemy.sql.expression import true
+
+from dispatch.project import service as project_service
+
+from .models import (
+ CaseSeverity,
+ CaseSeverityCreate,
+ CaseSeverityRead,
+ CaseSeverityUpdate,
+)
+
+
+def get(*, db_session, case_severity_id: int) -> CaseSeverity | None:
+ """Returns a case severity based on the given severity id."""
+ return db_session.query(CaseSeverity).filter(CaseSeverity.id == case_severity_id).one_or_none()
+
+
+def get_default(*, db_session, project_id: int):
+ """Returns the default case severity."""
+ return (
+ db_session.query(CaseSeverity)
+ .filter(CaseSeverity.default == true())
+ .filter(CaseSeverity.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_default_or_raise(*, db_session, project_id: int) -> CaseSeverity:
+ """Returns the default case severity or raises a ValidationError if one doesn't exist."""
+ case_severity = get_default(db_session=db_session, project_id=project_id)
+
+ if not case_severity:
+ raise ValidationError(
+ [
+ {
+ "loc": ("case_severity",),
+ "msg": "No default case severity defined.",
+ "type": "value_error",
+ }
+ ]
+ )
+ return case_severity
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> CaseSeverity | None:
+ """Returns a case severity based on the given severity name."""
+ return (
+ db_session.query(CaseSeverity)
+ .filter(CaseSeverity.name == name)
+ .filter(CaseSeverity.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(
+ *, db_session, project_id: int, case_severity_in=CaseSeverityRead
+) -> CaseSeverity:
+ """Returns the case severity specified or raises ValidationError."""
+ case_severity = get_by_name(
+ db_session=db_session, project_id=project_id, name=case_severity_in.name
+ )
+
+ if not case_severity:
+ raise ValidationError(
+ [
+ {
+ "loc": ("case_severity",),
+ "msg": "Case severity not found.",
+ "type": "value_error",
+ "case_severity": case_severity_in.name,
+ }
+ ]
+ )
+
+ return case_severity
+
+
+def get_by_name_or_default(
+ *, db_session, project_id: int, case_severity_in=CaseSeverityRead
+) -> CaseSeverity:
+ """Returns a case severity based on a name or the default if not specified."""
+ if case_severity_in and case_severity_in.name:
+ case_severity = get_by_name(
+ db_session=db_session, project_id=project_id, name=case_severity_in.name
+ )
+ if case_severity:
+ return case_severity
+ return get_default_or_raise(db_session=db_session, project_id=project_id)
+
+
+def get_all(*, db_session, project_id: int = None) -> list[CaseSeverity | None]:
+ """Returns all case severities."""
+ if project_id:
+ return db_session.query(CaseSeverity).filter(CaseSeverity.project_id == project_id)
+ return db_session.query(CaseSeverity)
+
+
+def get_all_enabled(*, db_session, project_id: int = None) -> list[CaseSeverity | None]:
+ """Returns all enabled case severities."""
+ if project_id:
+ return (
+ db_session.query(CaseSeverity)
+ .filter(CaseSeverity.project_id == project_id)
+ .filter(CaseSeverity.enabled == true())
+ )
+ return db_session.query(CaseSeverity).filter(CaseSeverity.enabled == true())
+
+
+def create(*, db_session, case_severity_in: CaseSeverityCreate) -> CaseSeverity:
+ """Creates a case severity."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=case_severity_in.project
+ )
+ case_severity = CaseSeverity(
+ **case_severity_in.dict(exclude={"project", "color"}), project=project
+ )
+ if case_severity_in.color:
+ case_severity.color = case_severity_in.color
+
+ db_session.add(case_severity)
+ db_session.commit()
+ return case_severity
+
+
+def update(
+ *, db_session, case_severity: CaseSeverity, case_severity_in: CaseSeverityUpdate
+) -> CaseSeverity:
+ """Updates a case severity."""
+ case_severity_data = case_severity.dict()
+
+ update_data = case_severity_in.dict(exclude_unset=True, exclude={"project", "color"})
+
+ for field in case_severity_data:
+ if field in update_data:
+ setattr(case_severity, field, update_data[field])
+
+ if case_severity_in.color:
+ case_severity.color = case_severity_in.color
+
+ db_session.commit()
+ return case_severity
+
+
+def delete(*, db_session, case_severity_id: int):
+ """Deletes a case severity."""
+ db_session.query(CaseSeverity).filter(CaseSeverity.id == case_severity_id).delete()
+ db_session.commit()
diff --git a/src/dispatch/case/severity/views.py b/src/dispatch/case/severity/views.py
new file mode 100644
index 000000000000..5e1718b80f94
--- /dev/null
+++ b/src/dispatch/case/severity/views.py
@@ -0,0 +1,77 @@
+from fastapi import APIRouter, Depends, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
+from dispatch.models import PrimaryKey
+
+from .models import (
+ CaseSeverityCreate,
+ CaseSeverityPagination,
+ CaseSeverityRead,
+ CaseSeverityUpdate,
+)
+from .service import create, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=CaseSeverityPagination, tags=["case_severities"])
+def get_case_severities(common: CommonParameters):
+ """Returns all case severities."""
+ return search_filter_sort_paginate(model="CaseSeverity", **common)
+
+
+@router.post(
+ "",
+ response_model=CaseSeverityRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_case_severity(
+ *,
+ db_session: DbSession,
+ case_severity_in: CaseSeverityCreate,
+):
+ """Creates a new case severity."""
+ case_severity = create(db_session=db_session, case_severity_in=case_severity_in)
+ return case_severity
+
+
+@router.put(
+ "/{case_severity_id}",
+ response_model=CaseSeverityRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_case_severity(
+ *,
+ db_session: DbSession,
+ case_severity_id: PrimaryKey,
+ case_severity_in: CaseSeverityUpdate,
+):
+ """Updates an existing case severity."""
+ case_severity = get(db_session=db_session, case_severity_id=case_severity_id)
+ if not case_severity:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A case severity with this id does not exist."}],
+ )
+
+ case_severity = update(
+ db_session=db_session,
+ case_severity=case_severity,
+ case_severity_in=case_severity_in,
+ )
+ return case_severity
+
+
+@router.get("/{case_severity_id}", response_model=CaseSeverityRead)
+def get_case_severity(db_session: DbSession, case_severity_id: PrimaryKey):
+ """Gets a case severity."""
+ case_severity = get(db_session=db_session, case_severity_id=case_severity_id)
+ if not case_severity:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A case severity with this id does not exist."}],
+ )
+ return case_severity
diff --git a/src/dispatch/status_report/__init__.py b/src/dispatch/case/type/__init__.py
similarity index 100%
rename from src/dispatch/status_report/__init__.py
rename to src/dispatch/case/type/__init__.py
diff --git a/src/dispatch/case/type/config.py b/src/dispatch/case/type/config.py
new file mode 100644
index 000000000000..222f1df196f9
--- /dev/null
+++ b/src/dispatch/case/type/config.py
@@ -0,0 +1,10 @@
+from dispatch.enums import Visibility
+
+default_case_type = {
+ "name": "Default",
+ "description": "This is the default case type.",
+ "visibility": Visibility.open,
+ "exclude_from_metrics": False,
+ "default": True,
+ "enabled": True,
+}
diff --git a/src/dispatch/case/type/models.py b/src/dispatch/case/type/models.py
new file mode 100644
index 000000000000..c3b44114ba9d
--- /dev/null
+++ b/src/dispatch/case/type/models.py
@@ -0,0 +1,148 @@
+"""Models for case types and related entities in the Dispatch application."""
+
+from pydantic import field_validator, AnyHttpUrl
+
+from sqlalchemy import JSON, Boolean, Column, ForeignKey, Integer, String
+from sqlalchemy.event import listen
+from sqlalchemy.ext.hybrid import hybrid_method
+from sqlalchemy.orm import relationship
+from sqlalchemy.sql import false
+from sqlalchemy.sql.schema import UniqueConstraint
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.cost_model.models import CostModelRead
+from dispatch.database.core import Base, ensure_unique_default_per_project
+from dispatch.enums import Visibility
+from dispatch.models import DispatchBase, NameStr, Pagination, PrimaryKey, ProjectMixin
+from dispatch.plugin.models import PluginMetadata
+from dispatch.project.models import ProjectRead
+
+
+class CaseType(ProjectMixin, Base):
+ """SQLAlchemy model for case types, representing different types of cases in the system."""
+
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ visibility = Column(String, default=Visibility.open)
+ default = Column(Boolean, default=False)
+ enabled = Column(Boolean, default=True)
+ exclude_from_metrics = Column(Boolean, default=False)
+ plugin_metadata = Column(JSON, default=[])
+ conversation_target = Column(String)
+ auto_close = Column(Boolean, default=False, server_default=false())
+ generate_read_in_summary = Column(Boolean, default=False, server_default=false())
+
+ # the catalog here is simple to help matching "named entities"
+ search_vector = Column(TSVectorType("name", regconfig="pg_catalog.simple"))
+
+ # relationships
+ case_template_document_id = Column(Integer, ForeignKey("document.id"))
+ case_template_document = relationship("Document")
+
+ oncall_service_id = Column(Integer, ForeignKey("service.id"))
+ oncall_service = relationship("Service")
+
+ incident_type_id = Column(Integer, ForeignKey("incident_type.id"))
+ incident_type = relationship("IncidentType")
+
+ cost_model_id = Column(Integer, ForeignKey("cost_model.id"), nullable=True, default=None)
+ cost_model = relationship(
+ "CostModel",
+ )
+
+ @hybrid_method
+ def get_meta(self, slug):
+ """Retrieve plugin metadata by slug."""
+ if not self.plugin_metadata:
+ return
+
+ for m in self.plugin_metadata:
+ if m["slug"] == slug:
+ return m
+
+
+listen(CaseType.default, "set", ensure_unique_default_per_project)
+
+
+# Pydantic models
+class Document(DispatchBase):
+ """Pydantic model for a document related to a case type."""
+
+ id: PrimaryKey
+ description: str | None = None
+ name: NameStr
+ resource_id: str | None = None
+ resource_type: str | None = None
+ weblink: AnyHttpUrl | None = None
+
+
+class IncidentType(DispatchBase):
+ """Pydantic model for an incident type related to a case type."""
+
+ id: PrimaryKey
+ description: str | None = None
+ name: NameStr
+ visibility: str | None = None
+
+
+class Service(DispatchBase):
+ """Pydantic model for a service related to a case type."""
+
+ id: PrimaryKey
+ description: str | None = None
+ external_id: str
+ is_active: bool | None = None
+ name: NameStr
+ type: str | None = None
+
+
+class CaseTypeBase(DispatchBase):
+ """Base Pydantic model for case types, used for shared fields."""
+
+ case_template_document: Document | None = None
+ conversation_target: str | None = None
+ default: bool | None = False
+ description: str | None = None
+ enabled: bool | None = True
+ exclude_from_metrics: bool | None = False
+ incident_type: IncidentType | None = None
+ name: NameStr
+ oncall_service: Service | None = None
+ plugin_metadata: list[PluginMetadata] = []
+ project: ProjectRead | None = None
+ visibility: str | None = None
+ cost_model: CostModelRead | None = None
+ auto_close: bool | None = False
+ generate_read_in_summary: bool | None = False
+
+ @field_validator("plugin_metadata", mode="before")
+ @classmethod
+ def replace_none_with_empty_list(cls, value):
+ """Ensure plugin_metadata is always a list, replacing None with an empty list."""
+ return [] if value is None else value
+
+
+class CaseTypeCreate(CaseTypeBase):
+ """Pydantic model for creating a new case type."""
+
+ pass
+
+
+class CaseTypeUpdate(CaseTypeBase):
+ """Pydantic model for updating an existing case type."""
+
+ id: PrimaryKey | None = None
+
+
+class CaseTypeRead(CaseTypeBase):
+ """Pydantic model for reading a case type from the database."""
+
+ id: PrimaryKey
+
+
+class CaseTypePagination(Pagination):
+ """Pydantic model for paginated case type results."""
+
+ items: list[CaseTypeRead] = []
diff --git a/src/dispatch/case/type/service.py b/src/dispatch/case/type/service.py
new file mode 100644
index 000000000000..4468a5027db4
--- /dev/null
+++ b/src/dispatch/case/type/service.py
@@ -0,0 +1,194 @@
+from sqlalchemy.sql.expression import true
+
+from dispatch.case import service as case_service
+from dispatch.case_cost import service as case_cost_service
+from dispatch.cost_model import service as cost_model_service
+from dispatch.document import service as document_service
+from dispatch.incident.type import service as incident_type_service
+from dispatch.project import service as project_service
+from dispatch.service import service as service_service
+
+from .models import CaseType, CaseTypeCreate, CaseTypeRead, CaseTypeUpdate
+
+
+def get(*, db_session, case_type_id: int) -> CaseType | None:
+ """Returns a case type based on the given type id."""
+ return db_session.query(CaseType).filter(CaseType.id == case_type_id).one_or_none()
+
+
+def get_default(*, db_session, project_id: int):
+ """Returns the default case type."""
+ return (
+ db_session.query(CaseType)
+ .filter(CaseType.default == true())
+ .filter(CaseType.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_default_or_raise(*, db_session, project_id: int) -> CaseType:
+ """Returns the default case type or raises a ValueError if one doesn't exist."""
+ case_type = get_default(db_session=db_session, project_id=project_id)
+
+ if not case_type:
+ raise ValueError("No default case type defined.")
+ return case_type
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> CaseType | None:
+ """Returns a case type based on the given type name."""
+ return (
+ db_session.query(CaseType)
+ .filter(CaseType.name == name)
+ .filter(CaseType.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(*, db_session, project_id: int, case_type_in=CaseTypeRead) -> CaseType:
+ """Returns the case type specified or raises a ValueError."""
+ case_type = get_by_name(db_session=db_session, project_id=project_id, name=case_type_in.name)
+
+ if not case_type:
+ raise ValueError(f"Case type not found: {case_type_in.name}")
+
+ return case_type
+
+
+def get_by_name_or_default(*, db_session, project_id: int, case_type_in=CaseTypeRead) -> CaseType:
+ """Returns a case type based on a name or the default if not specified."""
+ if case_type_in and case_type_in.name:
+ case_type = get_by_name(
+ db_session=db_session, project_id=project_id, name=case_type_in.name
+ )
+ if case_type:
+ return case_type
+ return get_default_or_raise(db_session=db_session, project_id=project_id)
+
+
+def get_by_slug(*, db_session, project_id: int, slug: str) -> CaseType | None:
+ """Returns a case type based on the given type slug."""
+ return (
+ db_session.query(CaseType)
+ .filter(CaseType.slug == slug)
+ .filter(CaseType.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_all(*, db_session, project_id: int = None) -> list[CaseType | None]:
+ """Returns all case types."""
+ if project_id:
+ return db_session.query(CaseType).filter(CaseType.project_id == project_id)
+ return db_session.query(CaseType)
+
+
+def get_all_enabled(*, db_session, project_id: int = None) -> list[CaseType | None]:
+ """Returns all enabled case types."""
+ if project_id:
+ return (
+ db_session.query(CaseType)
+ .filter(CaseType.project_id == project_id)
+ .filter(CaseType.enabled == true())
+ )
+ return db_session.query(CaseType).filter(CaseType.enabled == true())
+
+
+def create(*, db_session, case_type_in: CaseTypeCreate) -> CaseType:
+ """Creates a case type."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=case_type_in.project
+ )
+
+ case_type = CaseType(
+ **case_type_in.dict(
+ exclude={"case_template_document", "oncall_service", "incident_type", "project"}
+ ),
+ project=project,
+ )
+ if case_type_in.cost_model:
+ cost_model = cost_model_service.get_cost_model_by_id(
+ db_session=db_session, cost_model_id=case_type_in.cost_model.id
+ )
+ case_type.cost_model = cost_model
+
+ if case_type_in.case_template_document:
+ case_template_document = document_service.get(
+ db_session=db_session, document_id=case_type_in.case_template_document.id
+ )
+ case_type.case_template_document = case_template_document
+
+ if case_type_in.oncall_service:
+ oncall_service = service_service.get(
+ db_session=db_session, service_id=case_type_in.oncall_service.id
+ )
+ case_type.oncall_service = oncall_service
+
+ if case_type_in.incident_type:
+ incident_type = incident_type_service.get(
+ db_session=db_session, incident_type_id=case_type_in.incident_type.id
+ )
+ case_type.incident_type = incident_type
+
+ db_session.add(case_type)
+ db_session.commit()
+ return case_type
+
+
+def update(*, db_session, case_type: CaseType, case_type_in: CaseTypeUpdate) -> CaseType:
+ """Updates a case type."""
+ cost_model = None
+ if case_type_in.cost_model:
+ cost_model = cost_model_service.get_cost_model_by_id(
+ db_session=db_session, cost_model_id=case_type_in.cost_model.id
+ )
+ should_update_case_cost = case_type.cost_model != cost_model
+ case_type.cost_model = cost_model
+
+ # Calculate the cost of all non-closed cases associated with this case type
+ cases = case_service.get_all_open_by_case_type(db_session=db_session, case_type_id=case_type.id)
+ for case in cases:
+ case_cost_service.calculate_case_response_cost(case=case, db_session=db_session)
+
+ if case_type_in.case_template_document:
+ case_template_document = document_service.get(
+ db_session=db_session, document_id=case_type_in.case_template_document.id
+ )
+ case_type.case_template_document = case_template_document
+
+ if case_type_in.oncall_service:
+ oncall_service = service_service.get(
+ db_session=db_session, service_id=case_type_in.oncall_service.id
+ )
+ case_type.oncall_service = oncall_service
+
+ if case_type_in.incident_type:
+ incident_type = incident_type_service.get(
+ db_session=db_session, incident_type_id=case_type_in.incident_type.id
+ )
+ case_type.incident_type = incident_type
+
+ case_type_data = case_type.dict()
+
+ update_data = case_type_in.dict(
+ exclude_unset=True, exclude={"case_template_document", "oncall_service", "incident_type"}
+ )
+
+ for field in case_type_data:
+ if field in update_data:
+ setattr(case_type, field, update_data[field])
+
+ db_session.commit()
+
+ if should_update_case_cost:
+ case_cost_service.update_case_response_cost_for_case_type(
+ db_session=db_session, case_type=case_type
+ )
+
+ return case_type
+
+
+def delete(*, db_session, case_type_id: int):
+ """Deletes a case type."""
+ db_session.query(CaseType).filter(CaseType.id == case_type_id).delete()
+ db_session.commit()
diff --git a/src/dispatch/case/type/views.py b/src/dispatch/case/type/views.py
new file mode 100644
index 000000000000..482ffb109e88
--- /dev/null
+++ b/src/dispatch/case/type/views.py
@@ -0,0 +1,65 @@
+from fastapi import APIRouter, Depends, HTTPException, status
+
+from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
+
+from .models import CaseTypeCreate, CaseTypePagination, CaseTypeRead, CaseTypeUpdate
+from .service import create, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=CaseTypePagination, tags=["case_types"])
+def get_case_types(common: CommonParameters):
+ """Returns all case types."""
+ return search_filter_sort_paginate(model="CaseType", **common)
+
+
+@router.post(
+ "",
+ response_model=CaseTypeRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_case_type(
+ *,
+ db_session: DbSession,
+ case_type_in: CaseTypeCreate,
+):
+ """Creates a new case type."""
+ return create(db_session=db_session, case_type_in=case_type_in)
+
+
+@router.put(
+ "/{case_type_id}",
+ response_model=CaseTypeRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_case_type(
+ *,
+ db_session: DbSession,
+ case_type_id: PrimaryKey,
+ case_type_in: CaseTypeUpdate,
+):
+ """Updates an existing case type."""
+ case_type = get(db_session=db_session, case_type_id=case_type_id)
+ if not case_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A case type with this id does not exist."}],
+ )
+ return update(db_session=db_session, case_type=case_type, case_type_in=case_type_in)
+
+
+@router.get("/{case_type_id}", response_model=CaseTypeRead)
+def get_case_type(db_session: DbSession, case_type_id: PrimaryKey):
+ """Gets a case type."""
+ case_type = get(db_session=db_session, case_type_id=case_type_id)
+ if not case_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A case type with this id does not exist."}],
+ )
+ return case_type
diff --git a/src/dispatch/case/views.py b/src/dispatch/case/views.py
new file mode 100644
index 000000000000..284327fa62dd
--- /dev/null
+++ b/src/dispatch/case/views.py
@@ -0,0 +1,593 @@
+import json
+import logging
+from datetime import datetime
+from typing import Annotated
+
+from fastapi import APIRouter, BackgroundTasks, Depends, HTTPException, Query, status
+from sqlalchemy.exc import IntegrityError
+from starlette.requests import Request
+
+# NOTE: define permissions before enabling the code block below
+from dispatch.auth.permissions import (
+ CaseEditPermission,
+ CaseJoinPermission,
+ CaseViewPermission,
+ CaseEventPermission,
+ PermissionsDependency,
+)
+from dispatch.auth.service import CurrentUser
+from dispatch.case.enums import CaseStatus
+from dispatch.common.utils.views import create_pydantic_include
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.event import flows as event_flows
+from dispatch.event.models import EventCreateMinimal, EventUpdate
+from dispatch.incident import service as incident_service
+from dispatch.incident.models import IncidentCreate, IncidentRead
+from dispatch.individual.models import IndividualContactRead
+from dispatch.individual.service import get_or_create
+from dispatch.models import OrganizationSlug, PrimaryKey
+from dispatch.participant.models import ParticipantRead, ParticipantReadMinimal, ParticipantUpdate
+from dispatch.project import service as project_service
+
+from .flows import (
+ case_add_or_reactivate_participant_flow,
+ case_closed_create_flow,
+ case_create_conversation_flow,
+ case_create_resources_flow,
+ case_delete_flow,
+ case_escalated_create_flow,
+ case_new_create_flow,
+ case_remove_participant_flow,
+ case_stable_create_flow,
+ case_to_incident_endpoint_escalate_flow,
+ case_triage_create_flow,
+ case_update_flow,
+ get_case_participants_flow,
+)
+from .models import (
+ Case,
+ CaseCreate,
+ CaseExpandedPagination,
+ CasePagination,
+ CasePaginationMinimalWithExtras,
+ CaseRead,
+ CaseUpdate,
+)
+from .service import create, delete, get, get_participants, update
+
+log = logging.getLogger(__name__)
+
+router = APIRouter()
+
+
+def get_current_case(db_session: DbSession, request: Request) -> Case:
+ """Fetches a case or returns an HTTP 404."""
+ case = get(db_session=db_session, case_id=request.path_params["case_id"])
+ if not case:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "The requested case does not exist."}],
+ )
+ return case
+
+
+CurrentCase = Annotated[Case, Depends(get_current_case)]
+
+
+@router.get(
+ "/{case_id}",
+ response_model=CaseRead,
+ summary="Retrieves a single case.",
+ dependencies=[Depends(PermissionsDependency([CaseViewPermission]))],
+)
+def get_case(
+ case_id: PrimaryKey,
+ db_session: DbSession,
+ current_case: CurrentCase,
+):
+ """Retrieves the details of a single case."""
+ return current_case
+
+
+@router.get(
+ "/{case_id}/participants/minimal",
+ response_model=list[ParticipantReadMinimal],
+ summary="Retrieves a minimal list of case participants.",
+ dependencies=[Depends(PermissionsDependency([CaseViewPermission]))],
+)
+def get_case_participants_minimal(
+ case_id: PrimaryKey,
+ db_session: DbSession,
+):
+ """Retrieves the details of a single case."""
+ return get_participants(case_id=case_id, db_session=db_session)
+
+
+@router.get(
+ "/{case_id}/participants",
+ summary="Retrieves a list of case participants.",
+ dependencies=[Depends(PermissionsDependency([CaseViewPermission]))],
+)
+def get_case_participants(
+ case_id: PrimaryKey,
+ db_session: DbSession,
+ minimal: bool = Query(default=False),
+):
+ """Retrieves the details of a single case."""
+ participants = get_participants(case_id=case_id, db_session=db_session, minimal=minimal)
+
+ if minimal:
+ return [ParticipantReadMinimal.from_orm(p) for p in participants]
+ else:
+ return [ParticipantRead.from_orm(p) for p in participants]
+
+
+@router.get("", summary="Retrieves a list of cases.")
+def get_cases(
+ common: CommonParameters,
+ include: list[str] = Query([], alias="include[]"),
+ expand: bool = Query(default=False),
+):
+ """Retrieves all cases."""
+ pagination = search_filter_sort_paginate(model="Case", **common)
+
+ if expand:
+ return json.loads(CaseExpandedPagination(**pagination).json())
+
+ if include:
+ # only allow two levels for now
+ include_sets = create_pydantic_include(include)
+
+ include_fields = {
+ "items": {"__all__": include_sets},
+ "itemsPerPage": ...,
+ "page": ...,
+ "total": ...,
+ }
+ return json.loads(CaseExpandedPagination(**pagination).json(include=include_fields))
+ return json.loads(CasePagination(**pagination).json())
+
+
+@router.get("/minimal", summary="Retrieves a list of cases with minimal data.")
+def get_cases_minimal(
+ common: CommonParameters,
+):
+ """Retrieves all cases with minimal data."""
+ pagination = search_filter_sort_paginate(model="Case", **common)
+
+ return json.loads(CasePaginationMinimalWithExtras(**pagination).json())
+
+
+@router.post("", response_model=CaseRead, summary="Creates a new case.")
+def create_case(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ case_in: CaseCreate,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ """Creates a new case."""
+ # TODO: (wshel) this conditional always happens in the UI flow since
+ # reporter is not available to be set.
+ if not case_in.reporter:
+ # Ensure the individual exists, create if not
+ if case_in.project is None:
+ raise HTTPException(
+ status.HTTP_422_UNPROCESSABLE_ENTITY,
+ detail=[{"msg": "Project must be set to create reporter individual."}],
+ )
+ # Fetch the full DB project instance
+ project = project_service.get_by_name_or_default(
+ db_session=db_session, project_in=case_in.project
+ )
+ individual = get_or_create(
+ db_session=db_session,
+ email=current_user.email,
+ project=project,
+ )
+ case_in.reporter = ParticipantUpdate(
+ individual=IndividualContactRead(id=individual.id, email=individual.email)
+ )
+
+ try:
+ case = create(db_session=db_session, case_in=case_in, current_user=current_user)
+ except ValueError as e:
+ log.exception(e)
+ raise HTTPException(
+ status.HTTP_422_UNPROCESSABLE_ENTITY, detail=[{"msg": e.args[0]}]
+ ) from e
+
+ if case.status == CaseStatus.triage:
+ background_tasks.add_task(
+ case_triage_create_flow,
+ case_id=case.id,
+ organization_slug=organization,
+ )
+ elif case.status == CaseStatus.escalated:
+ background_tasks.add_task(
+ case_escalated_create_flow,
+ case_id=case.id,
+ organization_slug=organization,
+ )
+ elif case.status == CaseStatus.closed:
+ background_tasks.add_task(
+ case_closed_create_flow,
+ case_id=case.id,
+ organization_slug=organization,
+ )
+ elif case.status == CaseStatus.stable:
+ background_tasks.add_task(
+ case_stable_create_flow,
+ case_id=case.id,
+ organization_slug=organization,
+ )
+ else:
+ background_tasks.add_task(
+ case_new_create_flow,
+ case_id=case.id,
+ db_session=db_session,
+ organization_slug=organization,
+ )
+
+ return case
+
+
+@router.post(
+ "/{case_id}/resources/conversation",
+ response_model=CaseRead,
+ summary="Creates conversation channel for an existing case.",
+)
+def create_case_channel(
+ db_session: DbSession,
+ current_case: CurrentCase,
+):
+ """Creates conversation channel for an existing case."""
+
+ current_case.dedicated_channel = True
+
+ # Add all case participants to the case channel
+ case_create_conversation_flow(
+ db_session=db_session,
+ case=current_case,
+ participant_emails=current_case.participant_emails,
+ conversation_target=None,
+ )
+
+ return current_case
+
+
+@router.post(
+ "/{case_id}/resources",
+ response_model=CaseRead,
+ summary="Creates resources for an existing case.",
+)
+def create_case_resources(
+ db_session: DbSession,
+ case_id: PrimaryKey,
+ current_case: CurrentCase,
+ background_tasks: BackgroundTasks,
+):
+ """Creates resources for an existing case."""
+ individual_participants, team_participants = get_case_participants_flow(
+ case=current_case, db_session=db_session
+ )
+ background_tasks.add_task(
+ case_create_resources_flow,
+ db_session=db_session,
+ case_id=case_id,
+ individual_participants=individual_participants,
+ team_participants=team_participants,
+ )
+
+ return current_case
+
+
+@router.put(
+ "/{case_id}",
+ response_model=CaseRead,
+ summary="Updates an existing case.",
+ dependencies=[Depends(PermissionsDependency([CaseEditPermission]))],
+)
+def update_case(
+ db_session: DbSession,
+ current_case: CurrentCase,
+ organization: OrganizationSlug,
+ case_id: PrimaryKey,
+ case_in: CaseUpdate,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ """Updates an existing case."""
+ reporter_email = None
+ if case_in.reporter:
+ # we assign the case to the reporter provided
+ reporter_email = case_in.reporter.individual.email
+ elif current_user:
+ # we fall back to assign the case to the current user
+ reporter_email = current_user.email
+
+ assignee_email = None
+ if case_in.assignee:
+ # we assign the case to the assignee provided
+ assignee_email = case_in.assignee.individual.email
+ elif current_user:
+ # we fall back to assign the case to the current user
+ assignee_email = current_user.email
+
+ # we store the previous state of the case in order to be able to detect changes
+ previous_case = CaseRead.from_orm(current_case)
+
+ # we update the case
+ case = update(
+ db_session=db_session, case=current_case, case_in=case_in, current_user=current_user
+ )
+
+ # we run the case update flow
+ background_tasks.add_task(
+ case_update_flow,
+ case_id=case_id,
+ previous_case=previous_case,
+ reporter_email=reporter_email,
+ assignee_email=assignee_email,
+ organization_slug=organization,
+ )
+
+ return case
+
+
+@router.put(
+ "/{case_id}/escalate",
+ response_model=IncidentRead,
+ summary="Escalates an existing case.",
+ dependencies=[Depends(PermissionsDependency([CaseEditPermission]))],
+)
+def escalate_case(
+ db_session: DbSession,
+ current_case: CurrentCase,
+ organization: OrganizationSlug,
+ incident_in: IncidentCreate,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ """Escalates an existing case."""
+ # use existing reporter or current user if not provided
+ if not incident_in.reporter:
+ incident_in.reporter = (
+ ParticipantUpdate(
+ individual=IndividualContactRead(email=current_case.reporter.individual.email)
+ )
+ if current_case.reporter
+ else ParticipantUpdate(individual=IndividualContactRead(email=current_user.email))
+ )
+
+ # allow for default values
+ if not incident_in.incident_type:
+ if current_case.case_type.incident_type:
+ incident_in.incident_type = {"name": current_case.case_type.incident_type.name}
+
+ if not incident_in.project:
+ if current_case.case_type.incident_type:
+ incident_in.project = {"name": current_case.case_type.incident_type.project.name}
+
+ incident = incident_service.create(db_session=db_session, incident_in=incident_in)
+ background_tasks.add_task(
+ case_to_incident_endpoint_escalate_flow,
+ case_id=current_case.id,
+ incident_id=incident.id,
+ organization_slug=organization,
+ )
+
+ return incident
+
+
+@router.delete(
+ "/{case_id}",
+ response_model=None,
+ summary="Deletes an existing case and its external resources.",
+ dependencies=[Depends(PermissionsDependency([CaseEditPermission]))],
+)
+def delete_case(
+ case_id: PrimaryKey,
+ db_session: DbSession,
+ current_case: CurrentCase,
+):
+ """Deletes an existing case and its external resources."""
+ # we run the case delete flow
+ case_delete_flow(case=current_case, db_session=db_session)
+
+ # we delete the internal case
+ try:
+ delete(db_session=db_session, case_id=case_id)
+ except IntegrityError as e:
+ log.exception(e)
+ raise HTTPException(
+ status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
+ detail=[
+ {
+ "msg": (
+ f"Case {current_case.name} could not be deleted. Make sure the case has no "
+ "relationships to other cases or incidents before deleting it.",
+ )
+ }
+ ],
+ ) from None
+
+
+@router.post(
+ "/{case_id}/join",
+ summary="Adds an individual to a case.",
+ dependencies=[Depends(PermissionsDependency([CaseJoinPermission]))],
+)
+def join_case(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ case_id: PrimaryKey,
+ current_case: CurrentCase,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ """Adds an individual to a case."""
+ background_tasks.add_task(
+ case_add_or_reactivate_participant_flow,
+ current_user.email,
+ case_id=current_case.id,
+ organization_slug=organization,
+ )
+
+
+@router.delete(
+ "/{case_id}/remove/{email}",
+ summary="Removes an individual from a case.",
+ dependencies=[Depends(PermissionsDependency([CaseEditPermission]))],
+)
+def remove_participant_from_case(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ case_id: PrimaryKey,
+ email: str,
+ current_case: CurrentCase,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ """Removes an individual from a case."""
+ background_tasks.add_task(
+ case_remove_participant_flow,
+ email,
+ case_id=current_case.id,
+ db_session=db_session,
+ )
+
+
+@router.post(
+ "/{case_id}/add/{email}",
+ summary="Adds an individual to a case.",
+ dependencies=[Depends(PermissionsDependency([CaseEditPermission]))],
+)
+def add_participant_to_case(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ case_id: PrimaryKey,
+ email: str,
+ current_case: CurrentCase,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ """Adds an individual to a case."""
+ background_tasks.add_task(
+ case_add_or_reactivate_participant_flow,
+ email,
+ case_id=current_case.id,
+ organization_slug=organization,
+ db_session=db_session,
+ )
+
+
+@router.post(
+ "/{case_id}/event",
+ summary="Creates a custom event.",
+ dependencies=[Depends(PermissionsDependency([CaseEventPermission]))],
+)
+def create_custom_event(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ case_id: PrimaryKey,
+ current_case: CurrentCase,
+ event_in: EventCreateMinimal,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ if event_in.details is None:
+ event_in.details = {}
+ event_in.details.update({"created_by": current_user.email, "added_on": str(datetime.utcnow())})
+ """Creates a custom event."""
+ background_tasks.add_task(
+ event_flows.log_case_event,
+ user_email=current_user.email,
+ case_id=current_case.id,
+ event_in=event_in,
+ organization_slug=organization,
+ )
+
+
+@router.patch(
+ "/{case_id}/event",
+ summary="Updates a custom event.",
+ dependencies=[Depends(PermissionsDependency([CaseEventPermission]))],
+)
+def update_custom_event(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ case_id: PrimaryKey,
+ current_case: CurrentCase,
+ event_in: EventUpdate,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ if event_in.details:
+ event_in.details.update(
+ {
+ **event_in.details,
+ "updated_by": current_user.email,
+ "updated_on": str(datetime.utcnow()),
+ }
+ )
+ else:
+ event_in.details = {"updated_by": current_user.email, "updated_on": str(datetime.utcnow())}
+ """Updates a custom event."""
+ background_tasks.add_task(
+ event_flows.update_case_event,
+ event_in=event_in,
+ organization_slug=organization,
+ )
+
+
+@router.post(
+ "/{case_id}/exportTimeline",
+ summary="Exports timeline events.",
+ dependencies=[Depends(PermissionsDependency([CaseEventPermission]))],
+)
+def export_timeline_event(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ case_id: PrimaryKey,
+ current_case: CurrentCase,
+ timeline_filters: dict,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ try:
+ event_flows.export_case_timeline(
+ timeline_filters=timeline_filters,
+ case_id=case_id,
+ organization_slug=organization,
+ db_session=db_session,
+ )
+ except Exception as e:
+ log.exception(e)
+ raise HTTPException(
+ status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
+ detail=[{"msg": (f"{str(e)}.",)}],
+ ) from e
+
+
+@router.delete(
+ "/{case_id}/event/{event_uuid}",
+ summary="Deletes a custom event.",
+ dependencies=[Depends(PermissionsDependency([CaseEventPermission]))],
+)
+def delete_custom_event(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ case_id: PrimaryKey,
+ current_case: CurrentCase,
+ event_uuid: str,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ """Deletes a custom event."""
+ background_tasks.add_task(
+ event_flows.delete_case_event,
+ event_uuid=event_uuid,
+ organization_slug=organization,
+ )
diff --git a/src/dispatch/case_cost/models.py b/src/dispatch/case_cost/models.py
new file mode 100644
index 000000000000..7e29efe365c2
--- /dev/null
+++ b/src/dispatch/case_cost/models.py
@@ -0,0 +1,52 @@
+from datetime import datetime
+
+from sqlalchemy import Column, ForeignKey, Integer, Numeric
+from sqlalchemy.ext.associationproxy import association_proxy
+from sqlalchemy.orm import relationship
+
+from dispatch.database.core import Base
+from dispatch.case_cost_type.models import CaseCostTypeRead
+from dispatch.models import DispatchBase, Pagination, PrimaryKey, ProjectMixin, TimeStampMixin
+from dispatch.project.models import ProjectRead
+
+
+# SQLAlchemy Model
+class CaseCost(Base, TimeStampMixin, ProjectMixin):
+ # columns
+ id = Column(Integer, primary_key=True)
+ amount = Column(Numeric(precision=10, scale=2), nullable=True)
+
+ # relationships
+ case_cost_type = relationship("CaseCostType", backref="case_cost")
+ case_cost_type_id = Column(Integer, ForeignKey("case_cost_type.id", ondelete="CASCADE"))
+ case_id = Column(Integer, ForeignKey("case.id", ondelete="CASCADE"))
+ search_vector = association_proxy("case_cost_type", "search_vector")
+
+
+# Pydantic Models
+class CaseCostBase(DispatchBase):
+ amount: float = 0
+
+
+class CaseCostCreate(CaseCostBase):
+ case_cost_type: CaseCostTypeRead
+ project: ProjectRead
+
+
+class CaseCostUpdate(CaseCostBase):
+ id: PrimaryKey | None = None
+ case_cost_type: CaseCostTypeRead
+
+
+class CaseCostReadMinimal(DispatchBase):
+ amount: float = 0
+
+
+class CaseCostRead(CaseCostBase):
+ id: PrimaryKey
+ case_cost_type: CaseCostTypeRead
+ updated_at: datetime | None = None
+
+
+class CaseCostPagination(Pagination):
+ items: list[CaseCostRead] = []
diff --git a/src/dispatch/case_cost/scheduled.py b/src/dispatch/case_cost/scheduled.py
new file mode 100644
index 000000000000..0d769b5e0ac2
--- /dev/null
+++ b/src/dispatch/case_cost/scheduled.py
@@ -0,0 +1,83 @@
+import logging
+
+from schedule import every
+
+from sqlalchemy.orm import Session
+
+from dispatch.decorators import scheduled_project_task, timer
+from dispatch.case import service as case_service
+from dispatch.case.enums import CaseStatus, CostModelType
+from dispatch.project.models import Project
+from dispatch.scheduler import scheduler
+
+from .service import (
+ update_case_response_cost,
+ get_or_create_case_response_cost_by_model_type,
+)
+
+
+log = logging.getLogger(__name__)
+
+
+@scheduler.add(every(1).hour, name="calculate-cases-response-cost")
+@timer
+@scheduled_project_task
+def calculate_cases_response_cost(db_session: Session, project: Project):
+ """Calculates and saves the response cost for all cases."""
+ cases = case_service.get_all_by_status(
+ db_session=db_session,
+ project_id=project.id,
+ statuses=[CaseStatus.new, CaseStatus.triage, CaseStatus.stable],
+ )
+
+ for case in cases:
+ try:
+ # we get the response cost for the given case
+ case_response_cost_classic = get_or_create_case_response_cost_by_model_type(
+ case=case, db_session=db_session, model_type=CostModelType.classic
+ )
+ case_response_cost_new = get_or_create_case_response_cost_by_model_type(
+ case=case, db_session=db_session, model_type=CostModelType.new
+ )
+
+ # we don't need to update the cost of closed cases if they already have a response
+ # cost and this was updated after the case was closed
+ if case.status == CaseStatus.closed:
+ if case_response_cost_classic:
+ if case_response_cost_classic.updated_at > case.closed_at:
+ continue
+ # we don't need to update the cost of escalated cases if they already have a response
+ # cost and this was updated after the case was escalated
+ if case.status == CaseStatus.escalated:
+ if case_response_cost_classic:
+ if case_response_cost_classic.updated_at > case.escalated_at:
+ continue
+ # we don't need to update the cost of stable cases if they already have a response
+ # cost and this was updated after the case was marked as stable
+ if case.status == CaseStatus.stable:
+ if case_response_cost_classic:
+ if case.stable_at and case_response_cost_classic.updated_at > case.stable_at:
+ continue
+
+ # we calculate the response cost amount
+ results = update_case_response_cost(case, db_session)
+
+ # we don't need to update the cost amount if it hasn't changed
+ if (
+ case_response_cost_classic.amount == results[CostModelType.classic]
+ and case_response_cost_new.amount == results[CostModelType.new]
+ ):
+ continue
+
+ # we save the new case cost amount
+ case_response_cost_classic.amount = results[CostModelType.classic]
+ case.case_costs.append(case_response_cost_classic)
+
+ case_response_cost_new.amount = results[CostModelType.new]
+ case.case_costs.append(case_response_cost_new)
+
+ db_session.add(case)
+ db_session.commit()
+ except Exception as e:
+ # we shouldn't fail to update all cases when one fails
+ log.exception(e)
diff --git a/src/dispatch/case_cost/service.py b/src/dispatch/case_cost/service.py
new file mode 100644
index 000000000000..e9dd6a087436
--- /dev/null
+++ b/src/dispatch/case_cost/service.py
@@ -0,0 +1,521 @@
+from datetime import datetime, timedelta, timezone
+import logging
+import math
+
+from sqlalchemy.orm import Session
+
+from dispatch.cost_model.models import CostModelActivity
+from dispatch.case import service as case_service
+from dispatch.case.models import Case, CaseStatus
+from dispatch.case.type.models import CaseType
+from dispatch.case_cost_type import service as case_cost_type_service
+from dispatch.case.enums import CostModelType
+from dispatch.cost_model import service as cost_model_service
+from dispatch.participant import service as participant_service
+from dispatch.participant.models import ParticipantRead
+from dispatch.participant_activity import service as participant_activity_service
+from dispatch.participant_activity.models import ParticipantActivityCreate, ParticipantActivity
+from dispatch.participant_role.models import ParticipantRoleType, ParticipantRole
+from dispatch.plugin import service as plugin_service
+
+from .models import CaseCost, CaseCostCreate, CaseCostUpdate
+
+
+HOURS_IN_DAY = 24
+SECONDS_IN_HOUR = 3600
+log = logging.getLogger(__name__)
+
+
+def get(*, db_session: Session, case_cost_id: int) -> CaseCost | None:
+ """Gets a case cost by its id."""
+ return db_session.query(CaseCost).filter(CaseCost.id == case_cost_id).one_or_none()
+
+
+def get_by_case_id(*, db_session, case_id: int) -> list[CaseCost | None]:
+ """Gets case costs by their case id."""
+ return db_session.query(CaseCost).filter(CaseCost.case_id == case_id).all()
+
+
+def get_by_case_id_and_case_cost_type_id(
+ *, db_session: Session, case_id: int, case_cost_type_id: int
+) -> CaseCost | None:
+ """Gets case costs by their case id and case cost type id."""
+ return (
+ db_session.query(CaseCost)
+ .filter(CaseCost.case_id == case_id)
+ .filter(CaseCost.case_cost_type_id == case_cost_type_id)
+ .order_by(CaseCost.id.asc())
+ .first()
+ )
+
+
+def get_all(*, db_session: Session) -> list[CaseCost | None]:
+ """Gets all case costs."""
+ return db_session.query(CaseCost)
+
+
+def get_or_create(
+ *, db_session: Session, case_cost_in: CaseCostCreate | CaseCostUpdate
+) -> CaseCost:
+ """Gets or creates a case cost object."""
+ if type(case_cost_in) is CaseCostUpdate and case_cost_in.id:
+ case_cost = get(db_session=db_session, case_cost_id=case_cost_in.id)
+ else:
+ case_cost = create(db_session=db_session, case_cost_in=case_cost_in)
+
+ return case_cost
+
+
+def create(*, db_session: Session, case_cost_in: CaseCostCreate) -> CaseCost:
+ """Creates a new case cost."""
+ case_cost_type = case_cost_type_service.get(
+ db_session=db_session, case_cost_type_id=case_cost_in.case_cost_type.id
+ )
+ case_cost = CaseCost(
+ **case_cost_in.dict(exclude={"case_cost_type", "project"}),
+ case_cost_type=case_cost_type,
+ project=case_cost_type.project,
+ )
+ db_session.add(case_cost)
+ db_session.commit()
+
+ return case_cost
+
+
+def update(*, db_session: Session, case_cost: CaseCost, case_cost_in: CaseCostUpdate) -> CaseCost:
+ """Updates a case cost."""
+ case_cost_data = case_cost.dict()
+ update_data = case_cost_in.dict(exclude_unset=True)
+
+ for field in case_cost_data:
+ if field in update_data:
+ setattr(case_cost, field, update_data[field])
+
+ db_session.commit()
+ return case_cost
+
+
+def delete(*, db_session: Session, case_cost_id: int):
+ """Deletes an existing case cost."""
+ db_session.query(CaseCost).filter(CaseCost.id == case_cost_id).delete()
+ db_session.commit()
+
+
+def get_engagement_multiplier(participant_role: str):
+ """Returns an engagement multiplier for a given case role."""
+ engagement_mappings = {
+ ParticipantRoleType.assignee: 1, # Case assignee has full engagement like incident commander
+ ParticipantRoleType.reporter: 0.5, # Same as incident reporter
+ ParticipantRoleType.participant: 0.5, # Same as incident participant
+ ParticipantRoleType.observer: 0, # Same as incident observer
+ }
+
+ return engagement_mappings.get(participant_role, 0.5) # Default to participant level
+
+
+def get_hourly_rate(project) -> int:
+ """Calculates and rounds up the employee hourly rate within a project."""
+ return math.ceil(project.annual_employee_cost / project.business_year_hours)
+
+
+def update_case_response_cost_for_case_type(db_session: Session, case_type: CaseType) -> None:
+ """Calculate the response cost of all non-closed cases associated with this case type."""
+ cases = case_service.get_all_open_by_case_type(db_session=db_session, case_type_id=case_type.id)
+ for case in cases:
+ update_case_response_cost(case=case, db_session=db_session)
+
+
+def calculate_response_cost(hourly_rate, total_response_time_seconds) -> int:
+ """Calculates and rounds up the case response cost."""
+ return math.ceil((total_response_time_seconds / SECONDS_IN_HOUR) * hourly_rate)
+
+
+def get_or_create_case_response_cost_by_model_type(
+ case: Case, model_type: str, db_session: Session
+) -> CaseCost | None:
+ """Gets a case response cost for a specific model type."""
+ # Find the cost type matching the requested model type for the project
+ response_cost_type = case_cost_type_service.get_or_create_response_cost_type(
+ db_session=db_session, project_id=case.project.id, model_type=model_type
+ )
+
+ if not response_cost_type:
+ log.warning(
+ f"A default cost type for model type {model_type} doesn't exist and could not be "
+ f"created in the {case.project.name} project. Response costs for case {case.name} "
+ "won't be calculated for this model."
+ )
+ return None
+
+ # Retrieve or create the case cost for the given case and cost type
+ case_cost = get_by_case_id_and_case_cost_type_id(
+ db_session=db_session, case_id=case.id, case_cost_type_id=response_cost_type.id
+ )
+ if not case_cost:
+ case_cost = CaseCostCreate(
+ case=case, case_cost_type=response_cost_type, amount=0, project=case.project
+ )
+ case_cost = create(db_session=db_session, case_cost_in=case_cost)
+ case.case_costs.append(case_cost)
+ db_session.add(case)
+ db_session.commit()
+
+ return case_cost
+
+
+def fetch_case_events(
+ case: Case, activity: CostModelActivity, oldest: str, db_session: Session
+) -> list[tuple[datetime.timestamp, str | None]]:
+ """Fetches case events for a given case and cost model activity.
+
+ Args:
+ case: The case to fetch events for.
+ activity: The activity to fetch events for. This defines the plugin event to fetch and
+ how much response effort each event requires.
+ oldest: The timestamp to start fetching events from.
+ db_session: The database session.
+
+ Returns:
+ list[tuple[datetime.timestamp, str | None]]: A list of tuples containing the timestamp and
+ user_id of each event.
+ """
+
+ plugin_instance = plugin_service.get_active_instance_by_slug(
+ db_session=db_session,
+ slug=activity.plugin_event.plugin.slug,
+ project_id=case.project.id,
+ )
+ if not plugin_instance:
+ log.warning(
+ f"Cannot fetch cost model activity. Its associated plugin "
+ f"{activity.plugin_event.plugin.title} is not enabled."
+ )
+ return []
+
+ return plugin_instance.instance.fetch_events(
+ db_session=db_session,
+ subject=case,
+ plugin_event_id=activity.plugin_event.id,
+ oldest=oldest,
+ )
+
+
+def update_case_participant_activities(
+ case: Case, db_session: Session
+) -> ParticipantActivity | None:
+ """Records or updates case participant activities using the case's cost model.
+
+ This function records and/or updates all new case participant activities since the last case cost update.
+
+ Args:
+ case: The case to calculate the case response cost for.
+ db_session: The database session.
+
+ Returns:
+ ParticipantActivity | None: The most recent participant activity created or updated.
+ """
+ if not case:
+ log.warning(f"Case with id {case.id} not found.")
+ return
+
+ case_type = case.case_type
+ if not case_type:
+ log.debug(f"Case type for case {case.name} not found.")
+ return
+
+ if not case_type.cost_model:
+ log.debug("No case cost model found. Skipping this case.")
+ return
+
+ if not case_type.cost_model.enabled:
+ log.debug("Case cost model is not enabled. Skipping this case.")
+ return
+
+ log.debug(f"Calculating {case.name} case cost with model {case_type.cost_model}.")
+ oldest = case.created_at.replace(tzinfo=timezone.utc).timestamp()
+
+ # Used for determining whether we've previously calculated the case cost.
+ current_time = datetime.now(tz=timezone.utc).replace(tzinfo=None)
+
+ case_response_cost = get_or_create_case_response_cost_by_model_type(
+ case=case, db_session=db_session, model_type=CostModelType.new
+ )
+ if not case_response_cost:
+ log.warning(
+ f"Cannot calculate case response cost for case {case.name}. No default case response "
+ "cost type created or found."
+ )
+ return
+
+ most_recent_activity = None
+ # Ignore events that happened before the last case cost update.
+ if case_response_cost.updated_at < current_time:
+ oldest = case_response_cost.updated_at.replace(tzinfo=timezone.utc).timestamp()
+
+ case_events = []
+ # Get the cost model. Iterate through all the listed activities we want to record.
+ cost_model = case.case_type.cost_model
+ if not cost_model:
+ cost_model = cost_model_service.get_default(
+ db_session=db_session, project_id=case.project.id
+ )
+
+ if cost_model:
+ for activity in cost_model.activities:
+ # Array of sorted (timestamp, user_id) tuples.
+ case_events.extend(
+ fetch_case_events(
+ case=case, activity=activity, oldest=oldest, db_session=db_session
+ )
+ )
+
+ # Sort case_events by timestamp
+ sorted(case_events, key=lambda x: x[0])
+
+ for ts, user_id in case_events:
+ participant = participant_service.get_by_case_id_and_conversation_id(
+ db_session=db_session,
+ case_id=case.id,
+ user_conversation_id=user_id,
+ )
+ if not participant:
+ log.warning("Cannot resolve participant.")
+ continue
+
+ activity_in = ParticipantActivityCreate(
+ plugin_event=activity.plugin_event,
+ started_at=ts,
+ ended_at=ts + timedelta(seconds=activity.response_time_seconds),
+ participant=ParticipantRead(id=participant.id),
+ case=case,
+ )
+
+ most_recent_activity = participant_activity_service.create_or_update(
+ db_session=db_session, activity_in=activity_in
+ )
+ return most_recent_activity
+
+
+def calculate_case_response_cost(case: Case, db_session: Session) -> int:
+ """Calculates the response cost of a given case.
+
+ Args:
+ case: The case to calculate costs for.
+ db_session: The database session.
+
+ Returns:
+ dict[str, int]: Dictionary containing costs from both models {'new': new_cost, 'classic': classic_cost}
+ """
+ results = {}
+
+ # Calculate new model cost
+ new_amount = calculate_case_response_cost_new(case=case, db_session=db_session)
+ results[CostModelType.new] = new_amount
+
+ # Calculate classic model cost
+ classic_amount = calculate_case_response_cost_classic(case=case, db_session=db_session)
+ results[CostModelType.classic] = classic_amount
+
+ return results
+
+
+def get_participant_role_time_seconds(case: Case, participant_role: ParticipantRole) -> int:
+ """Returns the time a participant has spent in a given role in seconds."""
+ # Skip calculating cost for participants with the observer role
+ if participant_role.role == ParticipantRoleType.observer:
+ return 0
+
+ # we get the time the participant has spent in the role so far
+ participant_role_renounced_at = datetime.now(tz=timezone.utc)
+
+ if case.status not in [CaseStatus.new, CaseStatus.triage]:
+ # Determine the earliest relevant timestamp for cost calculation cut-off
+ timestamps = []
+ if case.stable_at:
+ # Ensure stable_at is timezone-aware
+ stable_at = case.stable_at
+ if not stable_at.tzinfo:
+ stable_at = stable_at.replace(tzinfo=timezone.utc)
+ timestamps.append(stable_at)
+ if case.escalated_at:
+ # Ensure escalated_at is timezone-aware
+ escalated_at = case.escalated_at
+ if not escalated_at.tzinfo:
+ escalated_at = escalated_at.replace(tzinfo=timezone.utc)
+ timestamps.append(escalated_at)
+ if case.closed_at:
+ # Ensure closed_at is timezone-aware
+ closed_at = case.closed_at
+ if not closed_at.tzinfo:
+ closed_at = closed_at.replace(tzinfo=timezone.utc)
+ timestamps.append(closed_at)
+ if timestamps:
+ participant_role_renounced_at = min(timestamps)
+
+ if participant_role.renounced_at:
+ # the participant left the conversation or got assigned another role
+ # Ensure renounced_at is timezone-aware for comparison
+ renounced_at = participant_role.renounced_at
+ if not renounced_at.tzinfo:
+ renounced_at = renounced_at.replace(tzinfo=timezone.utc)
+
+ if renounced_at < participant_role_renounced_at:
+ # we use the role's renounced_at time if it happened before the
+ # case was marked as stable or closed
+ participant_role_renounced_at = renounced_at
+
+ # Ensure assumed_at is timezone-aware
+ assumed_at = participant_role.assumed_at
+ if not assumed_at.tzinfo:
+ assumed_at = assumed_at.replace(tzinfo=timezone.utc)
+
+ # we calculate the time the participant has spent in the role
+ participant_role_time = participant_role_renounced_at - assumed_at
+ if participant_role_time.total_seconds() < 0:
+ return 0
+
+ # we calculate the number of hours the participant has spent in the incident role
+ participant_role_time_hours = participant_role_time.total_seconds() / SECONDS_IN_HOUR
+
+ # we make the assumption that participants only spend 8 hours a day working on the incident,
+ # if the incident goes past 24hrs
+ # TODO(mvilanova): adjust based on incident priority
+ if participant_role_time_hours > HOURS_IN_DAY:
+ days, hours = divmod(participant_role_time_hours, HOURS_IN_DAY)
+ participant_role_time_hours = ((days * HOURS_IN_DAY) / 3) + hours
+
+ # we make the assumption that participants spend more or less time based on their role
+ # and we adjust the time spent based on that
+ return (
+ participant_role_time_hours
+ * SECONDS_IN_HOUR
+ * get_engagement_multiplier(participant_role.role)
+ )
+
+
+def get_total_participant_roles_time_seconds(case: Case) -> int:
+ """Calculates the time spent by all participants in this case starting from a given time.
+
+ The participant hours are adjusted based on their role's engagement multiplier.
+
+ Args:
+ case: The case the participants are part of.
+ start_at: Only time spent after this will be considered.
+
+ Returns:
+ int: The total time spent by all participants in the case roles in seconds.
+ """
+ total_participants_roles_time_seconds = 0
+ for participant in case.participants:
+ for participant_role in participant.participant_roles:
+ total_participants_roles_time_seconds += get_participant_role_time_seconds(
+ case=case,
+ participant_role=participant_role,
+ )
+ return total_participants_roles_time_seconds
+
+
+def calculate_case_response_cost_classic(case: Case, db_session: Session) -> int:
+ """Calculates case response cost using classic cost model.
+
+ This function aggregates all new case costs since the last case cost update.
+ If this is the first time performing cost calculation for this case,
+ it computes the total costs from the case's creation.
+
+ Args:
+ case: The case to calculate the case response cost for.
+ db_session: The database session.
+
+ Returns:
+ int: The case response cost in dollars.
+ """
+
+ case_response_cost = get_or_create_case_response_cost_by_model_type(
+ case=case, model_type=CostModelType.classic, db_session=db_session
+ )
+ if not case_response_cost:
+ return 0
+
+ # Aggregates the case response costs accumulated since the last case cost update
+ total_participants_roles_time_seconds = get_total_participant_roles_time_seconds(case)
+
+ # Calculates and rounds up the case cost
+ hourly_rate = get_hourly_rate(case.project)
+ amount = calculate_response_cost(
+ hourly_rate=hourly_rate,
+ total_response_time_seconds=total_participants_roles_time_seconds,
+ )
+
+ # Ensure we return an integer by rounding up the sum
+ return math.ceil(amount)
+
+
+def calculate_case_response_cost_new(case: Case, db_session: Session) -> int:
+ """Calculates case response cost using new cost model."""
+ participants_total_response_time = timedelta(0)
+ participant_activities = (
+ participant_activity_service.get_all_case_participant_activities_for_case(
+ db_session=db_session, case_id=case.id
+ )
+ )
+ for participant_activity in participant_activities:
+ participants_total_response_time += (
+ participant_activity.ended_at - participant_activity.started_at
+ )
+
+ hourly_rate = get_hourly_rate(case.project)
+ amount = calculate_response_cost(
+ hourly_rate=hourly_rate,
+ total_response_time_seconds=participants_total_response_time.total_seconds(),
+ )
+ return amount
+
+
+def update_case_response_cost(case: Case, db_session: Session) -> dict[str, int]:
+ """Updates the response cost of a given case.
+
+ This function:
+ 1. Updates case participant activities
+ 2. Calculates costs using both models
+ 3. Updates the stored costs if they've changed
+
+ Args:
+ case: The case to update costs for.
+ db_session: The database session.
+
+ Returns:
+ dict[str, int]: Dictionary containing costs from both models {'new': new_cost, 'classic': classic_cost}
+ """
+ # Update case participant activities before calculating costs
+ update_case_participant_activities(case=case, db_session=db_session)
+
+ # Calculate costs using both models
+ costs = calculate_case_response_cost(case=case, db_session=db_session)
+
+ results = {}
+
+ # Update new model cost if needed
+ if new_cost := get_or_create_case_response_cost_by_model_type(
+ case=case, model_type=CostModelType.new, db_session=db_session
+ ):
+ new_amount = costs[CostModelType.new]
+ if new_cost.amount != new_amount:
+ new_cost.amount = new_amount
+ case.case_costs.append(new_cost)
+ db_session.add(case)
+ db_session.commit()
+ results[CostModelType.new] = new_cost.amount
+
+ # Update classic model cost if needed
+ if classic_cost := get_or_create_case_response_cost_by_model_type(
+ case=case, model_type=CostModelType.classic, db_session=db_session
+ ):
+ classic_amount = costs[CostModelType.classic]
+ if classic_cost.amount != classic_amount:
+ classic_cost.amount = classic_amount
+ case.case_costs.append(classic_cost)
+ db_session.add(case)
+ db_session.commit()
+ results[CostModelType.classic] = classic_cost.amount
+
+ return results
diff --git a/src/dispatch/case_cost/views.py b/src/dispatch/case_cost/views.py
new file mode 100644
index 000000000000..bd6d184463a7
--- /dev/null
+++ b/src/dispatch/case_cost/views.py
@@ -0,0 +1,87 @@
+from fastapi import APIRouter, Depends, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
+from dispatch.models import PrimaryKey
+
+from .models import (
+ CaseCostCreate,
+ CaseCostPagination,
+ CaseCostRead,
+ CaseCostUpdate,
+)
+from .service import create, delete, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=CaseCostPagination)
+def get_case_costs(common: CommonParameters):
+ """Get all case costs, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="CaseCost", **common)
+
+
+@router.get("/{case_cost_id}", response_model=CaseCostRead)
+def get_case_cost(db_session: DbSession, case_cost_id: PrimaryKey):
+ """Get a case cost by its id."""
+ case_cost = get(db_session=db_session, case_cost_id=case_cost_id)
+ if not case_cost:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A case cost with this id does not exist."}],
+ )
+ return case_cost
+
+
+@router.post(
+ "",
+ response_model=CaseCostRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_case_cost(db_session: DbSession, case_cost_in: CaseCostCreate):
+ """Create a case cost."""
+ case_cost = create(db_session=db_session, case_cost_in=case_cost_in)
+ return case_cost
+
+
+@router.put(
+ "/{case_cost_id}",
+ response_model=CaseCostRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_case_cost(
+ db_session: DbSession,
+ case_cost_id: PrimaryKey,
+ case_cost_in: CaseCostUpdate,
+):
+ """Update a case cost by its id."""
+ case_cost = get(db_session=db_session, case_cost_id=case_cost_id)
+ if not case_cost:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A case cost with this id does not exist."}],
+ )
+ case_cost = update(
+ db_session=db_session,
+ case_cost=case_cost,
+ case_cost_in=case_cost_in,
+ )
+ return case_cost
+
+
+@router.delete(
+ "/{case_cost_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def delete_case_cost(db_session: DbSession, case_cost_id: PrimaryKey):
+ """Delete a case cost, returning only an HTTP 200 OK if successful."""
+ case_cost = get(db_session=db_session, case_cost_id=case_cost_id)
+ if not case_cost:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A case cost with this id does not exist."}],
+ )
+ delete(db_session=db_session, case_cost_id=case_cost_id)
diff --git a/src/dispatch/case_cost_type/config.py b/src/dispatch/case_cost_type/config.py
new file mode 100644
index 000000000000..322582d95e20
--- /dev/null
+++ b/src/dispatch/case_cost_type/config.py
@@ -0,0 +1,8 @@
+default_case_cost_type = {
+ "name": "Response Cost",
+ "description": "Cost associated with handling a case.",
+ "category": "Primary",
+ "details": {},
+ "default": True,
+ "editable": False,
+}
diff --git a/src/dispatch/case_cost_type/models.py b/src/dispatch/case_cost_type/models.py
new file mode 100644
index 000000000000..59299e8a9342
--- /dev/null
+++ b/src/dispatch/case_cost_type/models.py
@@ -0,0 +1,61 @@
+from datetime import datetime
+from pydantic import Field
+
+from sqlalchemy import Column, Integer, String, Boolean
+
+from sqlalchemy_utils import TSVectorType, JSONType
+
+from dispatch.database.core import Base
+from dispatch.models import (
+ DispatchBase,
+ NameStr,
+ ProjectMixin,
+ TimeStampMixin,
+ Pagination,
+ PrimaryKey,
+)
+from dispatch.project.models import ProjectRead
+
+
+# SQLAlchemy Model
+class CaseCostType(Base, TimeStampMixin, ProjectMixin):
+ # columns
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ category = Column(String)
+ details = Column(JSONType, nullable=True)
+ editable = Column(Boolean, default=True)
+ model_type = Column(String, nullable=True)
+
+ # full text search capabilities
+ search_vector = Column(
+ TSVectorType("name", "description", weights={"name": "A", "description": "B"})
+ )
+
+
+# Pydantic Models
+class CaseCostTypeBase(DispatchBase):
+ name: NameStr
+ description: str | None = None
+ category: str | None = None
+ details: dict | None = {}
+ created_at: datetime | None = None
+ editable: bool | None = None
+ model_type: str | None = Field(None, nullable=False)
+
+
+class CaseCostTypeCreate(CaseCostTypeBase):
+ project: ProjectRead
+
+
+class CaseCostTypeUpdate(CaseCostTypeBase):
+ id: PrimaryKey = None
+
+
+class CaseCostTypeRead(CaseCostTypeBase):
+ id: PrimaryKey
+
+
+class CaseCostTypePagination(Pagination):
+ items: list[CaseCostTypeRead] = []
diff --git a/src/dispatch/case_cost_type/service.py b/src/dispatch/case_cost_type/service.py
new file mode 100644
index 000000000000..96fd2f278925
--- /dev/null
+++ b/src/dispatch/case_cost_type/service.py
@@ -0,0 +1,126 @@
+from datetime import datetime, timezone
+
+from dispatch.case.enums import CostModelType
+from dispatch.project import service as project_service
+
+from .config import default_case_cost_type
+from .models import (
+ CaseCostType,
+ CaseCostTypeCreate,
+ CaseCostTypeUpdate,
+)
+
+
+def get(*, db_session, case_cost_type_id: int) -> CaseCostType | None:
+ """Gets a case cost type by its id."""
+ return db_session.query(CaseCostType).filter(CaseCostType.id == case_cost_type_id).one_or_none()
+
+
+def get_response_cost_type(
+ *, db_session, project_id: int, model_type: str
+) -> CaseCostType | None:
+ """Gets the default response cost type."""
+ return (
+ db_session.query(CaseCostType)
+ .filter(CaseCostType.project_id == project_id)
+ .filter(CaseCostType.model_type == model_type)
+ .one_or_none()
+ )
+
+
+def get_or_create_response_cost_type(
+ *, db_session, project_id: int, model_type: str = CostModelType.new
+) -> CaseCostType:
+ """Gets or creates the response case cost type."""
+ case_cost_type = get_response_cost_type(
+ db_session=db_session, project_id=project_id, model_type=model_type
+ )
+
+ if not case_cost_type:
+ case_cost_type_in = CaseCostTypeCreate(
+ name=f"{default_case_cost_type['name']}",
+ description=f"{default_case_cost_type['description']} ({model_type} Cost Model)",
+ category=default_case_cost_type["category"],
+ details=default_case_cost_type["details"],
+ editable=default_case_cost_type["editable"],
+ project=project_service.get(db_session=db_session, project_id=project_id),
+ model_type=model_type,
+ created_at=datetime.now(timezone.utc),
+ )
+ case_cost_type = create(db_session=db_session, case_cost_type_in=case_cost_type_in)
+
+ return case_cost_type
+
+
+def get_all_response_case_cost_types(
+ *, db_session, project_id: int
+) -> list[CaseCostType | None]:
+ """Returns all response case cost types.
+
+ This function queries the database for all case cost types that are marked as the response cost type.
+ The following case cost types that match this description are:
+ - CaseCostType with model_type CLASSIC
+ - CaseCostType with model_type NEW
+
+ All other case types are not tied to the default response cost type.
+ """
+ return (
+ +db_session.query(CaseCostType)
+ .filter(CaseCostType.project_id == project_id)
+ .filter(CaseCostType.model_type == CostModelType.classic)
+ .one()
+ + db_session.query(CaseCostType)
+ .filter(CaseCostType.project_id == project_id)
+ .filter(CaseCostType.model_type == CostModelType.new)
+ .one()
+ )
+
+
+def get_by_name(*, db_session, project_id: int, case_cost_type_name: str) -> CaseCostType | None:
+ """Gets a case cost type by its name."""
+ return (
+ db_session.query(CaseCostType)
+ .filter(CaseCostType.name == case_cost_type_name)
+ .filter(CaseCostType.project_id == project_id)
+ .first()
+ )
+
+
+def get_all(*, db_session) -> list[CaseCostType | None]:
+ """Gets all case cost types."""
+ return db_session.query(CaseCostType).all()
+
+
+def create(*, db_session, case_cost_type_in: CaseCostTypeCreate) -> CaseCostType:
+ """Creates a new case cost type."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=case_cost_type_in.project
+ )
+ case_cost_type = CaseCostType(**case_cost_type_in.dict(exclude={"project"}), project=project)
+ db_session.add(case_cost_type)
+ db_session.commit()
+ return case_cost_type
+
+
+def update(
+ *,
+ db_session,
+ case_cost_type: CaseCostType,
+ case_cost_type_in: CaseCostTypeUpdate,
+) -> CaseCostType:
+ """Updates a case cost type."""
+ case_cost_data = case_cost_type.dict()
+ update_data = case_cost_type_in.dict(exclude_unset=True)
+
+ for field in case_cost_data:
+ if field in update_data:
+ setattr(case_cost_type, field, update_data[field])
+
+ db_session.commit()
+ return case_cost_type
+
+
+def delete(*, db_session, case_cost_type_id: int):
+ """Deletes an existing case cost type."""
+ db_session.query(CaseCostType).filter(CaseCostType.id == case_cost_type_id).delete()
+ db_session.commit()
diff --git a/src/dispatch/case_cost_type/views.py b/src/dispatch/case_cost_type/views.py
new file mode 100644
index 000000000000..25f900288cc5
--- /dev/null
+++ b/src/dispatch/case_cost_type/views.py
@@ -0,0 +1,105 @@
+from fastapi import APIRouter, Depends, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
+from dispatch.models import PrimaryKey
+
+from .models import (
+ CaseCostTypeCreate,
+ CaseCostTypePagination,
+ CaseCostTypeRead,
+ CaseCostTypeUpdate,
+)
+from .service import create, delete, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=CaseCostTypePagination)
+def get_case_cost_types(common: CommonParameters):
+ """Get all case cost types, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="CaseCostType", **common)
+
+
+@router.get("/{case_cost_type_id}", response_model=CaseCostTypeRead)
+def get_case_cost_type(db_session: DbSession, case_cost_type_id: PrimaryKey):
+ """Get a case cost type by its id."""
+ case_cost_type = get(db_session=db_session, case_cost_type_id=case_cost_type_id)
+ if not case_cost_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A case cost type with this id does not exist."}],
+ )
+ return case_cost_type
+
+
+@router.post(
+ "",
+ response_model=CaseCostTypeRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_case_cost_type(db_session: DbSession, case_cost_type_in: CaseCostTypeCreate):
+ """Create a case cost type."""
+ case_cost_type = create(db_session=db_session, case_cost_type_in=case_cost_type_in)
+ return case_cost_type
+
+
+@router.put(
+ "/{case_cost_type_id}",
+ response_model=CaseCostTypeRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_case_cost_type(
+ db_session: DbSession,
+ case_cost_type_id: PrimaryKey,
+ case_cost_type_in: CaseCostTypeUpdate,
+):
+ """Update a case cost type by its id."""
+ case_cost_type = get(db_session=db_session, case_cost_type_id=case_cost_type_id)
+ if not case_cost_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A case cost type with this id does not exist."}],
+ )
+
+ if not case_cost_type.editable:
+ raise HTTPException(
+ status_code=301,
+ detail=[{"msg": "You are not allowed to update this case cost type."}],
+ )
+
+ case_cost_type = update(
+ db_session=db_session,
+ case_cost_type=case_cost_type,
+ case_cost_type_in=case_cost_type_in,
+ )
+ return case_cost_type
+
+
+@router.delete(
+ "/{case_cost_type_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def delete_case_cost_type(
+ db_session: DbSession,
+ case_cost_type_id: PrimaryKey,
+):
+ """Delete a case cost type, returning only an HTTP 200 OK if successful."""
+ case_cost_type = get(db_session=db_session, case_cost_type_id=case_cost_type_id)
+
+ if not case_cost_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A case cost type with this id does not exist."}],
+ )
+
+ if not case_cost_type.editable:
+ raise HTTPException(
+ status_code=301,
+ detail=[{"msg": "You are not allowed to delete this case cost type."}],
+ )
+
+ delete(db_session=db_session, case_cost_type_id=case_cost_type_id)
diff --git a/src/dispatch/cli.py b/src/dispatch/cli.py
index c337781b773b..44cd993c8dcd 100644
--- a/src/dispatch/cli.py
+++ b/src/dispatch/cli.py
@@ -1,475 +1,263 @@
import logging
import os
-import sys
import click
import uvicorn
-from alembic import command as alembic_command
-from alembic.config import Config as AlembicConfig
-from tabulate import tabulate
-from uvicorn import main as uvicorn_main
from dispatch import __version__, config
-from dispatch.tag.models import * # noqa
-from dispatch.common.utils.cli import install_plugin_events, install_plugins
+from dispatch.config import DISPATCH_UI_URL
+from dispatch.enums import UserRoles
+from dispatch.plugin.models import PluginInstance
-from .database import Base, engine
-from .exceptions import DispatchException
-from .plugins.base import plugins
+from .extensions import configure_extensions
from .scheduler import scheduler
-from dispatch.models import * # noqa; noqa
-
os.environ["OAUTHLIB_INSECURE_TRANSPORT"] = "1"
log = logging.getLogger(__name__)
-def abort_if_false(ctx, param, value):
- if not value:
- ctx.abort()
-
-
-def insert_newlines(string, every=64):
- return "\n".join(string[i : i + every] for i in range(0, len(string), every))
-
-
@click.group()
@click.version_option(version=__version__)
def dispatch_cli():
"""Command-line interface to Dispatch."""
- pass
+ from .logging import configure_logging
+
+ configure_logging()
+
+ configure_extensions()
@dispatch_cli.group("plugins")
def plugins_group():
"""All commands for plugin manipulation."""
- install_plugins()
+ pass
@plugins_group.command("list")
def list_plugins():
- """Shows all available plugins"""
- table = []
- for p in plugins.all():
- table.append([p.title, p.slug, p.version, p.type, p.author, p.description])
- click.secho(
- tabulate(table, headers=["Title", "Slug", "Version", "Type", "Author", "Description"]),
- fg="blue",
- )
+ """Shows all available plugins."""
+ from tabulate import tabulate
-
-@dispatch_cli.group("term")
-def term_command_group():
- """All commands for term manipulation."""
- pass
-
-
-@dispatch_cli.group("contact")
-def contact_command_group():
- """All commands for contact manipulation."""
- pass
-
-
-@contact_command_group.group("load")
-def contact_load_group():
- """All contact load commands."""
- pass
-
-
-@contact_load_group.command("csv")
-@click.argument("input", type=click.File("r"))
-@click.option("--first-row-is-header", is_flag=True, default=True)
-def contact_load_csv_command(input, first_row_is_header):
- """Load contacts via CSV."""
- import csv
- from pydantic import ValidationError
- from dispatch.individual import service as individual_service
- from dispatch.team import service as team_service
- from dispatch.database import SessionLocal
+ from dispatch.database.core import SessionLocal
+ from dispatch.plugin import service as plugin_service
db_session = SessionLocal()
+ table = []
+ for record in plugin_service.get_all(db_session=db_session):
+ table.append(
+ [
+ record.title,
+ record.slug,
+ record.version,
+ record.type,
+ record.author,
+ record.description,
+ ]
+ )
- individual_contacts = []
- team_contacts = []
- if first_row_is_header:
- reader = csv.DictReader(input)
- for row in reader:
- row = {k.lower(): v for k, v in row.items()}
- if not row.get("email"):
- continue
-
- individual_contacts.append(row)
-
- for i in individual_contacts:
- i["is_external"] = True
- try:
- click.secho(f"Adding new individual contact. Email: {i['email']}", fg="blue")
- individual_service.get_or_create(db_session=db_session, **i)
- except ValidationError as e:
- click.secho(f"Failed to add individual contact. {e} {row}", fg="red")
-
- for t in team_contacts:
- i["is_external"] = True
- try:
- click.secho(f"Adding new team contact. Email: {t['email']}", fg="blue")
- team_service.get_or_create(db_session=db_session, **t)
- except ValidationError as e:
- click.secho(f"Failed to add team contact. {e} {row}", fg="red")
-
-
-@dispatch_cli.group("incident")
-def incident_command_group():
- """All commands for incident manipulation."""
- pass
+ click.secho(
+ tabulate(
+ table,
+ headers=[
+ "Title",
+ "Slug",
+ "Version",
+ "Type",
+ "Author",
+ "Description",
+ ],
+ ),
+ fg="blue",
+ )
-@incident_command_group.group("load")
-def incident_load_group():
- """All incient load commands."""
- pass
-
+@plugins_group.command("install")
+@click.option(
+ "-f",
+ "--force",
+ is_flag=True,
+ help="Force a plugin to update all details about itself, this will overwrite the current database entry.",
+)
+def install_plugins(force):
+ """Installs all plugins, or only one."""
+ from dispatch.common.utils.cli import install_plugins
+ from dispatch.database.core import SessionLocal
+ from dispatch.plugin import service as plugin_service
+ from dispatch.plugin.models import Plugin, PluginEvent
+ from dispatch.plugins.base import plugins
-@incident_load_group.command("csv")
-@click.argument("input", type=click.File("r"))
-@click.option("--first-row-is-header", is_flag=True, default=True)
-def incident_load_csv_command(input, first_row_is_header):
- """Load incidents via CSV."""
- import csv
- from dispatch.database import SessionLocal
- from datetime import datetime
- from dispatch.incident import service as incident_service
+ install_plugins()
db_session = SessionLocal()
-
- if first_row_is_header:
- reader = csv.DictReader(input)
- for row in reader:
- incident = incident_service.get_by_name(
- db_session=db_session, incident_name=row["name"]
+ for p in plugins.all():
+ record = plugin_service.get_by_slug(db_session=db_session, slug=p.slug)
+ if not record:
+ click.secho(f"Installing plugin... Slug: {p.slug} Version: {p.version}", fg="blue")
+ plugin = Plugin(
+ title=p.title,
+ slug=p.slug,
+ type=p.type,
+ version=p.version,
+ author=p.author,
+ author_url=p.author_url,
+ multiple=p.multiple,
+ description=p.description,
)
- if incident:
- incident.created_at = datetime.fromisoformat(row["created"])
+ db_session.add(plugin)
+ record = plugin
+ else:
+ if force:
+ click.secho(f"Updating plugin... Slug: {p.slug} Version: {p.version}", fg="blue")
+ # we only update values that should change
+ record.title = p.title
+ record.version = p.version
+ record.author = p.author
+ record.author_url = p.author_url
+ record.description = p.description
+ record.type = p.type
+
+ # Registers the plugin events with the plugin or updates the plugin events
+ for plugin_event_in in p.plugin_events:
+ click.secho(f" Registering plugin event... Slug: {plugin_event_in.slug}", fg="blue")
+ if plugin_event := plugin_service.get_plugin_event_by_slug(
+ db_session=db_session, slug=plugin_event_in.slug
+ ):
+ plugin_event.name = plugin_event_in.name
+ plugin_event.description = plugin_event_in.description
+ plugin_event.plugin = record
else:
- click.secho(f"No incident found. Name: {row['name']}", fg="red")
-
-
-# This has been left as an example of how to import a jira issue
-# @incident_load_group.command("jira")
-# @click.argument("query")
-# @click.option("--url", help="Jira instance url.", default=JIRA_URL)
-# @click.option("--username", help="Jira username.", default=JIRA_USERNAME)
-# @click.option("--password", help="Jira password.", default=JIRA_PASSWORD)
-# def incident_load_jira(query, url, username, password):
-# """Loads incident data from jira."""
-# install_plugins()
-# import re
-# from jira import JIRA
-# from dispatch.incident.models import Incident
-# from dispatch.database import SessionLocal
-# from dispatch.incident_priority import service as incident_priority_service
-# from dispatch.incident_type import service as incident_type_service
-# from dispatch.individual import service as individual_service
-# from dispatch.participant import service as participant_service
-# from dispatch.participant_role import service as participant_role_service
-# from dispatch.participant_role.models import ParticipantRoleType
-# from dispatch.ticket import service as ticket_service
-# from dispatch.conversation import service as conversation_service
-# from dispatch.config import (
-# INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT,
-# INCIDENT_PLUGIN_CONVERSATION_SLUG,
-# INCIDENT_PLUGIN_TICKET_SLUG,
-# )
-# from dispatch.document import service as document_service
-# from dispatch.document.models import DocumentCreate
-#
-# db_session = SessionLocal()
-#
-# client = JIRA(str(JIRA_URL), basic_auth=(JIRA_USERNAME, str(JIRA_PASSWORD)))
-#
-# block_size = 100
-# block_num = 0
-#
-#
-# while True:
-# start_idx = block_num * block_size
-# issues = client.search_issues(query, start_idx, block_size)
-#
-# click.secho(f"Collecting. PageSize: {block_size} PageNum: {block_num}", fg="blue")
-# if not issues:
-# # Retrieve issues until there are no more to come
-# break
-#
-# block_num += 1
-#
-# for issue in issues:
-# try:
-# participants = []
-# incident_name = issue.key
-# created_at = issue.fields.created
-#
-# # older tickets don't have a component
-# if not issue.fields.components:
-# incident_type = "Other"
-# else:
-# incident_type = issue.fields.components[0].name
-#
-# title = issue.fields.summary
-#
-# if issue.fields.reporter:
-# reporter_email = issue.fields.reporter.emailAddress
-# else:
-# reporter_email = "joe@example.com"
-#
-# status = issue.fields.status.name
-#
-# # older tickets don't have priority
-# if not issue.fields.customfield_10551:
-# incident_priority = "Low"
-# else:
-# incident_priority = issue.fields.customfield_10551.value
-#
-# incident_cost = issue.fields.customfield_20250
-# if incident_cost:
-# incident_cost = incident_cost.replace("$", "")
-# incident_cost = incident_cost.replace(",", "")
-# incident_cost = float(incident_cost)
-#
-# if issue.fields.assignee:
-# commander_email = issue.fields.assignee.emailAddress
-# else:
-# commander_email = "joe@example.com"
-#
-# resolved_at = issue.fields.resolutiondate
-#
-# description = issue.fields.description or "No Description"
-#
-# match = re.findall(r"\[(?P.*?)\|(?P .*?)\]", description)
-#
-# conversation_weblink = None
-# incident_document_weblink = None
-# for m_type, m_link in match:
-# if "conversation" in m_type.lower():
-# conversation_weblink = m_link
-#
-# if "document" in m_type.lower():
-# incident_document_weblink = m_link
-#
-# ticket = {
-# "resource_type": INCIDENT_PLUGIN_TICKET_SLUG,
-# "weblink": f"{JIRA_URL}/projects/SEC/{incident_name}",
-# }
-# ticket_obj = ticket_service.create(db_session=db_session, **ticket)
-#
-# documents = []
-# if incident_document_weblink:
-# document_in = DocumentCreate(
-# name=f"{incident_name} - Investigation Document",
-# resource_id=incident_document_weblink.split("/")[-2],
-# resource_type=INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT,
-# weblink=incident_document_weblink,
-# )
-#
-# document_obj = document_service.create(
-# db_session=db_session, document_in=document_in
-# )
-#
-# documents.append(document_obj)
-#
-# conversation_obj = None
-# if conversation_weblink:
-# conversation_obj = conversation_service.create(
-# db_session=db_session,
-# resource_id=incident_name.lower(),
-# resource_type=INCIDENT_PLUGIN_CONVERSATION_SLUG,
-# weblink=conversation_weblink,
-# channel_id=incident_name.lower(),
-# )
-#
-# # TODO should some of this logic be in the incident_create_flow_instead? (kglisson)
-# incident_priority = incident_priority_service.get_by_name(
-# db_session=db_session, name=incident_priority
-# )
-#
-# incident_type = incident_type_service.get_by_name(
-# db_session=db_session, name=incident_type
-# )
-#
-# try:
-# commander_info = individual_service.resolve_user_by_email(commander_email)
-# except KeyError:
-# commander_info = {"email": commander_email, "fullname": "", "weblink": ""}
-#
-# incident_commander_role = participant_role_service.create(
-# db_session=db_session, role=ParticipantRoleType.incident_commander
-# )
-#
-# commander_participant = participant_service.create(
-# db_session=db_session, participant_role=[incident_commander_role]
-# )
-#
-# commander = individual_service.get_or_create(
-# db_session=db_session,
-# email=commander_info["email"],
-# name=commander_info["fullname"],
-# weblink=commander_info["weblink"],
-# )
-#
-# incident_reporter_role = participant_role_service.create(
-# db_session=db_session, role=ParticipantRoleType.reporter
-# )
-#
-# if reporter_email == commander_email:
-# commander_participant.participant_role.append(incident_reporter_role)
-# else:
-# reporter_participant = participant_service.create(
-# db_session=db_session, participant_role=[incident_reporter_role]
-# )
-#
-# try:
-# reporter_info = individual_service.resolve_user_by_email(reporter_email)
-# except KeyError:
-# reporter_info = {"email": reporter_email, "fullname": "", "weblink": ""}
-#
-# reporter = individual_service.get_or_create(
-# db_session=db_session,
-# email=reporter_info["email"],
-# name=reporter_info["fullname"],
-# weblink=commander_info["weblink"],
-# )
-# reporter.participant.append(reporter_participant)
-# db_session.add(reporter)
-# participants.append(reporter_participant)
-#
-# participants.append(commander_participant)
-# incident = Incident(
-# title=title,
-# description=description,
-# status=status,
-# name=incident_name,
-# cost=incident_cost,
-# created_at=created_at,
-# closed_at=resolved_at,
-# incident_priority=incident_priority,
-# incident_type=incident_type,
-# participants=participants,
-# conversation=conversation_obj,
-# documents=documents,
-# ticket=ticket_obj,
-# )
-#
-# commander.participant.append(commander_participant)
-# db_session.add(commander)
-# db_session.add(incident)
-# db_session.commit()
-# click.secho(
-# f"Imported Issue. Key: {issue.key} Reporter: {incident.reporter.email}, Commander: {incident.commander.email}",
-# fg="blue",
-# )
-# except Exception as e:
-# click.secho(f"Error importing issue. Key: {issue.key} Reason: {e}", fg="red")
-#
-
-
-@incident_command_group.command("close")
-@click.argument("username")
-@click.argument("name", nargs=-1)
-def close_incidents(name, username):
- """This command will close a specific incident (running the close flow or all open incidents). Useful for development."""
- from dispatch.incident.flows import incident_closed_flow
- from dispatch.incident.models import Incident
- from dispatch.database import SessionLocal
-
- install_plugins()
+ plugin_event = PluginEvent(
+ name=plugin_event_in.name,
+ slug=plugin_event_in.slug,
+ description=plugin_event_in.description,
+ plugin=record,
+ )
+ db_session.add(plugin_event)
+ db_session.commit()
+
+
+@plugins_group.command("uninstall")
+@click.argument("plugins", nargs=-1)
+def uninstall_plugins(plugins):
+ """Uninstalls all plugins, or only one."""
+ from dispatch.database.core import SessionLocal
+ from dispatch.plugin import service as plugin_service
- incidents = []
db_session = SessionLocal()
- if not name:
- incidents = db_session.query(Incident).all()
- else:
- incidents = [db_session.query(Incident).filter(Incident.name == x).first() for x in name]
-
- for i in incidents:
- if i.conversation:
- if i.status == "Active":
- command = {"channel_id": i.conversation.channel_id, "user_id": username}
- try:
- incident_closed_flow(command=command, db_session=db_session, incident_id=i.id)
- except Exception:
- click.echo("Incident close failed.")
-
-
-@incident_command_group.command("clean")
-@click.argument("pattern", nargs=-1)
-def clean_incident_artifacts(pattern):
- """This command will clean up incident artifacts. Useful for development."""
- import re
- from dispatch.plugins.dispatch_google.drive.config import GOOGLE_DOMAIN
- from dispatch.plugins.dispatch_google.common import get_service
- from dispatch.plugins.dispatch_google.drive.drive import delete_team_drive, list_team_drives
-
- from dispatch.plugins.dispatch_slack.service import (
- slack,
- list_conversations,
- archive_conversation,
- )
- from dispatch.plugins.dispatch_slack.config import SLACK_API_BOT_TOKEN
+ for plugin_slug in plugins:
+ plugin = plugin_service.get_by_slug(db_session=db_session, slug=plugin_slug)
+ if not plugin:
+ click.secho(
+ f"Plugin slug {plugin_slug} does not exist. Make sure you're passing the plugin's slug.",
+ fg="red",
+ )
- from dispatch.plugins.dispatch_google.groups.plugin import delete_group, list_groups
+ plugin_service.delete(db_session=db_session, plugin_id=plugin.id)
- install_plugins()
- patterns = [re.compile(p) for p in pattern]
+@dispatch_cli.group("user")
+def dispatch_user():
+ """Container for all user commands."""
+ pass
- click.secho("Deleting google groups...", fg="red")
- scopes = [
- "https://www.googleapis.com/auth/admin.directory.group",
- "https://www.googleapis.com/auth/apps.groups.settings",
- ]
- client = get_service("admin", "directory_v1", scopes)
+@dispatch_user.command("register")
+@click.argument("email")
+@click.option(
+ "--organization",
+ "-o",
+ required=True,
+ help="Organization to set role for.",
+)
+@click.password_option()
+@click.option(
+ "--role",
+ "-r",
+ required=True,
+ type=click.Choice(UserRoles),
+ help="Role to be assigned to the user.",
+)
+def register_user(email: str, role: str, password: str, organization: str):
+ """Registers a new user."""
+ from dispatch.auth import service as user_service
+ from dispatch.auth.models import UserOrganization, UserRegister
+ from dispatch.database.core import refetch_db_session
+
+ db_session = refetch_db_session(organization_slug=organization)
+ user = user_service.get_by_email(email=email, db_session=db_session)
+ if user:
+ click.secho(f"User already exists. Email: {email}", fg="red")
+ return
+
+ user_organization = UserOrganization(role=role, organization={"name": organization})
+ user_service.create(
+ user_in=UserRegister(email=email, password=password, organizations=[user_organization]),
+ db_session=db_session,
+ organization=organization,
+ )
+ click.secho("User registered successfully.", fg="green")
- for group in list_groups(client, query="email:sec-test*", domain=GOOGLE_DOMAIN)["groups"]:
- for p in patterns:
- if p.match(group["name"]):
- click.secho(group["name"], fg="red")
- delete_group(client, group_key=group["email"])
- click.secho("Archiving slack channels...", fg="red")
- client = slack.WebClient(token=SLACK_API_BOT_TOKEN)
- for c in list_conversations(client):
- for p in patterns:
- if p.match(c["name"]):
- archive_conversation(client, c["id"])
+@dispatch_user.command("update")
+@click.argument("email")
+@click.option(
+ "--organization",
+ "-o",
+ required=True,
+ help="Organization to set role for.",
+)
+@click.option(
+ "--role",
+ "-r",
+ required=True,
+ type=click.Choice(UserRoles),
+ help="Role to be assigned to the user.",
+)
+def update_user(email: str, role: str, organization: str):
+ """Updates a user's roles."""
+ from dispatch.auth import service as user_service
+ from dispatch.auth.models import UserOrganization, UserUpdate
+ from dispatch.database.core import SessionLocal
- click.secho("Deleting google drives...", fg="red")
- scopes = ["https://www.googleapis.com/auth/drive"]
- client = get_service("drive", "v3", scopes)
+ db_session = SessionLocal()
+ user = user_service.get_by_email(email=email, db_session=db_session)
+ if not user:
+ click.secho(f"No user found. Email: {email}", fg="red")
+ return
+
+ organization = UserOrganization(role=role, organization={"name": organization})
+ user_service.update(
+ user=user,
+ user_in=UserUpdate(id=user.id, organizations=[organization]),
+ db_session=db_session,
+ )
+ click.secho("User successfully updated.", fg="green")
- for drive in list_team_drives(client):
- for p in patterns:
- if p.match(drive["name"]):
- click.secho(f"Deleting drive: {drive['name']}", fg="red")
- delete_team_drive(client, drive["id"], empty=True)
+@dispatch_user.command("reset")
+@click.argument("email")
+@click.password_option()
+def reset_user_password(email: str, password: str):
+ """Resets a user's password."""
+ from dispatch.auth import service as user_service
+ from dispatch.database.core import SessionLocal
-def sync_triggers():
- from sqlalchemy_searchable import sync_trigger
+ db_session = SessionLocal()
+ user = user_service.get_by_email(email=email, db_session=db_session)
+ if not user:
+ click.secho(f"No user found. Email: {email}", fg="red")
+ return
- sync_trigger(engine, "tag", "search_vector", ["name"])
- sync_trigger(engine, "definition", "search_vector", ["text"])
- sync_trigger(engine, "incident", "search_vector", ["name", "title", "description"])
- sync_trigger(
- engine, "individual_contact", "search_vector", ["name", "title", "company", "notes"]
- )
- sync_trigger(engine, "team_contact", "search_vector", ["name", "company", "notes"])
- sync_trigger(engine, "term", "search_vector", ["text"])
- sync_trigger(engine, "document", "search_vector", ["name"])
- sync_trigger(engine, "incident_type", "search_vector", ["name", "description"])
- sync_trigger(engine, "policy", "search_vector", ["name", "description"])
- sync_trigger(engine, "service", "search_vector", ["name"])
- sync_trigger(engine, "task", "search_vector", ["description"])
+ try:
+ # Use the new set_password method which includes validation
+ user.set_password(password)
+ db_session.commit()
+ click.secho("User password successfully updated.", fg="green")
+ except ValueError as e:
+ click.secho(f"Failed to update password: {str(e)}", fg="red")
+ return
@dispatch_cli.group("database")
@@ -478,44 +266,85 @@ def dispatch_database():
pass
-@dispatch_database.command("sync-triggers")
-def database_trigger_sync():
- """Ensures that all database triggers have been installed."""
- sync_triggers()
+def prompt_for_confirmation(command: str) -> bool:
+ """Prompts the user for database details."""
+ from dispatch.config import DATABASE_HOSTNAME, DATABASE_NAME
+ from sqlalchemy_utils import database_exists
- click.secho("Success.", fg="green")
+ database_hostname = click.prompt(
+ f"Please enter the database hostname (env = {DATABASE_HOSTNAME})"
+ )
+ if database_hostname != DATABASE_HOSTNAME:
+ click.secho(
+ f"ERROR: You cannot {command} a database with a different hostname.",
+ fg="red",
+ )
+ return False
+ if database_hostname != "localhost":
+ click.secho(
+ f"Warning: You are about to {command} a remote database.",
+ fg="yellow",
+ )
+
+ database_name = click.prompt(f"Please enter the database name (env = {DATABASE_NAME})")
+ if database_name != DATABASE_NAME:
+ click.secho(
+ f"ERROR: You cannot {command} a database with a different name.",
+ fg="red",
+ )
+ return False
+
+ if command != "drop":
+ return True
+
+ sqlalchemy_database_uri = f"postgresql+psycopg2://{config._DATABASE_CREDENTIAL_USER}:{config._QUOTED_DATABASE_PASSWORD}@{database_hostname}:{config.DATABASE_PORT}/{database_name}"
+ if database_exists(str(sqlalchemy_database_uri)):
+ if click.confirm(
+ f"Are you sure you want to {command} database: '{database_hostname}:{database_name}'?"
+ ):
+ return True
+ else:
+ click.secho(f"Database '{database_hostname}:{database_name}' does not exist!!!", fg="red")
+ return False
@dispatch_database.command("init")
-def init_database():
+def database_init():
"""Initializes a new database."""
- from sqlalchemy_utils import create_database, database_exists
+ click.echo("Initializing new database...")
+ from .database.core import engine
+ from .database.manage import init_database
- if not database_exists(str(config.SQLALCHEMY_DATABASE_URI)):
- create_database(str(config.SQLALCHEMY_DATABASE_URI))
- Base.metadata.create_all(engine)
- alembic_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "alembic.ini")
- alembic_cfg = AlembicConfig(alembic_path)
- alembic_command.stamp(alembic_cfg, "head")
+ if not prompt_for_confirmation("init"):
+ click.secho("Aborting database initialization.", fg="red")
+ return
- sync_triggers()
+ init_database(engine)
click.secho("Success.", fg="green")
@dispatch_database.command("restore")
@click.option(
- "--dump-file", default='dispatch-backup.dump', help="Path to a PostgreSQL dump file.")
-def restore_database(dump_file):
- """Restores the database via pg_restore."""
- import sh
- from sh import psql, createdb
+ "--dump-file",
+ default="dispatch-backup.dump",
+ help="Path to a PostgreSQL text format dump file.",
+)
+@click.option("--skip-check", is_flag=True, help="Skip confirmation check if flag is set.")
+def restore_database(dump_file, skip_check):
+ """Restores the database via psql."""
+ from sh import ErrorReturnCode_1, createdb, psql
+
from dispatch.config import (
+ DATABASE_CREDENTIALS,
DATABASE_HOSTNAME,
DATABASE_NAME,
DATABASE_PORT,
- DATABASE_CREDENTIALS
)
+ if not skip_check and not prompt_for_confirmation("restore"):
+ click.secho("Aborting database restore.", fg="red")
+ return
+
username, password = str(DATABASE_CREDENTIALS).split(":")
try:
@@ -531,7 +360,7 @@ def restore_database(dump_file):
_env={"PGPASSWORD": password},
)
)
- except sh.ErrorReturnCode_1:
+ except ErrorReturnCode_1:
print("Database already exists.")
print(
@@ -542,6 +371,8 @@ def restore_database(dump_file):
DATABASE_PORT,
"-U",
username,
+ "-d",
+ DATABASE_NAME,
"-f",
dump_file,
_env={"PGPASSWORD": password},
@@ -551,21 +382,27 @@ def restore_database(dump_file):
@dispatch_database.command("dump")
-def dump_database():
+@click.option(
+ "--dump-file",
+ default="dispatch-backup.dump",
+ help="Path to a PostgreSQL text format dump file.",
+)
+def dump_database(dump_file):
"""Dumps the database via pg_dump."""
from sh import pg_dump
+
from dispatch.config import (
+ DATABASE_CREDENTIALS,
DATABASE_HOSTNAME,
DATABASE_NAME,
DATABASE_PORT,
- DATABASE_CREDENTIALS,
)
username, password = str(DATABASE_CREDENTIALS).split(":")
pg_dump(
"-f",
- "dispatch-backup.dump",
+ dump_file,
"-h",
DATABASE_HOSTNAME,
"-p",
@@ -578,18 +415,21 @@ def dump_database():
@dispatch_database.command("drop")
-@click.option(
- "--yes",
- is_flag=True,
- callback=abort_if_false,
- expose_value=False,
- prompt="Are you sure you want to drop the database?",
-)
def drop_database():
"""Drops all data in database."""
from sqlalchemy_utils import drop_database
- drop_database(str(config.SQLALCHEMY_DATABASE_URI))
+ if not prompt_for_confirmation("drop"):
+ click.secho("Aborting database drop.", fg="red")
+ return
+
+ sqlalchemy_database_uri = (
+ f"postgresql+psycopg2://{config._DATABASE_CREDENTIAL_USER}:"
+ f"{config._QUOTED_DATABASE_PASSWORD}@{config.DATABASE_HOSTNAME}:"
+ f"{config.DATABASE_PORT}/{config.DATABASE_NAME}"
+ )
+
+ drop_database(str(sqlalchemy_database_uri))
click.secho("Success.", fg="green")
@@ -604,36 +444,102 @@ def drop_database():
help="Don't emit SQL to database - dump to standard output instead.",
)
@click.option("--revision", nargs=1, default="head", help="Revision identifier.")
-def upgrade_database(tag, sql, revision):
+@click.option("--revision-type", type=click.Choice(["core", "tenant"]))
+def upgrade_database(tag, sql, revision, revision_type):
"""Upgrades database schema to newest version."""
- from sqlalchemy_utils import database_exists, create_database
+ import sqlalchemy
+ from alembic import command as alembic_command
+ from alembic.config import Config as AlembicConfig
+ from sqlalchemy import inspect
+ from sqlalchemy_utils import database_exists
+
+ from .database.core import engine
+ from .database.manage import init_database
+
+ alembic_cfg = AlembicConfig(config.ALEMBIC_INI_PATH)
- alembic_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "alembic.ini")
- alembic_cfg = AlembicConfig(alembic_path)
if not database_exists(str(config.SQLALCHEMY_DATABASE_URI)):
- create_database(str(config.SQLALCHEMY_DATABASE_URI))
- Base.metadata.create_all(engine)
- alembic_command.stamp(alembic_cfg, "head")
+ click.secho("Found no database to upgrade, initializing new database...")
+ init_database(engine)
else:
- alembic_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "alembic.ini")
- alembic_cfg = AlembicConfig(alembic_path)
- alembic_command.upgrade(alembic_cfg, revision, sql=sql, tag=tag)
+ conn = engine.connect()
+
+ # detect if we need to convert to a multi-tenant schema structure
+ schema_names = inspect(engine).get_schema_names()
+ if "dispatch_core" not in schema_names:
+ click.secho("Detected single tenant database, converting to multi-tenant...")
+ conn.execute(sqlalchemy.text(open(config.ALEMBIC_MULTI_TENANT_MIGRATION_PATH).read()))
+
+ if revision_type:
+ if revision_type == "core":
+ path = config.ALEMBIC_CORE_REVISION_PATH
+
+ elif revision_type == "tenant":
+ path = config.ALEMBIC_TENANT_REVISION_PATH
+
+ alembic_cfg.set_main_option("script_location", path)
+ alembic_command.upgrade(alembic_cfg, revision, sql=sql, tag=tag)
+ else:
+ for path in [config.ALEMBIC_CORE_REVISION_PATH, config.ALEMBIC_TENANT_REVISION_PATH]:
+ alembic_cfg.set_main_option("script_location", path)
+ alembic_command.upgrade(alembic_cfg, revision, sql=sql, tag=tag)
+
click.secho("Success.", fg="green")
+@dispatch_database.command("merge")
+@click.argument("revisions", nargs=-1)
+@click.option("--revision-type", type=click.Choice(["core", "tenant"]), default="core")
+@click.option("--message")
+def merge_revisions(revisions, revision_type, message):
+ """Combines two revisions."""
+ from alembic import command as alembic_command
+ from alembic.config import Config as AlembicConfig
+
+ alembic_cfg = AlembicConfig(config.ALEMBIC_INI_PATH)
+ if revision_type == "core":
+ path = config.ALEMBIC_CORE_REVISION_PATH
+
+ elif revision_type == "tenant":
+ path = config.ALEMBIC_TENANT_REVISION_PATH
+
+ alembic_cfg.set_main_option("script_location", path)
+ alembic_command.merge(alembic_cfg, revisions, message=message)
+
+
@dispatch_database.command("heads")
-def head_database():
+@click.option("--revision-type", type=click.Choice(["core", "tenant"]), default="core")
+def head_database(revision_type):
"""Shows the heads of the database."""
- alembic_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "alembic.ini")
- alembic_cfg = AlembicConfig(alembic_path)
+ from alembic import command as alembic_command
+ from alembic.config import Config as AlembicConfig
+
+ alembic_cfg = AlembicConfig(config.ALEMBIC_INI_PATH)
+ if revision_type == "core":
+ path = config.ALEMBIC_CORE_REVISION_PATH
+
+ elif revision_type == "tenant":
+ path = config.ALEMBIC_TENANT_REVISION_PATH
+
+ alembic_cfg.set_main_option("script_location", path)
alembic_command.heads(alembic_cfg)
@dispatch_database.command("history")
-def history_database():
+@click.option("--revision-type", type=click.Choice(["core", "tenant"]), default="core")
+def history_database(revision_type):
"""Shows the history of the database."""
- alembic_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "alembic.ini")
- alembic_cfg = AlembicConfig(alembic_path)
+ from alembic import command as alembic_command
+ from alembic.config import Config as AlembicConfig
+
+ alembic_cfg = AlembicConfig(config.ALEMBIC_INI_PATH)
+ if revision_type == "core":
+ path = config.ALEMBIC_CORE_REVISION_PATH
+
+ elif revision_type == "tenant":
+ path = config.ALEMBIC_TENANT_REVISION_PATH
+
+ alembic_cfg.set_main_option("script_location", path)
alembic_command.history(alembic_cfg)
@@ -648,22 +554,57 @@ def history_database():
help="Don't emit SQL to database - dump to standard output instead.",
)
@click.option("--revision", nargs=1, default="head", help="Revision identifier.")
-def downgrade_database(tag, sql, revision):
+@click.option("--revision-type", type=click.Choice(["core", "tenant"]), default="core")
+def downgrade_database(tag, sql, revision, revision_type):
"""Downgrades database schema to next newest version."""
- alembic_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "alembic.ini")
- alembic_cfg = AlembicConfig(alembic_path)
+ from alembic import command as alembic_command
+ from alembic.config import Config as AlembicConfig
if sql and revision == "-1":
revision = "head:-1"
+ alembic_cfg = AlembicConfig(config.ALEMBIC_INI_PATH)
+ if revision_type == "core":
+ path = config.ALEMBIC_CORE_REVISION_PATH
+
+ elif revision_type == "tenant":
+ path = config.ALEMBIC_TENANT_REVISION_PATH
+
+ alembic_cfg.set_main_option("script_location", path)
alembic_command.downgrade(alembic_cfg, revision, sql=sql, tag=tag)
click.secho("Success.", fg="green")
-@dispatch_database.command("revision")
+@dispatch_database.command("stamp")
+@click.argument("revision", nargs=1, default="head")
+@click.option("--revision-type", type=click.Choice(["core", "tenant"]), default="core")
+@click.option(
+ "--tag", default=None, help="Arbitrary 'tag' name - can be used by custom env.py scripts."
+)
@click.option(
- "-d", "--directory", default=None, help=('migration script directory (default is "migrations")')
+ "--sql",
+ is_flag=True,
+ default=False,
+ help="Don't emit SQL to database - dump to standard output instead.",
)
+def stamp_database(revision, revision_type, tag, sql):
+ """Forces the database to a given revision."""
+ from alembic import command as alembic_command
+ from alembic.config import Config as AlembicConfig
+
+ alembic_cfg = AlembicConfig(config.ALEMBIC_INI_PATH)
+
+ if revision_type == "core":
+ path = config.ALEMBIC_CORE_REVISION_PATH
+
+ elif revision_type == "tenant":
+ path = config.ALEMBIC_TENANT_REVISION_PATH
+
+ alembic_cfg.set_main_option("script_location", path)
+ alembic_command.stamp(alembic_cfg, revision, sql=sql, tag=tag)
+
+
+@dispatch_database.command("revision")
@click.option("-m", "--message", default=None, help="Revision message")
@click.option(
"--autogenerate",
@@ -673,13 +614,14 @@ def downgrade_database(tag, sql, revision):
"operations, based on comparison of database to model"
),
)
+@click.option("--revision-type", type=click.Choice(["core", "tenant"]))
@click.option(
- "--sql", is_flag=True, help=("Don't emit SQL to database - dump to standard output " "instead")
+ "--sql", is_flag=True, help=("Don't emit SQL to database - dump to standard output instead")
)
@click.option(
"--head",
default="head",
- help=("Specify head revision or @head to base new " "revision on"),
+ help=("Specify head revision or @head to base new revision on"),
)
@click.option(
"--splice", is_flag=True, help=('Allow a non-head revision as the "head" to splice onto')
@@ -691,43 +633,91 @@ def downgrade_database(tag, sql, revision):
"--version-path", default=None, help=("Specify specific path from config for version file")
)
@click.option(
- "--rev-id", default=None, help=("Specify a hardcoded revision id instead of generating " "one")
+ "--rev-id", default=None, help=("Specify a hardcoded revision id instead of generating one")
)
def revision_database(
- directory, message, autogenerate, sql, head, splice, branch_label, version_path, rev_id
+ message, autogenerate, revision_type, sql, head, splice, branch_label, version_path, rev_id
):
"""Create new database revision."""
- alembic_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "alembic.ini")
- alembic_cfg = AlembicConfig(alembic_path)
- alembic_command.revision(
- alembic_cfg,
- message,
- autogenerate=autogenerate,
- sql=sql,
- head=head,
- splice=splice,
- branch_label=branch_label,
- version_path=version_path,
- rev_id=rev_id,
- )
+ import types
+
+ from alembic import command as alembic_command
+ from alembic.config import Config as AlembicConfig
+
+ alembic_cfg = AlembicConfig(config.ALEMBIC_INI_PATH)
+
+ if revision_type:
+ if revision_type == "core":
+ path = config.ALEMBIC_CORE_REVISION_PATH
+ elif revision_type == "tenant":
+ path = config.ALEMBIC_TENANT_REVISION_PATH
+
+ alembic_cfg.set_main_option("script_location", path)
+ alembic_cfg.cmd_opts = types.SimpleNamespace(cmd="revision")
+ alembic_command.revision(
+ alembic_cfg,
+ message,
+ autogenerate=autogenerate,
+ sql=sql,
+ head=head,
+ splice=splice,
+ branch_label=branch_label,
+ version_path=version_path,
+ rev_id=rev_id,
+ )
+ else:
+ for path in [
+ config.ALEMBIC_CORE_REVISION_PATH,
+ config.ALEMBIC_TENANT_REVISION_PATH,
+ ]:
+ alembic_cfg.set_main_option("script_location", path)
+ alembic_cfg.cmd_opts = types.SimpleNamespace(cmd="revision")
+ alembic_command.revision(
+ alembic_cfg,
+ message,
+ autogenerate=autogenerate,
+ sql=sql,
+ head=head,
+ splice=splice,
+ branch_label=branch_label,
+ version_path=version_path,
+ rev_id=rev_id,
+ )
@dispatch_cli.group("scheduler")
def dispatch_scheduler():
"""Container for all dispatch scheduler commands."""
# we need scheduled tasks to be imported
- from .incident.scheduled import daily_summary # noqa
- from .task.scheduled import sync_tasks, create_task_reminders # noqa
- from .term.scheduled import sync_terms # noqa
+ from .case.scheduled import case_close_reminder, case_triage_reminder # noqa
+ from .case.scheduled_internal import schedule_placeholder # noqa
+ from .case_cost.scheduled import (
+ calculate_cases_response_cost, # noqa
+ )
+ from .data.source.scheduled import sync_sources # noqa
from .document.scheduled import sync_document_terms # noqa
- from .tag.scheduled import sync_tags # noqa
-
- install_plugins()
+ from .evergreen.scheduled import create_evergreen_reminders # noqa
+ from .feedback.incident.scheduled import feedback_report_daily # noqa
+ from .feedback.service.scheduled import oncall_shift_feedback # noqa
+ from .incident.scheduled import (
+ incident_auto_tagger, # noqa
+ )
+ from .incident_cost.scheduled import calculate_incidents_response_cost # noqa
+ from .monitor.scheduled import sync_active_stable_monitors # noqa
+ from .report.scheduled import incident_report_reminders # noqa
+ from .tag.scheduled import build_tag_models, sync_tags # noqa
+ from .task.scheduled import (
+ create_incident_tasks_reminders, # noqa
+ )
+ from .term.scheduled import sync_terms # noqa
+ from .workflow.scheduled import sync_workflows # noqa
@dispatch_scheduler.command("list")
def list_tasks():
- """Prints and runs all currently configured periodic tasks, in seperate event loop."""
+ """Prints and runs all currently configured periodic tasks, in separate event loop."""
+ from tabulate import tabulate
+
table = []
for task in scheduler.registered_tasks:
table.append([task["name"], task["job"].period, task["job"].at_time])
@@ -737,14 +727,27 @@ def list_tasks():
@dispatch_scheduler.command("start")
@click.argument("tasks", nargs=-1)
+@click.option("--exclude", multiple=True, help="Specifically exclude tasks you do no wish to run.")
@click.option("--eager", is_flag=True, default=False, help="Run the tasks immediately.")
-def start_tasks(tasks, eager):
+def start_tasks(tasks, exclude, eager):
"""Starts the scheduler."""
+ import signal
+
+ from dispatch.common.utils.cli import install_plugins
+ from dispatch.scheduler import stop_scheduler
+
+ install_plugins()
+
if tasks:
for task in scheduler.registered_tasks:
if task["name"] not in tasks:
scheduler.remove(task)
+ if exclude:
+ for task in scheduler.registered_tasks:
+ if task["name"] in exclude:
+ scheduler.remove(task)
+
if eager:
for task in tasks:
for r_task in scheduler.registered_tasks:
@@ -753,7 +756,12 @@ def start_tasks(tasks, eager):
r_task["func"]()
break
else:
- click.secho(f"Task not found. TaskName: {task}", fg="red")
+ click.secho(f"A scheduled task/job named {task} does not exist", fg="red")
+
+ # registers a handler to stop future scheduling when encountering sigterm
+ signals = (signal.SIGHUP, signal.SIGTERM, signal.SIGINT)
+ for s in signals:
+ signal.signal(s, stop_scheduler)
click.secho("Starting scheduler...", fg="blue")
scheduler.start()
@@ -762,23 +770,19 @@ def start_tasks(tasks, eager):
@dispatch_cli.group("server")
def dispatch_server():
"""Container for all dispatch server commands."""
- install_plugins()
+ pass
@dispatch_server.command("routes")
def show_routes():
"""Prints all available routes."""
- from dispatch.api import api_router
+ from tabulate import tabulate
- install_plugin_events(api_router)
+ from dispatch.main import api_router
table = []
for r in api_router.routes:
- auth = False
- for d in r.dependencies:
- if d.dependency.__name__ == "get_current_user": # TODO this is fragile
- auth = True
- table.append([r.path, auth, ",".join(r.methods)])
+ table.append([r.path, ",".join(r.methods)])
click.secho(tabulate(table, headers=["Path", "Authenticated", "Methods"]), fg="blue")
@@ -786,11 +790,19 @@ def show_routes():
@dispatch_server.command("config")
def show_config():
"""Prints the current config as dispatch sees it."""
- from dispatch.config import config
+ import inspect
+ import sys
+
+ from tabulate import tabulate
+
+ from dispatch import config
+
+ func_members = inspect.getmembers(sys.modules[config.__name__])
table = []
- for k, v in config.file_values.items():
- table.append([k, v])
+ for key, value in func_members:
+ if key.isupper():
+ table.append([key, value])
click.secho(tabulate(table, headers=["Key", "Value"]), fg="blue")
@@ -806,26 +818,390 @@ def run_server(log_level):
"""Runs a simple server for development."""
# Uvicorn expects lowercase logging levels; the logging package expects upper.
os.environ["LOG_LEVEL"] = log_level.upper()
- if not config.STATIC_DIR:
+ if not os.path.isdir(config.STATIC_DIR):
import atexit
+ import subprocess
from subprocess import Popen
# take our frontend vars and export them for the frontend to consume
envvars = os.environ.copy()
- envvars.update({x: getattr(config, x) for x in dir(config) if x.startswith("VUE_APP_")})
+ envvars.update({x: getattr(config, x) for x in dir(config) if x.startswith("VITE_")})
- p = Popen(["npm", "run", "serve"], cwd="src/dispatch/static/dispatch", env=envvars)
+ # Add git commit information for development
+ try:
+ commit_hash = subprocess.check_output(
+ ["git", "rev-parse", "HEAD"], cwd=".", stderr=subprocess.DEVNULL
+ ).decode("utf-8").strip()
+ envvars["VITE_DISPATCH_COMMIT_HASH"] = commit_hash
+ except (subprocess.CalledProcessError, FileNotFoundError):
+ # If git is not available or not in a git repo, use a default value
+ envvars["VITE_DISPATCH_COMMIT_HASH"] = "dev-local"
+
+ try:
+ commit_message = subprocess.check_output(
+ ["git", "log", "-1", "--pretty=%B"], cwd=".", stderr=subprocess.DEVNULL
+ ).decode("utf-8").strip()
+ envvars["VITE_DISPATCH_COMMIT_MESSAGE"] = commit_message
+ except (subprocess.CalledProcessError, FileNotFoundError):
+ # If git is not available or not in a git repo, use a default value
+ envvars["VITE_DISPATCH_COMMIT_MESSAGE"] = "Development build"
+
+ try:
+ commit_date = subprocess.check_output(
+ ["git", "log", "-1", "--pretty=%cd", "--date=short"], cwd=".", stderr=subprocess.DEVNULL
+ ).decode("utf-8").strip()
+ envvars["VITE_DISPATCH_COMMIT_DATE"] = commit_date
+ except (subprocess.CalledProcessError, FileNotFoundError):
+ # If git is not available or not in a git repo, use a default value
+ envvars["VITE_DISPATCH_COMMIT_DATE"] = "Unknown"
+ is_windows = os.name == "nt"
+ windows_cmds = ["cmd", "/c"]
+ default_cmds = ["npm", "run", "serve"]
+ cmds = windows_cmds + default_cmds if is_windows else default_cmds
+ p = Popen(
+ cmds,
+ cwd=os.path.join("src", "dispatch", "static", "dispatch"),
+ env=envvars,
+ )
atexit.register(p.terminate)
- uvicorn.run("dispatch.main:app", debug=True, log_level=log_level)
+ uvicorn.run("dispatch.main:app", reload=True, log_level=log_level)
+
+
+dispatch_server.add_command(uvicorn.main, name="start")
+
+
+@dispatch_cli.group("signals")
+def signals_group():
+ """All commands for signal consumer manipulation."""
+ pass
+
+
+@signals_group.command("consume")
+def consume_signals():
+ """
+ Runs a continuous process that consumes signals from the specified plugins.
+
+ This function sets up consumer threads for all active signal-consumer plugins
+ across all organizations and projects. It monitors these threads and restarts
+ them if they die. The process can be terminated using SIGINT or SIGTERM.
+
+ Returns:
+ None
+ """
+ from dispatch.common.utils.cli import install_plugins
+ from dispatch.database.core import get_organization_session, get_session
+ from dispatch.organization.service import get_all as get_all_organizations
+ from dispatch.plugin import service as plugin_service
+ from dispatch.project import service as project_service
+
+ install_plugins()
+
+ try:
+ with get_session() as session:
+ organizations = get_all_organizations(db_session=session)
+ except Exception as e:
+ log.exception(f"Error fetching organizations: {e}")
+ return
+
+ for organization in organizations:
+ try:
+ with get_organization_session(organization.slug) as session:
+ projects = project_service.get_all(db_session=session)
+
+ for project in projects:
+ try:
+ plugins = plugin_service.get_active_instances(
+ db_session=session, plugin_type="signal-consumer", project_id=project.id
+ )
+
+ if not plugins:
+ log.warning(
+ f"No signals consumed. No signal-consumer plugins enabled. Project: {project.name}. Organization: {project.organization.name}"
+ )
+ continue
+
+ for plugin in plugins:
+ log.debug(f"Consuming signals for plugin: {plugin.plugin.slug}")
+ try:
+ plugin.instance.consume(db_session=session, project=project)
+ except Exception as e:
+ log.error(
+ f"Error consuming signals for plugin: {plugin.plugin.slug}. Error: {e}"
+ )
+ except Exception as e:
+ log.exception(f"Error processing project {project.name}: {e}")
+ except Exception as e:
+ log.exception(f"Error processing organization {organization.slug}: {e}")
+
+
+@signals_group.command("process")
+def process_signals():
+ """
+ Runs a continuous process that does additional processing on newly created signals.
+
+ This function processes signal instances across all organizations by:
+ 1. Creating a session for each organization using a context manager
+ 2. Fetching unprocessed signal instances (with no filter_action or case_id)
+ 3. Processing each instance with proper error handling
+ 4. Ensuring proper session cleanup even if exceptions occur
+
+ Returns:
+ None
+ """
+ from contextlib import contextmanager
+
+ from sqlalchemy import asc
+ from sqlalchemy.orm import sessionmaker
+
+ from dispatch.common.utils.cli import install_plugins
+ from dispatch.database.core import SessionLocal, engine
+ from dispatch.organization.service import get_all as get_all_organizations
+ from dispatch.signal import flows as signal_flows
+ from dispatch.signal.models import SignalInstance
+
+ install_plugins()
+
+ @contextmanager
+ def session_scope(schema_engine):
+ """Provide a transactional scope around a series of operations."""
+ session = sessionmaker(bind=schema_engine)()
+ try:
+ yield session
+ session.commit()
+ except Exception:
+ session.rollback()
+ raise
+ finally:
+ session.close()
+
+ organizations = get_all_organizations(db_session=SessionLocal())
+
+ while True:
+ for organization in organizations:
+ schema_engine = engine.execution_options(
+ schema_translate_map={
+ None: f"dispatch_organization_{organization.slug}",
+ }
+ )
+ try:
+ with session_scope(schema_engine) as db_session:
+ # Get IDs first rather than full instances
+ signal_instance_ids = (
+ db_session.query(SignalInstance.id)
+ .filter(SignalInstance.filter_action == None) # noqa
+ .filter(SignalInstance.case_id == None) # noqa
+ .order_by(asc(SignalInstance.created_at))
+ .limit(500)
+ .all()
+ )
+
+ # Process each instance with its own transaction
+ for (instance_id,) in signal_instance_ids:
+ try:
+ # Process each signal instance in its own transaction
+ # This ensures each instance is fresh and attached to the session
+ signal_flows.signal_instance_create_flow(
+ db_session=db_session,
+ signal_instance_id=instance_id,
+ )
+ # Commit after each successful processing to avoid
+ # accumulating too many objects in the session
+ db_session.commit()
+ except Exception as e:
+ log.exception(f"Error processing signal instance {instance_id}: {e}")
+ # Rollback this specific transaction but continue with others
+ db_session.rollback()
+ except Exception as e:
+ log.exception(f"Error processing signals for organization {organization.slug}: {e}")
+ # No need to close the session here as it's handled by the context manager
+
+
+@signals_group.command("perf-test")
+@click.option("--num-instances", default=1, help="Number of signal instances to send.")
+@click.option("--num-workers", default=1, help="Number of threads to use.")
+@click.option(
+ "--api-endpoint",
+ default=f"{DISPATCH_UI_URL}/api/v1/default/signals/instances",
+ required=True,
+ help="API endpoint to send the signal instances to.",
+)
+@click.option(
+ "--api-token",
+ required=True,
+ help="API token to use.",
+)
+@click.option(
+ "--project",
+ default="Test",
+ required=True,
+ help="The Dispatch project to send the instances to.",
+)
+def perf_test(
+ num_instances: int, num_workers: int, api_endpoint: str, api_token: str, project: str
+) -> None:
+ """Performance testing utility for creating signal instances."""
+
+ import concurrent.futures
+ import time
+ import uuid
+
+ import requests
+ from fastapi import status
+
+ NUM_SIGNAL_INSTANCES = num_instances
+ NUM_WORKERS = num_workers
+
+ session = requests.Session()
+ session.headers.update(
+ {
+ "Content-Type": "application/json",
+ "Authorization": f"Bearer {api_token}",
+ }
+ )
+ start_time = time.time()
+
+ def _send_signal_instance(
+ api_endpoint: str,
+ api_token: str,
+ session: requests.Session,
+ signal_instance: dict[str, str],
+ ) -> None:
+ try:
+ r = session.post(
+ api_endpoint,
+ json=signal_instance,
+ headers={
+ "Content-Type": "application/json",
+ "Authorization": f"Bearer {api_token}",
+ },
+ )
+ log.info(f"Response: {r.json()}")
+ if r.status_code == status.HTTP_401_UNAUTHORIZED:
+ raise PermissionError(
+ "Unauthorized. Please check your bearer token. You can find it in the Dev Tools under Request Headers -> Authorization."
+ )
+
+ r.raise_for_status()
+
+ except requests.exceptions.RequestException as e:
+ log.error(f"Unable to send finding. Reason: {e} Response: {r.json() if r else 'N/A'}")
+ else:
+ log.info(f"{signal_instance.get('raw', {}).get('id')} created successfully")
+
+ def send_signal_instances(
+ api_endpoint: str, api_token: str, signal_instances: list[dict[str, str]]
+ ):
+ with concurrent.futures.ThreadPoolExecutor(max_workers=NUM_WORKERS) as executor:
+ futures = [
+ executor.submit(
+ _send_signal_instance,
+ api_endpoint=api_endpoint,
+ api_token=api_token,
+ session=session,
+ signal_instance=signal_instance,
+ )
+ for signal_instance in signal_instances
+ ]
+ results = [future.result() for future in concurrent.futures.as_completed(futures)]
+
+ log.info(f"\nSent {len(results)} of {NUM_SIGNAL_INSTANCES} signal instances")
+
+ signal_instances = [
+ {
+ "project": {"name": project},
+ "raw": {
+ "id": str(uuid.uuid4()),
+ "name": "Test Signal",
+ "slug": "test-signal",
+ "canary": False,
+ "events": [
+ {
+ "original": {
+ "dateint": 20240930,
+ "distinct_lookupkey_count": 95,
+ },
+ },
+ ],
+ "created_at": "2024-09-18T19:47:15Z",
+ "quiet_mode": False,
+ "external_id": "4ebbab36-c703-495f-ae47-7051bdc8b3ef",
+ },
+ },
+ ] * NUM_SIGNAL_INSTANCES
+
+ send_signal_instances(api_endpoint, api_token, signal_instances)
+
+ elapsed_time = time.time() - start_time
+ click.echo(f"Elapsed time: {elapsed_time:.2f} seconds")
+
+
+@dispatch_server.command("slack")
+@click.argument("organization")
+@click.argument("project")
+def run_slack_websocket(organization: str, project: str):
+ """Runs the slack websocket process."""
+ from slack_bolt.adapter.socket_mode import SocketModeHandler
+ from sqlalchemy import true
+
+ from dispatch.common.utils.cli import install_plugins
+ from dispatch.database.core import refetch_db_session
+ from dispatch.plugins.dispatch_slack.bolt import app
+ from dispatch.plugins.dispatch_slack.case.interactive import configure as case_configure
+ from dispatch.plugins.dispatch_slack.incident.interactive import configure as incident_configure
+ from dispatch.plugins.dispatch_slack.workflow import configure as workflow_configure
+ from dispatch.project import service as project_service
+ from dispatch.project.models import ProjectRead
+
+ install_plugins()
-dispatch_server.add_command(uvicorn_main, name="start")
+ session = refetch_db_session(organization)
+
+ project = project_service.get_by_name_or_raise(
+ db_session=session, project_in=ProjectRead(name=project)
+ )
+
+ instances = (
+ session.query(PluginInstance)
+ .filter(PluginInstance.enabled == true())
+ .filter(PluginInstance.project_id == project.id)
+ .all()
+ )
+
+ instance = None
+ for i in instances:
+ if i.plugin.slug == "slack-conversation":
+ instance: PluginInstance = i
+ break
+
+ if not instance:
+ click.secho(
+ f"No slack plugin has been configured for this organization/plugin. Organization: {organization} Project: {project}",
+ fg="red",
+ )
+ return
+
+ session.close()
+
+ click.secho("Slack websocket process started...", fg="blue")
+ incident_configure(instance.configuration)
+ workflow_configure(instance.configuration)
+ case_configure(instance.configuration)
+
+ app._token = instance.configuration.api_bot_token.get_secret_value()
+
+ handler = SocketModeHandler(
+ app, instance.configuration.socket_mode_app_token.get_secret_value()
+ )
+ handler.start()
@dispatch_server.command("shell")
@click.argument("ipython_args", nargs=-1, type=click.UNPROCESSED)
def shell(ipython_args):
"""Starts an ipython shell importing our app. Useful for debugging."""
+ import sys
+
import IPython
from IPython.terminal.ipapp import load_default_config
@@ -839,6 +1215,8 @@ def shell(ipython_args):
def entrypoint():
"""The entry that the CLI is executed from"""
+ from .exceptions import DispatchException
+
try:
dispatch_cli()
except DispatchException as e:
diff --git a/src/dispatch/common/managers.py b/src/dispatch/common/managers.py
index 719b27a6fe84..7a629e00edb8 100644
--- a/src/dispatch/common/managers.py
+++ b/src/dispatch/common/managers.py
@@ -6,8 +6,9 @@
.. moduleauthor:: Kevin Glisson
"""
+
import logging
-from dispatch.exceptions import InvalidConfiguration
+from dispatch.exceptions import InvalidConfigurationError
logger = logging.getLogger(__name__)
@@ -62,7 +63,7 @@ def all(self):
else:
results.append(cls)
- except InvalidConfiguration as e:
+ except InvalidConfigurationError as e:
logger.warning(f"Plugin '{class_name}' may not work correctly. {e}")
except Exception as e:
diff --git a/src/dispatch/common/utils/cli.py b/src/dispatch/common/utils/cli.py
index 27163964f0b5..0c2341177e13 100644
--- a/src/dispatch/common/utils/cli.py
+++ b/src/dispatch/common/utils/cli.py
@@ -1,158 +1,37 @@
-import os
-import sys
import traceback
import logging
-import pkg_resources
-
-import click
-
-from dispatch.plugins.base import plugins
-from .dynamic_click import params_factory
+from importlib.metadata import entry_points
+from sqlalchemy.exc import SQLAlchemyError
+from dispatch.plugins.base import plugins, register
logger = logging.getLogger(__name__)
-def chunk(l, n):
- """Chunk a list to sublists."""
- for i in range(0, len(l), n):
- yield l[i : i + n]
-
-
# Plugin endpoints should determine authentication # TODO allow them to specify (kglisson)
def install_plugin_events(api):
"""Adds plugin endpoints to the event router."""
for plugin in plugins.all():
if plugin.events:
- api.include_router(plugin.events, prefix="/events", tags=["events"])
+ api.include_router(plugin.events, prefix="/{organization}/events", tags=["events"])
def install_plugins():
- """
- Installs plugins associated with dispatch
- :return:
- """
- from dispatch.plugins.base import register
+ """Installs plugins associated with dispatch"""
+ dispatch_plugins = entry_points().select(group="dispatch.plugins")
- for ep in pkg_resources.iter_entry_points("dispatch.plugins"):
- logger.debug(f"Loading plugin {ep.name}")
+ for ep in dispatch_plugins:
+ logger.info(f"Attempting to load plugin: {ep.name}")
try:
plugin = ep.load()
+ register(plugin)
+ logger.info(f"Successfully loaded plugin: {ep.name}")
+ except SQLAlchemyError:
+ logger.error(
+ "Something went wrong with creating plugin rows, is the database setup correctly?"
+ )
+ logger.error(f"Failed to load plugin {ep.name}:{traceback.format_exc()}")
except KeyError as e:
- logger.warning(f"Failed to load plugin {ep.name}. Reason: {e}")
+ logger.info(f"Failed to load plugin {ep.name} due to missing configuration items. {e}")
except Exception:
- import traceback
-
logger.error(f"Failed to load plugin {ep.name}:{traceback.format_exc()}")
- else:
- register(plugin)
-
-
-def with_plugins(plugin_type: str):
-
- """
- A decorator to register external CLI commands to an instance of
- `click.Group()`.
- Parameters
- ----------
- plugin_type : str
- Plugin type to create subcommands for.
- Returns
- -------
- click.Group()
- """
-
- def decorator(group):
- if not isinstance(group, click.Group):
- raise TypeError("Plugins can only be attached to an instance of click.Group()")
-
- for p in plugins.all(plugin_type=plugin_type) or ():
- # create a new subgroup for each plugin
- name = p.slug.split("-")[0]
- plugin_group = click.Group(name)
- try:
- for command in p.commands:
- command_func = getattr(p, command)
- props = get_plugin_properties(p.schema)
- params = params_factory([props])
- command_obj = click.Command(
- command, params=params, callback=command_func, help=command_func.__doc__
- )
- plugin_group.add_command(command_obj)
- except Exception:
- # Catch this so a busted plugin doesn't take down the CLI.
- # Handled by registering a dummy command that does nothing
- # other than explain the error.
- plugin_group.add_command(BrokenCommand(p.slug, plugin_type))
-
- group.add_command(plugin_group)
- return group
-
- return decorator
-
-
-class BrokenCommand(click.Command):
-
- """
- Rather than completely crash the CLI when a broken plugin is loaded, this
- class provides a modified help message informing the user that the plugin is
- broken and they should contact the owner. If the user executes the plugin
- or specifies `--help` a traceback is reported showing the exception the
- plugin loader encountered.
- """
-
- def __init__(self, name, plugin_type):
-
- """
- Define the special help messages after instantiating a `click.Command()`.
- """
-
- click.Command.__init__(self, name)
-
- util_name = os.path.basename(sys.argv and sys.argv[0] or __file__)
-
- if os.environ.get("CLICK_PLUGINS_HONESTLY"): # pragma no cover
- icon = "\U0001F4A9"
- else:
- icon = "\u2020"
-
- self.help = (
- f"\nWarning: plugin could not be loaded. Contact "
- f"its author for help.\n\n\b\n {traceback.format_exc()}"
- )
- self.short_help = f"{icon} Warning: could not load plugin. See `{util_name} {plugin_type} {self.name} --help`."
-
- def invoke(self, ctx):
-
- """
- Print the traceback instead of doing nothing.
- """
-
- click.echo(self.help, color=ctx.color)
- ctx.exit(1)
-
- def parse_args(self, ctx, args):
- return args
-
-
-def get_plugin_properties(json_schema):
- for _, v in json_schema["definitions"].items():
- return v["properties"]
-
-
-def add_plugins_args(f):
- """Adds installed plugin options."""
- schemas = []
- if isinstance(f, click.Command):
- for p in plugins.all():
- schemas.append(get_plugin_properties(p.schema))
- f.params.extend(params_factory(schemas))
- else:
- if not hasattr(f, "__click_params__"):
- f.__click_params__ = []
-
- for p in plugins.all():
- schemas.append(get_plugin_properties(p.schema))
- f.__click_params__.extend(params_factory(schemas))
-
- return f
diff --git a/src/dispatch/common/utils/dynamic_click.py b/src/dispatch/common/utils/dynamic_click.py
deleted file mode 100644
index ae2f3d30f28e..000000000000
--- a/src/dispatch/common/utils/dynamic_click.py
+++ /dev/null
@@ -1,122 +0,0 @@
-import json
-from functools import partial
-from typing import List
-
-import click
-
-
-from .json_schema import COMPLEX_TYPES, json_schema_to_click_type, handle_oneof
-
-
-CORE_COMMANDS = {
- "required": "'{}' required schema",
- "schema": "'{}' full schema",
- "metadata": "'{}' metadata",
- "defaults": "'{}' default values",
-}
-
-
-# TODO figure out how to validate across opts
-def validate_schema_callback(ctx, param, value):
- """Ensures options passed fulfill what plugins are expecting."""
- return value
-
-
-def params_factory(schemas: List[dict]) -> list:
- """
- Generates list of :class:`click.Option` based on a JSON schema
-
- :param schemas: JSON schemas to operate on
- :return: Lists of created :class:`click.Option` object to be added to a :class:`click.Command`
- """
- params = []
- unique_decls = []
- for schema in schemas:
- for prpty, prpty_schema in schema.items():
- multiple = False
- choices = None
-
- if any(char in prpty for char in ["@"]):
- continue
-
- if prpty_schema.get("type") in COMPLEX_TYPES:
- continue
-
- if prpty_schema.get("duplicate"):
- continue
-
- elif not prpty_schema.get("oneOf"):
- click_type, description, choices = json_schema_to_click_type(prpty_schema)
- else:
- click_type, multiple, description = handle_oneof(prpty_schema["oneOf"])
- # Not all oneOf schema can be handled by click
- if not click_type:
- continue
-
- # Convert bool values into flags
- if click_type == click.BOOL:
- param_decls = [get_flag_param_decals_from_bool(prpty)]
- click_type = None
- else:
- param_decls = [get_param_decals_from_name(prpty)]
-
- if description:
- description = description.capitalize() + "."
- default = prpty_schema.get("default")
-
- if default:
- description += f" [Default: {default}]"
-
- if multiple:
- if not description.endswith("."):
- description += "."
- description += " Multiple usages of this option are allowed"
-
- param_decls = [x for x in param_decls if x not in unique_decls]
- if not param_decls:
- continue
-
- unique_decls += param_decls
- option = partial(
- click.Option,
- param_decls=param_decls,
- help=description,
- default=prpty_schema.get("default"),
- callback=validate_schema_callback,
- multiple=multiple,
- )
-
- if choices:
- option = option(type=choices)
- elif click_type:
- option = option(type=click_type)
- else:
- option = option()
-
- params.append(option)
- return params
-
-
-def func_factory(p, method: str) -> callable:
- """
- Dynamically generates callback commands to correlate to provider public methods
- """
-
- def callback(pretty: bool = False):
- res = getattr(p, method)
- dump = partial(json.dumps, indent=4) if pretty else partial(json.dumps)
- click.echo(dump(res))
-
- return callback
-
-
-def get_param_decals_from_name(option_name: str) -> str:
- """Converts a name to a param name"""
- name = option_name.replace("_", "-")
- return f"--{name}"
-
-
-def get_flag_param_decals_from_bool(option_name: str) -> str:
- """Return a '--do/not-do' style flag param"""
- name = option_name.replace("_", "-")
- return f"--{name}/--no-{name}"
diff --git a/src/dispatch/common/utils/json_schema.py b/src/dispatch/common/utils/json_schema.py
deleted file mode 100644
index 8fd5c02887fb..000000000000
--- a/src/dispatch/common/utils/json_schema.py
+++ /dev/null
@@ -1,53 +0,0 @@
-import click
-
-SCHEMA_BASE_MAP = {
- "string": click.STRING,
- "integer": click.INT,
- "number": click.FLOAT,
- "boolean": click.BOOL,
-}
-COMPLEX_TYPES = ["object", "array"]
-
-
-def handle_oneof(oneof_schema: list) -> tuple:
- """
- Custom handle of `oneOf` JSON schema validator. Tried to match primitive type and see if it should be allowed
- to be passed multiple timns into a command
-
- :param oneof_schema: `oneOf` JSON schema
- :return: Tuple of :class:`click.ParamType`, ``multiple`` flag and ``description`` of option
- """
- oneof_dict = {schema["type"]: schema for schema in oneof_schema}
- click_type = None
- multiple = False
- description = None
- for key, value in oneof_dict.items():
- if key == "array":
- continue
- elif key in SCHEMA_BASE_MAP:
- if oneof_dict.get("array") and oneof_dict["array"]["items"]["type"] == key:
- multiple = True
- # Found a match to a primitive type
- click_type = SCHEMA_BASE_MAP[key]
- description = value.get("title")
- break
- return click_type, multiple, description
-
-
-def json_schema_to_click_type(schema: dict) -> tuple:
- """
- A generic handler of a single property JSON schema to :class:`click.ParamType` converter
-
- :param schema: JSON schema property to operate on
- :return: Tuple of :class:`click.ParamType`, `description`` of option and optionally a :class:`click.Choice`
- if the allowed values are a closed list (JSON schema ``enum``)
- """
- choices = None
- if isinstance(schema["type"], list):
- if "string" in schema["type"]:
- schema["type"] = "string"
- click_type = SCHEMA_BASE_MAP[schema["type"]]
- description = schema.get("title")
- if schema.get("enum"):
- choices = click.Choice(schema["enum"])
- return click_type, description, choices
diff --git a/src/dispatch/common/utils/views.py b/src/dispatch/common/utils/views.py
new file mode 100644
index 000000000000..6eb684cd3428
--- /dev/null
+++ b/src/dispatch/common/utils/views.py
@@ -0,0 +1,17 @@
+def create_pydantic_include(include):
+ """Creates a pydantic sets based on dotted notation."""
+ include_sets = {}
+ for i in include:
+ keyset = None
+ for key in reversed(i.split(".")):
+ if keyset:
+ if key.endswith("[]"):
+ key = key.strip("[]")
+ keyset = {key: {"__all__": keyset}}
+ else:
+ keyset = {key: keyset}
+ else:
+ keyset = {key: ...}
+ include_sets.update(keyset)
+
+ return include_sets
diff --git a/src/dispatch/conference/flows.py b/src/dispatch/conference/flows.py
new file mode 100644
index 000000000000..dcb039ac93a1
--- /dev/null
+++ b/src/dispatch/conference/flows.py
@@ -0,0 +1,67 @@
+import logging
+
+from dispatch.database.core import SessionLocal
+from dispatch.event import service as event_service
+from dispatch.incident.models import Incident
+from dispatch.plugin import service as plugin_service
+
+from .models import ConferenceCreate
+from .service import create
+
+log = logging.getLogger(__name__)
+
+
+def create_conference(incident: Incident, participants: list[str], db_session: SessionLocal):
+ """Creates a conference room."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conference"
+ )
+ if not plugin:
+ log.warning("Conference room not created. No conference plugin enabled.")
+ return
+
+ # we create the external conference room
+ try:
+ external_conference = plugin.instance.create(
+ incident.name, title=incident.title, participants=participants
+ )
+ except Exception as e:
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Creating the incident conference room failed. Reason: {e}",
+ incident_id=incident.id,
+ )
+ log.exception(e)
+ return
+
+ if not external_conference:
+ log.error(f"Conference not created. Plugin {plugin.plugin.slug} encountered an error.")
+ return
+
+ external_conference.update(
+ {"resource_type": plugin.plugin.slug, "resource_id": external_conference["id"]}
+ )
+
+ # we create the internal conference room
+ conference_in = ConferenceCreate(
+ resource_id=external_conference["resource_id"],
+ resource_type=external_conference["resource_type"],
+ weblink=external_conference["weblink"],
+ conference_id=external_conference["id"],
+ conference_challenge=external_conference["challenge"],
+ )
+ conference = create(conference_in=conference_in, db_session=db_session)
+ incident.conference = conference
+
+ db_session.add(incident)
+ db_session.commit()
+
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description="Incident conference created",
+ incident_id=incident.id,
+ )
+
+ return conference
diff --git a/src/dispatch/conference/models.py b/src/dispatch/conference/models.py
index 751a340ab703..60747d26b24e 100644
--- a/src/dispatch/conference/models.py
+++ b/src/dispatch/conference/models.py
@@ -1,49 +1,54 @@
-from typing import Optional
+"""Models for conference resources in the Dispatch application."""
-from pydantic import validator
-from sqlalchemy import Column, Integer, String
+from jinja2 import Template
-from dispatch.database import Base
-from dispatch.messaging import INCIDENT_CONFERENCE, render_message_template
-from dispatch.models import DispatchBase, ResourceMixin
+from pydantic import field_validator, ValidationInfo
+from sqlalchemy import Column, Integer, String, ForeignKey
+
+from dispatch.database.core import Base
+from dispatch.messaging.strings import INCIDENT_CONFERENCE_DESCRIPTION
+from dispatch.models import ResourceBase, ResourceMixin
class Conference(Base, ResourceMixin):
+ """SQLAlchemy model for conference resources."""
+
id = Column(Integer, primary_key=True)
conference_id = Column(String)
conference_challenge = Column(String, nullable=False, server_default="N/A")
+ incident_id = Column(Integer, ForeignKey("incident.id", ondelete="CASCADE"))
# Pydantic models...
-class ConferenceBase(DispatchBase):
- resource_id: str
- resource_type: str
- weblink: str
- conference_id: str
- conference_challenge: str
+class ConferenceBase(ResourceBase):
+ """Base Pydantic model for conference resources."""
+
+ conference_id: str | None = None
+ conference_challenge: str | None = None
class ConferenceCreate(ConferenceBase):
+ """Pydantic model for creating a conference resource."""
+
pass
class ConferenceUpdate(ConferenceBase):
+ """Pydantic model for updating a conference resource."""
+
pass
class ConferenceRead(ConferenceBase):
- weblink: str
- conference_challenge: str
- description: Optional[str]
-
- @validator("description", pre=True, always=True)
- def set_description(cls, v, values):
- """Sets the description"""
- description = render_message_template(
- [INCIDENT_CONFERENCE], conference_challenge=values["conference_challenge"]
- )[0]["text"]
- return description
-
-
-class ConferenceNested(ConferenceBase):
- pass
+ """Pydantic model for reading a conference resource."""
+
+ description: str | None = None
+
+ @field_validator("description", mode="before")
+ @classmethod
+ def set_description(cls, v, info: ValidationInfo):
+ """Sets the description using a Jinja2 template and the conference challenge."""
+ conference_challenge = info.data.get("conference_challenge")
+ return Template(INCIDENT_CONFERENCE_DESCRIPTION).render(
+ conference_challenge=conference_challenge
+ )
diff --git a/src/dispatch/conference/service.py b/src/dispatch/conference/service.py
index aa4d5e8e263a..a65b8eb2591c 100644
--- a/src/dispatch/conference/service.py
+++ b/src/dispatch/conference/service.py
@@ -1,44 +1,30 @@
-from typing import Optional
from .models import Conference, ConferenceCreate
-def get(*, db_session, conference_id: int) -> Optional[Conference]:
+def get(*, db_session, conference_id: int) -> Conference | None:
+ """Get a conference by its id."""
return db_session.query(Conference).filter(Conference.id == conference_id).one()
-def get_by_resource_id(*, db_session, resource_id: str) -> Optional[Conference]:
+def get_by_resource_id(*, db_session, resource_id: str) -> Conference | None:
+ """Get a conference by its id."""
return db_session.query(Conference).filter(Conference.resource_id == resource_id).one_or_none()
-def get_by_resource_type(*, db_session, resource_type: str) -> Optional[Conference]:
- return (
- db_session.query(Conference).filter(Conference.resource_type == resource_type).one_or_none()
- )
-
-
-def get_by_conference_id(db_session, conference_id: str) -> Optional[Conference]:
- return (
- db_session.query(Conference).filter(Conference.conference_id == conference_id).one_or_none()
- )
-
-
-def get_by_incident_id(*, db_session, incident_id: str) -> Optional[Conference]:
- return (
- db_session.query(Conference)
- .filter(Conference.incident_id == incident_id)
- .one()
- )
+def get_by_incident_id(*, db_session, incident_id: str) -> Conference | None:
+ """Get a conference by its associated incident id."""
+ return db_session.query(Conference).filter(Conference.incident_id == incident_id).one()
def get_all(*, db_session):
+ """Get all conferences."""
return db_session.query(Conference)
def create(*, db_session, conference_in: ConferenceCreate) -> Conference:
- contact = Conference(**conference_in.dict())
- db_session.add(contact)
+ """Create a new conference."""
+ conference = Conference(**conference_in.dict())
+ db_session.add(conference)
db_session.commit()
- db_session.flush(contact)
-
- return contact
+ return conference
diff --git a/src/dispatch/config.py b/src/dispatch/config.py
index 1d171c61f997..754ae727be4b 100644
--- a/src/dispatch/config.py
+++ b/src/dispatch/config.py
@@ -1,12 +1,37 @@
+import base64
import logging
import os
-import base64
+from urllib import parse
+from pydantic import BaseModel
from starlette.config import Config
from starlette.datastructures import CommaSeparatedStrings
-# if we have metatron available to us, lets use it to decrypt our secrets in memory
-try:
+log = logging.getLogger(__name__)
+
+
+class BaseConfigurationModel(BaseModel):
+ pass
+
+
+def get_env_tags(tag_list: list[str]) -> dict:
+ """Create dictionary of available env tags."""
+ tags = {}
+ for t in tag_list:
+ tag_key, env_key = t.split(":")
+
+ env_value = os.environ.get(env_key)
+
+ if env_value:
+ tags.update({tag_key: env_value})
+
+ return tags
+
+
+config = Config(".env")
+
+SECRET_PROVIDER = config("SECRET_PROVIDER", default=None)
+if SECRET_PROVIDER == "metatron-secret":
import metatron.decrypt
class Secret:
@@ -30,166 +55,181 @@ def __repr__(self) -> str:
def __str__(self) -> str:
return self._decrypted_value
+elif SECRET_PROVIDER == "kms-secret":
+ import boto3
-except Exception:
- # Let's see if we have boto3 for KMS available to us, let's use it to decrypt our secrets in memory
- try:
- import boto3
-
- class Secret:
- """
- Holds a string value that should not be revealed in tracebacks etc.
- You should cast the value to `str` at the point it is required.
- """
-
- def __init__(self, value: str):
- self._value = value
- self._decrypted_value = (
- boto3.client("kms")
- .decrypt(CiphertextBlob=base64.b64decode(value))["Plaintext"]
- .decode("utf-8")
- )
+ class Secret:
+ """
+ Holds a string value that should not be revealed in tracebacks etc.
+ You should cast the value to `str` at the point it is required.
+ """
- def __repr__(self) -> str:
- class_name = self.__class__.__name__
- return f"{class_name}('**********')"
+ def __init__(self, value: str):
+ self._value = value
+ self._decrypted_value = (
+ boto3.client("kms")
+ .decrypt(CiphertextBlob=base64.b64decode(value))["Plaintext"]
+ .decode("utf-8")
+ )
- def __str__(self) -> str:
- return self._decrypted_value
+ def __repr__(self) -> str:
+ class_name = self.__class__.__name__
+ return f"{class_name}('**********')"
- except Exception:
- from starlette.datastructures import Secret
+ def __str__(self) -> str:
+ return self._decrypted_value
+else:
+ from starlette.datastructures import Secret
-config = Config(".env")
LOG_LEVEL = config("LOG_LEVEL", default=logging.WARNING)
ENV = config("ENV", default="local")
-DISPATCH_UI_URL = config("DISPATCH_UI_URL")
-DISPATCH_HELP_EMAIL = config("DISPATCH_HELP_EMAIL")
-DISPATCH_HELP_SLACK_CHANNEL = config("DISPATCH_HELP_SLACK_CHANNEL")
+ENV_TAG_LIST = config("ENV_TAGS", cast=CommaSeparatedStrings, default="")
+ENV_TAGS = get_env_tags(ENV_TAG_LIST)
+
+DISPATCH_UI_URL = config("DISPATCH_UI_URL", default="http://localhost:8080")
+DISPATCH_ENCRYPTION_KEY = config("DISPATCH_ENCRYPTION_KEY", cast=Secret)
# authentication
-DISPATCH_AUTHENTICATION_PROVIDER_SLUG = config(
- "DISPATCH_AUTHENTICATION_PROVIDER_SLUG", default="dispatch-auth-provider-pkce"
+VITE_DISPATCH_AUTH_REGISTRATION_ENABLED = config(
+ "VITE_DISPATCH_AUTH_REGISTRATION_ENABLED", default="true"
)
-VUE_APP_DISPATCH_AUTHENTICATION_PROVIDER_SLUG = DISPATCH_AUTHENTICATION_PROVIDER_SLUG
+DISPATCH_AUTH_REGISTRATION_ENABLED = VITE_DISPATCH_AUTH_REGISTRATION_ENABLED != "false"
-DISPATCH_AUTHENTICATION_DEFAULT_USER = config(
- "DISPATCH_AUTHENTICATION_DEFAULT_USER", default="dispatch@example.com"
-)
-DISPATCH_AUTHENTICATION_PROVIDER_PKCE_JWKS = config(
- "DISPATCH_AUTHENTICATION_PROVIDER_PKCE_JWKS", default=None
+DISPATCH_AUTHENTICATION_PROVIDER_SLUG = config(
+ "DISPATCH_AUTHENTICATION_PROVIDER_SLUG", default="dispatch-auth-provider-basic"
)
-VUE_APP_DISPATCH_AUTHENTICATION_PROVIDER_PKCE_OPEN_ID_CONNECT_URL = config(
- "VUE_APP_DISPATCH_AUTHENTICATION_PROVIDER_PKCE_OPEN_ID_CONNECT_URL", default=None
+
+MJML_PATH = config(
+ "MJML_PATH",
+ default=f"{os.path.dirname(os.path.realpath(__file__))}/static/dispatch/node_modules/.bin",
)
-VUE_APP_DISPATCH_AUTHENTICATION_PROVIDER_PKCE_CLIENT_ID = config(
- "VUE_APP_DISPATCH_AUTHENTICATION_PROVIDER_PKCE_CLIENT_ID", default=None
+DISPATCH_MARKDOWN_IN_INCIDENT_DESC = config(
+ "DISPATCH_MARKDOWN_IN_INCIDENT_DESC", cast=bool, default=False
)
+DISPATCH_ESCAPE_HTML = config("DISPATCH_ESCAPE_HTML", cast=bool, default=None)
+if DISPATCH_ESCAPE_HTML and DISPATCH_MARKDOWN_IN_INCIDENT_DESC:
+ log.warning(
+ "HTML escape and Markdown are both explicitly enabled, this may cause unexpected notification markup."
+ )
+elif DISPATCH_ESCAPE_HTML is None and DISPATCH_MARKDOWN_IN_INCIDENT_DESC:
+ log.info("Disabling HTML escaping, due to Markdown was enabled explicitly.")
+ DISPATCH_ESCAPE_HTML = False
-# static files
-DEFAULT_STATIC_DIR = os.path.join(
- os.path.abspath(os.path.dirname(__file__)), "static/dispatch/dist"
-)
-STATIC_DIR = config("STATIC_DIR", default=DEFAULT_STATIC_DIR)
+DISPATCH_JWT_AUDIENCE = config("DISPATCH_JWT_AUDIENCE", default=None)
+DISPATCH_JWT_EMAIL_OVERRIDE = config("DISPATCH_JWT_EMAIL_OVERRIDE", default=None)
-# metrics
-METRIC_PROVIDERS = config("METRIC_PROVIDERS", cast=CommaSeparatedStrings, default="")
+if DISPATCH_AUTHENTICATION_PROVIDER_SLUG == "dispatch-auth-provider-pkce":
+ if not DISPATCH_JWT_AUDIENCE:
+ log.warn("No JWT Audience specified. This is required for IdPs like Okta")
+ if not DISPATCH_JWT_EMAIL_OVERRIDE:
+ log.warn("No JWT Email Override specified. 'email' is expected in the idtoken.")
-# sentry middleware
-SENTRY_DSN = config("SENTRY_DSN", cast=Secret, default=None)
+DISPATCH_JWT_SECRET = config("DISPATCH_JWT_SECRET", default=None)
+DISPATCH_JWT_ALG = config("DISPATCH_JWT_ALG", default="HS256")
+DISPATCH_JWT_EXP = config("DISPATCH_JWT_EXP", cast=int, default=86400) # Seconds
-# database
-DATABASE_HOSTNAME = config("DATABASE_HOSTNAME")
-DATABASE_CREDENTIALS = config("DATABASE_CREDENTIALS", cast=Secret)
-DATABASE_NAME = config("DATABASE_NAME", default="dispatch")
-DATABASE_PORT = config("DATABASE_PORT", default="5432")
-SQLALCHEMY_DATABASE_URI = f"postgresql+psycopg2://{DATABASE_CREDENTIALS}@{DATABASE_HOSTNAME}:{DATABASE_PORT}/{DATABASE_NAME}"
+if DISPATCH_AUTHENTICATION_PROVIDER_SLUG == "dispatch-auth-provider-basic":
+ if not DISPATCH_JWT_SECRET:
+ log.warn("No JWT secret specified, this is required if you are using basic authentication.")
-# incident plugins
-INCIDENT_PLUGIN_CONTACT_SLUG = config("INCIDENT_PLUGIN_CONTACT_SLUG", default="slack-contact")
-INCIDENT_PLUGIN_CONVERSATION_SLUG = config(
- "INCIDENT_PLUGIN_CONVERSATION_SLUG", default="slack-conversation"
-)
-INCIDENT_PLUGIN_DOCUMENT_SLUG = config(
- "INCIDENT_PLUGIN_DOCUMENT_SLUG", default="google-docs-document"
-)
-INCIDENT_PLUGIN_DOCUMENT_RESOLVER_SLUG = config(
- "INCIDENT_PLUGIN_DOCUMENT_RESOLVER_SLUG", default="dispatch-document-resolver"
-)
-INCIDENT_PLUGIN_EMAIL_SLUG = config(
- "INCIDENT_PLUGIN_EMAIL_SLUG", default="google-gmail-conversation"
-)
-INCIDENT_PLUGIN_GROUP_SLUG = config(
- "INCIDENT_PLUGIN_GROUP_SLUG", default="google-group-participant-group"
-)
-INCIDENT_PLUGIN_PARTICIPANT_SLUG = config(
- "INCIDENT_PLUGIN_PARTICIPANT_SLUG", default="dispatch-participants"
-)
-INCIDENT_PLUGIN_STORAGE_SLUG = config(
- "INCIDENT_PLUGIN_STORAGE_SLUG", default="google-drive-storage"
+DISPATCH_AUTHENTICATION_DEFAULT_USER = config(
+ "DISPATCH_AUTHENTICATION_DEFAULT_USER", default="dispatch@example.com"
)
-INCIDENT_PLUGIN_CONFERENCE_SLUG = config(
- "INCIDENT_PLUGIN_CONFERENCE_SLUG", default="google-calendar-conference"
+DISPATCH_AUTHENTICATION_PROVIDER_PKCE_JWKS = config(
+ "DISPATCH_AUTHENTICATION_PROVIDER_PKCE_JWKS", default=None
)
-INCIDENT_PLUGIN_TICKET_SLUG = config("INCIDENT_PLUGIN_TICKET_SLUG", default="jira-ticket")
-INCIDENT_PLUGIN_TASK_SLUG = config("INCIDENT_PLUGIN_TASK_SLUG", default="google-drive-task")
-# incident resources
-INCIDENT_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT_ID = config(
- "INCIDENT_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT_ID"
-)
-INCIDENT_DOCUMENT_INVESTIGATION_SHEET_ID = config("INCIDENT_DOCUMENT_INVESTIGATION_SHEET_ID")
-INCIDENT_FAQ_DOCUMENT_ID = config("INCIDENT_FAQ_DOCUMENT_ID")
-INCIDENT_STORAGE_ARCHIVAL_FOLDER_ID = config("INCIDENT_STORAGE_ARCHIVAL_FOLDER_ID")
-INCIDENT_STORAGE_INCIDENT_REVIEW_FILE_ID = config("INCIDENT_STORAGE_INCIDENT_REVIEW_FILE_ID")
-INCIDENT_STORAGE_RESTRICTED = config("INCIDENT_STORAGE_RESTRICTED", cast=bool, default=True)
+DISPATCH_PKCE_DONT_VERIFY_AT_HASH = config("DISPATCH_PKCE_DONT_VERIFY_AT_HASH", default=False)
-INCIDENT_NOTIFICATION_CONVERSATIONS = config(
- "INCIDENT_NOTIFICATION_CONVERSATIONS", cast=CommaSeparatedStrings, default=""
-)
+if DISPATCH_AUTHENTICATION_PROVIDER_SLUG == "dispatch-auth-provider-pkce":
+ if not DISPATCH_AUTHENTICATION_PROVIDER_PKCE_JWKS:
+ log.warn(
+ "No PKCE JWKS url provided, this is required if you are using PKCE authentication."
+ )
-INCIDENT_NOTIFICATION_DISTRIBUTION_LISTS = config(
- "INCIDENT_NOTIFICATION_DISTRIBUTION_LISTS", cast=CommaSeparatedStrings, default=""
+DISPATCH_AUTHENTICATION_PROVIDER_HEADER_NAME = config(
+ "DISPATCH_AUTHENTICATION_PROVIDER_HEADER_NAME", default="remote-user"
)
-INCIDENT_DAILY_SUMMARY_ONCALL_SERVICE_ID = config(
- "INCIDENT_DAILY_SUMMARY_ONCALL_SERVICE_ID", default=None
+DISPATCH_AUTHENTICATION_PROVIDER_AWS_ALB_ARN = config(
+ "DISPATCH_AUTHENTICATION_PROVIDER_AWS_ALB_ARN", default=None
)
-
-INCIDENT_RESOURCE_TACTICAL_GROUP = config(
- "INCIDENT_RESOURCE_TACTICAL_GROUP", default="google-group-participant-tactical-group"
+DISPATCH_AUTHENTICATION_PROVIDER_AWS_ALB_EMAIL_CLAIM = config(
+ "DISPATCH_AUTHENTICATION_PROVIDER_AWS_ALB_EMAIL_CLAIM", default="email"
)
-INCIDENT_RESOURCE_NOTIFICATIONS_GROUP = config(
- "INCIDENT_RESOURCE_NOTIFICATIONS_GROUP", default="google-group-participant-notifications-group"
+DISPATCH_AUTHENTICATION_PROVIDER_AWS_ALB_PUBLIC_KEY_CACHE_SECONDS = config(
+ "DISPATCH_AUTHENTICATION_PROVIDER_AWS_ALB_PUBLIC_KEY_CACHE_SECONDS", cast=int, default=300
)
-INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT = config(
- "INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT", default="google-docs-investigation-document"
-)
-INCIDENT_RESOURCE_INVESTIGATION_SHEET = config(
- "INCIDENT_RESOURCE_INVESTIGATION_SHEET", default="google-docs-investigation-sheet"
-)
-INCIDENT_RESOURCE_INCIDENT_REVIEW_DOCUMENT = config(
- "INCIDENT_RESOURCE_INCIDENT_REVIEW_DOCUMENT", default="google-docs-incident-review-document"
+# sentry middleware
+SENTRY_ENABLED = config("SENTRY_ENABLED", default="")
+SENTRY_DSN = config("SENTRY_DSN", default="")
+SENTRY_APP_KEY = config("SENTRY_APP_KEY", default="")
+SENTRY_TAGS = config("SENTRY_TAGS", default="")
+
+# Frontend configuration
+VITE_DISPATCH_AUTHENTICATION_PROVIDER_SLUG = DISPATCH_AUTHENTICATION_PROVIDER_SLUG
+
+VITE_SENTRY_ENABLED = SENTRY_ENABLED
+VITE_SENTRY_DSN = SENTRY_DSN
+VITE_SENTRY_APP_KEY = SENTRY_APP_KEY
+VITE_SENTRY_TAGS = SENTRY_TAGS
+
+# used by pkce authprovider
+VITE_DISPATCH_AUTHENTICATION_PROVIDER_PKCE_OPEN_ID_CONNECT_URL = config(
+ "VITE_DISPATCH_AUTHENTICATION_PROVIDER_PKCE_OPEN_ID_CONNECT_URL", default=""
)
-INCIDENT_RESOURCE_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT = config(
- "INCIDENT_RESOURCE_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT",
- default="google-docs-conversation-commands-reference-document",
+VITE_DISPATCH_AUTHENTICATION_PROVIDER_PKCE_CLIENT_ID = config(
+ "DISPATCH_AUTHENTICATION_PROVIDER_PKCE_CLIENT_ID", default=""
)
-INCIDENT_RESOURCE_FAQ_DOCUMENT = config(
- "INCIDENT_RESOURCE_FAQ_DOCUMENT", default="google-docs-faq-document"
+VITE_DISPATCH_AUTHENTICATION_PROVIDER_USE_ID_TOKEN = config(
+ "DISPATCH_AUTHENTICATION_PROVIDER_USE_ID_TOKEN", default=""
)
-INCIDENT_RESOURCE_INCIDENT_TASK = config(
- "INCIDENT_RESOURCE_INCIDENT_TASK", default="google-docs-incident-task"
+
+# static files
+DEFAULT_STATIC_DIR = os.path.join(
+ os.path.abspath(os.path.dirname(__file__)), os.path.join("static", "dispatch", "dist")
)
+STATIC_DIR = config("STATIC_DIR", default=DEFAULT_STATIC_DIR)
-INCIDENT_METRIC_FORECAST_REGRESSIONS = config("INCIDENT_METRIC_FORECAST_REGRESSIONS", default=None)
+# metrics
+METRIC_PROVIDERS = config("METRIC_PROVIDERS", cast=CommaSeparatedStrings, default="")
-# Incident Cost Configuration
-ANNUAL_COST_EMPLOYEE = config("ANNUAL_COST_EMPLOYEE", cast=int, default="650000")
-BUSINESS_HOURS_YEAR = config("BUSINESS_HOURS_YEAR", cast=int, default="2080")
+# database
+DATABASE_HOSTNAME = config("DATABASE_HOSTNAME")
+DATABASE_CREDENTIALS = config("DATABASE_CREDENTIALS", cast=Secret)
+# this will support special chars for credentials
+_DATABASE_CREDENTIAL_USER, _DATABASE_CREDENTIAL_PASSWORD = str(DATABASE_CREDENTIALS).split(":")
+_QUOTED_DATABASE_PASSWORD = parse.quote(str(_DATABASE_CREDENTIAL_PASSWORD))
+DATABASE_NAME = config("DATABASE_NAME", default="dispatch")
+DATABASE_PORT = config("DATABASE_PORT", default="5432")
+DATABASE_ENGINE_MAX_OVERFLOW = config("DATABASE_ENGINE_MAX_OVERFLOW", cast=int, default=10)
+# Deal with DB disconnects
+# https://docs.sqlalchemy.org/en/20/core/pooling.html#pool-disconnects
+DATABASE_ENGINE_POOL_PING = config("DATABASE_ENGINE_POOL_PING", default=False)
+DATABASE_ENGINE_POOL_RECYCLE = config("DATABASE_ENGINE_POOL_RECYCLE", cast=int, default=3600)
+DATABASE_ENGINE_POOL_SIZE = config("DATABASE_ENGINE_POOL_SIZE", cast=int, default=20)
+DATABASE_ENGINE_POOL_TIMEOUT = config("DATABASE_ENGINE_POOL_TIMEOUT", cast=int, default=30)
+SQLALCHEMY_DATABASE_URI = f"postgresql+psycopg2://{_DATABASE_CREDENTIAL_USER}:{_QUOTED_DATABASE_PASSWORD}@{DATABASE_HOSTNAME}:{DATABASE_PORT}/{DATABASE_NAME}"
+
+ALEMBIC_CORE_REVISION_PATH = config(
+ "ALEMBIC_CORE_REVISION_PATH",
+ default=f"{os.path.dirname(os.path.realpath(__file__))}/database/revisions/core",
+)
+ALEMBIC_TENANT_REVISION_PATH = config(
+ "ALEMBIC_TENANT_REVISION_PATH",
+ default=f"{os.path.dirname(os.path.realpath(__file__))}/database/revisions/tenant",
+)
+ALEMBIC_INI_PATH = config(
+ "ALEMBIC_INI_PATH",
+ default=f"{os.path.dirname(os.path.realpath(__file__))}/alembic.ini",
+)
+ALEMBIC_MULTI_TENANT_MIGRATION_PATH = config(
+ "ALEMBIC_MULTI_TENANT_MIGRATION_PATH",
+ default=f"{os.path.dirname(os.path.realpath(__file__))}/database/revisions/multi-tenant-migration.sql",
+)
diff --git a/src/dispatch/conversation/enums.py b/src/dispatch/conversation/enums.py
index 61d7c360c08f..7906545515ab 100644
--- a/src/dispatch/conversation/enums.py
+++ b/src/dispatch/conversation/enums.py
@@ -1,18 +1,30 @@
-from enum import Enum
+from dispatch.enums import DispatchEnum
-class ConversationCommands(str, Enum):
- mark_active = "mark-active"
- mark_stable = "mark-stable"
- mark_closed = "mark-closed"
- status_report = "status-report"
- list_tasks = "list-tasks"
- list_participants = "list-participants"
+class ConversationCommands(DispatchEnum):
assign_role = "assign-role"
- edit_incident = "edit-incident"
engage_oncall = "engage-oncall"
- list_resources = "list-resources"
+ executive_report = "executive-report"
+ list_participants = "list-participants"
+ list_tasks = "list-tasks"
+ report_incident = "report-incident"
+ tactical_report = "tactical-report"
+ update_incident = "update-incident"
+ escalate_case = "escalate-case"
-class ConversationButtonActions(str, Enum):
+class ConversationButtonActions(DispatchEnum):
+ feedback_notification_provide = "feedback-notification-provide"
+ case_feedback_notification_provide = "case-feedback-notification-provide"
invite_user = "invite-user"
+ invite_user_case = "invite-user-case"
+ monitor_link = "monitor-link"
+ remind_again = "remind-again"
+ service_feedback = "service-feedback"
+ subscribe_user = "subscribe-user"
+ update_task_status = "update-task-status"
+
+
+class ConversationFilters(DispatchEnum):
+ exclude_bots = "exclude-bots"
+ exclude_channel_join = "exclude-channel-join"
diff --git a/src/dispatch/conversation/flows.py b/src/dispatch/conversation/flows.py
new file mode 100644
index 000000000000..bbc41ad21bf4
--- /dev/null
+++ b/src/dispatch/conversation/flows.py
@@ -0,0 +1,584 @@
+import logging
+
+from sqlalchemy.orm import Session
+
+from dispatch.case.models import Case
+from dispatch.conference.models import Conference
+from dispatch.document.models import Document
+from dispatch.enums import EventType
+from dispatch.event import service as event_service
+from dispatch.incident.models import Incident
+from dispatch.messaging.strings import MessageType
+from dispatch.plugin import service as plugin_service
+from dispatch.plugins.dispatch_slack.case import messages
+from dispatch.project.models import Project
+from dispatch.service.models import Service
+from dispatch.storage.models import Storage
+from dispatch.ticket.models import Ticket
+from dispatch.types import Subject
+from dispatch.utils import deslug_and_capitalize_resource_type
+
+from .models import Conversation, ConversationCreate, ConversationUpdate
+from .service import create, update
+
+log = logging.getLogger(__name__)
+
+
+Resource = Document | Conference | Storage | Ticket
+
+
+def create_case_conversation(
+ case: Case,
+ conversation_target: str,
+ db_session: Session,
+):
+ """Create external communication conversation."""
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning("Conversation not created. No conversation plugin enabled.")
+ return
+
+ if not conversation_target:
+ conversation_target = case.case_type.conversation_target
+
+ conversation = None
+
+ # Do not overwrite a case conversation with one of the same type (thread, channel)
+ if case.conversation:
+ if case.has_channel:
+ log.warning(
+ f"Trying to create case conversation but case {case.id} already has a dedicated channel conversation."
+ )
+ return
+ if case.has_thread and not case.dedicated_channel:
+ log.warning(
+ "Trying to create case conversation but case {case.id} already has a thread conversation."
+ )
+ return
+
+ # This case is a thread version, we send a new messaged (threaded) to the conversation target
+ # for the configured case type
+ if conversation_target and not case.dedicated_channel:
+ try:
+ conversation = plugin.instance.create_threaded(
+ case=case,
+ conversation_id=conversation_target,
+ db_session=db_session,
+ )
+ except Exception as e:
+ # TODO: consistency across exceptions
+ log.exception(e)
+
+ # otherwise, it must be a channel based case.
+ if case.dedicated_channel:
+ try:
+ conversation = plugin.instance.create(
+ name=f"case-{case.name}",
+ )
+ except Exception as e:
+ # TODO: consistency across exceptions
+ log.exception(e)
+
+ if not conversation:
+ log.error(f"Conversation not created. Plugin {plugin.plugin.slug} encountered an error.")
+ return
+
+ conversation.update({"resource_type": plugin.plugin.slug, "resource_id": conversation["id"]})
+
+ if not case.conversation:
+ conversation_in = ConversationCreate(
+ resource_id=conversation["resource_id"],
+ resource_type=conversation["resource_type"],
+ weblink=conversation["weblink"],
+ thread_id=conversation.get("timestamp"),
+ channel_id=conversation["id"],
+ )
+ case.conversation = create(db_session=db_session, conversation_in=conversation_in)
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description="Case conversation created",
+ case_id=case.id,
+ )
+ elif case.conversation.thread_id and case.dedicated_channel:
+ thread_conversation_channel_id = case.conversation.channel_id
+ thread_conversation_thread_id = case.conversation.thread_id
+ thread_conversation_weblink = case.conversation.weblink
+
+ conversation_in = ConversationUpdate(
+ resource_id=conversation.get("resource_id"),
+ resource_type=conversation.get("resource_type"),
+ weblink=conversation.get("weblink"),
+ thread_id=conversation.get("timestamp"),
+ channel_id=conversation.get("id"),
+ )
+
+ update(
+ db_session=db_session, conversation=case.conversation, conversation_in=conversation_in
+ )
+
+ event_service.log_case_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description=f"Case conversation has migrated from thread [{thread_conversation_weblink}] to channel[{case.conversation.weblink}].",
+ case_id=case.id,
+ )
+
+ try:
+ plugin.instance.update_thread(
+ case=case,
+ conversation_id=thread_conversation_channel_id,
+ ts=thread_conversation_thread_id,
+ )
+ except Exception as e:
+ event_service.log_subject_event(
+ subject=case,
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Updating thread message failed. Reason: {e}",
+ )
+ log.exception(e)
+
+ # Inform users in the case thread that the conversation has migrated to a channel
+ try:
+ plugin.instance.send(
+ thread_conversation_channel_id,
+ "Notify Case conversation migration",
+ [],
+ MessageType.case_notification,
+ blocks=messages.create_case_thread_migration_message(
+ channel_weblink=conversation.get("weblink")
+ ),
+ ts=thread_conversation_thread_id,
+ )
+ except Exception as e:
+ event_service.log_subject_event(
+ subject=case,
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Failed to send message to original Case thread. Reason: {e}",
+ )
+ log.exception(e)
+
+ # Provide users in the case channel which thread the conversation originated from.
+ try:
+ plugin.instance.send(
+ case.conversation.channel_id,
+ "Maintain Case conversation context",
+ [],
+ MessageType.case_notification,
+ blocks=messages.create_case_channel_migration_message(
+ thread_weblink=thread_conversation_weblink
+ ),
+ )
+ except Exception as e:
+ event_service.log_subject_event(
+ subject=case,
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Failed to send message to dedicated Case channel. Reason: {e}",
+ )
+ log.exception(e)
+
+ db_session.add(case)
+ db_session.commit()
+
+ return case.conversation
+
+
+def create_incident_conversation(incident: Incident, db_session: Session):
+ """Creates a conversation."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning("Conversation not created. No conversation plugin enabled.")
+ return
+
+ # we create the external conversation
+ try:
+ external_conversation = plugin.instance.create(incident.name)
+ except Exception as e:
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Creating the incident conversation failed. Reason: {e}",
+ incident_id=incident.id,
+ )
+ log.exception(e)
+ return
+
+ if not external_conversation:
+ log.error(f"Conversation not created. Plugin {plugin.plugin.slug} encountered an error.")
+ return
+
+ external_conversation.update(
+ {"resource_type": plugin.plugin.slug, "resource_id": external_conversation["id"]}
+ )
+
+ # we create the internal conversation room
+ conversation_in = ConversationCreate(
+ resource_id=external_conversation["resource_id"],
+ resource_type=external_conversation["resource_type"],
+ weblink=external_conversation["weblink"],
+ channel_id=external_conversation["id"],
+ )
+ incident.conversation = create(conversation_in=conversation_in, db_session=db_session)
+
+ db_session.add(incident)
+ db_session.commit()
+
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description="Incident conversation created",
+ incident_id=incident.id,
+ )
+
+ return incident.conversation
+
+
+def archive_conversation(subject: Subject, db_session: Session) -> None:
+ """Archives a conversation."""
+ if not subject.conversation:
+ log.warning("Conversation not archived. No conversation available for this subject.")
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=subject.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning("Conversation not archived. No conversation plugin enabled.")
+ return
+
+ try:
+ plugin.instance.archive(subject.conversation.channel_id)
+ except Exception as e:
+ event_service.log_subject_event(
+ subject=subject,
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Archiving conversation failed. Reason: {e}",
+ )
+ log.exception(e)
+
+
+def unarchive_conversation(subject: Subject, db_session: Session) -> None:
+ """Unarchives a conversation."""
+ if not subject.conversation:
+ log.warning("Conversation not unarchived. No conversation available for this subject.")
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=subject.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning("Conversation not unarchived. No conversation plugin enabled.")
+ return
+
+ try:
+ plugin.instance.unarchive(subject.conversation.channel_id)
+ except Exception as e:
+ event_service.log_subject_event(
+ subject=subject,
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Unarchiving conversation failed. Reason: {e}",
+ )
+ log.exception(e)
+
+
+def get_topic_text(subject: Subject) -> str:
+ """Returns the topic details based on subject"""
+ if isinstance(subject, Incident):
+ return (
+ f"âī¸ {subject.commander.individual.name}, {subject.commander.team} | "
+ f"Status: {subject.status} | "
+ f"Type: {subject.incident_type.name} | "
+ f"Severity: {subject.incident_severity.name} | "
+ f"Priority: {subject.incident_priority.name}"
+ )
+ return (
+ f"âī¸ {subject.assignee.individual.name}, {subject.assignee.team} | "
+ f"Status: {subject.status} | "
+ f"Type: {subject.case_type.name} | "
+ f"Severity: {subject.case_severity.name} | "
+ f"Priority: {subject.case_priority.name}"
+ )
+
+
+def set_conversation_topic(subject: Subject, db_session: Session) -> None:
+ """Sets the conversation topic."""
+ if not subject.conversation:
+ log.warning("Conversation topic not set. No conversation available for this incident/case.")
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=subject.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning("Conversation topic not set. No conversation plugin enabled.")
+ return
+
+ conversation_topic = get_topic_text(subject)
+
+ try:
+ plugin.instance.set_topic(subject.conversation.channel_id, conversation_topic)
+ except Exception as e:
+ event_service.log_subject_event(
+ subject=subject,
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Setting the incident/case conversation topic failed. Reason: {e}",
+ )
+ log.exception(e)
+
+
+def get_current_oncall_email(project: Project, service: Service, db_session: Session) -> str | None:
+ """Notifies oncall about completed form"""
+ oncall_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project.id, plugin_type="oncall"
+ )
+ if not oncall_plugin:
+ log.debug("Unable to send email since oncall plugin is not active.")
+ else:
+ return oncall_plugin.instance.get(service.external_id)
+
+
+def get_description_text(subject: Subject, db_session: Session) -> str | None:
+ """Returns the description details based on the subject"""
+ if not isinstance(subject, Incident):
+ return
+
+ incident_type = subject.incident_type
+ if not incident_type.channel_description:
+ return
+
+ description_service = incident_type.description_service
+ if description_service:
+ oncall_email = get_current_oncall_email(
+ project=subject.project, service=description_service, db_session=db_session
+ )
+ if oncall_email:
+ return incident_type.channel_description.replace("{oncall_email}", oncall_email)
+
+ return incident_type.channel_description
+
+
+def set_conversation_description(subject: Subject, db_session: Session) -> None:
+ """Sets the conversation description."""
+ if not subject.conversation:
+ log.warning("Conversation topic not set. No conversation available for this incident/case.")
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=subject.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning("Conversation topic not set. No conversation plugin enabled.")
+ return
+
+ conversation_description = get_description_text(subject, db_session)
+ if not conversation_description:
+ return
+
+ try:
+ plugin.instance.set_description(subject.conversation.channel_id, conversation_description)
+ except Exception as e:
+ event_service.log_subject_event(
+ subject=subject,
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Setting the incident/case conversation description failed. Reason: {e}",
+ )
+ log.exception(e)
+
+
+def add_conversation_bookmark(
+ db_session: Session,
+ subject: Subject,
+ resource: Resource,
+ title: str | None = None,
+):
+ """Adds a conversation bookmark."""
+ if not resource:
+ log.warning("No conversation bookmark added since no resource available for subject.")
+ return
+
+ resource_name = (
+ resource.name.lower()
+ if hasattr(resource, "name")
+ else (
+ deslug_and_capitalize_resource_type(resource.resource_type)
+ if hasattr(resource, "resource_type")
+ else title
+ if title
+ else "untitled resource"
+ )
+ )
+
+ if not subject.conversation:
+ log.warning(f"Conversation bookmark {resource_name} not added. No conversation available.")
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session,
+ project_id=subject.project.id,
+ plugin_type="conversation",
+ )
+ if not plugin:
+ log.warning(
+ f"Conversation bookmark {resource_name} not added. No conversation plugin enabled."
+ )
+ return
+
+ try:
+ if not title:
+ title = deslug_and_capitalize_resource_type(resource.resource_type)
+ (
+ plugin.instance.add_bookmark(
+ subject.conversation.channel_id,
+ resource.weblink,
+ title=title,
+ )
+ if resource
+ else log.warning(
+ f"{resource_name} bookmark not added. No {resource_name} available for subject.."
+ )
+ )
+ except Exception as e:
+ event_service.log_subject_event(
+ subject=subject,
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Adding the {resource_name} bookmark failed. Reason: {e}",
+ )
+ log.exception(e)
+
+
+def add_case_participants(
+ case: Case,
+ participant_emails: list[str],
+ db_session: Session,
+):
+ """Adds one or more participants to the case conversation."""
+ if not case.conversation:
+ log.warning(
+ "Case participant(s) not added to conversation. No conversation available for this case."
+ )
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning(
+ "Case participant(s) not added to conversation. No conversation plugin enabled."
+ )
+ return
+
+ try:
+ if case.has_thread:
+ if case.signal_instances:
+ if case.signal_instances[0].signal.lifecycle == "production":
+ # we only add participants to case threads that originate from signals in production
+ plugin.instance.add_to_thread(
+ case.conversation.channel_id,
+ case.conversation.thread_id,
+ participant_emails,
+ )
+
+ # log event for adding participants
+ event_service.log_case_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description=f"{', '.join(participant_emails)} added to conversation (channel ID: {case.conversation.channel_id}, thread ID: {case.conversation.thread_id})",
+ case_id=case.id,
+ type=EventType.participant_updated,
+ )
+ log.info(f"{', '.join(participant_emails)} added to conversation (channel ID: {case.conversation.channel_id}, thread ID: {case.conversation.thread_id})")
+ elif case.has_channel:
+ plugin.instance.add(case.conversation.channel_id, participant_emails)
+
+ # log event for adding participants
+ event_service.log_case_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description=f"{', '.join(participant_emails)} added to conversation (channel ID: {case.conversation.channel_id})",
+ case_id=case.id,
+ type=EventType.participant_updated,
+ )
+ log.info(f"{', '.join(participant_emails)} added to conversation (channel ID: {case.conversation.channel_id})")
+ except Exception as e:
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Adding participant(s) to case conversation failed. Reason: {e}",
+ case_id=case.id,
+ )
+ log.exception(e)
+ raise e
+
+
+def add_incident_participants_to_conversation(
+ incident: Incident,
+ participant_emails: list[str],
+ db_session: Session,
+):
+ """Adds one or more participants to the incident conversation."""
+ if not incident.conversation:
+ log.warning(
+ "Incident participant(s) not added to conversation. No conversation available for this incident."
+ )
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning(
+ "Incident participant(s) not added to conversation. No conversation plugin enabled."
+ )
+ return
+
+ try:
+ plugin.instance.add(incident.conversation.channel_id, participant_emails)
+ except Exception as e:
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Adding participant(s) to incident conversation failed. Reason: {e}",
+ incident_id=incident.id,
+ )
+ log.exception(e)
+ else:
+ log.info(
+ f"Add participants {str(participant_emails)} to Incident {incident.id} successfully."
+ )
+
+
+def delete_conversation(conversation: Conversation, project_id: int, db_session: Session):
+ """
+ Renames and archives an existing conversation. Deleting a conversation
+ requires admin permissions in some SaaS products.
+ """
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning("Conversation not renamed and archived. No conversation plugin enabled.")
+ return
+
+ try:
+ # we rename the conversation to avoid future naming collisions
+ plugin.instance.rename(
+ conversation.channel_id, f"{conversation.incident.name.lower()}-deleted"
+ )
+ # we archive the conversation
+ plugin.instance.archive(conversation.channel_id)
+ except Exception as e:
+ log.exception(e)
diff --git a/src/dispatch/conversation/messaging.py b/src/dispatch/conversation/messaging.py
new file mode 100644
index 000000000000..2a663d30ae26
--- /dev/null
+++ b/src/dispatch/conversation/messaging.py
@@ -0,0 +1,34 @@
+"""
+.. module: dispatch.conversation.messaging
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
+
+import logging
+
+from sqlalchemy.orm import Session
+
+from dispatch.plugin import service as plugin_service
+
+
+log = logging.getLogger(__name__)
+
+
+def send_conversation_message(
+ channel_id: str, project_id: int, user_id: str, message: str, db_session: Session
+):
+ """Sends feedack to the user using an ephemeral message."""
+ blocks = [{"type": "section", "text": {"type": "mrkdwn", "text": message}}]
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type="conversation"
+ )
+
+ if not plugin:
+ log.warning("Conversation message not sent. No conversation plugin enabled.")
+ return
+
+ plugin.instance.send_ephemeral(
+ channel_id, user_id, "Conversation Command Feedback", blocks=blocks
+ )
diff --git a/src/dispatch/conversation/models.py b/src/dispatch/conversation/models.py
index 1593a81fc9e9..e1e93e2b73e1 100644
--- a/src/dispatch/conversation/models.py
+++ b/src/dispatch/conversation/models.py
@@ -1,43 +1,53 @@
-from pydantic import validator
+"""Models for conversation resources in the Dispatch application."""
-from typing import Optional
+from pydantic import field_validator
-from sqlalchemy import Column, String, Integer
+from sqlalchemy import Column, String, Integer, ForeignKey
-from dispatch.database import Base
-from dispatch.messaging import INCIDENT_CONVERSATION_DESCRIPTION
-from dispatch.models import DispatchBase, ResourceMixin
+from dispatch.database.core import Base
+from dispatch.messaging.strings import INCIDENT_CONVERSATION_DESCRIPTION
+from dispatch.models import ResourceBase, ResourceMixin, PrimaryKey
class Conversation(Base, ResourceMixin):
+ """SQLAlchemy model for conversation resources."""
id = Column(Integer, primary_key=True)
channel_id = Column(String)
+ thread_id = Column(String)
+
+ incident_id = Column(Integer, ForeignKey("incident.id", ondelete="CASCADE"))
+ case_id = Column(Integer, ForeignKey("case.id", ondelete="CASCADE"))
# Pydantic models...
-class ConversationBase(DispatchBase):
- resource_id: str
- resource_type: str
- weblink: str
- channel_id: str
+class ConversationBase(ResourceBase):
+ """Base Pydantic model for conversation resources."""
+ channel_id: str | None = None
+ thread_id: str | None = None
class ConversationCreate(ConversationBase):
+ """Pydantic model for creating a conversation resource."""
pass
class ConversationUpdate(ConversationBase):
+ """Pydantic model for updating a conversation resource."""
pass
class ConversationRead(ConversationBase):
- description: Optional[str]
-
- @validator("description", pre=True, always=True)
- def set_description(cls, v):
- """Sets the description"""
+ """Pydantic model for reading a conversation resource."""
+ id: PrimaryKey
+ description: str | None = None
+
+ @field_validator("description", mode="before")
+ @classmethod
+ def set_description(cls, _):
+ """Sets the description for the conversation resource."""
return INCIDENT_CONVERSATION_DESCRIPTION
class ConversationNested(ConversationBase):
+ """Pydantic model for a nested conversation resource."""
pass
diff --git a/src/dispatch/conversation/service.py b/src/dispatch/conversation/service.py
index db8c464542c4..f0e1bae7e249 100644
--- a/src/dispatch/conversation/service.py
+++ b/src/dispatch/conversation/service.py
@@ -1,24 +1,62 @@
-from typing import Optional
-
-from fastapi.encoders import jsonable_encoder
-
+import logging
from .models import Conversation, ConversationCreate, ConversationUpdate
+log = logging.getLogger(__name__)
+
-def get(*, db_session, conversation_id: int) -> Optional[Conversation]:
- """Returns a conversation based on the given conversation id."""
+def get(*, db_session, conversation_id: int) -> Conversation | None:
+ """Gets a conversation by its id."""
return db_session.query(Conversation).filter(Conversation.id == conversation_id).one_or_none()
-def get_by_channel_id(db_session, channel_id: str) -> Optional[Conversation]:
- """Returns a conversation based on the given channel id."""
- return (
- db_session.query(Conversation).filter(Conversation.channel_id == channel_id).one_or_none()
- )
+def get_by_channel_id_ignoring_channel_type(
+ db_session, channel_id: str, thread_id: str = None
+) -> Conversation | None:
+ """
+ Gets a conversation by its id ignoring the channel type, and updates the
+ channel id in the database if the channel type has changed.
+ """
+ conversation = None
+
+ conversations = db_session.query(Conversation).filter(Conversation.channel_id == channel_id)
+
+ # The code below disambiguates between incident threads, case threads, and incident messages
+ if not thread_id:
+ # assume incident message
+ conversation = conversations.first()
+
+ if not conversation:
+ conversation = conversations.filter(Conversation.thread_id == thread_id).one_or_none()
+
+ if not conversation:
+ # No conversations with that thread_id, check all conversations without thread filter
+ conversation_count = conversations.count()
+ if conversation_count > 1:
+ # this happens when a user posts in the main thread of a triage channel since
+ # there are multiple cases in the channel with that channel_id
+ # so we log a warning and return None
+ log.warning(
+ f"Multiple conversations found for channel_id: {channel_id}, thread_id: {thread_id}"
+ )
+ conversation = None
+ else:
+ conversation = conversations.one_or_none()
+
+ if conversation:
+ if channel_id[0] != conversation.channel_id[0]:
+ # We update the channel id if the channel type has changed (public <-> private)
+ conversation_in = ConversationUpdate(channel_id=channel_id)
+ update(
+ db_session=db_session,
+ conversation=conversation,
+ conversation_in=conversation_in,
+ )
+
+ return conversation
def get_all(*, db_session):
- """Returns all conversations."""
+ """Fetches all conversations."""
return db_session.query(Conversation)
@@ -34,14 +72,13 @@ def update(
*, db_session, conversation: Conversation, conversation_in: ConversationUpdate
) -> Conversation:
"""Updates a conversation."""
- conversation_data = jsonable_encoder(conversation)
- update_data = conversation_in.dict(skip_defaults=True)
+ conversation_data = conversation.dict()
+ update_data = conversation_in.dict(exclude_unset=True)
for field in conversation_data:
if field in update_data:
setattr(conversation, field, update_data[field])
- db_session.add(conversation)
db_session.commit()
return conversation
diff --git a/tests/utils/__init__.py b/src/dispatch/cost_model/__init__.py
similarity index 100%
rename from tests/utils/__init__.py
rename to src/dispatch/cost_model/__init__.py
diff --git a/src/dispatch/cost_model/models.py b/src/dispatch/cost_model/models.py
new file mode 100644
index 000000000000..69e15f7417f2
--- /dev/null
+++ b/src/dispatch/cost_model/models.py
@@ -0,0 +1,112 @@
+from datetime import datetime
+from pydantic import Field
+from sqlalchemy import (
+ Column,
+ Integer,
+ String,
+ Boolean,
+ ForeignKey,
+ PrimaryKeyConstraint,
+ Table,
+)
+from sqlalchemy.orm import relationship
+from sqlalchemy.sql.schema import UniqueConstraint
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base
+from dispatch.models import (
+ DispatchBase,
+ NameStr,
+ Pagination,
+ PrimaryKey,
+ ProjectMixin,
+ TimeStampMixin,
+)
+from dispatch.plugin.models import PluginEventRead, PluginEvent
+from dispatch.project.models import ProjectRead
+
+assoc_cost_model_activities = Table(
+ "assoc_cost_model_activities",
+ Base.metadata,
+ Column("cost_model_id", Integer, ForeignKey("cost_model.id", ondelete="CASCADE")),
+ Column(
+ "cost_model_activity_id",
+ Integer,
+ ForeignKey("cost_model_activity.id", ondelete="CASCADE"),
+ ),
+ PrimaryKeyConstraint("cost_model_id", "cost_model_activity_id"),
+)
+
+
+# SQLAlchemy Model
+class CostModelActivity(Base):
+ id = Column(Integer, primary_key=True)
+ plugin_event_id = Column(Integer, ForeignKey(PluginEvent.id, ondelete="CASCADE"))
+ plugin_event = relationship(PluginEvent, backref="plugin_event")
+ response_time_seconds = Column(Integer, default=300)
+ enabled = Column(Boolean, default=True)
+
+
+class CostModel(Base, TimeStampMixin, ProjectMixin):
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ enabled = Column(Boolean)
+ activities = relationship(
+ "CostModelActivity",
+ secondary=assoc_cost_model_activities,
+ lazy="subquery",
+ backref="cost_model",
+ )
+ search_vector = Column(
+ TSVectorType("name", "description", weights={"name": "A", "description": "B"})
+ )
+
+
+# Pydantic Models
+class CostModelActivityBase(DispatchBase):
+ """Base class for cost model activity resources"""
+
+ plugin_event: PluginEventRead
+ response_time_seconds: int | None = 300
+ enabled: bool | None = Field(True, nullable=True)
+
+
+class CostModelActivityCreate(CostModelActivityBase):
+ id: PrimaryKey | None = None
+
+
+class CostModelActivityRead(CostModelActivityBase):
+ id: PrimaryKey
+
+
+class CostModelActivityUpdate(CostModelActivityBase):
+ id: PrimaryKey | None = None
+
+
+class CostModelBase(DispatchBase):
+ name: NameStr
+ description: str | None = None
+ enabled: bool | None = Field(True, nullable=True)
+ created_at: datetime | None = None
+ updated_at: datetime | None = None
+ project: ProjectRead
+
+
+class CostModelUpdate(CostModelBase):
+ id: PrimaryKey
+ activities: list[CostModelActivityUpdate] | None = []
+
+
+class CostModelCreate(CostModelBase):
+ activities: list[CostModelActivityCreate] | None = []
+
+
+class CostModelRead(CostModelBase):
+ id: PrimaryKey
+ activities: list[CostModelActivityRead] | None = []
+
+
+class CostModelPagination(Pagination):
+ items: list[CostModelRead] = []
diff --git a/src/dispatch/cost_model/service.py b/src/dispatch/cost_model/service.py
new file mode 100644
index 000000000000..50568f68b79e
--- /dev/null
+++ b/src/dispatch/cost_model/service.py
@@ -0,0 +1,207 @@
+from datetime import datetime
+import logging
+
+from .models import (
+ CostModel,
+ CostModelCreate,
+ CostModelRead,
+ CostModelUpdate,
+ CostModelActivity,
+ CostModelActivityCreate,
+ CostModelActivityUpdate,
+)
+from dispatch.cost_model import service as cost_model_service
+from dispatch.plugin import service as plugin_service
+from dispatch.project import service as project_service
+
+log = logging.getLogger(__name__)
+
+
+def has_unique_plugin_event(cost_model_in: CostModelRead) -> bool:
+ seen = set()
+ for activity in cost_model_in.activities:
+ if activity.plugin_event.id in seen:
+ log.warning(
+ f"Duplicate plugin event id detected. Please ensure all plugin events are unique for each cost model. Duplicate id: {activity.plugin_event.id}"
+ )
+ return False
+ seen.add(activity.plugin_event.id)
+ return True
+
+
+def get_default(*, db_session, project_id: int) -> CostModel:
+ """Returns the default cost model."""
+ return (
+ db_session.query(CostModel)
+ .filter(CostModel.project_id == project_id)
+ .order_by(CostModel.created_at.desc())
+ .first()
+ )
+
+
+def get_all(*, db_session, project_id: int) -> list[CostModel]:
+ """Returns all cost models."""
+ if project_id:
+ return db_session.query(CostModel).filter(CostModel.project_id == project_id)
+ return db_session.query(CostModel)
+
+
+def get_cost_model_activity_by_id(*, db_session, cost_model_activity_id: int) -> CostModelActivity:
+ """Returns a cost model activity based on the given cost model activity id."""
+ return (
+ db_session.query(CostModelActivity)
+ .filter(CostModelActivity.id == cost_model_activity_id)
+ .one()
+ )
+
+
+def delete_cost_model_activity(*, db_session, cost_model_activity_id: int):
+ """Deletes a cost model activity."""
+ cost_model_activity = get_cost_model_activity_by_id(
+ db_session=db_session, cost_model_activity_id=cost_model_activity_id
+ )
+ db_session.delete(cost_model_activity)
+ db_session.commit()
+
+
+def update_cost_model_activity(*, db_session, cost_model_activity_in: CostModelActivityUpdate):
+ """Updates a cost model activity."""
+ cost_model_activity = get_cost_model_activity_by_id(
+ db_session=db_session, cost_model_activity_id=cost_model_activity_in.id
+ )
+
+ cost_model_activity.response_time_seconds = cost_model_activity_in.response_time_seconds
+ cost_model_activity.enabled = cost_model_activity_in.enabled
+ cost_model_activity.plugin_event_id = cost_model_activity_in.plugin_event.id
+
+ db_session.commit()
+ return cost_model_activity
+
+
+def create_cost_model_activity(
+ *, db_session, cost_model_activity_in: CostModelActivityCreate
+) -> CostModelActivity:
+ cost_model_activity = CostModelActivity(
+ response_time_seconds=cost_model_activity_in.response_time_seconds,
+ enabled=cost_model_activity_in.enabled,
+ plugin_event_id=cost_model_activity_in.plugin_event.id,
+ )
+
+ db_session.add(cost_model_activity)
+ db_session.commit()
+ return cost_model_activity
+
+
+def delete(*, db_session, cost_model_id: int):
+ """Deletes a cost model."""
+ cost_model = get_cost_model_by_id(db_session=db_session, cost_model_id=cost_model_id)
+ if not cost_model:
+ raise ValueError(
+ f"Unable to delete cost model. No cost model found with id {cost_model_id}."
+ )
+
+ db_session.delete(cost_model)
+ db_session.commit()
+
+
+def update(*, db_session, cost_model_in: CostModelUpdate) -> CostModel:
+ """Updates a cost model."""
+ if not has_unique_plugin_event(cost_model_in):
+ raise KeyError("Unable to update cost model. Duplicate plugin event ids detected.")
+
+ cost_model = get_cost_model_by_id(db_session=db_session, cost_model_id=cost_model_in.id)
+ if not cost_model:
+ raise ValueError("Unable to update cost model. No cost model found with that id.")
+
+ cost_model.name = cost_model_in.name
+ cost_model.description = cost_model_in.description
+ cost_model.enabled = cost_model_in.enabled
+ cost_model.created_at = cost_model_in.created_at
+ cost_model.updated_at = (
+ cost_model_in.updated_at if cost_model_in.updated_at else datetime.utcnow()
+ )
+
+ # Update all recognized activities. Delete all removed activities.
+ update_activities = []
+ delete_activities = []
+
+ for activity in cost_model.activities:
+ updated = False
+ for idx_in, activity_in in enumerate(cost_model_in.activities):
+ if activity.plugin_event.id == activity_in.plugin_event.id:
+ update_activities.append((activity, activity_in))
+ cost_model_in.activities.pop(idx_in)
+ updated = True
+ break
+ if updated:
+ continue
+
+ # Delete activities that have been removed from the cost model.
+ delete_activities.append(activity)
+
+ for activity, activity_in in update_activities:
+ activity.response_time_seconds = activity_in.response_time_seconds
+ activity.enabled = activity_in.enabled
+ activity.plugin_event = plugin_service.get_plugin_event_by_id(
+ db_session=db_session, plugin_event_id=activity_in.plugin_event.id
+ )
+
+ for activity in delete_activities:
+ cost_model_service.delete_cost_model_activity(
+ db_session=db_session, cost_model_activity_id=activity.id
+ )
+
+ # Create new activities.
+ for activity_in in cost_model_in.activities:
+ activity_out = cost_model_service.create_cost_model_activity(
+ db_session=db_session, cost_model_activity_in=activity_in
+ )
+
+ if not activity_out:
+ log.error("Failed to create cost model activity. Continuing.")
+ continue
+
+ cost_model.activities.append(activity_out)
+
+ db_session.commit()
+ return cost_model
+
+
+def create(*, db_session, cost_model_in: CostModelCreate) -> CostModel:
+ """Creates a new cost model."""
+ if not has_unique_plugin_event(cost_model_in):
+ raise KeyError("Unable to update cost model. Duplicate plugin event ids detected.")
+
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=cost_model_in.project
+ )
+
+ cost_model = CostModel(
+ **cost_model_in.dict(exclude={"activities", "project"}),
+ activities=[],
+ project=project,
+ )
+
+ db_session.add(cost_model)
+ db_session.commit()
+
+ # Create activities after the cost model is created.
+ # We need the cost model id to map to the activity.
+ if cost_model and cost_model_in.activities:
+ for activity_in in cost_model_in.activities:
+ activity_out = cost_model_service.create_cost_model_activity(
+ db_session=db_session, cost_model_activity_in=activity_in
+ )
+ if not activity_out:
+ log.error("Failed to create cost model activity. Continuing.")
+ continue
+
+ cost_model.activities.append(activity_out)
+
+ db_session.commit()
+ return cost_model
+
+
+def get_cost_model_by_id(*, db_session, cost_model_id: int) -> CostModel:
+ """Returns a cost model based on the given cost model id."""
+ return db_session.query(CostModel).filter(CostModel.id == cost_model_id).one()
diff --git a/src/dispatch/cost_model/views.py b/src/dispatch/cost_model/views.py
new file mode 100644
index 000000000000..8d41b8e310a7
--- /dev/null
+++ b/src/dispatch/cost_model/views.py
@@ -0,0 +1,76 @@
+from fastapi import APIRouter, Depends, HTTPException, status
+import logging
+from sqlalchemy.exc import IntegrityError
+
+from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
+
+from .models import (
+ CostModelCreate,
+ CostModelPagination,
+ CostModelRead,
+ CostModelUpdate,
+)
+from .service import create, update, delete
+
+log = logging.getLogger(__name__)
+
+router = APIRouter()
+
+
+@router.get("", response_model=CostModelPagination)
+def get_cost_models(common: CommonParameters):
+ """Get all cost models, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="CostModel", **common)
+
+
+@router.post(
+ "",
+ summary="Creates a new cost model.",
+ response_model=CostModelRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_cost_model(
+ db_session: DbSession,
+ cost_model_in: CostModelCreate,
+):
+ """Create a cost model."""
+ return create(db_session=db_session, cost_model_in=cost_model_in)
+
+
+@router.put(
+ "/{cost_model_id}",
+ summary="Modifies an existing cost model.",
+ response_model=CostModelRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_cost_model(
+ cost_model_id: PrimaryKey,
+ db_session: DbSession,
+ cost_model_in: CostModelUpdate,
+):
+ """Modifies an existing cost model."""
+ return update(db_session=db_session, cost_model_in=cost_model_in)
+
+
+@router.delete(
+ "/{cost_model_id}",
+ response_model=None,
+ summary="Deletes a cost model and its activities.",
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def delete_cost_model(
+ cost_model_id: PrimaryKey,
+ db_session: DbSession,
+):
+ """Deletes a cost model and its external resources."""
+ try:
+ delete(cost_model_id=cost_model_id, db_session=db_session)
+ except IntegrityError as e:
+ log.exception(e)
+ raise HTTPException(
+ status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
+ detail=[{"msg": (f"Cost Model with id {cost_model_id} could not be deleted. ")}],
+ ) from None
diff --git a/src/dispatch/constants.py b/src/dispatch/data/__init__.py
similarity index 100%
rename from src/dispatch/constants.py
rename to src/dispatch/data/__init__.py
diff --git a/src/dispatch/data/alert/models.py b/src/dispatch/data/alert/models.py
new file mode 100644
index 000000000000..db2221b23d77
--- /dev/null
+++ b/src/dispatch/data/alert/models.py
@@ -0,0 +1,41 @@
+from pydantic import Field
+
+from sqlalchemy import Column, ForeignKey, Integer, String
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base
+from dispatch.models import DispatchBase, TimeStampMixin, PrimaryKey, Pagination
+
+
+class Alert(Base, TimeStampMixin):
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ orginator = Column(String)
+ external_link = Column(String)
+ source_id = Column(Integer, ForeignKey("source.id"))
+ search_vector = Column(TSVectorType("name", regconfig="pg_catalog.simple"))
+
+
+# Pydantic models
+class AlertBase(DispatchBase):
+ name: str | None = Field(None, nullable=False)
+ description: str | None = None
+ originator: str | None = None
+ external_link: str | None = None
+
+
+class AlertCreate(AlertBase):
+ id: PrimaryKey | None = None
+
+
+class AlertUpdate(AlertBase):
+ id: PrimaryKey | None = None
+
+
+class AlertRead(AlertBase):
+ id: PrimaryKey
+
+
+class AlertPagination(Pagination):
+ items: list[AlertRead]
diff --git a/src/dispatch/data/alert/service.py b/src/dispatch/data/alert/service.py
new file mode 100644
index 000000000000..f144f187223a
--- /dev/null
+++ b/src/dispatch/data/alert/service.py
@@ -0,0 +1,81 @@
+
+from pydantic import ValidationError
+
+
+from .models import Alert, AlertCreate, AlertRead, AlertUpdate
+
+
+def get(*, db_session, alert_id: int) -> Alert | None:
+ """Gets an alert by its id."""
+ return db_session.query(Alert).filter(Alert.id == alert_id).one_or_none()
+
+
+def get_by_name(*, db_session, name: str) -> Alert | None:
+ """Gets a alert by its name."""
+ return db_session.query(Alert).filter(Alert.name == name).one_or_none()
+
+
+def get_by_name_or_raise(*, db_session, alert_in: AlertRead) -> AlertRead:
+ """Returns the alert specified or raises ValidationError."""
+ alert = get_by_name(db_session=db_session, name=alert_in.name)
+
+ if not alert:
+ raise ValidationError(
+ [
+ {
+ "msg": "Alert not found.",
+ "alert": alert_in.name,
+ "loc": ["alert"],
+ }
+ ]
+ )
+
+ return alert
+
+
+def get_all(*, db_session):
+ """Gets all alerts."""
+ return db_session.query(Alert)
+
+
+def create(*, db_session, alert_in: AlertCreate) -> Alert:
+ """Creates a new alert."""
+ alert = Alert(**alert_in.dict(exclude={}))
+ db_session.add(alert)
+ db_session.commit()
+ return alert
+
+
+def get_or_create(*, db_session, alert_in: AlertCreate) -> Alert:
+ """Gets or creates a new alert."""
+ # prefer the alert id if available
+ if alert_in.id:
+ q = db_session.query(Alert).filter(Alert.id == alert_in.id)
+ else:
+ q = db_session.query(Alert).filter_by(name=alert_in.name)
+
+ instance = q.first()
+ if instance:
+ return instance
+
+ return create(db_session=db_session, alert_in=alert_in)
+
+
+def update(*, db_session, alert: Alert, alert_in: AlertUpdate) -> Alert:
+ """Updates an existing alert."""
+ alert_data = alert.dict()
+ update_data = alert_in.dict(exclude_unset=True, exclude={})
+
+ for field in alert_data:
+ if field in update_data:
+ setattr(alert, field, update_data[field])
+
+ db_session.commit()
+ return alert
+
+
+def delete(*, db_session, alert_id: int):
+ """Deletes an existing alert."""
+ alert = db_session.query(Alert).filter(Alert.id == alert_id).one_or_none()
+ db_session.delete(alert)
+ db_session.commit()
diff --git a/src/dispatch/data/alert/views.py b/src/dispatch/data/alert/views.py
new file mode 100644
index 000000000000..aac5e869e227
--- /dev/null
+++ b/src/dispatch/data/alert/views.py
@@ -0,0 +1,55 @@
+from fastapi import APIRouter, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.models import PrimaryKey
+
+from .models import (
+ AlertCreate,
+ AlertRead,
+ AlertUpdate,
+)
+from .service import create, delete, get, update
+
+router = APIRouter()
+
+
+@router.get("/{alert_id}", response_model=AlertRead)
+def get_alert(db_session: DbSession, alert_id: PrimaryKey):
+ """Given its unique id, retrieve details about a single alert."""
+ alert = get(db_session=db_session, alert_id=alert_id)
+ if not alert:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "The requested alert does not exist."}],
+ )
+ return alert
+
+
+@router.post("", response_model=AlertRead)
+def create_alert(db_session: DbSession, alert_in: AlertCreate):
+ """Creates a new alert."""
+ return create(db_session=db_session, alert_in=alert_in)
+
+
+@router.put("/{alert_id}", response_model=AlertRead)
+def update_alert(db_session: DbSession, alert_id: PrimaryKey, alert_in: AlertUpdate):
+ """Updates an alert."""
+ alert = get(db_session=db_session, alert_id=alert_id)
+ if not alert:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An alert with this id does not exist."}],
+ )
+ return update(db_session=db_session, alert=alert, alert_in=alert_in)
+
+
+@router.delete("/{alert_id}", response_model=None)
+def delete_alert(db_session: DbSession, alert_id: PrimaryKey):
+ """Deletes an alert, returning only an HTTP 200 OK if successful."""
+ alert = get(db_session=db_session, alert_id=alert_id)
+ if not alert:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An alert with this id does not exist."}],
+ )
+ delete(db_session=db_session, alert_id=alert_id)
diff --git a/src/dispatch/static/dispatch/mock/post.js b/src/dispatch/data/query/__init__.py
similarity index 100%
rename from src/dispatch/static/dispatch/mock/post.js
rename to src/dispatch/data/query/__init__.py
diff --git a/src/dispatch/data/query/models.py b/src/dispatch/data/query/models.py
new file mode 100644
index 000000000000..210acc70df60
--- /dev/null
+++ b/src/dispatch/data/query/models.py
@@ -0,0 +1,74 @@
+from pydantic import Field
+
+from sqlalchemy import Column, Integer, String, Table, ForeignKey, PrimaryKeyConstraint
+from sqlalchemy.orm import relationship
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base
+from dispatch.models import (
+ DispatchBase,
+ ProjectMixin,
+ Pagination,
+ TimeStampMixin,
+ PrimaryKey,
+)
+from dispatch.project.models import ProjectRead
+from dispatch.data.source.models import SourceRead
+
+from dispatch.tag.models import TagRead
+
+
+assoc_query_tags = Table(
+ "assoc_query_tags",
+ Base.metadata,
+ Column("query_id", Integer, ForeignKey("query.id", ondelete="CASCADE")),
+ Column("tag_id", Integer, ForeignKey("tag.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("query_id", "tag_id"),
+)
+
+assoc_query_incidents = Table(
+ "assoc_query_incidents",
+ Base.metadata,
+ Column("query_id", Integer, ForeignKey("query.id", ondelete="CASCADE")),
+ Column("incident_id", Integer, ForeignKey("incident.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("query_id", "incident_id"),
+)
+
+
+class Query(Base, TimeStampMixin, ProjectMixin):
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ text = Column(String)
+ language = Column(String)
+ source_id = Column(Integer, ForeignKey("source.id"))
+ source = relationship("Source", backref="queries")
+ tags = relationship("Tag", secondary=assoc_query_tags, backref="queries")
+ search_vector = Column(TSVectorType("name", regconfig="pg_catalog.simple"))
+
+
+# Pydantic models
+class QueryBase(DispatchBase):
+ name: str | None = Field(None, nullable=False)
+ description: str | None = None
+ language: str | None = None
+ text: str | None = None
+ tags: list[TagRead | None] = []
+ source: SourceRead
+ project: ProjectRead
+
+
+class QueryCreate(QueryBase):
+ pass
+
+
+class QueryUpdate(QueryBase):
+ id: PrimaryKey | None = None
+
+
+class QueryRead(QueryBase):
+ id: PrimaryKey
+
+
+class QueryPagination(Pagination):
+ items: list[QueryRead]
diff --git a/src/dispatch/data/query/service.py b/src/dispatch/data/query/service.py
new file mode 100644
index 000000000000..8fe6c9f554a6
--- /dev/null
+++ b/src/dispatch/data/query/service.py
@@ -0,0 +1,114 @@
+from pydantic import ValidationError
+
+from dispatch.project import service as project_service
+from dispatch.tag import service as tag_service
+from dispatch.data.source import service as source_service
+
+from .models import Query, QueryCreate, QueryUpdate, QueryRead
+
+
+def get(*, db_session, query_id: int) -> Query | None:
+ """Gets a query by its id."""
+ return db_session.query(Query).filter(Query.id == query_id).one_or_none()
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> Query | None:
+ """Gets a query by its name."""
+ return (
+ db_session.query(Query)
+ .filter(Query.name == name)
+ .filter(Query.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(*, db_session, query_in: QueryRead, project_id: int) -> QueryRead:
+ """Returns the query specified or raises ValidationError."""
+ query = get_by_name(db_session=db_session, name=query_in.name, project_id=project_id)
+
+ if not query:
+ raise ValidationError([
+ {
+ "loc": ("query",),
+ "msg": f"Query not found: {query_in.name}",
+ "type": "value_error",
+ "input": query_in.name,
+ }
+ ])
+
+ return query
+
+
+def get_all(*, db_session):
+ """Gets all queries."""
+ return db_session.query(Query)
+
+
+def create(*, db_session, query_in: QueryCreate) -> Query:
+ """Creates a new query."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=query_in.project
+ )
+
+ source = source_service.get_by_name_or_raise(
+ db_session=db_session, project_id=project.id, source_in=query_in.source
+ )
+
+ tags = []
+ for t in query_in.tags:
+ tags.append(tag_service.get_or_create(db_session=db_session, tag_in=t))
+
+ query = Query(
+ **query_in.dict(exclude={"project", "tags", "source"}),
+ source=source,
+ tags=tags,
+ project=project,
+ )
+ db_session.add(query)
+ db_session.commit()
+ return query
+
+
+def get_or_create(*, db_session, query_in: QueryCreate) -> Query:
+ """Gets or creates a new query."""
+ # prefer the query id if available
+ if query_in.id:
+ q = db_session.query(Query).filter(Query.id == query_in.id)
+ else:
+ q = db_session.query(Query).filter_by(name=query_in.name)
+
+ instance = q.first()
+ if instance:
+ return instance
+
+ return create(db_session=db_session, query_in=query_in)
+
+
+def update(*, db_session, query: Query, query_in: QueryUpdate) -> Query:
+ """Updates an existing query."""
+ query_data = query.dict()
+ update_data = query_in.dict(exclude_unset=True, exclude={})
+
+ source = source_service.get_by_name_or_raise(
+ db_session=db_session, project_id=query.project.id, source_in=query_in.source
+ )
+
+ tags = []
+ for t in query_in.tags:
+ tags.append(tag_service.get_or_create(db_session=db_session, tag_in=t))
+
+ for field in query_data:
+ if field in update_data:
+ setattr(query, field, update_data[field])
+
+ query.tags = tags
+ query.source = source
+ db_session.commit()
+ return query
+
+
+def delete(*, db_session, query_id: int):
+ """Deletes an existing query."""
+ query = db_session.query(Query).filter(Query.id == query_id).one_or_none()
+ db_session.delete(query)
+ db_session.commit()
diff --git a/src/dispatch/data/query/views.py b/src/dispatch/data/query/views.py
new file mode 100644
index 000000000000..6195ebfa7fcc
--- /dev/null
+++ b/src/dispatch/data/query/views.py
@@ -0,0 +1,63 @@
+from fastapi import APIRouter, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
+
+from .models import (
+ QueryCreate,
+ QueryPagination,
+ QueryRead,
+ QueryUpdate,
+)
+from .service import create, delete, get, update
+
+router = APIRouter()
+
+
+@router.get("", response_model=QueryPagination)
+def get_queries(common: CommonParameters):
+ """Get all queries, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="Query", **common)
+
+
+@router.get("/{query_id}", response_model=QueryRead)
+def get_query(db_session: DbSession, query_id: PrimaryKey):
+ """Given its unique ID, retrieve details about a single query."""
+ query = get(db_session=db_session, query_id=query_id)
+ if not query:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "The requested query does not exist."}],
+ )
+ return query
+
+
+@router.post("", response_model=QueryRead)
+def create_query(db_session: DbSession, query_in: QueryCreate):
+ """Creates a new data query."""
+ return create(db_session=db_session, query_in=query_in)
+
+
+@router.put("/{query_id}", response_model=QueryRead)
+def update_query(db_session: DbSession, query_id: PrimaryKey, query_in: QueryUpdate):
+ """Updates a data query."""
+ query = get(db_session=db_session, query_id=query_id)
+ if not query:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A data query with this id does not exist."}],
+ )
+ return update(db_session=db_session, query=query, query_in=query_in)
+
+
+@router.delete("/{query_id}", response_model=None)
+def delete_query(db_session: DbSession, query_id: PrimaryKey):
+ """Deletes a data query, returning only an HTTP 200 OK if successful."""
+ query = get(db_session=db_session, query_id=query_id)
+ if not query:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A data query with this id does not exist."}],
+ )
+ delete(db_session=db_session, query_id=query_id)
diff --git a/src/dispatch/static/dispatch/src/vuetify/config.js b/src/dispatch/data/source/data_format/__init__.py
similarity index 100%
rename from src/dispatch/static/dispatch/src/vuetify/config.js
rename to src/dispatch/data/source/data_format/__init__.py
diff --git a/src/dispatch/data/source/data_format/models.py b/src/dispatch/data/source/data_format/models.py
new file mode 100644
index 000000000000..2706ffb0acd1
--- /dev/null
+++ b/src/dispatch/data/source/data_format/models.py
@@ -0,0 +1,44 @@
+from pydantic import Field
+
+from sqlalchemy import (
+ Column,
+ Integer,
+ String,
+)
+from sqlalchemy import UniqueConstraint
+
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base
+from dispatch.models import DispatchBase, ProjectMixin, Pagination, PrimaryKey
+from dispatch.project.models import ProjectRead
+
+
+class SourceDataFormat(Base, ProjectMixin):
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ search_vector = Column(TSVectorType("name", regconfig="pg_catalog.simple"))
+
+
+class SourceDataFormatBase(DispatchBase):
+ name: str | None = Field(None, nullable=False)
+ description: str | None = None
+
+
+class SourceDataFormatRead(SourceDataFormatBase):
+ id: PrimaryKey
+ project: ProjectRead
+
+
+class SourceDataFormatCreate(SourceDataFormatBase):
+ project: ProjectRead
+
+
+class SourceDataFormatUpdate(SourceDataFormatBase):
+ id: PrimaryKey
+
+
+class SourceDataFormatPagination(Pagination):
+ items: list[SourceDataFormatRead]
diff --git a/src/dispatch/data/source/data_format/service.py b/src/dispatch/data/source/data_format/service.py
new file mode 100644
index 000000000000..77a278c4fc3c
--- /dev/null
+++ b/src/dispatch/data/source/data_format/service.py
@@ -0,0 +1,117 @@
+from pydantic import ValidationError
+
+from dispatch.project import service as project_service
+
+from .models import (
+ SourceDataFormat,
+ SourceDataFormatCreate,
+ SourceDataFormatUpdate,
+ SourceDataFormatRead,
+)
+
+
+def get(*, db_session, source_data_format_id: int) -> SourceDataFormat | None:
+ """Gets a data source by its id."""
+ return (
+ db_session.query(SourceDataFormat)
+ .filter(SourceDataFormat.id == source_data_format_id)
+ .one_or_none()
+ )
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> SourceDataFormat | None:
+ """Gets a source by its name."""
+ return (
+ db_session.query(SourceDataFormat)
+ .filter(SourceDataFormat.name == name)
+ .filter(SourceDataFormat.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(
+ *, db_session, project_id, source_data_format_in=SourceDataFormatRead
+) -> SourceDataFormatRead:
+ """Returns the source specified or raises ValidationError."""
+ data_format = get_by_name(
+ db_session=db_session, project_id=project_id, name=source_data_format_in.name
+ )
+
+ if not data_format:
+ raise ValidationError([
+ {
+ "loc": ("dataFormat",),
+ "msg": f"SourceDataFormat not found: {source_data_format_in.name}",
+ "type": "value_error",
+ "input": source_data_format_in.name,
+ }
+ ])
+
+ return data_format
+
+
+def get_all(*, db_session, project_id: int) -> list[SourceDataFormat | None]:
+ """Gets all sources."""
+ return db_session.query(SourceDataFormat).filter(SourceDataFormat.project_id == project_id)
+
+
+def create(*, db_session, source_data_format_in: SourceDataFormatCreate) -> SourceDataFormat:
+ """Creates a new source."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=source_data_format_in.project
+ )
+ source_data_format = SourceDataFormat(
+ **source_data_format_in.dict(exclude={"project"}), project=project
+ )
+ db_session.add(source_data_format)
+ db_session.commit()
+ return source_data_format
+
+
+def get_or_create(*, db_session, source_data_format_in: SourceDataFormatCreate) -> SourceDataFormat:
+ """Gets or creates a new source."""
+ # prefer the source id if available
+ if source_data_format_in.id:
+ q = db_session.query(SourceDataFormat).filter(
+ SourceDataFormat.id == source_data_format_in.id
+ )
+ else:
+ q = db_session.query(SourceDataFormat).filter_by(name=source_data_format_in.name)
+
+ instance = q.first()
+ if instance:
+ return instance
+
+ return create(
+ db_session=db_session,
+ source_data_format_in=source_data_format_in,
+ )
+
+
+def update(
+ *,
+ db_session,
+ source_data_format: SourceDataFormat,
+ source_data_format_in: SourceDataFormatUpdate,
+) -> SourceDataFormat:
+ """Updates an existing source."""
+ source_data_format_data = source_data_format.dict()
+ update_data = source_data_format_in.dict(exclude_unset=True, exclude={})
+
+ for field in source_data_format_data:
+ if field in update_data:
+ setattr(source_data_format, field, update_data[field])
+
+ db_session.commit()
+ return source_data_format
+
+
+def delete(*, db_session, source_data_format_id: int):
+ """Deletes an existing source."""
+ source_data_format = (
+ db_session.query(SourceDataFormat)
+ .filter(SourceDataFormat.id == source_data_format_id)
+ .one_or_none()
+ )
+ db_session.delete(source_data_format)
+ db_session.commit()
diff --git a/src/dispatch/data/source/data_format/views.py b/src/dispatch/data/source/data_format/views.py
new file mode 100644
index 000000000000..e71049e2d8c3
--- /dev/null
+++ b/src/dispatch/data/source/data_format/views.py
@@ -0,0 +1,73 @@
+from fastapi import APIRouter, HTTPException, status
+
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
+
+from .models import (
+ SourceDataFormatCreate,
+ SourceDataFormatPagination,
+ SourceDataFormatRead,
+ SourceDataFormatUpdate,
+)
+from .service import create, delete, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=SourceDataFormatPagination)
+def get_source_data_formats(common: CommonParameters):
+ """Get all source data formats, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="SourceDataFormat", **common)
+
+
+@router.get("/{source_data_format_id}", response_model=SourceDataFormatRead)
+def get_source_data_format(db_session: DbSession, source_data_format_id: PrimaryKey):
+ """Given its unique id, retrieve details about a source data format."""
+ source_data_format = get(db_session=db_session, source_data_format_id=source_data_format_id)
+ if not source_data_format:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "The requested data format does not exist."}],
+ )
+ return source_data_format
+
+
+@router.post("", response_model=SourceDataFormatRead)
+def create_source_data_format(db_session: DbSession, source_data_format_in: SourceDataFormatCreate):
+ """Creates a new source data format."""
+ return create(db_session=db_session, source_data_format_in=source_data_format_in)
+
+
+@router.put("/{source_data_format_id}", response_model=SourceDataFormatRead)
+def update_source_data_format(
+ db_session: DbSession,
+ source_data_format_id: PrimaryKey,
+ source_data_format_in: SourceDataFormatUpdate,
+):
+ """Updates a source data format."""
+ source_data_format = get(db_session=db_session, source_data_format_id=source_data_format_id)
+ if not source_data_format:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A source data format with this id does not exist."}],
+ )
+ return update(
+ db_session=db_session,
+ source_data_format=source_data_format,
+ source_data_format_in=source_data_format_in,
+ )
+
+
+@router.delete("/{source_data_format_id}", response_model=None)
+def delete_source_data_format(db_session: DbSession, source_data_format_id: PrimaryKey):
+ """Delete a source data format, returning only an HTTP 200 OK if successful."""
+ source_data_format = get(db_session=db_session, source_data_format_id=source_data_format_id)
+ if not source_data_format:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A source data format with this id does not exist."}],
+ )
+ delete(db_session=db_session, source_data_format_id=source_data_format_id)
diff --git a/tests/service/test_service_views.py b/src/dispatch/data/source/environment/__init__.py
similarity index 100%
rename from tests/service/test_service_views.py
rename to src/dispatch/data/source/environment/__init__.py
diff --git a/src/dispatch/data/source/environment/models.py b/src/dispatch/data/source/environment/models.py
new file mode 100644
index 000000000000..68a57c17837c
--- /dev/null
+++ b/src/dispatch/data/source/environment/models.py
@@ -0,0 +1,44 @@
+from pydantic import Field
+
+from sqlalchemy import (
+ Column,
+ Integer,
+ String,
+)
+from sqlalchemy import UniqueConstraint
+
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base
+from dispatch.models import DispatchBase, ProjectMixin, Pagination, PrimaryKey
+from dispatch.project.models import ProjectRead
+
+
+class SourceEnvironment(Base, ProjectMixin):
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ search_vector = Column(TSVectorType("name", regconfig="pg_catalog.simple"))
+
+
+class SourceEnvironmentBase(DispatchBase):
+ name: str | None = Field(None, nullable=False)
+ description: str | None = None
+
+
+class SourceEnvironmentRead(SourceEnvironmentBase):
+ id: PrimaryKey
+ project: ProjectRead
+
+
+class SourceEnvironmentCreate(SourceEnvironmentBase):
+ project: ProjectRead
+
+
+class SourceEnvironmentUpdate(SourceEnvironmentBase):
+ id: PrimaryKey
+
+
+class SourceEnvironmentPagination(Pagination):
+ items: list[SourceEnvironmentRead]
diff --git a/src/dispatch/data/source/environment/service.py b/src/dispatch/data/source/environment/service.py
new file mode 100644
index 000000000000..8c131c5e58bb
--- /dev/null
+++ b/src/dispatch/data/source/environment/service.py
@@ -0,0 +1,121 @@
+from pydantic import ValidationError
+
+from dispatch.project import service as project_service
+
+from .models import (
+ SourceEnvironment,
+ SourceEnvironmentCreate,
+ SourceEnvironmentUpdate,
+ SourceEnvironmentRead,
+)
+
+
+def get(*, db_session, source_environment_id: int) -> SourceEnvironment | None:
+ """Gets a source by its id."""
+ return (
+ db_session.query(SourceEnvironment)
+ .filter(SourceEnvironment.id == source_environment_id)
+ .one_or_none()
+ )
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> SourceEnvironment | None:
+ """Gets a source by its name."""
+ return (
+ db_session.query(SourceEnvironment)
+ .filter(SourceEnvironment.name == name)
+ .filter(SourceEnvironment.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(
+ *, db_session, project_id, source_environment_in=SourceEnvironmentRead
+) -> SourceEnvironmentRead:
+ """Returns the source specified or raises ValidationError."""
+ source = get_by_name(
+ db_session=db_session,
+ project_id=project_id,
+ name=source_environment_in.name,
+ )
+
+ if not source:
+ raise ValidationError([
+ {
+ "loc": ("source",),
+ "msg": f"Source environment not found: {source_environment_in.name}",
+ "type": "value_error",
+ "input": source_environment_in.name,
+ }
+ ])
+
+ return source
+
+
+def get_all(*, db_session, project_id: int) -> list[SourceEnvironment | None]:
+ """Gets all sources."""
+ return db_session.query(SourceEnvironment).filter(SourceEnvironment.project_id == project_id)
+
+
+def create(*, db_session, source_environment_in: SourceEnvironmentCreate) -> SourceEnvironment:
+ """Creates a new source."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=source_environment_in.project
+ )
+ source_environment = SourceEnvironment(
+ **source_environment_in.dict(exclude={"project"}), project=project
+ )
+ db_session.add(source_environment)
+ db_session.commit()
+ return source_environment
+
+
+def get_or_create(
+ *, db_session, source_environment_in: SourceEnvironmentCreate
+) -> SourceEnvironment:
+ """Gets or creates a new source."""
+ # prefer the source id if available
+ if source_environment_in.id:
+ q = db_session.query(SourceEnvironment).filter(
+ SourceEnvironment.id == source_environment_in.id
+ )
+ else:
+ q = db_session.query(SourceEnvironment).filter_by(name=source_environment_in.name)
+
+ instance = q.first()
+ if instance:
+ return instance
+
+ return create(
+ db_session=db_session,
+ source_environment_in=source_environment_in,
+ )
+
+
+def update(
+ *,
+ db_session,
+ source_environment: SourceEnvironment,
+ source_environment_in: SourceEnvironmentUpdate,
+) -> SourceEnvironment:
+ """Updates an existing source."""
+ source_environment_data = source_environment.dict()
+ update_data = source_environment_in.dict(exclude_unset=True, exclude={})
+
+ for field in source_environment_data:
+ if field in update_data:
+ setattr(source_environment, field, update_data[field])
+
+ db_session.commit()
+ return source_environment
+
+
+def delete(*, db_session, source_environment_id: int):
+ """Deletes an existing source."""
+ source_environment = (
+ db_session.query(SourceEnvironment)
+ .filter(SourceEnvironment.id == source_environment_id)
+ .one_or_none()
+ )
+ db_session.delete(source_environment)
+ db_session.commit()
diff --git a/src/dispatch/data/source/environment/views.py b/src/dispatch/data/source/environment/views.py
new file mode 100644
index 000000000000..5d0675a2b64c
--- /dev/null
+++ b/src/dispatch/data/source/environment/views.py
@@ -0,0 +1,75 @@
+from fastapi import APIRouter, HTTPException, status
+
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
+
+from .models import (
+ SourceEnvironmentCreate,
+ SourceEnvironmentPagination,
+ SourceEnvironmentRead,
+ SourceEnvironmentUpdate,
+)
+from .service import create, delete, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=SourceEnvironmentPagination)
+def get_source_environments(common: CommonParameters):
+ """Get all source_environment environments, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="SourceEnvironment", **common)
+
+
+@router.get("/{source_environment_id}", response_model=SourceEnvironmentRead)
+def get_source_environment(db_session: DbSession, source_environment_id: PrimaryKey):
+ """Given its unique id, retrieve details about a single source_environment environment."""
+ source_environment = get(db_session=db_session, source_environment_id=source_environment_id)
+ if not source_environment:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "The requested source environment does not exist."}],
+ )
+ return source_environment
+
+
+@router.post("", response_model=SourceEnvironmentRead)
+def create_source_environment(
+ db_session: DbSession, source_environment_in: SourceEnvironmentCreate
+):
+ """Creates a new source environment."""
+ return create(db_session=db_session, source_environment_in=source_environment_in)
+
+
+@router.put("/{source_environment_id}", response_model=SourceEnvironmentRead)
+def update_source_environment(
+ db_session: DbSession,
+ source_environment_id: PrimaryKey,
+ source_environment_in: SourceEnvironmentUpdate,
+):
+ """Updates a source environment."""
+ source_environment = get(db_session=db_session, source_environment_id=source_environment_id)
+ if not source_environment:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A source environment with this id does not exist."}],
+ )
+ return update(
+ db_session=db_session,
+ source_environment=source_environment,
+ source_environment_in=source_environment_in,
+ )
+
+
+@router.delete("/{source_environment_id}", response_model=None)
+def delete_source_environment(db_session: DbSession, source_environment_id: PrimaryKey):
+ """Delete a source environment, returning only an HTTP 200 OK if successful."""
+ source_environment = get(db_session=db_session, source_environment_id=source_environment_id)
+ if not source_environment:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A source environment with this id does not exist."}],
+ )
+ delete(db_session=db_session, source_environment_id=source_environment_id)
diff --git a/src/dispatch/data/source/models.py b/src/dispatch/data/source/models.py
new file mode 100644
index 000000000000..4da919d110af
--- /dev/null
+++ b/src/dispatch/data/source/models.py
@@ -0,0 +1,163 @@
+from datetime import datetime
+from pydantic import Field, AnyHttpUrl, field_serializer
+
+from sqlalchemy import (
+ JSON,
+ Column,
+ Integer,
+ String,
+ Boolean,
+ Text,
+ DateTime,
+ Table,
+ PrimaryKeyConstraint,
+ ForeignKey,
+ BigInteger,
+)
+from sqlalchemy.orm import relationship
+from sqlalchemy import UniqueConstraint
+
+from sqlalchemy_utils import TSVectorType
+from dispatch.database.core import Base
+from dispatch.models import (
+ DispatchBase,
+ ProjectMixin,
+ Pagination,
+ TimeStampMixin,
+ PrimaryKey,
+)
+from dispatch.project.models import ProjectRead
+from dispatch.data.source.environment.models import SourceEnvironmentRead
+from dispatch.data.source.data_format.models import SourceDataFormatRead
+from dispatch.data.source.status.models import SourceStatusRead
+from dispatch.data.source.transport.models import SourceTransportRead
+from dispatch.data.source.type.models import SourceTypeRead
+from dispatch.tag.models import TagRead
+from dispatch.incident.models import IncidentRead
+
+from dispatch.service.models import ServiceRead
+from dispatch.data.alert.models import AlertRead
+
+assoc_source_tags = Table(
+ "assoc_source_tags",
+ Base.metadata,
+ Column("source_id", Integer, ForeignKey("source.id", ondelete="CASCADE")),
+ Column("tag_id", Integer, ForeignKey("tag.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("source_id", "tag_id"),
+)
+
+assoc_source_incidents = Table(
+ "assoc_source_incidents",
+ Base.metadata,
+ Column("source_id", Integer, ForeignKey("source.id", ondelete="CASCADE")),
+ Column("incident_id", Integer, ForeignKey("incident.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("source_id", "incident_id"),
+)
+
+
+class Source(Base, TimeStampMixin, ProjectMixin):
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ data_last_loaded_at = Column(DateTime)
+ daily_volume = Column(Integer)
+ aggregated = Column(Boolean)
+ retention = Column(Integer)
+ size = Column(BigInteger)
+ cost = Column(Integer)
+ delay = Column(Integer)
+ environment = Column(String)
+ external_id = Column(String)
+ documentation = Column(Text)
+ sampling_rate = Column(Integer)
+ source_schema = Column(Text)
+ links = Column(JSON)
+
+ # relationships
+ source_type = relationship("SourceType", uselist=False, backref="sources")
+ source_type_id = Column(Integer, ForeignKey("source_type.id"))
+ source_status = relationship("SourceStatus", uselist=False, backref="sources")
+ source_status_id = Column(Integer, ForeignKey("source_status.id"))
+ source_environment = relationship("SourceEnvironment", uselist=False, backref="sources")
+ source_environment_id = Column(Integer, ForeignKey("source_environment.id"))
+ source_data_format = relationship("SourceDataFormat", uselist=False, backref="sources")
+ source_data_format_id = Column(Integer, ForeignKey("source_data_format.id"))
+ source_transport_id = Column(Integer, ForeignKey("source_transport.id"))
+ source_transport = relationship("SourceTransport", uselist=False, backref="sources")
+ owner = relationship("Service", uselist=False, backref="sources")
+ owner_id = Column(Integer, ForeignKey("service.id"))
+ incidents = relationship("Incident", secondary=assoc_source_incidents, backref="sources")
+ tags = relationship("Tag", secondary=assoc_source_tags, backref="sources")
+ alerts = relationship("Alert", backref="source", cascade="all, delete-orphan")
+
+ search_vector = Column(TSVectorType("name", regconfig="pg_catalog.simple"))
+
+
+class QueryReadMinimal(DispatchBase):
+ id: PrimaryKey
+ name: str
+ description: str
+
+
+class Link(DispatchBase):
+ id: int | None
+ name: str | None
+ description: str | None = None
+ href: AnyHttpUrl | None
+
+ @field_serializer("href")
+ def serialize_href(self, href: AnyHttpUrl, _info):
+ return str(href)
+
+
+# Pydantic models
+class SourceBase(DispatchBase):
+ name: str | None = Field(None, nullable=False)
+ description: str | None = None
+ data_last_loaded_at: datetime | None = Field(None, nullable=True, title="Last Loaded")
+ sampling_rate: int | None = Field(
+ None,
+ nullable=True,
+ title="Sampling Rate",
+ lt=101,
+ gt=1,
+ description="Rate at which data is sampled (as a percentage) 100% meaning all data is captured.",
+ )
+ source_schema: str | None = None
+ documentation: str | None = None
+ retention: int | None = None
+ delay: int | None = None
+ size: int | None = None
+ external_id: str | None = None
+ aggregated: bool | None = Field(False, nullable=True)
+ links: list[Link | None] = Field(default_factory=list)
+ tags: list[TagRead | None] = []
+ incidents: list[IncidentRead | None] = []
+ queries: list[QueryReadMinimal | None] = []
+ alerts: list[AlertRead | None] = []
+ cost: float | None = None
+ owner: ServiceRead | None = None
+ source_type: SourceTypeRead | None = None
+ source_environment: SourceEnvironmentRead | None = None
+ source_data_format: SourceDataFormatRead | None = None
+ source_status: SourceStatusRead | None = None
+ source_transport: SourceTransportRead | None = None
+ project: ProjectRead
+
+
+class SourceCreate(SourceBase):
+ pass
+
+
+class SourceUpdate(SourceBase):
+ id: PrimaryKey | None = None
+
+
+class SourceRead(SourceBase):
+ id: PrimaryKey
+
+
+class SourcePagination(Pagination):
+ items: list[SourceRead]
diff --git a/src/dispatch/data/source/scheduled.py b/src/dispatch/data/source/scheduled.py
new file mode 100644
index 000000000000..21793ea93465
--- /dev/null
+++ b/src/dispatch/data/source/scheduled.py
@@ -0,0 +1,52 @@
+"""
+.. module: dispatch.data.source.scheduled
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
+
+import logging
+
+from schedule import every
+from sqlalchemy.orm import Session
+
+from dispatch.data.source import service as source_service
+from dispatch.decorators import scheduled_project_task, timer
+from dispatch.plugin import service as plugin_service
+from dispatch.project.models import Project
+from dispatch.scheduler import scheduler
+
+log = logging.getLogger(__name__)
+
+
+@scheduler.add(every(1).hour, name="sync-sources")
+@timer
+@scheduled_project_task
+def sync_sources(db_session: Session, project: Project):
+ """Syncs sources from external sources."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, plugin_type="source", project_id=project.id
+ )
+
+ if not plugin:
+ log.debug(
+ f"Data sources not synced. No source plugin enabled. Project: {project.name}. Organization: {project.organization.name}"
+ )
+ return
+
+ log.debug(f"Getting data source information via plugin {plugin.plugin.slug}.")
+
+ sources = source_service.get_all(db_session=db_session, project_id=project.id)
+
+ for s in sources:
+ log.debug(f"Syncing data source {s}...")
+ if not s.external_id:
+ log.debug(f"Skipping data source. No external id for source {s}.")
+ continue
+ data = plugin.instance.get(external_id=s.external_id)
+
+ if data:
+ for k, v in data.items():
+ setattr(s, k, v)
+
+ db_session.commit()
diff --git a/src/dispatch/data/source/service.py b/src/dispatch/data/source/service.py
new file mode 100644
index 000000000000..eaf76f7c2ffe
--- /dev/null
+++ b/src/dispatch/data/source/service.py
@@ -0,0 +1,264 @@
+from pydantic import ValidationError
+
+from dispatch.project import service as project_service
+from dispatch.incident import service as incident_service
+from dispatch.service import service as service_service
+from dispatch.tag import service as tag_service
+from dispatch.data.query import service as query_service
+from dispatch.data.source.environment import service as environment_service
+from dispatch.data.source.data_format import service as data_format_service
+from dispatch.data.source.status import service as status_service
+from dispatch.data.source.type import service as type_service
+from dispatch.data.source.transport import service as transport_service
+
+
+from .models import Source, SourceCreate, SourceUpdate, SourceRead
+
+
+def get(*, db_session, source_id: int) -> Source | None:
+ """Gets a source by its id."""
+ return db_session.query(Source).filter(Source.id == source_id).one_or_none()
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> Source | None:
+ """Gets a source by its name."""
+ return (
+ db_session.query(Source)
+ .filter(Source.name == name)
+ .filter(Source.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(*, db_session, project_id, source_in: SourceRead) -> SourceRead:
+ """Returns the source specified or raises ValidationError."""
+ source = get_by_name(db_session=db_session, project_id=project_id, name=source_in.name)
+
+ if not source:
+ raise ValidationError([
+ {
+ "loc": ("source",),
+ "msg": f"Source not found: {source_in.name}",
+ "type": "value_error",
+ "input": source_in.name,
+ }
+ ])
+
+ return source
+
+
+def get_all(*, db_session, project_id: int) -> list[Source | None]:
+ """Gets all sources."""
+ return db_session.query(Source).filter(Source.project_id == project_id)
+
+
+def create(*, db_session, source_in: SourceCreate) -> Source:
+ """Creates a new source."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=source_in.project
+ )
+
+ source = Source(
+ **source_in.dict(
+ exclude={
+ "project",
+ "owner",
+ "tags",
+ "incidents",
+ "queries",
+ "source_environment",
+ "source_data_format",
+ "source_transport",
+ "source_status",
+ "source_type",
+ }
+ ),
+ project=project,
+ )
+
+ if source_in.owner:
+ source.owner = service_service.get_by_name_or_raise(
+ db_session=db_session, project_id=project.id, service_in=source_in.owner
+ )
+
+ tags = []
+ for t in source_in.tags:
+ tags.append(
+ tag_service.get_by_name_or_raise(db_session=db_session, tag_in=t, project_id=project.id)
+ )
+ source.tags = tags
+
+ incidents = []
+ for i in source_in.incidents:
+ incidents.append(
+ incident_service.get_by_name_or_raise(
+ db_session=db_session, incident_in=i, project_id=project.id
+ )
+ )
+ source.incidents = incidents
+
+ queries = []
+ for q in source_in.queries:
+ queries.append(
+ query_service.get_by_name_or_raise(
+ db_session=db_session, query_in=q, project_id=project.id
+ )
+ )
+ source.queries = queries
+
+ if source_in.source_environment:
+ source.source_environment = environment_service.get_by_name_or_raise(
+ db_session=db_session,
+ project_id=project.id,
+ source_environment_in=source_in.source_environment,
+ )
+
+ if source_in.source_type:
+ source.source_type = type_service.get_by_name_or_raise(
+ db_session=db_session,
+ project_id=project.id,
+ source_type_in=source_in.source_type,
+ )
+
+ if source_in.source_transport:
+ source.source_transport = transport_service.get_by_name_or_raise(
+ db_session=db_session,
+ project_id=project.id,
+ source_transport_in=source_in.source_transport,
+ )
+
+ if source_in.source_data_format:
+ source.source_data_format = data_format_service.get_by_name_or_raise(
+ db_session=db_session,
+ project_id=project.id,
+ source_data_format_in=source_in.source_data_format,
+ )
+
+ if source_in.source_status:
+ source.source_status = status_service.get_by_name_or_raise(
+ db_session=db_session,
+ project_id=project.id,
+ source_status_in=source_in.source_status,
+ )
+
+ db_session.add(source)
+ db_session.commit()
+ return source
+
+
+def get_or_create(*, db_session, source_in: SourceCreate) -> Source:
+ """Gets or creates a new source."""
+ # prefer the source id if available
+ if source_in.id:
+ q = db_session.query(Source).filter(Source.id == source_in.id)
+ else:
+ q = db_session.query(Source).filter_by(name=source_in.name)
+
+ instance = q.first()
+ if instance:
+ return instance
+
+ return create(db_session=db_session, source_in=source_in)
+
+
+def update(*, db_session, source: Source, source_in: SourceUpdate) -> Source:
+ """Updates an existing source."""
+ source_data = source.dict()
+
+ update_data = source_in.dict(
+ exclude_unset=True,
+ exclude={
+ "project",
+ "owner",
+ "tags",
+ "incidents",
+ "queries",
+ "source_environment",
+ "source_data_format",
+ "source_transport",
+ "source_status",
+ "source_type",
+ },
+ )
+
+ if source_in.owner:
+ source.owner = service_service.get_by_name_or_raise(
+ db_session=db_session, project_id=source.project.id, service_in=source_in.owner
+ )
+
+ tags = []
+ for t in source_in.tags:
+ tags.append(
+ tag_service.get_by_name_or_raise(
+ db_session=db_session, tag_in=t, project_id=source.project.id
+ )
+ )
+
+ source.tags = tags
+
+ incidents = []
+ for i in source_in.incidents:
+ incidents.append(
+ incident_service.get_by_name_or_raise(
+ db_session=db_session, incident_in=i, project_id=source.project.id
+ )
+ )
+ source.incidents = incidents
+
+ queries = []
+ for q in source_in.queries:
+ queries.append(
+ query_service.get_by_name_or_raise(
+ db_session=db_session, query_in=q, project_id=source.project.id
+ )
+ )
+ source.queries = queries
+
+ if source_in.source_environment:
+ source.source_environment = environment_service.get_by_name_or_raise(
+ db_session=db_session,
+ project_id=source.project.id,
+ source_environment_in=source_in.source_environment,
+ )
+
+ if source_in.source_type:
+ source.source_type = type_service.get_by_name_or_raise(
+ db_session=db_session,
+ project_id=source.project.id,
+ source_type_in=source_in.source_type,
+ )
+
+ if source_in.source_transport:
+ source.source_transport = transport_service.get_by_name_or_raise(
+ db_session=db_session,
+ project_id=source.project.id,
+ source_transport_in=source_in.source_transport,
+ )
+
+ if source_in.source_data_format:
+ source.source_data_format = data_format_service.get_by_name_or_raise(
+ db_session=db_session,
+ project_id=source.project.id,
+ source_data_format_in=source_in.source_data_format,
+ )
+
+ if source_in.source_status:
+ source.source_status = status_service.get_by_name_or_raise(
+ db_session=db_session,
+ project_id=source.project.id,
+ source_status_in=source_in.source_status,
+ )
+
+ for field in source_data:
+ if field in update_data:
+ setattr(source, field, update_data[field])
+
+ db_session.commit()
+ return source
+
+
+def delete(*, db_session, source_id: int):
+ """Deletes an existing source."""
+ source = db_session.query(Source).filter(Source.id == source_id).one_or_none()
+ db_session.delete(source)
+ db_session.commit()
diff --git a/tests/test_cli.py b/src/dispatch/data/source/status/__init__.py
similarity index 100%
rename from tests/test_cli.py
rename to src/dispatch/data/source/status/__init__.py
diff --git a/src/dispatch/data/source/status/models.py b/src/dispatch/data/source/status/models.py
new file mode 100644
index 000000000000..4bd72da942ca
--- /dev/null
+++ b/src/dispatch/data/source/status/models.py
@@ -0,0 +1,44 @@
+from pydantic import Field
+
+from sqlalchemy import (
+ Column,
+ Integer,
+ String,
+)
+from sqlalchemy import UniqueConstraint
+
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base
+from dispatch.models import DispatchBase, ProjectMixin, Pagination, PrimaryKey
+from dispatch.project.models import ProjectRead
+
+
+class SourceStatus(Base, ProjectMixin):
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ search_vector = Column(TSVectorType("name", regconfig="pg_catalog.simple"))
+
+
+class SourceStatusBase(DispatchBase):
+ name: str | None = Field(None, nullable=False)
+ description: str | None = None
+
+
+class SourceStatusRead(SourceStatusBase):
+ id: PrimaryKey
+ project: ProjectRead
+
+
+class SourceStatusCreate(SourceStatusBase):
+ project: ProjectRead
+
+
+class SourceStatusUpdate(SourceStatusBase):
+ id: PrimaryKey
+
+
+class SourceStatusPagination(Pagination):
+ items: list[SourceStatusRead]
diff --git a/src/dispatch/data/source/status/service.py b/src/dispatch/data/source/status/service.py
new file mode 100644
index 000000000000..114271ea352f
--- /dev/null
+++ b/src/dispatch/data/source/status/service.py
@@ -0,0 +1,105 @@
+from pydantic import ValidationError
+
+from dispatch.project import service as project_service
+
+from .models import (
+ SourceStatus,
+ SourceStatusCreate,
+ SourceStatusUpdate,
+ SourceStatusRead,
+)
+
+
+def get(*, db_session, source_status_id: int) -> SourceStatus | None:
+ """Gets a status by its id."""
+ return db_session.query(SourceStatus).filter(SourceStatus.id == source_status_id).one_or_none()
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> SourceStatus | None:
+ """Gets a status by its name."""
+ return (
+ db_session.query(SourceStatus)
+ .filter(SourceStatus.name == name)
+ .filter(SourceStatus.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(
+ *, db_session, project_id, source_status_in=SourceStatusRead
+) -> SourceStatusRead:
+ """Returns the status specified or raises ValidationError."""
+ status = get_by_name(db_session=db_session, project_id=project_id, name=source_status_in.name)
+
+ if not status:
+ raise ValidationError([
+ {
+ "loc": ("status",),
+ "msg": f"SourceStatus not found: {source_status_in.name}",
+ "type": "value_error",
+ "input": source_status_in.name,
+ }
+ ])
+
+ return status
+
+
+def get_all(*, db_session, project_id: int) -> list[SourceStatus | None]:
+ """Gets all sources."""
+ return db_session.query(SourceStatus).filter(SourceStatus.project_id == project_id)
+
+
+def create(*, db_session, source_status_in: SourceStatusCreate) -> SourceStatus:
+ """Creates a new status."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=source_status_in.project
+ )
+ source_status = SourceStatus(**source_status_in.dict(exclude={"project"}), project=project)
+ db_session.add(source_status)
+ db_session.commit()
+ return source_status
+
+
+def get_or_create(*, db_session, source_status_in: SourceStatusCreate) -> SourceStatus:
+ """Gets or creates a new status."""
+ # prefer the status id if available
+ if source_status_in.id:
+ q = db_session.query(SourceStatus).filter(SourceStatus.id == source_status_in.id)
+ else:
+ q = db_session.query(SourceStatus).filter_by(name=source_status_in.name)
+
+ instance = q.first()
+ if instance:
+ return instance
+
+ return create(
+ db_session=db_session,
+ source_status_in=source_status_in,
+ )
+
+
+def update(
+ *,
+ db_session,
+ source_status: SourceStatus,
+ source_status_in: SourceStatusUpdate,
+) -> SourceStatus:
+ """Updates an existing status."""
+ source_status_data = source_status.dict()
+ update_data = source_status_in.dict(exclude_unset=True, exclude={})
+
+ for field in source_status_data:
+ if field in update_data:
+ setattr(source_status, field, update_data[field])
+
+ db_session.commit()
+ return source_status
+
+
+def delete(*, db_session, source_status_id: int):
+ """Deletes an existing status."""
+ source_status = (
+ db_session.query(SourceStatus).filter(SourceStatus.id == source_status_id).one_or_none()
+ )
+ db_session.delete(source_status)
+ db_session.commit()
diff --git a/src/dispatch/data/source/status/views.py b/src/dispatch/data/source/status/views.py
new file mode 100644
index 000000000000..ce5620b3ffb8
--- /dev/null
+++ b/src/dispatch/data/source/status/views.py
@@ -0,0 +1,69 @@
+from fastapi import APIRouter, HTTPException
+
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
+
+from .models import (
+ SourceStatusCreate,
+ SourceStatusPagination,
+ SourceStatusRead,
+ SourceStatusUpdate,
+)
+from .service import create, delete, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=SourceStatusPagination)
+def get_source_statuses(common: CommonParameters):
+ """Get all source statuses, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="SourceStatus", **common)
+
+
+@router.get("/{source_status_id}", response_model=SourceStatusRead)
+def get_source_status(db_session: DbSession, source_status_id: PrimaryKey):
+ """Given its unique id, retrieve details about a single source status."""
+ status = get(db_session=db_session, source_status_id=source_status_id)
+ if not status:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "The requested source status does not exist."}],
+ )
+ return status
+
+
+@router.post("", response_model=SourceStatusRead)
+def create_source_status(db_session: DbSession, source_status_in: SourceStatusCreate):
+ """Creates a new source status."""
+ return create(db_session=db_session, source_status_in=source_status_in)
+
+
+@router.put("/{source_status_id}", response_model=SourceStatusRead)
+def update_source_status(
+ db_session: DbSession,
+ source_status_id: PrimaryKey,
+ source_status_in: SourceStatusUpdate,
+):
+ """Updates a source status."""
+ status = get(db_session=db_session, source_status_id=source_status_id)
+ if not status:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A source status with this id does not exist."}],
+ )
+ return update(db_session=db_session, source_status=status, source_status_in=source_status_in)
+
+
+@router.delete("/{source_status_id}", response_model=None)
+def delete_source_status(db_session: DbSession, source_status_id: PrimaryKey):
+ """Deletes a source status, returning only an HTTP 200 OK if successful."""
+ status = get(db_session=db_session, source_status_id=source_status_id)
+ if not status:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A source status with this id does not exist."}],
+ )
+ delete(db_session=db_session, source_status_id=source_status_id)
diff --git a/tests/utils/utils.py b/src/dispatch/data/source/transport/__init__.py
similarity index 100%
rename from tests/utils/utils.py
rename to src/dispatch/data/source/transport/__init__.py
diff --git a/src/dispatch/data/source/transport/models.py b/src/dispatch/data/source/transport/models.py
new file mode 100644
index 000000000000..7d675ae97127
--- /dev/null
+++ b/src/dispatch/data/source/transport/models.py
@@ -0,0 +1,44 @@
+from pydantic import Field
+
+from sqlalchemy import (
+ Column,
+ Integer,
+ String,
+)
+from sqlalchemy import UniqueConstraint
+
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base
+from dispatch.models import DispatchBase, ProjectMixin, Pagination, PrimaryKey
+from dispatch.project.models import ProjectRead
+
+
+class SourceTransport(Base, ProjectMixin):
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ search_vector = Column(TSVectorType("name", regconfig="pg_catalog.simple"))
+
+
+class SourceTransportBase(DispatchBase):
+ name: str | None = Field(None, nullable=False)
+ description: str | None = None
+
+
+class SourceTransportRead(SourceTransportBase):
+ id: PrimaryKey
+ project: ProjectRead
+
+
+class SourceTransportCreate(SourceTransportBase):
+ project: ProjectRead
+
+
+class SourceTransportUpdate(SourceTransportBase):
+ id: PrimaryKey
+
+
+class SourceTransportPagination(Pagination):
+ items: list[SourceTransportRead]
diff --git a/src/dispatch/data/source/transport/service.py b/src/dispatch/data/source/transport/service.py
new file mode 100644
index 000000000000..ba5106ccb9fb
--- /dev/null
+++ b/src/dispatch/data/source/transport/service.py
@@ -0,0 +1,115 @@
+from pydantic import ValidationError
+
+from dispatch.project import service as project_service
+
+from .models import (
+ SourceTransport,
+ SourceTransportCreate,
+ SourceTransportUpdate,
+ SourceTransportRead,
+)
+
+
+def get(*, db_session, source_transport_id: int) -> SourceTransport | None:
+ """Gets a source transport by its id."""
+ return (
+ db_session.query(SourceTransport)
+ .filter(SourceTransport.id == source_transport_id)
+ .one_or_none()
+ )
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> SourceTransport | None:
+ """Gets a source transport by its name."""
+ return (
+ db_session.query(SourceTransport)
+ .filter(SourceTransport.name == name)
+ .filter(SourceTransport.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(
+ *, db_session, project_id, source_transport_in=SourceTransportRead
+) -> SourceTransportRead:
+ """Returns the source transport specified or raises ValidationError."""
+ source = get_by_name(
+ db_session=db_session, project_id=project_id, name=source_transport_in.name
+ )
+
+ if not source:
+ raise ValidationError([
+ {
+ "loc": ("source",),
+ "msg": f"SourceTransport not found: {source_transport_in.name}",
+ "type": "value_error",
+ "input": source_transport_in.name,
+ }
+ ])
+
+ return source
+
+
+def get_all(*, db_session, project_id: int) -> list[SourceTransport | None]:
+ """Gets all source transports."""
+ return db_session.query(SourceTransport).filter(SourceTransport.project_id == project_id)
+
+
+def create(*, db_session, source_transport_in: SourceTransportCreate) -> SourceTransport:
+ """Creates a new source transport."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=source_transport_in.project
+ )
+ source_transport = SourceTransport(
+ **source_transport_in.dict(exclude={"project"}), project=project
+ )
+ db_session.add(source_transport)
+ db_session.commit()
+ return source_transport
+
+
+def get_or_create(*, db_session, source_transport_in: SourceTransportCreate) -> SourceTransport:
+ """Gets or creates a new source transport."""
+ # prefer the source id if available
+ if source_transport_in.id:
+ q = db_session.query(SourceTransport).filter(SourceTransport.id == source_transport_in.id)
+ else:
+ q = db_session.query(SourceTransport).filter_by(name=source_transport_in.name)
+
+ instance = q.first()
+ if instance:
+ return instance
+
+ return create(
+ db_session=db_session,
+ source_transport_in=source_transport_in,
+ )
+
+
+def update(
+ *,
+ db_session,
+ source_transport: SourceTransport,
+ source_transport_in: SourceTransportUpdate,
+) -> SourceTransport:
+ """Updates an existing source transport."""
+ source_transport_data = source_transport.dict()
+ update_data = source_transport_in.dict(exclude_unset=True, exclude={})
+
+ for field in source_transport_data:
+ if field in update_data:
+ setattr(source_transport, field, update_data[field])
+
+ db_session.commit()
+ return source_transport
+
+
+def delete(*, db_session, source_transport_id: int):
+ """Deletes an existing source transport."""
+ source_transport = (
+ db_session.query(SourceTransport)
+ .filter(SourceTransport.id == source_transport_id)
+ .one_or_none()
+ )
+ db_session.delete(source_transport)
+ db_session.commit()
diff --git a/src/dispatch/data/source/transport/views.py b/src/dispatch/data/source/transport/views.py
new file mode 100644
index 000000000000..f3664619e4bf
--- /dev/null
+++ b/src/dispatch/data/source/transport/views.py
@@ -0,0 +1,71 @@
+from fastapi import APIRouter, HTTPException, status
+
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
+
+from .models import (
+ SourceTransportCreate,
+ SourceTransportPagination,
+ SourceTransportRead,
+ SourceTransportUpdate,
+)
+from .service import create, delete, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=SourceTransportPagination)
+def get_source_transports(common: CommonParameters):
+ """Get all source transports, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="SourceTransport", **common)
+
+
+@router.get("/{source_transport_id}", response_model=SourceTransportRead)
+def get_source_transport(db_session: DbSession, source_transport_id: PrimaryKey):
+ """Given its unique id, retrieve details about a single source transport."""
+ transport = get(db_session=db_session, source_transport_id=source_transport_id)
+ if not transport:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "The requested source transport does not exist."}],
+ )
+ return transport
+
+
+@router.post("", response_model=SourceTransportRead)
+def create_source_transport(db_session: DbSession, source_transport_in: SourceTransportCreate):
+ """Creates a new source transport."""
+ return create(db_session=db_session, source_transport_in=source_transport_in)
+
+
+@router.put("/{source_transport_id}", response_model=SourceTransportRead)
+def update_source_transport(
+ db_session: DbSession,
+ source_transport_id: PrimaryKey,
+ source_transport_in: SourceTransportUpdate,
+):
+ """Updates a source transport."""
+ transport = get(db_session=db_session, source_transport_id=source_transport_id)
+ if not transport:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A source transport with this id does not exist."}],
+ )
+ return update(
+ db_session=db_session, source_transport=transport, source_transport_in=source_transport_in
+ )
+
+
+@router.delete("/{source_transport_id}", response_model=None)
+def delete_source_transport(db_session: DbSession, source_transport_id: PrimaryKey):
+ """Deletes a source transport, returning only an HTTP 200 OK if successful."""
+ transport = get(db_session=db_session, source_transport_id=source_transport_id)
+ if not transport:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A source transport with this id does not exist."}],
+ )
+ delete(db_session=db_session, source_transport_id=source_transport_id)
diff --git a/src/dispatch/data/source/type/__init__.py b/src/dispatch/data/source/type/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/data/source/type/models.py b/src/dispatch/data/source/type/models.py
new file mode 100644
index 000000000000..c69f7e3362ca
--- /dev/null
+++ b/src/dispatch/data/source/type/models.py
@@ -0,0 +1,44 @@
+from pydantic import Field
+
+from sqlalchemy import (
+ Column,
+ Integer,
+ String,
+)
+from sqlalchemy import UniqueConstraint
+
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base
+from dispatch.models import DispatchBase, ProjectMixin, Pagination, PrimaryKey
+from dispatch.project.models import ProjectRead
+
+
+class SourceType(Base, ProjectMixin):
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ search_vector = Column(TSVectorType("name", regconfig="pg_catalog.simple"))
+
+
+class SourceTypeBase(DispatchBase):
+ name: str | None = Field(None, nullable=False)
+ description: str | None = None
+
+
+class SourceTypeRead(SourceTypeBase):
+ id: PrimaryKey
+ project: ProjectRead
+
+
+class SourceTypeCreate(SourceTypeBase):
+ project: ProjectRead
+
+
+class SourceTypeUpdate(SourceTypeBase):
+ id: PrimaryKey
+
+
+class SourceTypePagination(Pagination):
+ items: list[SourceTypeRead]
diff --git a/src/dispatch/data/source/type/service.py b/src/dispatch/data/source/type/service.py
new file mode 100644
index 000000000000..34e9bfda7054
--- /dev/null
+++ b/src/dispatch/data/source/type/service.py
@@ -0,0 +1,103 @@
+from pydantic import ValidationError
+
+from dispatch.project import service as project_service
+
+from .models import (
+ SourceType,
+ SourceTypeCreate,
+ SourceTypeUpdate,
+ SourceTypeRead,
+)
+
+
+def get(*, db_session, source_type_id: int) -> SourceType | None:
+ """Gets a source by its id."""
+ return db_session.query(SourceType).filter(SourceType.id == source_type_id).one_or_none()
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> SourceType | None:
+ """Gets a source by its name."""
+ return (
+ db_session.query(SourceType)
+ .filter(SourceType.name == name)
+ .filter(SourceType.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(
+ *, db_session, project_id, source_type_in=SourceTypeRead
+) -> SourceTypeRead:
+ """Returns the source specified or raises ValidationError."""
+ source = get_by_name(db_session=db_session, project_id=project_id, name=source_type_in.name)
+
+ if not source:
+ raise ValidationError([
+ {
+ "loc": ("source",),
+ "msg": f"SourceType not found: {source_type_in.name}",
+ "type": "value_error",
+ "input": source_type_in.name,
+ }
+ ])
+
+ return source
+
+
+def get_all(*, db_session, project_id: int) -> list[SourceType | None]:
+ """Gets all source types."""
+ return db_session.query(SourceType).filter(SourceType.project_id == project_id)
+
+
+def create(*, db_session, source_type_in: SourceTypeCreate) -> SourceType:
+ """Creates a new source type."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=source_type_in.project
+ )
+ source_type = SourceType(**source_type_in.dict(exclude={"project"}), project=project)
+ db_session.add(source_type)
+ db_session.commit()
+ return source_type
+
+
+def get_or_create(*, db_session, source_type_in: SourceTypeCreate) -> SourceType:
+ """Gets or creates a new source type."""
+ # prefer the source id if available
+ if source_type_in.id:
+ q = db_session.query(SourceType).filter(SourceType.id == source_type_in.id)
+ else:
+ q = db_session.query(SourceType).filter_by(name=source_type_in.name)
+
+ instance = q.first()
+ if instance:
+ return instance
+
+ return create(
+ db_session=db_session,
+ source_type_in=source_type_in,
+ )
+
+
+def update(
+ *,
+ db_session,
+ source_type: SourceType,
+ source_type_in: SourceTypeUpdate,
+) -> SourceType:
+ """Updates an existing source."""
+ source_type_data = source_type.dict()
+ update_data = source_type_in.dict(exclude_unset=True, exclude={})
+
+ for field in source_type_data:
+ if field in update_data:
+ setattr(source_type, field, update_data[field])
+
+ db_session.commit()
+ return source_type
+
+
+def delete(*, db_session, source_type_id: int):
+ """Deletes an existing source."""
+ source_type = db_session.query(SourceType).filter(SourceType.id == source_type_id).one_or_none()
+ db_session.delete(source_type)
+ db_session.commit()
diff --git a/src/dispatch/data/source/type/views.py b/src/dispatch/data/source/type/views.py
new file mode 100644
index 000000000000..d9b57205d316
--- /dev/null
+++ b/src/dispatch/data/source/type/views.py
@@ -0,0 +1,69 @@
+from fastapi import APIRouter, HTTPException, status
+
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
+
+from .models import (
+ SourceTypeCreate,
+ SourceTypePagination,
+ SourceTypeRead,
+ SourceTypeUpdate,
+)
+from .service import create, delete, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=SourceTypePagination)
+def get_source_types(common: CommonParameters):
+ """Get all source types, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="SourceType", **common)
+
+
+@router.get("/{source_type_id}", response_model=SourceTypeRead)
+def get_source_type(db_session: DbSession, source_type_id: PrimaryKey):
+ """Given its unique id, retrieve details about a single source type."""
+ source_type = get(db_session=db_session, source_type_id=source_type_id)
+ if not source_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "The requested source type does not exist."}],
+ )
+ return source_type
+
+
+@router.post("", response_model=SourceTypeRead)
+def create_source_type(db_session: DbSession, source_type_in: SourceTypeCreate):
+ """Creates a new source type."""
+ return create(db_session=db_session, source_type_in=source_type_in)
+
+
+@router.put("/{source_type_id}", response_model=SourceTypeRead)
+def update_source_type(
+ db_session: DbSession,
+ source_type_id: PrimaryKey,
+ source_type_in: SourceTypeUpdate,
+):
+ """Updates a source type."""
+ source_type = get(db_session=db_session, source_type_id=source_type_id)
+ if not source_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An source type with this id does not exist."}],
+ )
+ return update(db_session=db_session, source_type=source_type, source_type_in=source_type_in)
+
+
+@router.delete("/{source_type_id}", response_model=None)
+def delete_source_type(db_session: DbSession, source_type_id: PrimaryKey):
+ """Deletes a source type, returning only an HTTP 200 OK if successful."""
+ source_type = get(db_session=db_session, source_type_id=source_type_id)
+ if not source_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An source type with this id does not exist."}],
+ )
+ delete(db_session=db_session, source_type_id=source_type_id)
diff --git a/src/dispatch/data/source/views.py b/src/dispatch/data/source/views.py
new file mode 100644
index 000000000000..c9e7e86dcc13
--- /dev/null
+++ b/src/dispatch/data/source/views.py
@@ -0,0 +1,63 @@
+from fastapi import APIRouter, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
+
+from .models import (
+ SourceCreate,
+ SourcePagination,
+ SourceRead,
+ SourceUpdate,
+)
+from .service import create, delete, get, update
+
+router = APIRouter()
+
+
+@router.get("", response_model=SourcePagination)
+def get_sources(common: CommonParameters):
+ """Get all sources, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="Source", **common)
+
+
+@router.get("/{source_id}", response_model=SourceRead)
+def get_source(db_session: DbSession, source_id: PrimaryKey):
+ """Given its unique id, retrieve details about a single source."""
+ source = get(db_session=db_session, source_id=source_id)
+ if not source:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "The requested source does not exist."}],
+ )
+ return source
+
+
+@router.post("", response_model=SourceRead)
+def create_source(db_session: DbSession, source_in: SourceCreate):
+ """Creates a new source."""
+ return create(db_session=db_session, source_in=source_in)
+
+
+@router.put("/{source_id}", response_model=SourceRead)
+def update_source(db_session: DbSession, source_id: PrimaryKey, source_in: SourceUpdate):
+ """Updates a source."""
+ source = get(db_session=db_session, source_id=source_id)
+ if not source:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A source with this id does not exist."}],
+ )
+ return update(db_session=db_session, source=source, source_in=source_in)
+
+
+@router.delete("/{source_id}", response_model=None)
+def delete_source(db_session: DbSession, source_id: PrimaryKey):
+ """Deletes a source, returning only an HTTP 200 OK if successful."""
+ source = get(db_session=db_session, source_id=source_id)
+ if not source:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A source with this id does not exist."}],
+ )
+ delete(db_session=db_session, source_id=source_id)
diff --git a/src/dispatch/database.py b/src/dispatch/database.py
deleted file mode 100644
index 937f1f4eedf8..000000000000
--- a/src/dispatch/database.py
+++ /dev/null
@@ -1,163 +0,0 @@
-import re
-from typing import Any, List
-
-from sqlalchemy import create_engine
-from sqlalchemy.ext.declarative import declarative_base, declared_attr
-from sqlalchemy.orm import Query, sessionmaker
-from sqlalchemy_filters import apply_pagination, apply_sort, apply_filters
-from sqlalchemy_searchable import make_searchable
-from sqlalchemy_searchable import search as search_db
-from starlette.requests import Request
-
-from dispatch.common.utils.composite_search import CompositeSearch
-
-from .config import SQLALCHEMY_DATABASE_URI
-
-engine = create_engine(str(SQLALCHEMY_DATABASE_URI))
-SessionLocal = sessionmaker(bind=engine)
-
-
-def resolve_table_name(name):
- """Resolves table names to their mapped names."""
- names = re.split("(?=[A-Z])", name) # noqa
- return "_".join([x.lower() for x in names if x])
-
-
-class CustomBase:
- @declared_attr
- def __tablename__(self):
- return resolve_table_name(self.__name__)
-
-
-Base = declarative_base(cls=CustomBase)
-
-make_searchable(Base.metadata)
-
-
-def get_db(request: Request):
- return request.state.db
-
-
-def get_model_name_by_tablename(table_fullname: str) -> str:
- """Returns the model name of a given table."""
- return get_class_by_tablename(table_fullname=table_fullname).__name__
-
-
-def get_class_by_tablename(table_fullname: str) -> Any:
- """Return class reference mapped to table."""
- mapped_name = resolve_table_name(table_fullname)
- for c in Base._decl_class_registry.values():
- if hasattr(c, "__table__") and c.__table__.fullname == mapped_name:
- return c
- raise Exception(f"Incorrect tablename '{mapped_name}'. Check the name of your model.")
-
-
-def paginate(query: Query, page: int, items_per_page: int):
- # Never pass a negative OFFSET value to SQL.
- offset_adj = 0 if page <= 0 else page - 1
- items = query.limit(items_per_page).offset(offset_adj * items_per_page).all()
- total = query.order_by(None).count()
- return items, total
-
-
-def composite_search(*, db_session, query_str: str, models: List[Base]):
- """Perform a multi-table search based on the supplied query."""
- s = CompositeSearch(db_session, models)
- q = s.build_query(query_str, sort=True)
- return s.search(query=q)
-
-
-def search(*, db_session, query_str: str, model: str):
- """Perform a search based on the query."""
- q = db_session.query(get_class_by_tablename(model))
- return search_db(q, query_str, sort=True)
-
-
-def create_filter_spec(model, fields, ops, values):
- """Creates a filter spec."""
- filter_spec = []
-
- if fields and ops and values:
- for field, op, value in zip(fields, ops, values):
- # we have a complex field, we may need to join
- if "." in field:
- complex_model, complex_field = field.split(".")
- filter_spec.append(
- {
- "model": get_model_name_by_tablename(complex_model),
- "field": complex_field,
- "op": op,
- "value": value,
- }
- )
- else:
- filter_spec.append({"model": model, "field": field, "op": op, "value": value})
- # NOTE we default to AND filters
- if filter_spec:
- return {"and": filter_spec}
- return filter_spec
-
-
-def create_sort_spec(model, sort_by, descending):
- """Creates sort_spec."""
- sort_spec = []
- if sort_by and descending:
- for field, direction in zip(sort_by, descending):
- direction = "desc" if direction else "asc"
-
- # we have a complex field, we may need to join
- if "." in field:
- complex_model, complex_field = field.split(".")
-
- sort_spec.append(
- {
- "model": get_model_name_by_tablename(complex_model),
- "field": complex_field,
- "direction": direction,
- }
- )
- else:
- sort_spec.append({"model": model, "field": field, "direction": direction})
- return sort_spec
-
-
-def get_all(*, db_session, model):
- """Fetches a query object based on the model class name."""
- return db_session.query(get_class_by_tablename(model))
-
-
-def search_filter_sort_paginate(
- db_session,
- model,
- query_str: str = None,
- page: int = 1,
- items_per_page: int = 5,
- sort_by: List[str] = None,
- descending: List[bool] = None,
- fields: List[str] = None,
- ops: List[str] = None,
- values: List[str] = None,
-):
- """Common functionality for searching, filtering and sorting"""
- if query_str:
- query = search(db_session=db_session, query_str=query_str, model=model)
- else:
- query = get_all(db_session=db_session, model=model)
-
- filter_spec = create_filter_spec(model, fields, ops, values)
- query = apply_filters(query, filter_spec)
-
- sort_spec = create_sort_spec(model, sort_by, descending)
- query = apply_sort(query, sort_spec)
-
- if items_per_page == -1:
- items_per_page = None
-
- query, pagination = apply_pagination(query, page_number=page, page_size=items_per_page)
-
- return {
- "items": query.all(),
- "itemsPerPage": pagination.page_size,
- "page": pagination.page_number,
- "total": pagination.total_results,
- }
diff --git a/src/dispatch/database/__init__.py b/src/dispatch/database/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/database/core.py b/src/dispatch/database/core.py
new file mode 100644
index 000000000000..a68726db472b
--- /dev/null
+++ b/src/dispatch/database/core.py
@@ -0,0 +1,272 @@
+"""
+.. module: dispatch.database.core
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
+
+import functools
+import re
+from contextlib import contextmanager
+
+from fastapi import Depends
+from pydantic import BaseModel, ValidationError
+from sqlalchemy import create_engine, inspect
+from sqlalchemy.engine.url import make_url
+from sqlalchemy.orm import Session, object_session, sessionmaker, DeclarativeBase, declared_attr
+from sqlalchemy.sql.expression import true
+from sqlalchemy_utils import get_mapper
+from starlette.requests import Request
+from typing import Annotated, Any
+from dispatch import config
+from dispatch.search.fulltext import make_searchable
+from dispatch.database.logging import SessionTracker
+
+
+def create_db_engine(connection_string: str):
+ """Create a database engine with proper timeout settings.
+
+ Args:
+ connection_string: Database connection string
+ """
+ url = make_url(connection_string)
+
+ # Use existing configuration values with fallbacks
+ timeout_kwargs = {
+ # Connection timeout - how long to wait for a connection from the pool
+ "pool_timeout": config.DATABASE_ENGINE_POOL_TIMEOUT,
+ # Recycle connections after this many seconds
+ "pool_recycle": config.DATABASE_ENGINE_POOL_RECYCLE,
+ # Maximum number of connections to keep in the pool
+ "pool_size": config.DATABASE_ENGINE_POOL_SIZE,
+ # Maximum overflow connections allowed beyond pool_size
+ "max_overflow": config.DATABASE_ENGINE_MAX_OVERFLOW,
+ # Connection pre-ping to verify connection is still alive
+ "pool_pre_ping": config.DATABASE_ENGINE_POOL_PING,
+ }
+ return create_engine(url, **timeout_kwargs)
+
+
+# Create the default engine with standard timeout
+engine = create_db_engine(
+ config.SQLALCHEMY_DATABASE_URI,
+)
+
+# Enable query timing logging
+#
+# Set up logging for query debugging
+# logger = logging.getLogger(__name__)
+#
+# @event.listens_for(Engine, "before_cursor_execute")
+# def before_cursor_execute(conn, cursor, statement, parameters, context, executemany):
+# conn.info.setdefault("query_start_time", []).append(time.time())
+# logger.debug("Start Query: %s", statement)
+
+# @event.listens_for(Engine, "after_cursor_execute")
+# def after_cursor_execute(conn, cursor, statement, parameters, context, executemany):
+# total = time.time() - conn.info["query_start_time"].pop(-1)
+# logger.debug("Query Complete!")
+# logger.debug("Total Time: %f", total)
+# # Log queries that take more than 1 second as warnings
+# if total > 1.0:
+# logger.warning("Slow Query (%.2fs): %s", total, statement)
+
+
+SessionLocal = sessionmaker(bind=engine)
+
+
+def resolve_table_name(name):
+ """Resolves table names to their mapped names."""
+ names = re.split("(?=[A-Z])", name) # noqa
+ return "_".join([x.lower() for x in names if x])
+
+
+raise_attribute_error = object()
+
+
+def resolve_attr(obj, attr, default=None):
+ """Attempts to access attr via dotted notation, returns none if attr does not exist."""
+ try:
+ return functools.reduce(getattr, attr.split("."), obj)
+ except AttributeError:
+ return default
+
+
+class Base(DeclarativeBase):
+ """Base class for all SQLAlchemy models."""
+ __repr_attrs__ = []
+ __repr_max_length__ = 15
+
+ @declared_attr.directive
+ def __tablename__(cls):
+ return resolve_table_name(cls.__name__)
+
+ def dict(self):
+ """Returns a dict representation of a model."""
+ return {c.name: getattr(self, c.name) for c in self.__table__.columns}
+
+ @property
+ def _id_str(self):
+ ids = inspect(self).identity
+ if ids:
+ return "-".join([str(x) for x in ids]) if len(ids) > 1 else str(ids[0])
+ else:
+ return "None"
+
+ @property
+ def _repr_attrs_str(self):
+ max_length = self.__repr_max_length__
+
+ values = []
+ single = len(self.__repr_attrs__) == 1
+ for key in self.__repr_attrs__:
+ if not hasattr(self, key):
+ raise KeyError(
+ "{} has incorrect attribute '{}' in __repr__attrs__".format(self.__class__, key)
+ )
+ value = getattr(self, key)
+ wrap_in_quote = isinstance(value, str)
+
+ value = str(value)
+ if len(value) > max_length:
+ value = value[:max_length] + "..."
+
+ if wrap_in_quote:
+ value = "'{}'".format(value)
+ values.append(value if single else "{}:{}".format(key, value))
+
+ return " ".join(values)
+
+ def __repr__(self):
+ # get id like '#123'
+ id_str = ("#" + self._id_str) if self._id_str else ""
+ # join class name, id and repr_attrs
+ return "<{} {}{}>".format(
+ self.__class__.__name__,
+ id_str,
+ " " + self._repr_attrs_str if self._repr_attrs_str else "",
+ )
+make_searchable(Base.metadata)
+
+
+def get_db(request: Request) -> Session:
+ """Get database session from request state."""
+ session = request.state.db
+ if not hasattr(session, "_dispatch_session_id"):
+ session._dispatch_session_id = SessionTracker.track_session(
+ session, context="fastapi_request"
+ )
+ return session
+
+
+DbSession = Annotated[Session, Depends(get_db)]
+
+
+def get_model_name_by_tablename(table_fullname: str) -> str:
+ """Returns the model name of a given table."""
+ return get_class_by_tablename(table_fullname=table_fullname).__name__
+
+
+def get_class_by_tablename(table_fullname: str) -> Any:
+ """Return class reference mapped to table."""
+
+ def _find_class(name):
+ for mapper in Base.registry.mappers:
+ cls = mapper.class_
+ if hasattr(cls, "__table__"):
+ if cls.__table__.fullname.lower() == name.lower():
+ return cls
+
+ mapped_name = resolve_table_name(table_fullname)
+ mapped_class = _find_class(mapped_name)
+
+ # try looking in the 'dispatch_core' schema
+ if not mapped_class:
+ mapped_class = _find_class(f"dispatch_core.{mapped_name}")
+
+ if not mapped_class:
+ raise ValidationError(
+ [
+ {
+ "type": "value_error",
+ "loc": ("filter",),
+ "msg": "Model not found. Check the name of your model.",
+ }
+ ],
+ model=BaseModel,
+ )
+
+ return mapped_class
+
+
+def get_table_name_by_class_instance(class_instance: Base) -> str:
+ """Returns the name of the table for a given class instance."""
+ return class_instance._sa_instance_state.mapper.mapped_table.name
+
+
+def ensure_unique_default_per_project(target, value, oldvalue, initiator):
+ """Ensures that only one row in table is specified as the default."""
+ session = object_session(target)
+ if session is None:
+ return
+
+ mapped_cls = get_mapper(target)
+
+ if value:
+ previous_default = (
+ session.query(mapped_cls)
+ .filter(mapped_cls.columns.default == true())
+ .filter(mapped_cls.columns.project_id == target.project_id)
+ .one_or_none()
+ )
+ if previous_default:
+ # we want exclude updating the current default
+ if previous_default.id != target.id:
+ previous_default.default = False
+ session.commit()
+
+
+def refetch_db_session(organization_slug: str) -> Session:
+ """Create a new database session for a specific organization."""
+ schema_engine = engine.execution_options(
+ schema_translate_map={
+ None: f"dispatch_organization_{organization_slug}",
+ }
+ )
+ session = sessionmaker(bind=schema_engine)()
+ session._dispatch_session_id = SessionTracker.track_session(
+ session, context=f"organization_{organization_slug}"
+ )
+ return session
+
+
+@contextmanager
+def get_session() -> Session:
+ """Context manager to ensure the session is closed after use."""
+ session = SessionLocal()
+ session_id = SessionTracker.track_session(session, context="context_manager")
+ try:
+ yield session
+ session.commit()
+ except:
+ session.rollback()
+ raise
+ finally:
+ SessionTracker.untrack_session(session_id)
+ session.close()
+
+
+@contextmanager
+def get_organization_session(organization_slug: str) -> Session:
+ """Context manager to ensure the organization session is closed after use."""
+ session = refetch_db_session(organization_slug)
+ try:
+ yield session
+ session.commit()
+ except:
+ session.rollback()
+ raise
+ finally:
+ if hasattr(session, "_dispatch_session_id"):
+ SessionTracker.untrack_session(session._dispatch_session_id)
+ session.close()
diff --git a/src/dispatch/database/enums.py b/src/dispatch/database/enums.py
new file mode 100644
index 000000000000..81f70c06533f
--- /dev/null
+++ b/src/dispatch/database/enums.py
@@ -0,0 +1 @@
+DISPATCH_ORGANIZATION_SCHEMA_PREFIX = "dispatch_organization"
diff --git a/src/dispatch/database/logging.py b/src/dispatch/database/logging.py
new file mode 100644
index 000000000000..f297696e3f93
--- /dev/null
+++ b/src/dispatch/database/logging.py
@@ -0,0 +1,62 @@
+import logging
+import uuid
+from datetime import datetime
+from typing import Any
+
+from sqlalchemy.orm import Session
+
+logger = logging.getLogger(__name__)
+
+
+class SessionTracker:
+ """Tracks database session lifecycle events."""
+
+ _sessions: dict[str, dict[str, Any]] = {}
+
+ @classmethod
+ def track_session(cls, session: Session, context: str | None = None) -> str:
+ """Tracks a new database session."""
+ session_id = str(uuid.uuid4())
+ cls._sessions[session_id] = {
+ "session": session,
+ "context": context,
+ "created_at": datetime.now().timestamp(),
+ }
+ logger.info(
+ "Database session created",
+ extra={
+ "session_id": session_id,
+ "context": context,
+ "total_active_sessions": len(cls._sessions),
+ },
+ )
+ return session_id
+
+ @classmethod
+ def untrack_session(cls, session_id: str) -> None:
+ """Untracks a database session."""
+ if session_id in cls._sessions:
+ session_info = cls._sessions.pop(session_id)
+ duration = datetime.now().timestamp() - session_info["created_at"]
+ logger.info(
+ "Database session closed",
+ extra={
+ "session_id": session_id,
+ "context": session_info["context"],
+ "duration_seconds": duration,
+ "total_active_sessions": len(cls._sessions),
+ },
+ )
+
+ @classmethod
+ def get_active_sessions(cls) -> list[dict[str, Any]]:
+ """Returns information about all active sessions."""
+ current_time = datetime.now().timestamp()
+ return [
+ {
+ "session_id": session_id,
+ "context": info["context"],
+ "age_seconds": current_time - info["created_at"],
+ }
+ for session_id, info in cls._sessions.items()
+ ]
diff --git a/src/dispatch/database/manage.py b/src/dispatch/database/manage.py
new file mode 100644
index 000000000000..f56d4b5db279
--- /dev/null
+++ b/src/dispatch/database/manage.py
@@ -0,0 +1,200 @@
+import os
+import logging
+
+from alembic import command as alembic_command
+from alembic.config import Config as AlembicConfig
+
+from sqlalchemy import Engine, text
+from sqlalchemy.engine import Connection
+from sqlalchemy.schema import CreateSchema, Table
+from sqlalchemy_utils import create_database, database_exists
+
+from dispatch import config
+from dispatch.organization.models import Organization
+from dispatch.project.models import Project
+from dispatch.plugin.models import Plugin
+from dispatch.search import fulltext
+from dispatch.search.fulltext import (
+ sync_trigger,
+)
+
+from .core import Base, sessionmaker
+from .enums import DISPATCH_ORGANIZATION_SCHEMA_PREFIX
+
+
+log = logging.getLogger(__file__)
+
+
+def version_schema(script_location: str):
+ """Applies alembic versioning to schema."""
+
+ # add it to alembic table
+ alembic_cfg = AlembicConfig(config.ALEMBIC_INI_PATH)
+ alembic_cfg.set_main_option("script_location", script_location)
+ alembic_command.stamp(alembic_cfg, "head")
+
+
+def get_core_tables() -> list[Table]:
+ """Fetches tables that belong to the 'dispatch_core' schema."""
+ core_tables: list[Table] = []
+ for _, table in Base.metadata.tables.items():
+ if table.schema == "dispatch_core":
+ core_tables.append(table)
+ return core_tables
+
+
+def get_tenant_tables() -> list[Table]:
+ """Fetches tables that belong to their own tenant tables."""
+ tenant_tables: list[Table] = []
+ for _, table in Base.metadata.tables.items():
+ if not table.schema:
+ tenant_tables.append(table)
+ return tenant_tables
+
+
+def init_database(engine: Engine):
+ """Initializes the database."""
+ if not database_exists(str(config.SQLALCHEMY_DATABASE_URI)):
+ create_database(str(config.SQLALCHEMY_DATABASE_URI))
+
+ schema_name = "dispatch_core"
+ with engine.begin() as connection:
+ connection.execute(CreateSchema(schema_name, if_not_exists=True))
+
+ tables = get_core_tables()
+
+ Base.metadata.create_all(engine, tables=tables)
+
+ version_schema(script_location=config.ALEMBIC_CORE_REVISION_PATH)
+ with engine.connect() as connection:
+ setup_fulltext_search(connection, tables)
+
+ # setup an required database functions
+ session = sessionmaker(bind=engine)
+ db_session = session()
+
+ # we create the default organization if it doesn't exist
+ organization = (
+ db_session.query(Organization).filter(Organization.name == "default").one_or_none()
+ )
+ if not organization:
+ print("Creating default organization...")
+ organization = Organization(
+ name="default",
+ slug="default",
+ default=True,
+ description="Default Dispatch organization.",
+ )
+
+ db_session.add(organization)
+ db_session.commit()
+
+ # we initialize the database schema
+ init_schema(engine=engine, organization=organization)
+
+ # we install all plugins
+ from dispatch.common.utils.cli import install_plugins
+ from dispatch.plugins.base import plugins
+
+ install_plugins()
+
+ for p in plugins.all():
+ plugin = Plugin(
+ title=p.title,
+ slug=p.slug,
+ type=p.type,
+ version=p.version,
+ author=p.author,
+ author_url=p.author_url,
+ multiple=p.multiple,
+ description=p.description,
+ )
+ db_session.add(plugin)
+ db_session.commit()
+
+ # we create the default project if it doesn't exist
+ project = db_session.query(Project).filter(Project.name == "default").one_or_none()
+ if not project:
+ print("Creating default project...")
+ project = Project(
+ name="default",
+ default=True,
+ description="Default Dispatch project.",
+ organization=organization,
+ )
+ db_session.add(project)
+ db_session.commit()
+
+ # we initialize the project with defaults
+ from dispatch.project import flows as project_flows
+
+ print("Initializing default project...")
+ project_flows.project_init_flow(
+ project_id=project.id, organization_slug=organization.slug, db_session=db_session
+ )
+
+
+def init_schema(*, engine: Engine, organization: Organization) -> Organization:
+ """Initializes a new schema."""
+ schema_name = f"{DISPATCH_ORGANIZATION_SCHEMA_PREFIX}_{organization.slug}"
+
+ with engine.begin() as connection:
+ connection.execute(CreateSchema(schema_name, if_not_exists=True))
+
+ # set the schema for table creation
+ tables = get_tenant_tables()
+
+ # alter each table's schema
+ for t in tables:
+ t.schema = schema_name
+
+ Base.metadata.create_all(engine, tables=tables)
+
+ # put schema under version control
+ version_schema(script_location=config.ALEMBIC_TENANT_REVISION_PATH)
+
+ with engine.connect() as connection:
+ # we need to map this for full text search as it uses sql literal strings
+ # and schema translate map does not apply
+ for t in tables:
+ t.schema = schema_name
+
+ setup_fulltext_search(connection, tables)
+
+ session = sessionmaker(bind=engine)
+ db_session = session()
+
+ organization = db_session.merge(organization)
+ db_session.add(organization)
+ db_session.commit()
+ return organization
+
+
+def setup_fulltext_search(connection: Connection, tables: list[Table]) -> None:
+ """Syncs any required fulltext table triggers and functions."""
+ # parsing functions
+ function_path = os.path.join(
+ os.path.dirname(os.path.abspath(fulltext.__file__)), "expressions.sql"
+ )
+ connection.execute(text(open(function_path).read()))
+
+ for table in tables:
+ table_triggers = []
+ for column in table.columns:
+ if column.name.endswith("search_vector"):
+ if hasattr(column.type, "columns"):
+ table_triggers.append(
+ {
+ "conn": connection,
+ "table": table,
+ "tsvector_column": "search_vector",
+ "indexed_columns": column.type.columns,
+ }
+ )
+ else:
+ log.warning(
+ f"Column search_vector defined but no index columns found. Table: {table.name}"
+ )
+
+ for trigger in table_triggers:
+ sync_trigger(**trigger)
diff --git a/src/dispatch/database/revisions/__init__.py b/src/dispatch/database/revisions/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/alembic/README b/src/dispatch/database/revisions/core/README
similarity index 100%
rename from src/dispatch/alembic/README
rename to src/dispatch/database/revisions/core/README
diff --git a/src/dispatch/database/revisions/core/env.py b/src/dispatch/database/revisions/core/env.py
new file mode 100644
index 000000000000..0ae70a6c01f2
--- /dev/null
+++ b/src/dispatch/database/revisions/core/env.py
@@ -0,0 +1,65 @@
+from alembic import context
+from sqlalchemy import create_engine, text
+
+from dispatch.logging import logging
+from dispatch.config import SQLALCHEMY_DATABASE_URI
+from dispatch.database.core import Base
+
+# this is the Alembic Config object, which provides
+# access to the values within the .ini file in use.
+config = context.config
+
+# Interpret the config file for Python logging.
+# This line sets up loggers basically.
+log = logging.getLogger(__name__)
+
+
+config.set_main_option("sqlalchemy.url", str(SQLALCHEMY_DATABASE_URI))
+
+target_metadata = Base.metadata # noqa
+
+CORE_SCHEMA_NAME = "dispatch_core"
+
+
+def include_object(object, name, type_, reflected, compare_to):
+ if type_ == "table":
+ return object.schema == CORE_SCHEMA_NAME
+ return True
+
+
+def run_migrations_online():
+ """Run migrations in 'online' mode.
+
+ In this scenario we need to create an Engine
+ and associate a connection with the context.
+
+ """
+ def process_revision_directives(context, revision, directives):
+ script = directives[0]
+ if script.upgrade_ops.is_empty():
+ directives[:] = []
+ log.info("No changes found skipping revision creation.")
+
+ connectable = create_engine(SQLALCHEMY_DATABASE_URI)
+
+ log.info("Migrating dispatch core schema...")
+ # migrate common tables
+ with connectable.connect() as connection:
+ set_search_path = text(f'set search_path to "{CORE_SCHEMA_NAME}"')
+ connection.execute(set_search_path)
+ connection.commit()
+ context.configure(
+ connection=connection,
+ target_metadata=target_metadata,
+ include_object=include_object,
+ process_revision_directives=process_revision_directives,
+ )
+
+ with context.begin_transaction():
+ context.run_migrations()
+
+
+if context.is_offline_mode():
+ log.info("Can't run migrations offline")
+else:
+ run_migrations_online()
diff --git a/src/dispatch/alembic/script.py.mako b/src/dispatch/database/revisions/core/script.py.mako
similarity index 100%
rename from src/dispatch/alembic/script.py.mako
rename to src/dispatch/database/revisions/core/script.py.mako
diff --git a/src/dispatch/database/revisions/core/versions/2021-06-02_f011c050b9ba.py b/src/dispatch/database/revisions/core/versions/2021-06-02_f011c050b9ba.py
new file mode 100644
index 000000000000..29ca485942df
--- /dev/null
+++ b/src/dispatch/database/revisions/core/versions/2021-06-02_f011c050b9ba.py
@@ -0,0 +1,21 @@
+"""Initial core revision
+
+Revision ID: f011c050b9ba
+Revises:
+Create Date: 2021-06-02 14:09:15.220737
+
+"""
+
+# revision identifiers, used by Alembic.
+revision = "f011c050b9ba"
+down_revision = None
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ pass
+
+
+def downgrade():
+ pass
diff --git a/src/dispatch/database/revisions/core/versions/2021-06-15_0ab4f8f54bfa.py b/src/dispatch/database/revisions/core/versions/2021-06-15_0ab4f8f54bfa.py
new file mode 100644
index 000000000000..77acee45d583
--- /dev/null
+++ b/src/dispatch/database/revisions/core/versions/2021-06-15_0ab4f8f54bfa.py
@@ -0,0 +1,105 @@
+"""empty message
+
+Revision ID: 0ab4f8f54bfa
+Revises: f011c050b9ba
+Create Date: 2021-06-15 16:11:19.703274
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "0ab4f8f54bfa"
+down_revision = "f011c050b9ba"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.alter_column(
+ "dispatch_user_organization", "dispatch_user_id", existing_type=sa.INTEGER(), nullable=False
+ )
+ op.alter_column(
+ "dispatch_user_organization", "organization_id", existing_type=sa.INTEGER(), nullable=False
+ )
+ op.drop_constraint(
+ "dispatch_user_organization_dispatch_user_id_fkey",
+ "dispatch_user_organization",
+ type_="foreignkey",
+ )
+ op.drop_constraint(
+ "dispatch_user_organization_organization_id_fkey",
+ "dispatch_user_organization",
+ type_="foreignkey",
+ )
+ op.create_foreign_key(
+ None,
+ "dispatch_user_organization",
+ "dispatch_user",
+ ["dispatch_user_id"],
+ ["id"],
+ source_schema="dispatch_core",
+ referent_schema="dispatch_core",
+ )
+ op.create_foreign_key(
+ None,
+ "dispatch_user_organization",
+ "organization",
+ ["organization_id"],
+ ["id"],
+ source_schema="dispatch_core",
+ referent_schema="dispatch_core",
+ )
+ op.drop_column("dispatch_user_organization", "id")
+ op.add_column("organization", sa.Column("banner_enabled", sa.Boolean(), nullable=True))
+ op.add_column("organization", sa.Column("banner_color", sa.String(), nullable=True))
+ op.add_column("organization", sa.Column("banner_text", sa.String(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("organization", "banner_text")
+ op.drop_column("organization", "banner_color")
+ op.drop_column("organization", "banner_enabled")
+ op.add_column(
+ "dispatch_user_organization",
+ sa.Column(
+ "id",
+ sa.INTEGER(),
+ server_default=sa.text("nextval('public.dispatch_user_organization_id_seq'::regclass)"),
+ autoincrement=True,
+ nullable=False,
+ ),
+ )
+ op.drop_constraint(
+ None, "dispatch_user_organization", schema="dispatch_core", type_="foreignkey"
+ )
+ op.drop_constraint(
+ None, "dispatch_user_organization", schema="dispatch_core", type_="foreignkey"
+ )
+ op.create_foreign_key(
+ "dispatch_user_organization_organization_id_fkey",
+ "dispatch_user_organization",
+ "organization",
+ ["organization_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "dispatch_user_organization_dispatch_user_id_fkey",
+ "dispatch_user_organization",
+ "dispatch_user",
+ ["dispatch_user_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.alter_column(
+ "dispatch_user_organization", "organization_id", existing_type=sa.INTEGER(), nullable=True
+ )
+ op.alter_column(
+ "dispatch_user_organization", "dispatch_user_id", existing_type=sa.INTEGER(), nullable=True
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/core/versions/2021-07-22_8f364cf49a23.py b/src/dispatch/database/revisions/core/versions/2021-07-22_8f364cf49a23.py
new file mode 100644
index 000000000000..fe3ab5751fa2
--- /dev/null
+++ b/src/dispatch/database/revisions/core/versions/2021-07-22_8f364cf49a23.py
@@ -0,0 +1,73 @@
+"""Recreates dispatch_user_organization_dispatch_user_id_fkey and dispatch_user_organization_organization_id_fkey
+
+Revision ID: 8f364cf49a23
+Revises: 0ab4f8f54bfa
+Create Date: 2021-07-22 14:59:12.629986
+
+"""
+from alembic import op
+
+
+# revision identifiers, used by Alembic.
+revision = "8f364cf49a23"
+down_revision = "0ab4f8f54bfa"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(
+ "dispatch_user_organization_dispatch_user_id_fkey",
+ "dispatch_user_organization",
+ type_="foreignkey",
+ )
+ op.drop_constraint(
+ "dispatch_user_organization_organization_id_fkey",
+ "dispatch_user_organization",
+ type_="foreignkey",
+ )
+ op.create_foreign_key(
+ None,
+ "dispatch_user_organization",
+ "organization",
+ ["organization_id"],
+ ["id"],
+ source_schema="dispatch_core",
+ referent_schema="dispatch_core",
+ )
+ op.create_foreign_key(
+ None,
+ "dispatch_user_organization",
+ "dispatch_user",
+ ["dispatch_user_id"],
+ ["id"],
+ source_schema="dispatch_core",
+ referent_schema="dispatch_core",
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(
+ None, "dispatch_user_organization", schema="dispatch_core", type_="foreignkey"
+ )
+ op.drop_constraint(
+ None, "dispatch_user_organization", schema="dispatch_core", type_="foreignkey"
+ )
+ op.create_foreign_key(
+ "dispatch_user_organization_organization_id_fkey",
+ "dispatch_user_organization",
+ "organization",
+ ["organization_id"],
+ ["id"],
+ )
+ op.create_foreign_key(
+ "dispatch_user_organization_dispatch_user_id_fkey",
+ "dispatch_user_organization",
+ "dispatch_user",
+ ["dispatch_user_id"],
+ ["id"],
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/core/versions/2021-07-22_c0bc938b058e.py b/src/dispatch/database/revisions/core/versions/2021-07-22_c0bc938b058e.py
new file mode 100644
index 000000000000..23078f00312a
--- /dev/null
+++ b/src/dispatch/database/revisions/core/versions/2021-07-22_c0bc938b058e.py
@@ -0,0 +1,41 @@
+"""Makes an organization's slug into a column.
+
+Revision ID: c0bc938b058e
+Revises: 0ab4f8f54bfa
+Create Date: 2021-07-22 09:14:12.411910
+
+"""
+from alembic import op
+from slugify import slugify
+from sqlalchemy import Column, String
+
+
+# revision identifiers, used by Alembic.
+revision = "c0bc938b058e"
+down_revision = "0ab4f8f54bfa"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("organization", Column("slug", String()), schema="dispatch_core")
+ op.create_unique_constraint(None, "organization", ["name"], schema="dispatch_core")
+
+ # generate existing slugs
+ conn = op.get_bind()
+ res = conn.execute("select id, name from dispatch_core.organization")
+ results = res.fetchall()
+
+ for r in results:
+ slug = slugify(r[1], separator="_")
+ conn.execute(f"update dispatch_core.organization set slug = '{slug}' where id = {r[0]}")
+
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("organization", "slug", schema="dispatch_core")
+ op.drop_constraint(None, "organization", schema="dispatch_core", type_="unique")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/core/versions/2021-07-23_e0d568f345c9.py b/src/dispatch/database/revisions/core/versions/2021-07-23_e0d568f345c9.py
new file mode 100644
index 000000000000..ca18eade423d
--- /dev/null
+++ b/src/dispatch/database/revisions/core/versions/2021-07-23_e0d568f345c9.py
@@ -0,0 +1,20 @@
+"""Merge revision
+
+Revision ID: e0d568f345c9
+Revises:
+Create Date: 2021-07-23 15:53:45.523064
+
+"""
+# revision identifiers, used by Alembic.
+revision = "e0d568f345c9"
+down_revision = ("c0bc938b058e", "8f364cf49a23")
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ pass
+
+
+def downgrade():
+ pass
diff --git a/src/dispatch/database/revisions/core/versions/2023-04-14_3dd4d12844dc.py b/src/dispatch/database/revisions/core/versions/2023-04-14_3dd4d12844dc.py
new file mode 100644
index 000000000000..2ea99cdb30b4
--- /dev/null
+++ b/src/dispatch/database/revisions/core/versions/2023-04-14_3dd4d12844dc.py
@@ -0,0 +1,28 @@
+"""Adds last_mfa_time to DispatchUser
+
+Revision ID: 3dd4d12844dc
+Revises: e0d568f345c9
+Create Date: 2023-04-14 15:17:00.450716
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "3dd4d12844dc"
+down_revision = "e0d568f345c9"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("dispatch_user", sa.Column("last_mfa_time", sa.DateTime(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("dispatch_user", "last_mfa_time")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/core/versions/2023-09-27_5c60513d6e5e.py b/src/dispatch/database/revisions/core/versions/2023-09-27_5c60513d6e5e.py
new file mode 100644
index 000000000000..c124e41afa3c
--- /dev/null
+++ b/src/dispatch/database/revisions/core/versions/2023-09-27_5c60513d6e5e.py
@@ -0,0 +1,28 @@
+"""Adds last_mfa_time to DispatchUser
+
+Revision ID: 5c60513d6e5e
+Revises: 3dd4d12844dc
+Create Date: 2023-09-27 15:17:00.450716
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "5c60513d6e5e"
+down_revision = "3dd4d12844dc"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("dispatch_user", sa.Column("experimental_features", sa.Boolean(), default=False))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("dispatch_user", "experimental_features")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/core/versions/2023-12-27_ed0b0388fa3f.py b/src/dispatch/database/revisions/core/versions/2023-12-27_ed0b0388fa3f.py
new file mode 100644
index 000000000000..ee060a1c7ee5
--- /dev/null
+++ b/src/dispatch/database/revisions/core/versions/2023-12-27_ed0b0388fa3f.py
@@ -0,0 +1,57 @@
+"""Adds the plugin_event table.
+
+Revision ID: ed0b0388fa3f
+Revises: 5c60513d6e5e
+Create Date: 2023-12-27 13:44:17.960851
+
+"""
+from alembic import op
+import sqlalchemy as sa
+import sqlalchemy_utils
+
+# revision identifiers, used by Alembic.
+revision = "ed0b0388fa3f"
+down_revision = "5c60513d6e5e"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "plugin_event",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("slug", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("plugin_id", sa.Integer(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["plugin_id"],
+ ["dispatch_core.plugin.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("slug"),
+ schema="dispatch_core",
+ )
+ op.create_index(
+ "plugin_event_search_vector_idx",
+ "plugin_event",
+ ["search_vector"],
+ unique=False,
+ schema="dispatch_core",
+ postgresql_using="gin",
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_index(
+ "plugin_event_search_vector_idx",
+ table_name="plugin_event",
+ schema="dispatch_core",
+ postgresql_using="gin",
+ )
+ op.drop_table("plugin_event", schema="dispatch_core")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/core/versions/2025-06-23_903183fd9aee.py b/src/dispatch/database/revisions/core/versions/2025-06-23_903183fd9aee.py
new file mode 100644
index 000000000000..642462a30295
--- /dev/null
+++ b/src/dispatch/database/revisions/core/versions/2025-06-23_903183fd9aee.py
@@ -0,0 +1,38 @@
+"""Add dispatch_user_settings table
+
+Revision ID: 903183fd9aee
+Revises: ed0b0388fa3f
+Create Date: 2025-06-23 11:27:01.615306
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = '903183fd9aee'
+down_revision = 'ed0b0388fa3f'
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ _ = op.create_table('dispatch_user_settings',
+ sa.Column('id', sa.Integer(), nullable=False),
+ sa.Column('dispatch_user_id', sa.Integer(), nullable=True),
+ sa.Column('auto_add_to_incident_bridges', sa.Boolean(), nullable=True),
+ sa.Column('created_at', sa.DateTime(), nullable=True),
+ sa.Column('updated_at', sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(['dispatch_user_id'], ['dispatch_core.dispatch_user.id'], ),
+ sa.PrimaryKeyConstraint('id'),
+ sa.UniqueConstraint('dispatch_user_id'),
+ schema='dispatch_core'
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table('dispatch_user_settings', schema='dispatch_core')
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/multi-tenant-migration.sql b/src/dispatch/database/revisions/multi-tenant-migration.sql
new file mode 100644
index 000000000000..e05808a6fa3c
--- /dev/null
+++ b/src/dispatch/database/revisions/multi-tenant-migration.sql
@@ -0,0 +1,152 @@
+CREATE OR REPLACE FUNCTION clone_schema(source_schema text, dest_schema text) RETURNS void AS
+$$
+
+DECLARE
+ object text;
+ buffer text;
+ default_ text;
+ column_ text;
+ constraint_name_ text;
+ constraint_def_ text;
+ trigger_name_ text;
+ trigger_timing_ text;
+ trigger_events_ text;
+ trigger_orientation_ text;
+ trigger_action_ text;
+BEGIN
+
+ -- replace existing schema
+ EXECUTE 'DROP SCHEMA IF EXISTS ' || dest_schema || ' CASCADE';
+
+ -- create schema
+ EXECUTE 'CREATE SCHEMA ' || dest_schema ;
+
+ -- create sequences
+ FOR object IN
+ SELECT sequence_name::text FROM information_schema.SEQUENCES WHERE sequence_schema = source_schema
+ LOOP
+ EXECUTE 'CREATE SEQUENCE ' || dest_schema || '.' || object;
+ END LOOP;
+
+ -- create tables
+ FOR object IN
+ SELECT table_name::text FROM information_schema.TABLES WHERE table_schema = source_schema
+ LOOP
+ buffer := dest_schema || '.' || object;
+
+ -- create table
+ EXECUTE 'CREATE TABLE ' || buffer || ' (LIKE ' || source_schema || '.' || object || ' INCLUDING CONSTRAINTS INCLUDING INDEXES INCLUDING DEFAULTS)';
+ EXECUTE 'INSERT INTO ' || buffer || '(SELECT * FROM ' || source_schema || '.' || object || ')';
+
+ -- fix sequence defaults
+ FOR column_, default_ IN
+ SELECT column_name::text, REPLACE(column_default::text, source_schema || '.', dest_schema|| '.') FROM information_schema.COLUMNS WHERE table_schema = dest_schema AND table_name = object AND column_default LIKE 'nextval(%' || source_schema || '.%::regclass)'
+ LOOP
+ EXECUTE 'ALTER TABLE ' || buffer || ' ALTER COLUMN ' || column_ || ' SET DEFAULT ' || default_;
+ END LOOP;
+
+ -- create triggers
+ FOR trigger_name_, trigger_timing_, trigger_events_, trigger_orientation_, trigger_action_ IN
+ SELECT trigger_name::text, action_timing::text, string_agg(event_manipulation::text, ' OR '), action_orientation::text, action_statement::text FROM information_schema.TRIGGERS WHERE event_object_schema=source_schema and event_object_table=object GROUP BY trigger_name, action_timing, action_orientation, action_statement
+ LOOP
+ EXECUTE 'CREATE TRIGGER ' || trigger_name_ || ' ' || trigger_timing_ || ' ' || trigger_events_ || ' ON ' || buffer || ' FOR EACH ' || trigger_orientation_ || ' ' || trigger_action_;
+ END LOOP;
+ END LOOP;
+
+ -- reiterate tables and create foreign keys
+ FOR object IN
+ SELECT table_name::text FROM information_schema.TABLES WHERE table_schema = source_schema
+ LOOP
+ buffer := dest_schema || '.' || object;
+
+ -- create foreign keys
+ FOR constraint_name_, constraint_def_ IN
+ SELECT conname::text, REPLACE(pg_get_constraintdef(pg_constraint.oid), source_schema||'.', dest_schema||'.') FROM pg_constraint INNER JOIN pg_class ON conrelid=pg_class.oid INNER JOIN pg_namespace ON pg_namespace.oid=pg_class.relnamespace WHERE contype='f' and relname=object and nspname=source_schema
+ LOOP
+ EXECUTE 'ALTER TABLE '|| buffer ||' ADD CONSTRAINT '|| constraint_name_ ||' '|| constraint_def_;
+ END LOOP;
+ END LOOP;
+
+END;
+
+$$ LANGUAGE plpgsql VOLATILE;
+
+
+select clone_schema('public', 'dispatch_core');
+select clone_schema('public', 'dispatch_organization_default');
+
+-- drop table that aren't needed
+drop table IF EXISTS dispatch_organization_default.dispatch_user;
+drop table IF EXISTS dispatch_organization_default.organization;
+drop table IF EXISTS dispatch_organization_default.dispatch_user_organization;
+drop table IF EXISTS dispatch_organization_default.plugin;
+drop table IF EXISTS dispatch_core.assoc_document_filters;
+drop table IF EXISTS dispatch_core.assoc_incident_tags;
+drop table IF EXISTS dispatch_core.assoc_incident_terms;
+drop table IF EXISTS dispatch_core.assoc_individual_contact_filters;
+drop table IF EXISTS dispatch_core.assoc_individual_contact_incident_priority;
+drop table IF EXISTS dispatch_core.assoc_individual_contact_incident_type;
+drop table IF EXISTS dispatch_core.assoc_individual_contact_terms;
+drop table IF EXISTS dispatch_core.assoc_notification_filters;
+drop table IF EXISTS dispatch_core.assoc_service_filters;
+drop table IF EXISTS dispatch_core.assoc_team_contact_filters;
+drop table IF EXISTS dispatch_core.conference;
+drop table IF EXISTS dispatch_core.conversation;
+drop table IF EXISTS dispatch_core.definition;
+drop table IF EXISTS dispatch_core.definition_teams;
+drop table IF EXISTS dispatch_core.definition_terms;
+drop table IF EXISTS dispatch_core.dispatch_user_project;
+drop table IF EXISTS dispatch_core.document;
+drop table IF EXISTS dispatch_core.document_incident_priority;
+drop table IF EXISTS dispatch_core.document_incident_type;
+drop table IF EXISTS dispatch_core.document_terms;
+drop table IF EXISTS dispatch_core.event;
+drop table IF EXISTS dispatch_core.feedback;
+drop table IF EXISTS dispatch_core.group;
+drop table IF EXISTS dispatch_core.incident;
+drop table IF EXISTS dispatch_core.incident_cost;
+drop table IF EXISTS dispatch_core.incident_cost_type;
+drop table IF EXISTS dispatch_core.incident_priority;
+drop table IF EXISTS dispatch_core.incident_type;
+drop table IF EXISTS dispatch_core.individual_contact;
+drop table IF EXISTS dispatch_core.notification;
+drop table IF EXISTS dispatch_core.participant;
+drop table IF EXISTS dispatch_core.participant_role;
+drop table IF EXISTS dispatch_core.plugin_instance;
+drop table IF EXISTS dispatch_core.project;
+drop table IF EXISTS dispatch_core.recommendation;
+drop table IF EXISTS dispatch_core.recommendation_accuracy;
+drop table IF EXISTS dispatch_core.recommendation_documents;
+drop table IF EXISTS dispatch_core.recommendation_incident_priorities;
+drop table IF EXISTS dispatch_core.recommendation_incident_types;
+drop table IF EXISTS dispatch_core.recommendation_individual_contacts;
+drop table IF EXISTS dispatch_core.recommendation_match;
+drop table IF EXISTS dispatch_core.recommendation_services;
+drop table IF EXISTS dispatch_core.recommendation_team_contacts;
+drop table IF EXISTS dispatch_core.recommendation_terms;
+drop table IF EXISTS dispatch_core.report;
+drop table IF EXISTS dispatch_core.search_filter;
+drop table IF EXISTS dispatch_core.service;
+drop table IF EXISTS dispatch_core.service_incident;
+drop table IF EXISTS dispatch_core.service_incident_priority;
+drop table IF EXISTS dispatch_core.service_incident_type;
+drop table IF EXISTS dispatch_core.service_terms;
+drop table IF EXISTS dispatch_core.storage;
+drop table IF EXISTS dispatch_core.tag;
+drop table IF EXISTS dispatch_core.tag_type;
+drop table IF EXISTS dispatch_core.task;
+drop table IF EXISTS dispatch_core.task_assignees;
+drop table IF EXISTS dispatch_core.task_tickets;
+drop table IF EXISTS dispatch_core.team_contact;
+drop table IF EXISTS dispatch_core.team_contact_incident;
+drop table IF EXISTS dispatch_core.team_contact_incident_priority;
+drop table IF EXISTS dispatch_core.team_contact_incident_type;
+drop table IF EXISTS dispatch_core.team_contact_terms;
+drop table IF EXISTS dispatch_core.term;
+drop table IF EXISTS dispatch_core.ticket;
+drop table IF EXISTS dispatch_core.workflow;
+drop table IF EXISTS dispatch_core.workflow_incident_priority;
+drop table IF EXISTS dispatch_core.workflow_incident_type;
+drop table IF EXISTS dispatch_core.workflow_instance;
+drop table IF EXISTS dispatch_core.workflow_instance_artifact;
+drop table IF EXISTS dispatch_core.workflow_term;
diff --git a/src/dispatch/database/revisions/tenant/README b/src/dispatch/database/revisions/tenant/README
new file mode 100644
index 000000000000..98e4f9c44eff
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/README
@@ -0,0 +1 @@
+Generic single-database configuration.
\ No newline at end of file
diff --git a/src/dispatch/database/revisions/tenant/env.py b/src/dispatch/database/revisions/tenant/env.py
new file mode 100644
index 000000000000..da533ee9f1ce
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/env.py
@@ -0,0 +1,81 @@
+from alembic import context
+from sqlalchemy import create_engine, inspect, text
+
+
+from dispatch.logging import logging
+from dispatch.config import SQLALCHEMY_DATABASE_URI
+from dispatch.database.core import Base
+
+
+# this is the Alembic Config object, which provides
+# access to the values within the .ini file in use.
+config = context.config
+
+# Interpret the config file for Python logging.
+# This line sets up loggers basically.
+log = logging.getLogger(__name__)
+
+config.set_main_option("sqlalchemy.url", SQLALCHEMY_DATABASE_URI)
+
+target_metadata = Base.metadata
+
+
+def get_tenant_schemas(connection):
+ tenant_schemas = []
+ for s in inspect(connection).get_schema_names():
+ if s.startswith("dispatch_organization_"):
+ tenant_schemas.append(s)
+ return tenant_schemas
+
+
+# produce an include object function that filters on the given schemas
+def include_object(object, name, type_, reflected, compare_to):
+ if type_ == "table":
+ if object.schema:
+ return False
+ return True
+
+
+def run_migrations_online():
+ """Run migrations in 'online' mode.
+
+ In this scenario we need to create an Engine
+ and associate a connection with the context.
+
+ """
+ def process_revision_directives(context, revision, directives):
+ script = directives[0]
+ if script.upgrade_ops.is_empty():
+ directives[:] = []
+ log.info("No changes found skipping revision creation.")
+
+ connectable = create_engine(SQLALCHEMY_DATABASE_URI)
+
+ with connectable.connect() as connection:
+ # get the schema names
+ for schema in get_tenant_schemas(connection):
+ log.info(f"Migrating {schema}...")
+ set_search_path = text(f'set search_path to "{schema}"')
+ connection.execute(set_search_path)
+ connection.commit()
+
+ print(target_metadata)
+ context.configure(
+ connection=connection,
+ target_metadata=target_metadata,
+ include_object=include_object,
+ process_revision_directives=process_revision_directives,
+ )
+
+ with context.begin_transaction():
+ context.run_migrations()
+
+ if context.config.cmd_opts:
+ if context.config.cmd_opts.cmd == "revision":
+ break
+
+
+if context.is_offline_mode():
+ log.info("Can't run migrations offline")
+else:
+ run_migrations_online()
diff --git a/src/dispatch/database/revisions/tenant/script.py.mako b/src/dispatch/database/revisions/tenant/script.py.mako
new file mode 100644
index 000000000000..2c0156303a8d
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/script.py.mako
@@ -0,0 +1,24 @@
+"""${message}
+
+Revision ID: ${up_revision}
+Revises: ${down_revision | comma,n}
+Create Date: ${create_date}
+
+"""
+from alembic import op
+import sqlalchemy as sa
+${imports if imports else ""}
+
+# revision identifiers, used by Alembic.
+revision = ${repr(up_revision)}
+down_revision = ${repr(down_revision)}
+branch_labels = ${repr(branch_labels)}
+depends_on = ${repr(depends_on)}
+
+
+def upgrade():
+ ${upgrades if upgrades else "pass"}
+
+
+def downgrade():
+ ${downgrades if downgrades else "pass"}
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-06-02_f011c050b9ba.py b/src/dispatch/database/revisions/tenant/versions/2021-06-02_f011c050b9ba.py
new file mode 100644
index 000000000000..0b33b81b4166
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-06-02_f011c050b9ba.py
@@ -0,0 +1,21 @@
+"""Initial tenant revision
+
+Revision ID: f011c050b9ba
+Revises:
+Create Date: 2021-06-02 14:09:15.220737
+
+"""
+
+# revision identifiers, used by Alembic.
+revision = "f011c050b9ba"
+down_revision = None
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ pass
+
+
+def downgrade():
+ pass
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-06-15_8a558baeef05.py b/src/dispatch/database/revisions/tenant/versions/2021-06-15_8a558baeef05.py
new file mode 100644
index 000000000000..c80e331788c6
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-06-15_8a558baeef05.py
@@ -0,0 +1,1757 @@
+"""empty message
+
+Revision ID: 8a558baeef05
+Revises: f011c050b9ba
+Create Date: 2021-06-15 16:12:33.534076
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "8a558baeef05"
+down_revision = "f011c050b9ba"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("document_terms")
+ op.drop_table("team_contact_terms")
+ op.drop_table("recommendation_documents")
+ op.drop_table("service_incident_type")
+ op.drop_table("assoc_individual_contact_incident_priority")
+ op.drop_table("recommendation_terms")
+ op.drop_table("team_contact_incident_priority")
+ op.drop_table("recommendation_individual_contacts")
+ op.drop_table("assoc_individual_contact_terms")
+ op.drop_table("recommendation_incident_types")
+ op.drop_table("assoc_individual_contact_incident_type")
+ op.drop_table("recommendation_incident_priorities")
+ op.drop_table("document_incident_priority")
+ op.drop_table("service_incident_priority")
+ op.drop_table("document_incident_type")
+ op.drop_table("recommendation_services")
+ op.drop_table("service_terms")
+ op.drop_table("team_contact_incident_type")
+ op.drop_table("recommendation_team_contacts")
+ op.drop_table("recommendation_accuracy")
+ op.drop_constraint(
+ "assoc_document_filters_document_id_fkey", "assoc_document_filters", type_="foreignkey"
+ )
+ op.drop_constraint(
+ "assoc_document_filters_search_filter_id_fkey", "assoc_document_filters", type_="foreignkey"
+ )
+ op.create_foreign_key(
+ None,
+ "assoc_document_filters",
+ "search_filter",
+ ["search_filter_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ None, "assoc_document_filters", "document", ["document_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint(
+ "assoc_incident_tags_incident_id_fkey", "assoc_incident_tags", type_="foreignkey"
+ )
+ op.drop_constraint("assoc_incident_tags_tag_id_fkey", "assoc_incident_tags", type_="foreignkey")
+ op.create_foreign_key(
+ None, "assoc_incident_tags", "incident", ["incident_id"], ["id"], ondelete="CASCADE"
+ )
+ op.create_foreign_key(
+ None, "assoc_incident_tags", "tag", ["tag_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint(
+ "assoc_incident_terms_incident_id_fkey", "assoc_incident_terms", type_="foreignkey"
+ )
+ op.drop_constraint(
+ "assoc_incident_terms_term_id_fkey", "assoc_incident_terms", type_="foreignkey"
+ )
+ op.create_foreign_key(
+ None, "assoc_incident_terms", "incident", ["incident_id"], ["id"], ondelete="CASCADE"
+ )
+ op.create_foreign_key(
+ None, "assoc_incident_terms", "term", ["term_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint(
+ "assoc_individual_contact_filters_search_filter_id_fkey",
+ "assoc_individual_contact_filters",
+ type_="foreignkey",
+ )
+ op.drop_constraint(
+ "assoc_individual_contact_filters_individual_contact_id_fkey",
+ "assoc_individual_contact_filters",
+ type_="foreignkey",
+ )
+ op.create_foreign_key(
+ None,
+ "assoc_individual_contact_filters",
+ "individual_contact",
+ ["individual_contact_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ None,
+ "assoc_individual_contact_filters",
+ "search_filter",
+ ["search_filter_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(
+ "assoc_notification_filters_search_filter_id_fkey",
+ "assoc_notification_filters",
+ type_="foreignkey",
+ )
+ op.drop_constraint(
+ "assoc_notification_filters_notification_id_fkey",
+ "assoc_notification_filters",
+ type_="foreignkey",
+ )
+ op.create_foreign_key(
+ None,
+ "assoc_notification_filters",
+ "search_filter",
+ ["search_filter_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ None,
+ "assoc_notification_filters",
+ "notification",
+ ["notification_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(
+ "assoc_service_filters_service_id_fkey", "assoc_service_filters", type_="foreignkey"
+ )
+ op.drop_constraint(
+ "assoc_service_filters_search_filter_id_fkey", "assoc_service_filters", type_="foreignkey"
+ )
+ op.create_foreign_key(
+ None,
+ "assoc_service_filters",
+ "search_filter",
+ ["search_filter_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ None, "assoc_service_filters", "service", ["service_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint(
+ "assoc_team_contact_filters_search_filter_id_fkey",
+ "assoc_team_contact_filters",
+ type_="foreignkey",
+ )
+ op.drop_constraint(
+ "assoc_team_contact_filters_team_contact_id_fkey",
+ "assoc_team_contact_filters",
+ type_="foreignkey",
+ )
+ op.create_foreign_key(
+ None,
+ "assoc_team_contact_filters",
+ "team_contact",
+ ["team_contact_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ None,
+ "assoc_team_contact_filters",
+ "search_filter",
+ ["search_filter_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ op.drop_constraint("conference_incident_id_fkey", "conference", type_="foreignkey")
+ op.create_foreign_key(
+ None, "conference", "incident", ["incident_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint("conversation_incident_id_fkey", "conversation", type_="foreignkey")
+ op.create_foreign_key(
+ None, "conversation", "incident", ["incident_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint("definition_project_id_fkey", "definition", type_="foreignkey")
+ op.create_foreign_key(None, "definition", "project", ["project_id"], ["id"], ondelete="CASCADE")
+ op.drop_constraint(
+ "definition_teams_definition_id_fkey", "definition_teams", type_="foreignkey"
+ )
+ op.drop_constraint(
+ "definition_teams_team_contact_id_fkey", "definition_teams", type_="foreignkey"
+ )
+ op.create_foreign_key(None, "definition_teams", "definition", ["definition_id"], ["id"])
+ op.create_foreign_key(None, "definition_teams", "team_contact", ["team_contact_id"], ["id"])
+ op.drop_constraint("definition_terms_term_id_fkey", "definition_terms", type_="foreignkey")
+ op.drop_constraint(
+ "definition_terms_definition_id_fkey", "definition_terms", type_="foreignkey"
+ )
+ op.create_foreign_key(None, "definition_terms", "term", ["term_id"], ["id"])
+ op.create_foreign_key(None, "definition_terms", "definition", ["definition_id"], ["id"])
+ op.alter_column(
+ "dispatch_user_project", "dispatch_user_id", existing_type=sa.INTEGER(), nullable=False
+ )
+ op.alter_column(
+ "dispatch_user_project", "project_id", existing_type=sa.INTEGER(), nullable=False
+ )
+ op.drop_constraint(
+ "dispatch_user_project_project_id_fkey", "dispatch_user_project", type_="foreignkey"
+ )
+ op.drop_constraint(
+ "dispatch_user_project_dispatch_user_id_fkey", "dispatch_user_project", type_="foreignkey"
+ )
+ op.create_foreign_key(None, "dispatch_user_project", "project", ["project_id"], ["id"])
+ op.create_foreign_key(
+ None,
+ "dispatch_user_project",
+ "dispatch_user",
+ ["dispatch_user_id"],
+ ["id"],
+ referent_schema="dispatch_core",
+ )
+ op.drop_column("dispatch_user_project", "id")
+ op.drop_constraint("document_project_id_fkey", "document", type_="foreignkey")
+ op.drop_constraint("document_incident_id_fkey", "document", type_="foreignkey")
+ op.drop_constraint("document_report_id_fkey", "document", type_="foreignkey")
+ op.create_foreign_key(None, "document", "project", ["project_id"], ["id"], ondelete="CASCADE")
+ op.create_foreign_key(None, "document", "report", ["report_id"], ["id"])
+ op.create_foreign_key(
+ None, "document", "incident", ["incident_id"], ["id"], ondelete="CASCADE", use_alter=True
+ )
+ op.drop_constraint("event_incident_id_fkey", "event", type_="foreignkey")
+ op.drop_constraint("event_individual_id_fkey", "event", type_="foreignkey")
+ op.create_foreign_key(
+ None, "event", "individual_contact", ["individual_id"], ["id"], ondelete="CASCADE"
+ )
+ op.create_foreign_key(None, "event", "incident", ["incident_id"], ["id"], ondelete="CASCADE")
+ op.drop_constraint("feedback_incident_id_fkey", "feedback", type_="foreignkey")
+ op.drop_constraint("feedback_participant_id_fkey", "feedback", type_="foreignkey")
+ op.create_foreign_key(None, "feedback", "incident", ["incident_id"], ["id"], ondelete="CASCADE")
+ op.create_foreign_key(None, "feedback", "participant", ["participant_id"], ["id"])
+ op.drop_constraint("group_incident_id_fkey", "group", type_="foreignkey")
+ op.create_foreign_key(None, "group", "incident", ["incident_id"], ["id"], ondelete="CASCADE")
+ op.drop_constraint("incident_project_id_fkey", "incident", type_="foreignkey")
+ op.drop_constraint("incident_incident_type_id_fkey", "incident", type_="foreignkey")
+ op.drop_constraint("incident_duplicate_id_fkey", "incident", type_="foreignkey")
+ op.drop_constraint("incident_incident_priority_id_fkey", "incident", type_="foreignkey")
+ op.create_foreign_key(None, "incident", "project", ["project_id"], ["id"], ondelete="CASCADE")
+ op.create_foreign_key(None, "incident", "incident_priority", ["incident_priority_id"], ["id"])
+ op.create_foreign_key(None, "incident", "incident", ["duplicate_id"], ["id"])
+ op.create_foreign_key(None, "incident", "incident_type", ["incident_type_id"], ["id"])
+ op.drop_constraint("incident_cost_incident_id_fkey", "incident_cost", type_="foreignkey")
+ op.drop_constraint("incident_cost_project_id_fkey", "incident_cost", type_="foreignkey")
+ op.drop_constraint(
+ "incident_cost_incident_cost_type_id_fkey", "incident_cost", type_="foreignkey"
+ )
+ op.create_foreign_key(
+ None, "incident_cost", "incident_cost_type", ["incident_cost_type_id"], ["id"]
+ )
+ op.create_foreign_key(
+ None, "incident_cost", "incident", ["incident_id"], ["id"], ondelete="CASCADE"
+ )
+ op.create_foreign_key(
+ None, "incident_cost", "project", ["project_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint(
+ "incident_cost_type_project_id_fkey", "incident_cost_type", type_="foreignkey"
+ )
+ op.create_foreign_key(
+ None, "incident_cost_type", "project", ["project_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint("incident_priority_project_id_fkey", "incident_priority", type_="foreignkey")
+ op.create_foreign_key(
+ None, "incident_priority", "project", ["project_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint(
+ "incident_type_template_document_id_fkey", "incident_type", type_="foreignkey"
+ )
+ op.drop_constraint("incident_type_project_id_fkey", "incident_type", type_="foreignkey")
+ op.drop_constraint(
+ "incident_type_commander_service_id_fkey", "incident_type", type_="foreignkey"
+ )
+ op.execute(
+ "ALTER TABLE incident_type DROP CONSTRAINT IF EXISTS incident_type_liason_service_id_fkey"
+ )
+ op.create_foreign_key(None, "incident_type", "service", ["commander_service_id"], ["id"])
+ op.create_foreign_key(None, "incident_type", "document", ["template_document_id"], ["id"])
+ op.create_foreign_key(None, "incident_type", "service", ["liaison_service_id"], ["id"])
+ op.create_foreign_key(
+ None, "incident_type", "project", ["project_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint(
+ "individual_contact_project_id_fkey", "individual_contact", type_="foreignkey"
+ )
+ op.drop_constraint(
+ "individual_contact_team_contact_id_fkey", "individual_contact", type_="foreignkey"
+ )
+ op.create_foreign_key(None, "individual_contact", "team_contact", ["team_contact_id"], ["id"])
+ op.create_foreign_key(
+ None, "individual_contact", "project", ["project_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint("notification_project_id_fkey", "notification", type_="foreignkey")
+ op.create_foreign_key(
+ None, "notification", "project", ["project_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint("participant_service_id_fkey", "participant", type_="foreignkey")
+ op.drop_constraint("participant_incident_id_fkey", "participant", type_="foreignkey")
+ op.drop_constraint("participant_added_by_id_fkey", "participant", type_="foreignkey")
+ op.drop_constraint("participant_individual_contact_id_fkey", "participant", type_="foreignkey")
+ op.create_foreign_key(
+ None, "participant", "individual_contact", ["individual_contact_id"], ["id"]
+ )
+ op.create_foreign_key(
+ None, "participant", "service", ["service_id"], ["id"], ondelete="CASCADE"
+ )
+ op.create_foreign_key(None, "participant", "participant", ["added_by_id"], ["id"])
+ op.create_foreign_key(
+ None, "participant", "incident", ["incident_id"], ["id"], ondelete="CASCADE", use_alter=True
+ )
+ op.drop_constraint(
+ "participant_role_participant_id_fkey", "participant_role", type_="foreignkey"
+ )
+ op.create_foreign_key(
+ None, "participant_role", "participant", ["participant_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint("plugin_instance_plugin_id_fkey", "plugin_instance", type_="foreignkey")
+ op.drop_constraint("plugin_instance_project_id_fkey", "plugin_instance", type_="foreignkey")
+ op.create_foreign_key(
+ None, "plugin_instance", "project", ["project_id"], ["id"], ondelete="CASCADE"
+ )
+ op.create_foreign_key(
+ None, "plugin_instance", "plugin", ["plugin_id"], ["id"], referent_schema="dispatch_core"
+ )
+ op.drop_constraint("project_organization_id_fkey", "project", type_="foreignkey")
+ op.create_foreign_key(
+ None,
+ "project",
+ "organization",
+ ["organization_id"],
+ ["id"],
+ referent_schema="dispatch_core",
+ )
+ op.drop_constraint("recommendation_incident_id_fkey", "recommendation", type_="foreignkey")
+ op.create_foreign_key(None, "recommendation", "incident", ["incident_id"], ["id"])
+ op.drop_column("recommendation", "text")
+ op.drop_constraint(
+ "recommendation_match_recommendation_id_fkey", "recommendation_match", type_="foreignkey"
+ )
+ op.create_foreign_key(
+ None, "recommendation_match", "recommendation", ["recommendation_id"], ["id"]
+ )
+ op.drop_constraint("report_participant_id_fkey", "report", type_="foreignkey")
+ op.drop_constraint("report_incident_id_fkey", "report", type_="foreignkey")
+ op.create_foreign_key(None, "report", "participant", ["participant_id"], ["id"])
+ op.create_foreign_key(
+ None, "report", "incident", ["incident_id"], ["id"], ondelete="CASCADE", use_alter=True
+ )
+ op.drop_constraint("search_filter_project_id_fkey", "search_filter", type_="foreignkey")
+ op.drop_constraint("search_filter_creator_id_fkey", "search_filter", type_="foreignkey")
+ op.create_foreign_key(
+ None, "search_filter", "project", ["project_id"], ["id"], ondelete="CASCADE"
+ )
+ op.create_foreign_key(
+ None,
+ "search_filter",
+ "dispatch_user",
+ ["creator_id"],
+ ["id"],
+ referent_schema="dispatch_core",
+ )
+ op.drop_constraint("service_project_id_fkey", "service", type_="foreignkey")
+ op.create_foreign_key(None, "service", "project", ["project_id"], ["id"], ondelete="CASCADE")
+ op.drop_constraint("service_incident_service_id_fkey", "service_incident", type_="foreignkey")
+ op.drop_constraint("service_incident_incident_id_fkey", "service_incident", type_="foreignkey")
+ op.create_foreign_key(None, "service_incident", "incident", ["incident_id"], ["id"])
+ op.create_foreign_key(None, "service_incident", "service", ["service_id"], ["id"])
+ op.drop_constraint("storage_incident_id_fkey", "storage", type_="foreignkey")
+ op.create_foreign_key(None, "storage", "incident", ["incident_id"], ["id"], ondelete="CASCADE")
+ op.drop_constraint("tag_project_id_fkey", "tag", type_="foreignkey")
+ op.drop_constraint("tag_tag_type_id_fkey", "tag", type_="foreignkey")
+ op.create_foreign_key(None, "tag", "project", ["project_id"], ["id"], ondelete="CASCADE")
+ op.create_foreign_key(None, "tag", "tag_type", ["tag_type_id"], ["id"])
+ op.drop_constraint("tag_type_project_id_fkey", "tag_type", type_="foreignkey")
+ op.create_foreign_key(None, "tag_type", "project", ["project_id"], ["id"], ondelete="CASCADE")
+ op.drop_constraint("task_incident_id_fkey", "task", type_="foreignkey")
+ op.drop_constraint("task_creator_id_fkey", "task", type_="foreignkey")
+ op.drop_constraint("task_owner_id_fkey", "task", type_="foreignkey")
+ op.create_foreign_key(None, "task", "incident", ["incident_id"], ["id"], ondelete="CASCADE")
+ op.create_foreign_key(None, "task", "participant", ["creator_id"], ["id"], ondelete="CASCADE")
+ op.create_foreign_key(None, "task", "participant", ["owner_id"], ["id"], ondelete="CASCADE")
+ op.drop_constraint("task_assignees_task_id_fkey", "task_assignees", type_="foreignkey")
+ op.drop_constraint("task_assignees_participant_id_fkey", "task_assignees", type_="foreignkey")
+ op.create_foreign_key(
+ None, "task_assignees", "participant", ["participant_id"], ["id"], ondelete="CASCADE"
+ )
+ op.create_foreign_key(None, "task_assignees", "task", ["task_id"], ["id"], ondelete="CASCADE")
+ op.drop_constraint("task_tickets_task_id_fkey", "task_tickets", type_="foreignkey")
+ op.drop_constraint("task_tickets_ticket_id_fkey", "task_tickets", type_="foreignkey")
+ op.create_foreign_key(None, "task_tickets", "task", ["task_id"], ["id"], ondelete="CASCADE")
+ op.create_foreign_key(None, "task_tickets", "ticket", ["ticket_id"], ["id"], ondelete="CASCADE")
+ op.drop_constraint("team_contact_project_id_fkey", "team_contact", type_="foreignkey")
+ op.create_foreign_key(
+ None, "team_contact", "project", ["project_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint(
+ "team_contact_incident_incident_id_fkey", "team_contact_incident", type_="foreignkey"
+ )
+ op.drop_constraint(
+ "team_contact_incident_team_contact_id_fkey", "team_contact_incident", type_="foreignkey"
+ )
+ op.create_foreign_key(
+ None, "team_contact_incident", "team_contact", ["team_contact_id"], ["id"]
+ )
+ op.create_foreign_key(None, "team_contact_incident", "incident", ["incident_id"], ["id"])
+ op.drop_constraint("term_project_id_fkey", "term", type_="foreignkey")
+ op.create_foreign_key(None, "term", "project", ["project_id"], ["id"], ondelete="CASCADE")
+ op.drop_constraint("ticket_incident_id_fkey", "ticket", type_="foreignkey")
+ op.create_foreign_key(None, "ticket", "incident", ["incident_id"], ["id"], ondelete="CASCADE")
+ op.drop_constraint("workflow_plugin_id_fkey", "workflow", type_="foreignkey")
+ op.drop_constraint("workflow_project_id_fkey", "workflow", type_="foreignkey")
+ op.create_foreign_key(None, "workflow", "project", ["project_id"], ["id"], ondelete="CASCADE")
+ op.create_foreign_key(
+ None, "workflow", "plugin", ["plugin_id"], ["id"], referent_schema="dispatch_core"
+ )
+ op.drop_constraint(
+ "workflow_incident_priority_incident_priority_id_fkey",
+ "workflow_incident_priority",
+ type_="foreignkey",
+ )
+ op.drop_constraint(
+ "workflow_incident_priority_workflow_id_fkey",
+ "workflow_incident_priority",
+ type_="foreignkey",
+ )
+ op.create_foreign_key(
+ None,
+ "workflow_incident_priority",
+ "incident_priority",
+ ["incident_priority_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ None, "workflow_incident_priority", "workflow", ["workflow_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint(
+ "workflow_incident_type_incident_type_id_fkey", "workflow_incident_type", type_="foreignkey"
+ )
+ op.drop_constraint(
+ "workflow_incident_type_workflow_id_fkey", "workflow_incident_type", type_="foreignkey"
+ )
+ op.create_foreign_key(
+ None,
+ "workflow_incident_type",
+ "incident_type",
+ ["incident_type_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ None, "workflow_incident_type", "workflow", ["workflow_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint("workflow_instance_creator_id_fkey", "workflow_instance", type_="foreignkey")
+ op.drop_constraint(
+ "workflow_instance_incident_id_fkey", "workflow_instance", type_="foreignkey"
+ )
+ op.drop_constraint(
+ "workflow_instance_workflow_id_fkey", "workflow_instance", type_="foreignkey"
+ )
+ op.create_foreign_key(None, "workflow_instance", "participant", ["creator_id"], ["id"])
+ op.create_foreign_key(None, "workflow_instance", "workflow", ["workflow_id"], ["id"])
+ op.create_foreign_key(
+ None, "workflow_instance", "incident", ["incident_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint(
+ "workflow_instance_artifact_document_id_fkey",
+ "workflow_instance_artifact",
+ type_="foreignkey",
+ )
+ op.drop_constraint(
+ "workflow_instance_artifact_workflow_instance_id_fkey",
+ "workflow_instance_artifact",
+ type_="foreignkey",
+ )
+ op.create_foreign_key(
+ None, "workflow_instance_artifact", "document", ["document_id"], ["id"], ondelete="CASCADE"
+ )
+ op.create_foreign_key(
+ None,
+ "workflow_instance_artifact",
+ "workflow_instance",
+ ["workflow_instance_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ op.drop_constraint("workflow_term_workflow_id_fkey", "workflow_term", type_="foreignkey")
+ op.drop_constraint("workflow_term_term_id_fkey", "workflow_term", type_="foreignkey")
+ op.create_foreign_key(None, "workflow_term", "term", ["term_id"], ["id"], ondelete="CASCADE")
+ op.create_foreign_key(
+ None, "workflow_term", "workflow", ["workflow_id"], ["id"], ondelete="CASCADE"
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "workflow_term", type_="foreignkey")
+ op.drop_constraint(None, "workflow_term", type_="foreignkey")
+ op.create_foreign_key(
+ "workflow_term_term_id_fkey",
+ "workflow_term",
+ "term",
+ ["term_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "workflow_term_workflow_id_fkey",
+ "workflow_term",
+ "workflow",
+ ["workflow_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "workflow_instance_artifact", type_="foreignkey")
+ op.drop_constraint(None, "workflow_instance_artifact", type_="foreignkey")
+ op.create_foreign_key(
+ "workflow_instance_artifact_workflow_instance_id_fkey",
+ "workflow_instance_artifact",
+ "workflow_instance",
+ ["workflow_instance_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "workflow_instance_artifact_document_id_fkey",
+ "workflow_instance_artifact",
+ "document",
+ ["document_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "workflow_instance", type_="foreignkey")
+ op.drop_constraint(None, "workflow_instance", type_="foreignkey")
+ op.drop_constraint(None, "workflow_instance", type_="foreignkey")
+ op.create_foreign_key(
+ "workflow_instance_workflow_id_fkey",
+ "workflow_instance",
+ "workflow",
+ ["workflow_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "workflow_instance_incident_id_fkey",
+ "workflow_instance",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "workflow_instance_creator_id_fkey",
+ "workflow_instance",
+ "participant",
+ ["creator_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.drop_constraint(None, "workflow_incident_type", type_="foreignkey")
+ op.drop_constraint(None, "workflow_incident_type", type_="foreignkey")
+ op.create_foreign_key(
+ "workflow_incident_type_workflow_id_fkey",
+ "workflow_incident_type",
+ "workflow",
+ ["workflow_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "workflow_incident_type_incident_type_id_fkey",
+ "workflow_incident_type",
+ "incident_type",
+ ["incident_type_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "workflow_incident_priority", type_="foreignkey")
+ op.drop_constraint(None, "workflow_incident_priority", type_="foreignkey")
+ op.create_foreign_key(
+ "workflow_incident_priority_workflow_id_fkey",
+ "workflow_incident_priority",
+ "workflow",
+ ["workflow_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "workflow_incident_priority_incident_priority_id_fkey",
+ "workflow_incident_priority",
+ "incident_priority",
+ ["incident_priority_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "workflow", type_="foreignkey")
+ op.drop_constraint(None, "workflow", type_="foreignkey")
+ op.create_foreign_key(
+ "workflow_project_id_fkey",
+ "workflow",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "workflow_plugin_id_fkey",
+ "workflow",
+ "plugin",
+ ["plugin_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.drop_constraint(None, "ticket", type_="foreignkey")
+ op.create_foreign_key(
+ "ticket_incident_id_fkey",
+ "ticket",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "term", type_="foreignkey")
+ op.create_foreign_key(
+ "term_project_id_fkey",
+ "term",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "team_contact_incident", type_="foreignkey")
+ op.drop_constraint(None, "team_contact_incident", type_="foreignkey")
+ op.create_foreign_key(
+ "team_contact_incident_team_contact_id_fkey",
+ "team_contact_incident",
+ "team_contact",
+ ["team_contact_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "team_contact_incident_incident_id_fkey",
+ "team_contact_incident",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.drop_constraint(None, "team_contact", type_="foreignkey")
+ op.create_foreign_key(
+ "team_contact_project_id_fkey",
+ "team_contact",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "task_tickets", type_="foreignkey")
+ op.drop_constraint(None, "task_tickets", type_="foreignkey")
+ op.create_foreign_key(
+ "task_tickets_ticket_id_fkey",
+ "task_tickets",
+ "ticket",
+ ["ticket_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "task_tickets_task_id_fkey",
+ "task_tickets",
+ "task",
+ ["task_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "task_assignees", type_="foreignkey")
+ op.drop_constraint(None, "task_assignees", type_="foreignkey")
+ op.create_foreign_key(
+ "task_assignees_participant_id_fkey",
+ "task_assignees",
+ "participant",
+ ["participant_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "task_assignees_task_id_fkey",
+ "task_assignees",
+ "task",
+ ["task_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "task", type_="foreignkey")
+ op.drop_constraint(None, "task", type_="foreignkey")
+ op.drop_constraint(None, "task", type_="foreignkey")
+ op.create_foreign_key(
+ "task_owner_id_fkey",
+ "task",
+ "participant",
+ ["owner_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "task_creator_id_fkey",
+ "task",
+ "participant",
+ ["creator_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "task_incident_id_fkey",
+ "task",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "tag_type", type_="foreignkey")
+ op.create_foreign_key(
+ "tag_type_project_id_fkey",
+ "tag_type",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "tag", type_="foreignkey")
+ op.drop_constraint(None, "tag", type_="foreignkey")
+ op.create_foreign_key(
+ "tag_tag_type_id_fkey", "tag", "tag_type", ["tag_type_id"], ["id"], referent_schema="public"
+ )
+ op.create_foreign_key(
+ "tag_project_id_fkey",
+ "tag",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "storage", type_="foreignkey")
+ op.create_foreign_key(
+ "storage_incident_id_fkey",
+ "storage",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "service_incident", type_="foreignkey")
+ op.drop_constraint(None, "service_incident", type_="foreignkey")
+ op.create_foreign_key(
+ "service_incident_incident_id_fkey",
+ "service_incident",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "service_incident_service_id_fkey",
+ "service_incident",
+ "service",
+ ["service_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.drop_constraint(None, "service", type_="foreignkey")
+ op.create_foreign_key(
+ "service_project_id_fkey",
+ "service",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "search_filter", type_="foreignkey")
+ op.drop_constraint(None, "search_filter", type_="foreignkey")
+ op.create_foreign_key(
+ "search_filter_creator_id_fkey",
+ "search_filter",
+ "dispatch_user",
+ ["creator_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "search_filter_project_id_fkey",
+ "search_filter",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "report", type_="foreignkey")
+ op.drop_constraint(None, "report", type_="foreignkey")
+ op.create_foreign_key(
+ "report_incident_id_fkey",
+ "report",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "report_participant_id_fkey",
+ "report",
+ "participant",
+ ["participant_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.drop_constraint(None, "recommendation_match", type_="foreignkey")
+ op.create_foreign_key(
+ "recommendation_match_recommendation_id_fkey",
+ "recommendation_match",
+ "recommendation",
+ ["recommendation_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.add_column(
+ "recommendation", sa.Column("text", sa.VARCHAR(), autoincrement=False, nullable=True)
+ )
+ op.drop_constraint(None, "recommendation", type_="foreignkey")
+ op.create_foreign_key(
+ "recommendation_incident_id_fkey",
+ "recommendation",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.drop_constraint(None, "project", type_="foreignkey")
+ op.create_foreign_key(
+ "project_organization_id_fkey",
+ "project",
+ "organization",
+ ["organization_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.drop_constraint(None, "plugin_instance", type_="foreignkey")
+ op.drop_constraint(None, "plugin_instance", type_="foreignkey")
+ op.create_foreign_key(
+ "plugin_instance_project_id_fkey",
+ "plugin_instance",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "plugin_instance_plugin_id_fkey",
+ "plugin_instance",
+ "plugin",
+ ["plugin_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.drop_constraint(None, "participant_role", type_="foreignkey")
+ op.create_foreign_key(
+ "participant_role_participant_id_fkey",
+ "participant_role",
+ "participant",
+ ["participant_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "participant", type_="foreignkey")
+ op.drop_constraint(None, "participant", type_="foreignkey")
+ op.drop_constraint(None, "participant", type_="foreignkey")
+ op.drop_constraint(None, "participant", type_="foreignkey")
+ op.create_foreign_key(
+ "participant_individual_contact_id_fkey",
+ "participant",
+ "individual_contact",
+ ["individual_contact_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "participant_added_by_id_fkey",
+ "participant",
+ "participant",
+ ["added_by_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "participant_incident_id_fkey",
+ "participant",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "participant_service_id_fkey",
+ "participant",
+ "service",
+ ["service_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "notification", type_="foreignkey")
+ op.create_foreign_key(
+ "notification_project_id_fkey",
+ "notification",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "individual_contact", type_="foreignkey")
+ op.drop_constraint(None, "individual_contact", type_="foreignkey")
+ op.create_foreign_key(
+ "individual_contact_team_contact_id_fkey",
+ "individual_contact",
+ "team_contact",
+ ["team_contact_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "individual_contact_project_id_fkey",
+ "individual_contact",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "incident_type", type_="foreignkey")
+ op.drop_constraint(None, "incident_type", type_="foreignkey")
+ op.drop_constraint(None, "incident_type", type_="foreignkey")
+ op.drop_constraint(None, "incident_type", type_="foreignkey")
+ op.create_foreign_key(
+ "incident_type_liason_service_id_fkey",
+ "incident_type",
+ "service",
+ ["liaison_service_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "incident_type_commander_service_id_fkey",
+ "incident_type",
+ "service",
+ ["commander_service_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "incident_type_project_id_fkey",
+ "incident_type",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "incident_type_template_document_id_fkey",
+ "incident_type",
+ "document",
+ ["template_document_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.drop_constraint(None, "incident_priority", type_="foreignkey")
+ op.create_foreign_key(
+ "incident_priority_project_id_fkey",
+ "incident_priority",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "incident_cost_type", type_="foreignkey")
+ op.create_foreign_key(
+ "incident_cost_type_project_id_fkey",
+ "incident_cost_type",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "incident_cost", type_="foreignkey")
+ op.drop_constraint(None, "incident_cost", type_="foreignkey")
+ op.drop_constraint(None, "incident_cost", type_="foreignkey")
+ op.create_foreign_key(
+ "incident_cost_incident_cost_type_id_fkey",
+ "incident_cost",
+ "incident_cost_type",
+ ["incident_cost_type_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "incident_cost_project_id_fkey",
+ "incident_cost",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "incident_cost_incident_id_fkey",
+ "incident_cost",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "incident", type_="foreignkey")
+ op.drop_constraint(None, "incident", type_="foreignkey")
+ op.drop_constraint(None, "incident", type_="foreignkey")
+ op.drop_constraint(None, "incident", type_="foreignkey")
+ op.create_foreign_key(
+ "incident_incident_priority_id_fkey",
+ "incident",
+ "incident_priority",
+ ["incident_priority_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "incident_duplicate_id_fkey",
+ "incident",
+ "incident",
+ ["duplicate_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "incident_incident_type_id_fkey",
+ "incident",
+ "incident_type",
+ ["incident_type_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "incident_project_id_fkey",
+ "incident",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "group", type_="foreignkey")
+ op.create_foreign_key(
+ "group_incident_id_fkey",
+ "group",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "feedback", type_="foreignkey")
+ op.drop_constraint(None, "feedback", type_="foreignkey")
+ op.create_foreign_key(
+ "feedback_participant_id_fkey",
+ "feedback",
+ "participant",
+ ["participant_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "feedback_incident_id_fkey",
+ "feedback",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "event", type_="foreignkey")
+ op.drop_constraint(None, "event", type_="foreignkey")
+ op.create_foreign_key(
+ "event_individual_id_fkey",
+ "event",
+ "individual_contact",
+ ["individual_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "event_incident_id_fkey",
+ "event",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "document", type_="foreignkey")
+ op.drop_constraint(None, "document", type_="foreignkey")
+ op.drop_constraint(None, "document", type_="foreignkey")
+ op.create_foreign_key(
+ "document_report_id_fkey",
+ "document",
+ "report",
+ ["report_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "document_incident_id_fkey",
+ "document",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "document_project_id_fkey",
+ "document",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.add_column(
+ "dispatch_user_project",
+ sa.Column(
+ "id",
+ sa.INTEGER(),
+ server_default=sa.text("nextval('public.dispatch_user_project_id_seq'::regclass)"),
+ autoincrement=True,
+ nullable=False,
+ ),
+ )
+ op.drop_constraint(None, "dispatch_user_project", type_="foreignkey")
+ op.drop_constraint(None, "dispatch_user_project", type_="foreignkey")
+ op.create_foreign_key(
+ "dispatch_user_project_dispatch_user_id_fkey",
+ "dispatch_user_project",
+ "dispatch_user",
+ ["dispatch_user_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "dispatch_user_project_project_id_fkey",
+ "dispatch_user_project",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.alter_column(
+ "dispatch_user_project", "project_id", existing_type=sa.INTEGER(), nullable=True
+ )
+ op.alter_column(
+ "dispatch_user_project", "dispatch_user_id", existing_type=sa.INTEGER(), nullable=True
+ )
+ op.drop_constraint(None, "definition_terms", type_="foreignkey")
+ op.drop_constraint(None, "definition_terms", type_="foreignkey")
+ op.create_foreign_key(
+ "definition_terms_definition_id_fkey",
+ "definition_terms",
+ "definition",
+ ["definition_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "definition_terms_term_id_fkey",
+ "definition_terms",
+ "term",
+ ["term_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.drop_constraint(None, "definition_teams", type_="foreignkey")
+ op.drop_constraint(None, "definition_teams", type_="foreignkey")
+ op.create_foreign_key(
+ "definition_teams_team_contact_id_fkey",
+ "definition_teams",
+ "team_contact",
+ ["team_contact_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.create_foreign_key(
+ "definition_teams_definition_id_fkey",
+ "definition_teams",
+ "definition",
+ ["definition_id"],
+ ["id"],
+ referent_schema="public",
+ )
+ op.drop_constraint(None, "definition", type_="foreignkey")
+ op.create_foreign_key(
+ "definition_project_id_fkey",
+ "definition",
+ "project",
+ ["project_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "conversation", type_="foreignkey")
+ op.create_foreign_key(
+ "conversation_incident_id_fkey",
+ "conversation",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "conference", type_="foreignkey")
+ op.create_foreign_key(
+ "conference_incident_id_fkey",
+ "conference",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "assoc_team_contact_filters", type_="foreignkey")
+ op.drop_constraint(None, "assoc_team_contact_filters", type_="foreignkey")
+ op.create_foreign_key(
+ "assoc_team_contact_filters_team_contact_id_fkey",
+ "assoc_team_contact_filters",
+ "team_contact",
+ ["team_contact_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "assoc_team_contact_filters_search_filter_id_fkey",
+ "assoc_team_contact_filters",
+ "search_filter",
+ ["search_filter_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "assoc_service_filters", type_="foreignkey")
+ op.drop_constraint(None, "assoc_service_filters", type_="foreignkey")
+ op.create_foreign_key(
+ "assoc_service_filters_search_filter_id_fkey",
+ "assoc_service_filters",
+ "search_filter",
+ ["search_filter_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "assoc_service_filters_service_id_fkey",
+ "assoc_service_filters",
+ "service",
+ ["service_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "assoc_notification_filters", type_="foreignkey")
+ op.drop_constraint(None, "assoc_notification_filters", type_="foreignkey")
+ op.create_foreign_key(
+ "assoc_notification_filters_notification_id_fkey",
+ "assoc_notification_filters",
+ "notification",
+ ["notification_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "assoc_notification_filters_search_filter_id_fkey",
+ "assoc_notification_filters",
+ "search_filter",
+ ["search_filter_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "assoc_individual_contact_filters", type_="foreignkey")
+ op.drop_constraint(None, "assoc_individual_contact_filters", type_="foreignkey")
+ op.create_foreign_key(
+ "assoc_individual_contact_filters_individual_contact_id_fkey",
+ "assoc_individual_contact_filters",
+ "individual_contact",
+ ["individual_contact_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "assoc_individual_contact_filters_search_filter_id_fkey",
+ "assoc_individual_contact_filters",
+ "search_filter",
+ ["search_filter_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "assoc_incident_terms", type_="foreignkey")
+ op.drop_constraint(None, "assoc_incident_terms", type_="foreignkey")
+ op.create_foreign_key(
+ "assoc_incident_terms_term_id_fkey",
+ "assoc_incident_terms",
+ "term",
+ ["term_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "assoc_incident_terms_incident_id_fkey",
+ "assoc_incident_terms",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "assoc_incident_tags", type_="foreignkey")
+ op.drop_constraint(None, "assoc_incident_tags", type_="foreignkey")
+ op.create_foreign_key(
+ "assoc_incident_tags_tag_id_fkey",
+ "assoc_incident_tags",
+ "tag",
+ ["tag_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "assoc_incident_tags_incident_id_fkey",
+ "assoc_incident_tags",
+ "incident",
+ ["incident_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(None, "assoc_document_filters", type_="foreignkey")
+ op.drop_constraint(None, "assoc_document_filters", type_="foreignkey")
+ op.create_foreign_key(
+ "assoc_document_filters_search_filter_id_fkey",
+ "assoc_document_filters",
+ "search_filter",
+ ["search_filter_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ "assoc_document_filters_document_id_fkey",
+ "assoc_document_filters",
+ "document",
+ ["document_id"],
+ ["id"],
+ referent_schema="public",
+ ondelete="CASCADE",
+ )
+ op.create_table(
+ "recommendation_accuracy",
+ sa.Column(
+ "id",
+ sa.INTEGER(),
+ server_default=sa.text("nextval('public.recommendation_accuracy_id_seq'::regclass)"),
+ autoincrement=True,
+ nullable=False,
+ ),
+ sa.Column("recommendation_id", sa.INTEGER(), autoincrement=False, nullable=True),
+ sa.Column("correct", sa.BOOLEAN(), autoincrement=False, nullable=True),
+ sa.Column("resource_id", sa.INTEGER(), autoincrement=False, nullable=True),
+ sa.Column("resource_type", sa.VARCHAR(), autoincrement=False, nullable=True),
+ sa.ForeignKeyConstraint(
+ ["recommendation_id"],
+ ["public.recommendation.id"],
+ name="recommendation_accuracy_recommendation_id_fkey",
+ ),
+ sa.PrimaryKeyConstraint("id", name="recommendation_accuracy_pkey"),
+ )
+ op.create_table(
+ "recommendation_team_contacts",
+ sa.Column("team_contact_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("recommendation_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["recommendation_id"],
+ ["public.recommendation.id"],
+ name="recommendation_team_contacts_recommendation_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["team_contact_id"],
+ ["public.team_contact.id"],
+ name="recommendation_team_contacts_team_contact_id_fkey",
+ ),
+ sa.PrimaryKeyConstraint(
+ "team_contact_id", "recommendation_id", name="recommendation_team_contacts_pkey"
+ ),
+ )
+ op.create_table(
+ "team_contact_incident_type",
+ sa.Column("incident_type_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("team_contact_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["incident_type_id"],
+ ["public.incident_type.id"],
+ name="team_contact_incident_type_incident_type_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["team_contact_id"],
+ ["public.team_contact.id"],
+ name="team_contact_incident_type_team_contact_id_fkey",
+ ),
+ sa.PrimaryKeyConstraint(
+ "incident_type_id", "team_contact_id", name="team_contact_incident_type_pkey"
+ ),
+ )
+ op.create_table(
+ "service_terms",
+ sa.Column("term_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("service_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["service_id"], ["public.service.id"], name="service_terms_service_id_fkey"
+ ),
+ sa.ForeignKeyConstraint(["term_id"], ["public.term.id"], name="service_terms_term_id_fkey"),
+ sa.PrimaryKeyConstraint("term_id", "service_id", name="service_terms_pkey"),
+ )
+ op.create_table(
+ "recommendation_services",
+ sa.Column("service_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("recommendation_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["recommendation_id"],
+ ["public.recommendation.id"],
+ name="recommendation_services_recommendation_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["service_id"], ["public.service.id"], name="recommendation_services_service_id_fkey"
+ ),
+ sa.PrimaryKeyConstraint(
+ "service_id", "recommendation_id", name="recommendation_services_pkey"
+ ),
+ )
+ op.create_table(
+ "document_incident_type",
+ sa.Column("incident_type_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("document_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["document_id"], ["public.document.id"], name="document_incident_type_document_id_fkey"
+ ),
+ sa.ForeignKeyConstraint(
+ ["incident_type_id"],
+ ["public.incident_type.id"],
+ name="document_incident_type_incident_type_id_fkey",
+ ),
+ sa.PrimaryKeyConstraint(
+ "incident_type_id", "document_id", name="document_incident_type_pkey"
+ ),
+ )
+ op.create_table(
+ "service_incident_priority",
+ sa.Column("incident_priority_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("service_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["incident_priority_id"],
+ ["public.incident_priority.id"],
+ name="service_incident_priority_incident_priority_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["service_id"], ["public.service.id"], name="service_incident_priority_service_id_fkey"
+ ),
+ sa.PrimaryKeyConstraint(
+ "incident_priority_id", "service_id", name="service_incident_priority_pkey"
+ ),
+ )
+ op.create_table(
+ "document_incident_priority",
+ sa.Column("incident_priority_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("document_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["document_id"],
+ ["public.document.id"],
+ name="document_incident_priority_document_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["incident_priority_id"],
+ ["public.incident_priority.id"],
+ name="document_incident_priority_incident_priority_id_fkey",
+ ),
+ sa.PrimaryKeyConstraint(
+ "incident_priority_id", "document_id", name="document_incident_priority_pkey"
+ ),
+ )
+ op.create_table(
+ "recommendation_incident_priorities",
+ sa.Column("incident_priority_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("recommendation_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["incident_priority_id"],
+ ["public.incident_priority.id"],
+ name="recommendation_incident_priorities_incident_priority_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["recommendation_id"],
+ ["public.recommendation.id"],
+ name="recommendation_incident_priorities_recommendation_id_fkey",
+ ),
+ sa.PrimaryKeyConstraint(
+ "incident_priority_id",
+ "recommendation_id",
+ name="recommendation_incident_priorities_pkey",
+ ),
+ )
+ op.create_table(
+ "assoc_individual_contact_incident_type",
+ sa.Column("incident_type_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("individual_contact_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["incident_type_id"],
+ ["public.incident_type.id"],
+ name="assoc_individual_contact_incident_type_incident_type_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["individual_contact_id"],
+ ["public.individual_contact.id"],
+ name="assoc_individual_contact_incident_ty_individual_contact_id_fkey",
+ ),
+ sa.PrimaryKeyConstraint(
+ "incident_type_id",
+ "individual_contact_id",
+ name="assoc_individual_contact_incident_type_pkey",
+ ),
+ )
+ op.create_table(
+ "recommendation_incident_types",
+ sa.Column("incident_type_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("recommendation_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["incident_type_id"],
+ ["public.incident_type.id"],
+ name="recommendation_incident_types_incident_type_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["recommendation_id"],
+ ["public.recommendation.id"],
+ name="recommendation_incident_types_recommendation_id_fkey",
+ ),
+ sa.PrimaryKeyConstraint(
+ "incident_type_id", "recommendation_id", name="recommendation_incident_types_pkey"
+ ),
+ )
+ op.create_table(
+ "assoc_individual_contact_terms",
+ sa.Column("term_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("individual_contact_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["individual_contact_id"],
+ ["public.individual_contact.id"],
+ name="assoc_individual_contact_terms_individual_contact_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["term_id"], ["public.term.id"], name="assoc_individual_contact_terms_term_id_fkey"
+ ),
+ sa.PrimaryKeyConstraint(
+ "term_id", "individual_contact_id", name="assoc_individual_contact_terms_pkey"
+ ),
+ )
+ op.create_table(
+ "recommendation_individual_contacts",
+ sa.Column("individual_contact_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("recommendation_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["individual_contact_id"],
+ ["public.individual_contact.id"],
+ name="recommendation_individual_contacts_individual_contact_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["recommendation_id"],
+ ["public.recommendation.id"],
+ name="recommendation_individual_contacts_recommendation_id_fkey",
+ ),
+ sa.PrimaryKeyConstraint(
+ "individual_contact_id",
+ "recommendation_id",
+ name="recommendation_individual_contacts_pkey",
+ ),
+ )
+ op.create_table(
+ "team_contact_incident_priority",
+ sa.Column("incident_priority_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("team_contact_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["incident_priority_id"],
+ ["public.incident_priority.id"],
+ name="team_contact_incident_priority_incident_priority_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["team_contact_id"],
+ ["public.team_contact.id"],
+ name="team_contact_incident_priority_team_contact_id_fkey",
+ ),
+ sa.PrimaryKeyConstraint(
+ "incident_priority_id", "team_contact_id", name="team_contact_incident_priority_pkey"
+ ),
+ )
+ op.create_table(
+ "recommendation_terms",
+ sa.Column("term_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("recommendation_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["recommendation_id"],
+ ["public.recommendation.id"],
+ name="recommendation_terms_recommendation_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["term_id"], ["public.term.id"], name="recommendation_terms_term_id_fkey"
+ ),
+ sa.PrimaryKeyConstraint("term_id", "recommendation_id", name="recommendation_terms_pkey"),
+ )
+ op.create_table(
+ "assoc_individual_contact_incident_priority",
+ sa.Column("incident_priority_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("individual_contact_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["incident_priority_id"],
+ ["public.incident_priority.id"],
+ name="assoc_individual_contact_incident_pri_incident_priority_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["individual_contact_id"],
+ ["public.individual_contact.id"],
+ name="assoc_individual_contact_incident_pr_individual_contact_id_fkey",
+ ),
+ sa.PrimaryKeyConstraint(
+ "incident_priority_id",
+ "individual_contact_id",
+ name="assoc_individual_contact_incident_priority_pkey",
+ ),
+ )
+ op.create_table(
+ "service_incident_type",
+ sa.Column("incident_type_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("service_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["incident_type_id"],
+ ["public.incident_type.id"],
+ name="service_incident_type_incident_type_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["service_id"], ["public.service.id"], name="service_incident_type_service_id_fkey"
+ ),
+ sa.PrimaryKeyConstraint(
+ "incident_type_id", "service_id", name="service_incident_type_pkey"
+ ),
+ )
+ op.create_table(
+ "recommendation_documents",
+ sa.Column("document_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("recommendation_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["document_id"],
+ ["public.document.id"],
+ name="recommendation_documents_document_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["recommendation_id"],
+ ["public.recommendation.id"],
+ name="recommendation_documents_recommendation_id_fkey",
+ ),
+ sa.PrimaryKeyConstraint(
+ "document_id", "recommendation_id", name="recommendation_documents_pkey"
+ ),
+ )
+ op.create_table(
+ "team_contact_terms",
+ sa.Column("term_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("team_contact_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["team_contact_id"],
+ ["public.team_contact.id"],
+ name="team_contact_terms_team_contact_id_fkey",
+ ),
+ sa.ForeignKeyConstraint(
+ ["term_id"], ["public.term.id"], name="team_contact_terms_term_id_fkey"
+ ),
+ sa.PrimaryKeyConstraint("term_id", "team_contact_id", name="team_contact_terms_pkey"),
+ )
+ op.create_table(
+ "document_terms",
+ sa.Column("term_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("document_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["document_id"],
+ ["public.document.id"],
+ name="document_terms_document_id_fkey",
+ ondelete="CASCADE",
+ ),
+ sa.ForeignKeyConstraint(
+ ["term_id"], ["public.term.id"], name="document_terms_term_id_fkey", ondelete="CASCADE"
+ ),
+ sa.PrimaryKeyConstraint("term_id", "document_id", name="document_terms_pkey"),
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-06-18_c767b759a508.py b/src/dispatch/database/revisions/tenant/versions/2021-06-18_c767b759a508.py
new file mode 100644
index 000000000000..f9a938d8341a
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-06-18_c767b759a508.py
@@ -0,0 +1,29 @@
+"""Marks incident visibility as non-nullable
+
+Revision ID: c767b759a508
+Revises: 8a558baeef05
+Create Date: 2021-06-18 16:12:17.723485
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "c767b759a508"
+down_revision = "8a558baeef05"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.execute("update incident set visibility = 'Open' where visibility is null;")
+ op.alter_column("incident", "visibility", existing_type=sa.VARCHAR(), nullable=False)
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.alter_column("incident", "visibility", existing_type=sa.VARCHAR(), nullable=True)
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-06-23_32811f581cb8.py b/src/dispatch/database/revisions/tenant/versions/2021-06-23_32811f581cb8.py
new file mode 100644
index 000000000000..4c8c8ebfed0a
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-06-23_32811f581cb8.py
@@ -0,0 +1,38 @@
+"""Moving column name from plugin_id to plugin_instance_id
+
+Revision ID: 32811f581cb8
+Revises: c767b759a508
+Create Date: 2021-06-23 10:45:12.839871
+
+"""
+from alembic import op
+
+
+# revision identifiers, used by Alembic.
+revision = "32811f581cb8"
+down_revision = "c767b759a508"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint("workflow_plugin_id_fkey", "workflow", type_="foreignkey")
+ op.alter_column("workflow", "plugin_id", new_column_name="plugin_instance_id")
+ op.create_foreign_key(None, "workflow", "plugin_instance", ["plugin_instance_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "workflow", type_="foreignkey")
+ op.alter_column("workflow", "plugin_instance_id", new_column_name="plugin_id")
+ op.create_foreign_key(
+ "workflow_plugin_id_fkey",
+ "workflow",
+ "plugin",
+ ["plugin_id"],
+ ["id"],
+ referent_schema="dispatch_core",
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-07-01_875e6b212dc7.py b/src/dispatch/database/revisions/tenant/versions/2021-07-01_875e6b212dc7.py
new file mode 100644
index 000000000000..80830a4fc760
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-07-01_875e6b212dc7.py
@@ -0,0 +1,73 @@
+"""Adds incident type specific documents.
+
+Revision ID: 875e6b212dc7
+Revises: 32811f581cb8
+Create Date: 2021-07-01 10:19:24.016062
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "875e6b212dc7"
+down_revision = "32811f581cb8"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.alter_column(
+ "incident_type",
+ "template_document_id",
+ nullable=True,
+ new_column_name="incident_template_document_id",
+ )
+ op.add_column(
+ "incident_type", sa.Column("executive_template_document_id", sa.Integer(), nullable=True)
+ )
+ op.add_column(
+ "incident_type", sa.Column("review_template_document_id", sa.Integer(), nullable=True)
+ )
+ op.add_column(
+ "incident_type", sa.Column("tracking_template_document_id", sa.Integer(), nullable=True)
+ )
+ op.drop_constraint(
+ "incident_type_template_document_id_fkey", "incident_type", type_="foreignkey"
+ )
+ op.create_foreign_key(
+ None, "incident_type", "document", ["review_template_document_id"], ["id"]
+ )
+ op.create_foreign_key(
+ None, "incident_type", "document", ["tracking_template_document_id"], ["id"]
+ )
+ op.create_foreign_key(
+ None, "incident_type", "document", ["incident_template_document_id"], ["id"]
+ )
+ op.create_foreign_key(
+ None, "incident_type", "document", ["executive_template_document_id"], ["id"]
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.alter_column(
+ "incident_type",
+ "incident_template_document_id",
+ new_column_name="template_document_id",
+ nullable=True,
+ )
+ op.create_foreign_key(
+ "incident_type_template_document_id_fkey",
+ "incident_type",
+ "document",
+ ["template_document_id"],
+ ["id"],
+ )
+ op.drop_column("incident_type", "tracking_template_document_id")
+ op.drop_column("incident_type", "review_template_document_id")
+ op.drop_column("incident_type", "executive_template_document_id")
+ op.drop_column("incident_type", "incident_template_document_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-07-22_2780d28e4f39.py b/src/dispatch/database/revisions/tenant/versions/2021-07-22_2780d28e4f39.py
new file mode 100644
index 000000000000..a824a93a5acd
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-07-22_2780d28e4f39.py
@@ -0,0 +1,29 @@
+"""Adds ondelete cascade for report id foreign key
+
+Revision ID: 2780d28e4f39
+Revises: 875e6b212dc7
+Create Date: 2021-07-22 14:59:14.822882
+
+"""
+from alembic import op
+
+
+# revision identifiers, used by Alembic.
+revision = "2780d28e4f39"
+down_revision = "875e6b212dc7"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint("document_report_id_fkey", "document", type_="foreignkey")
+ op.create_foreign_key(None, "document", "report", ["report_id"], ["id"], ondelete="CASCADE")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "document", type_="foreignkey")
+ op.create_foreign_key("document_report_id_fkey", "document", "report", ["report_id"], ["id"])
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-08-03_9fcf205ba6a5.py b/src/dispatch/database/revisions/tenant/versions/2021-08-03_9fcf205ba6a5.py
new file mode 100644
index 000000000000..0da6b9be8578
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-08-03_9fcf205ba6a5.py
@@ -0,0 +1,40 @@
+"""Adds missing feedback search_vector
+
+Revision ID: 9fcf205ba6a5
+Revises: b0b00d3e8330
+Create Date: 2021-08-03 10:15:17.529189
+
+"""
+from alembic import op
+import sqlalchemy as sa
+import sqlalchemy_utils
+
+
+# revision identifiers, used by Alembic.
+revision = "9fcf205ba6a5"
+down_revision = "b0b00d3e8330"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "feedback",
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ )
+ op.create_index(
+ "feedback_search_vector_idx",
+ "feedback",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_index("feedback_search_vector_idx", table_name="feedback", postgresql_using="gin")
+ op.drop_column("feedback", "search_vector")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-08-03_b0b00d3e8330.py b/src/dispatch/database/revisions/tenant/versions/2021-08-03_b0b00d3e8330.py
new file mode 100644
index 000000000000..6418298d2d83
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-08-03_b0b00d3e8330.py
@@ -0,0 +1,36 @@
+"""Migrates workflow instance enums
+
+Revision ID: b0b00d3e8330
+Revises: 2780d28e4f39
+Create Date: 2021-08-03 10:01:48.442379
+
+"""
+from alembic import op
+
+
+# revision identifiers, used by Alembic.
+revision = "b0b00d3e8330"
+down_revision = "2780d28e4f39"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # generate existing slugs
+ conn = op.get_bind()
+ res = conn.execute("select id, status from workflow_instance")
+ results = res.fetchall()
+
+ for r in results:
+ conn.execute(
+ f"update workflow_instance set status = '{r[1].capitalize()}' where id = {r[0]}"
+ )
+
+
+def downgrade():
+ conn = op.get_bind()
+ res = conn.execute("select id, status from workflow_instance")
+ results = res.fetchall()
+
+ for r in results:
+ conn.execute(f"update workflow_instance set status = '{r[1].lower()}' where id = {r[0]}")
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-08-09_b73416df5744.py b/src/dispatch/database/revisions/tenant/versions/2021-08-09_b73416df5744.py
new file mode 100644
index 000000000000..6be214759ceb
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-08-09_b73416df5744.py
@@ -0,0 +1,199 @@
+"""Adds incident role mappings
+
+Revision ID: b73416df5744
+Revises: 9fcf205ba6a5
+Create Date: 2021-08-09 15:29:24.856985
+
+"""
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy.ext.declarative import declarative_base
+from sqlalchemy import Boolean, Column, Integer, String, PrimaryKeyConstraint, Table, ForeignKey
+from sqlalchemy.orm import relationship, Session
+from collections import defaultdict
+
+
+# revision identifiers, used by Alembic.
+revision = "b73416df5744"
+down_revision = "9fcf205ba6a5"
+branch_labels = None
+depends_on = None
+
+Base = declarative_base()
+
+
+class Project(Base):
+ __tablename__ = "project"
+ id = Column(Integer, primary_key=True)
+
+
+class Service(Base):
+ __tablename__ = "service"
+ id = Column(Integer, primary_key=True)
+
+
+class IncidentType(Base):
+ __tablename__ = "incident_type"
+ id = Column(Integer, primary_key=True)
+ project_id = Column(Integer, ForeignKey("project.id", ondelete="CASCADE"))
+ project = relationship("Project")
+ commander_service_id = Column(Integer, ForeignKey("service.id"))
+ commander_service = relationship("Service", foreign_keys=[commander_service_id])
+
+ liaison_service_id = Column(Integer, ForeignKey("service.id"))
+ liaison_service = relationship("Service", foreign_keys=[liaison_service_id])
+
+
+class IncidentPriority(Base):
+ __tablename__ = "incident_priority"
+ id = Column(Integer, primary_key=True)
+
+
+assoc_incident_roles_incident_types = Table(
+ "incident_role_incident_type",
+ Base.metadata,
+ Column("incident_role_id", Integer, ForeignKey("incident_role.id")),
+ Column("incident_type_id", Integer, ForeignKey("incident_type.id")),
+ PrimaryKeyConstraint("incident_role_id", "incident_type_id"),
+)
+
+assoc_incident_roles_incident_priorities = Table(
+ "incident_role_incident_priority",
+ Base.metadata,
+ Column("incident_role_id", Integer, ForeignKey("incident_role.id")),
+ Column("incident_priority_id", Integer, ForeignKey("incident_priority.id")),
+ PrimaryKeyConstraint("incident_role_id", "incident_priority_id"),
+)
+
+
+class IncidentRole(Base):
+ __tablename__ = "incident_role"
+ # Columns
+ id = Column(Integer, primary_key=True)
+ role = Column(String)
+
+ enabled = Column(Boolean, default=True)
+ order = Column(Integer)
+
+ project_id = Column(Integer, ForeignKey("project.id", ondelete="CASCADE"))
+ project = relationship("Project")
+
+ # Relationships
+ incident_types = relationship("IncidentType", secondary=assoc_incident_roles_incident_types)
+ incident_priorities = relationship(
+ "IncidentPriority", secondary=assoc_incident_roles_incident_priorities
+ )
+
+ service_id = Column(Integer, ForeignKey("service.id"))
+ service = relationship("Service")
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "incident_role",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("role", sa.String(), nullable=True),
+ sa.Column("enabled", sa.Boolean(), nullable=True),
+ sa.Column("order", sa.Integer(), nullable=True),
+ sa.Column("service_id", sa.Integer(), nullable=True),
+ sa.Column("individual_id", sa.Integer(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["individual_id"],
+ ["individual_contact.id"],
+ ),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(
+ ["service_id"],
+ ["service.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ op.create_table(
+ "incident_role_incident_priority",
+ sa.Column("incident_role_id", sa.Integer(), nullable=False),
+ sa.Column("incident_priority_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(
+ ["incident_priority_id"],
+ ["incident_priority.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["incident_role_id"],
+ ["incident_role.id"],
+ ),
+ sa.PrimaryKeyConstraint("incident_role_id", "incident_priority_id"),
+ )
+ op.create_table(
+ "incident_role_tag",
+ sa.Column("incident_role_id", sa.Integer(), nullable=False),
+ sa.Column("tag_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(
+ ["incident_role_id"],
+ ["incident_role.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["tag_id"],
+ ["tag.id"],
+ ),
+ sa.PrimaryKeyConstraint("incident_role_id", "tag_id"),
+ )
+ op.create_table(
+ "incident_role_incident_type",
+ sa.Column("incident_role_id", sa.Integer(), nullable=False),
+ sa.Column("incident_type_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(
+ ["incident_role_id"],
+ ["incident_role.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["incident_type_id"],
+ ["incident_type.id"],
+ ),
+ sa.PrimaryKeyConstraint("incident_role_id", "incident_type_id"),
+ )
+
+ # migrate old incident type mappings to incident roles
+ bind = op.get_bind()
+ session = Session(bind=bind)
+
+ roles = defaultdict(list)
+
+ for i_type in session.query(IncidentType).all():
+ # group by types
+ if i_type.commander_service_id:
+ roles[(i_type.project_id, i_type.commander_service_id, "Incident Commander")].append(
+ i_type
+ )
+
+ if i_type.liaison_service_id:
+ roles[(i_type.project_id, i_type.liaison_service_id, "Liaison")].append(i_type)
+
+ incident_priorities = session.query(IncidentPriority).all()
+
+ for k, v in roles.items():
+ project_id, service_id, role = k
+ session.add(
+ IncidentRole(
+ project_id=project_id,
+ incident_types=v,
+ incident_priorities=incident_priorities,
+ role=role,
+ service_id=service_id,
+ )
+ )
+
+ session.commit()
+
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("incident_role_incident_type")
+ op.drop_table("incident_role_tag")
+ op.drop_table("incident_role_incident_priority")
+ op.drop_table("incident_role")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-08-26_3820fb661728.py b/src/dispatch/database/revisions/tenant/versions/2021-08-26_3820fb661728.py
new file mode 100644
index 000000000000..cfe5258886f2
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-08-26_3820fb661728.py
@@ -0,0 +1,58 @@
+"""Adding evergreen to other resources.
+
+Revision ID: 3820fb661728
+Revises: b73416df5744
+Create Date: 2021-08-26 15:43:21.019477
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "3820fb661728"
+down_revision = "b73416df5744"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("notification", sa.Column("evergreen", sa.Boolean(), nullable=True))
+ op.add_column("notification", sa.Column("evergreen_owner", sa.String(), nullable=True))
+ op.add_column(
+ "notification", sa.Column("evergreen_reminder_interval", sa.Integer(), nullable=True)
+ )
+ op.add_column(
+ "notification", sa.Column("evergreen_last_reminder_at", sa.DateTime(), nullable=True)
+ )
+ op.add_column("service", sa.Column("evergreen", sa.Boolean(), nullable=True))
+ op.add_column("service", sa.Column("evergreen_owner", sa.String(), nullable=True))
+ op.add_column("service", sa.Column("evergreen_reminder_interval", sa.Integer(), nullable=True))
+ op.add_column("service", sa.Column("evergreen_last_reminder_at", sa.DateTime(), nullable=True))
+ op.add_column("team_contact", sa.Column("evergreen", sa.Boolean(), nullable=True))
+ op.add_column("team_contact", sa.Column("evergreen_owner", sa.String(), nullable=True))
+ op.add_column(
+ "team_contact", sa.Column("evergreen_reminder_interval", sa.Integer(), nullable=True)
+ )
+ op.add_column(
+ "team_contact", sa.Column("evergreen_last_reminder_at", sa.DateTime(), nullable=True)
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("team_contact", "evergreen_last_reminder_at")
+ op.drop_column("team_contact", "evergreen_reminder_interval")
+ op.drop_column("team_contact", "evergreen_owner")
+ op.drop_column("team_contact", "evergreen")
+ op.drop_column("service", "evergreen_last_reminder_at")
+ op.drop_column("service", "evergreen_reminder_interval")
+ op.drop_column("service", "evergreen_owner")
+ op.drop_column("service", "evergreen")
+ op.drop_column("notification", "evergreen_last_reminder_at")
+ op.drop_column("notification", "evergreen_reminder_interval")
+ op.drop_column("notification", "evergreen_owner")
+ op.drop_column("notification", "evergreen")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-09-03_ebe0cb6528ba.py b/src/dispatch/database/revisions/tenant/versions/2021-09-03_ebe0cb6528ba.py
new file mode 100644
index 000000000000..f08aa48d9233
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-09-03_ebe0cb6528ba.py
@@ -0,0 +1,51 @@
+"""Adds the ability for monitors to be associated with incidents
+
+Revision ID: ebe0cb6528ba
+Revises: 3820fb661728
+Create Date: 2021-09-03 11:59:34.426418
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "ebe0cb6528ba"
+down_revision = "3820fb661728"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "monitor",
+ sa.Column("resource_type", sa.String(), nullable=True),
+ sa.Column("resource_id", sa.String(), nullable=True),
+ sa.Column("weblink", sa.String(), nullable=True),
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("plugin_instance_id", sa.Integer(), nullable=True),
+ sa.Column("creator_id", sa.Integer(), nullable=True),
+ sa.Column("incident_id", sa.Integer(), nullable=True),
+ sa.Column("enabled", sa.Boolean(), nullable=True),
+ sa.Column("status", sa.JSON(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["creator_id"],
+ ["participant.id"],
+ ),
+ sa.ForeignKeyConstraint(["incident_id"], ["incident.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(
+ ["plugin_instance_id"],
+ ["plugin_instance.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("monitor")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-09-16_3820a792d88a.py b/src/dispatch/database/revisions/tenant/versions/2021-09-16_3820a792d88a.py
new file mode 100644
index 000000000000..39ce17894056
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-09-16_3820a792d88a.py
@@ -0,0 +1,317 @@
+"""Migrates plugin instance configuration column to encrypted column.
+
+Revision ID: 3820a792d88a
+Revises: ebe0cb6528ba
+Create Date: 2021-09-16 16:33:40.605881
+
+"""
+from alembic import op
+from pydantic import SecretStr, ValidationError
+from pydantic.json import pydantic_encoder
+from starlette.datastructures import URL
+from sqlalchemy.dialects import postgresql
+
+from sqlalchemy import Column, Integer, ForeignKey, String
+from sqlalchemy.ext.declarative import declarative_base
+from sqlalchemy.orm import relationship, Session
+from sqlalchemy.ext.hybrid import hybrid_property
+from sqlalchemy_utils import StringEncryptedType, types
+from sqlalchemy_utils.types.encrypted.encrypted_type import AesEngine
+from dispatch.config import config, Secret, DISPATCH_ENCRYPTION_KEY
+
+
+# revision identifiers, used by Alembic.
+revision = "3820a792d88a"
+down_revision = "ebe0cb6528ba"
+branch_labels = None
+depends_on = None
+
+Base = declarative_base()
+
+
+def show_secrets_encoder(obj):
+ if isinstance(obj, SecretStr):
+ return obj.get_secret_value()
+ else:
+ return pydantic_encoder(obj)
+
+
+def migrate_config(instances, slug, config):
+ for instance in instances:
+ if slug == instance.plugin.slug:
+ instance.configuration = config
+
+
+class Plugin(Base):
+ __tablename__ = "plugin"
+ __table_args__ = {"schema": "dispatch_core"}
+ id = Column(Integer, primary_key=True)
+ slug = Column(String, unique=True)
+
+
+class PluginInstance(Base):
+ __tablename__ = "plugin_instance"
+ id = Column(Integer, primary_key=True)
+ _configuration = Column(
+ StringEncryptedType(key=str(DISPATCH_ENCRYPTION_KEY), engine=AesEngine, padding="pkcs5")
+ )
+ plugin_id = Column(Integer, ForeignKey(Plugin.id))
+ plugin = relationship(Plugin, backref="instances")
+
+ @hybrid_property
+ def configuration(self):
+ """Property that correctly returns a plugins configuration object."""
+ pass
+
+ @configuration.setter
+ def configuration(self, configuration):
+ """Property that correctly sets a plugins configuration object."""
+ if configuration:
+ self._configuration = configuration.json(encoder=show_secrets_encoder)
+
+
+def upgrade():
+ op.add_column(
+ "plugin_instance",
+ Column(
+ "_configuration",
+ types.encrypted.encrypted_type.StringEncryptedType(),
+ nullable=True,
+ ),
+ )
+
+ bind = op.get_bind()
+ session = Session(bind=bind)
+
+ instances = session.query(PluginInstance).all()
+
+ from dispatch.plugins.dispatch_google.config import GoogleConfiguration
+
+ GOOGLE_DEVELOPER_KEY = config("GOOGLE_DEVELOPER_KEY", cast=Secret, default=None)
+ GOOGLE_SERVICE_ACCOUNT_CLIENT_EMAIL = config(
+ "GOOGLE_SERVICE_ACCOUNT_CLIENT_EMAIL", default=None
+ )
+ GOOGLE_SERVICE_ACCOUNT_CLIENT_ID = config("GOOGLE_SERVICE_ACCOUNT_CLIENT_ID", default=None)
+ GOOGLE_SERVICE_ACCOUNT_DELEGATED_ACCOUNT = config(
+ "GOOGLE_SERVICE_ACCOUNT_DELEGATED_ACCOUNT", default=None
+ )
+ GOOGLE_SERVICE_ACCOUNT_PRIVATE_KEY = config(
+ "GOOGLE_SERVICE_ACCOUNT_PRIVATE_KEY", cast=Secret, default=None
+ )
+ GOOGLE_SERVICE_ACCOUNT_PRIVATE_KEY_ID = config(
+ "GOOGLE_SERVICE_ACCOUNT_PRIVATE_KEY_ID", default=None
+ )
+ GOOGLE_SERVICE_ACCOUNT_PROJECT_ID = config("GOOGLE_SERVICE_ACCOUNT_PROJECT_ID", default=None)
+ GOOGLE_DOMAIN = config("GOOGLE_DOMAIN", default=None)
+
+ try:
+ google_config = GoogleConfiguration(
+ developer_key=str(GOOGLE_DEVELOPER_KEY),
+ service_account_client_email=GOOGLE_SERVICE_ACCOUNT_CLIENT_EMAIL,
+ service_account_client_id=GOOGLE_SERVICE_ACCOUNT_CLIENT_ID,
+ service_account_private_key=str(GOOGLE_SERVICE_ACCOUNT_PRIVATE_KEY),
+ service_account_private_key_id=GOOGLE_SERVICE_ACCOUNT_PRIVATE_KEY_ID,
+ service_account_delegated_account=GOOGLE_SERVICE_ACCOUNT_DELEGATED_ACCOUNT,
+ service_account_project_id=GOOGLE_SERVICE_ACCOUNT_PROJECT_ID,
+ google_domain=GOOGLE_DOMAIN,
+ )
+ migrate_config(instances, "google-calendar-conference", google_config)
+ migrate_config(instances, "google-docs-document", google_config)
+ migrate_config(instances, "google-drive-storage", google_config)
+ migrate_config(instances, "google-drive-task", google_config)
+ migrate_config(instances, "google-gmail-email", google_config)
+ migrate_config(instances, "google-group-participant-group", google_config)
+ except ValidationError:
+ print(
+ "Skipping automatic migration of google plugin credentials, if you are using the google suite of plugins manually migrate credentials."
+ )
+
+ from dispatch.plugins.dispatch_pagerduty.plugin import PagerdutyConfiguration
+
+ PAGERDUTY_API_KEY = config("PAGERDUTY_API_KEY", cast=Secret, default=None)
+ PAGERDUTY_API_FROM_EMAIL = config("PAGERDUTY_API_FROM_EMAIL", default=None)
+
+ try:
+ pagerduty_config = PagerdutyConfiguration(
+ api_key=str(PAGERDUTY_API_KEY), from_email=PAGERDUTY_API_FROM_EMAIL
+ )
+ migrate_config(instances, "pagerduty-oncall", pagerduty_config)
+ except ValidationError:
+ print(
+ "Skipping automatic migration of pagerduty plugin credentials, if you are using the pagerduty plugin manually migrate credentials."
+ )
+
+ from dispatch.plugins.dispatch_zoom.plugin import ZoomConfiguration
+
+ ZOOM_API_USER_ID = config("ZOOM_API_USER_ID", default=None)
+ ZOOM_API_KEY = config("ZOOM_API_KEY", default=None)
+ ZOOM_API_SECRET = config("ZOOM_API_SECRET", cast=Secret, default=None)
+
+ try:
+ zoom_config = ZoomConfiguration(
+ api_user_id=ZOOM_API_USER_ID, api_key=ZOOM_API_KEY, api_secret=str(ZOOM_API_SECRET)
+ )
+ migrate_config(instances, "zoom-conference", zoom_config)
+ except ValidationError:
+ print(
+ "Skipping automatic migration of zoom plugin credentials, if you are using the zoom plugin manually migrate credentials."
+ )
+
+ from dispatch.plugins.dispatch_jira.plugin import JiraConfiguration
+
+ JIRA_API_URL = config("JIRA_API_URL", cast=URL, default=None)
+ JIRA_BROWSER_URL = config("JIRA_BROWSER_URL", cast=URL, default=None)
+ JIRA_HOSTING_TYPE = config("JIRA_HOSTING_TYPE", default="cloud")
+ JIRA_PASSWORD = config("JIRA_PASSWORD", cast=Secret, default=None)
+ JIRA_USERNAME = config("JIRA_USERNAME", default=None)
+ JIRA_PROJECT_ID = config("JIRA_PROJECT_ID", default=None)
+ JIRA_ISSUE_TYPE_NAME = config("JIRA_ISSUE_TYPE_NAME", default=None)
+
+ try:
+ jira_config = JiraConfiguration(
+ api_url=str(JIRA_API_URL),
+ browser_url=str(JIRA_BROWSER_URL),
+ hosting_type=JIRA_HOSTING_TYPE,
+ username=JIRA_USERNAME,
+ default_project_id=JIRA_PROJECT_ID,
+ default_issue_type_name=JIRA_ISSUE_TYPE_NAME,
+ password=str(JIRA_PASSWORD),
+ )
+ migrate_config(instances, "jira-ticket", jira_config)
+ except ValidationError:
+ print(
+ "Skipping automatic migration of jira plugin credentials, if you are using the jira plugin manually migrate credentials."
+ )
+
+ from dispatch.plugins.dispatch_opsgenie.plugin import OpsgenieConfiguration
+
+ OPSGENIE_API_KEY = config("OPSGENIE_API_KEY", default=None, cast=Secret)
+
+ try:
+ opsgenie_config = OpsgenieConfiguration(api_key=str(OPSGENIE_API_KEY), default=None)
+ migrate_config(instances, "opsgenie-oncall", opsgenie_config)
+ except ValidationError:
+ print(
+ "Skipping automatic migration of opsgenie plugin credentials, if you are using the opsgenie plugin manually migrate credentials."
+ )
+
+ from dispatch.plugins.dispatch_slack.config import (
+ SlackContactConfiguration,
+ SlackConversationConfiguration,
+ )
+
+ SLACK_API_BOT_TOKEN = config("SLACK_API_BOT_TOKEN", cast=Secret, default=None)
+ SLACK_SOCKET_MODE_APP_TOKEN = config("SLACK_SOCKET_MODE_APP_TOKEN", cast=Secret, default=None)
+ SLACK_APP_USER_SLUG = config("SLACK_APP_USER_SLUG", default=None)
+ SLACK_BAN_THREADS = config("SLACK_BAN_THREADS", default=True)
+ SLACK_PROFILE_DEPARTMENT_FIELD_ID = config("SLACK_PROFILE_DEPARTMENT_FIELD_ID", default="")
+ SLACK_PROFILE_TEAM_FIELD_ID = config("SLACK_PROFILE_TEAM_FIELD_ID", default="")
+ SLACK_PROFILE_WEBLINK_FIELD_ID = config("SLACK_PROFILE_WEBLINK_FIELD_ID", default="")
+ SLACK_SIGNING_SECRET = config("SLACK_SIGNING_SECRET", cast=Secret, default=None)
+ SLACK_TIMELINE_EVENT_REACTION = config("SLACK_TIMELINE_EVENT_REACTION", default="stopwatch")
+
+ # Slash commands
+ SLACK_COMMAND_LIST_TASKS_SLUG = config(
+ "SLACK_COMMAND_LIST_TASKS_SLUG", default="/dispatch-list-tasks"
+ )
+ SLACK_COMMAND_LIST_MY_TASKS_SLUG = config(
+ "SLACK_COMMAND_LIST_MY_TASKS_SLUG", default="/dispatch-list-my-tasks"
+ )
+ SLACK_COMMAND_LIST_PARTICIPANTS_SLUG = config(
+ "SLACK_COMMAND_LIST_PARTICIPANTS_SLUG", default="/dispatch-list-participants"
+ )
+ SLACK_COMMAND_ASSIGN_ROLE_SLUG = config(
+ "SLACK_COMMAND_ASSIGN_ROLE_SLUG", default="/dispatch-assign-role"
+ )
+ SLACK_COMMAND_UPDATE_INCIDENT_SLUG = config(
+ "SLACK_COMMAND_UPDATE_INCIDENT_SLUG", default="/dispatch-update-incident"
+ )
+ SLACK_COMMAND_UPDATE_PARTICIPANT_SLUG = config(
+ "SLACK_COMMAND_UPDATE_PARTICIPANT_SLUG", default="/dispatch-update-participant"
+ )
+ SLACK_COMMAND_ENGAGE_ONCALL_SLUG = config(
+ "SLACK_COMMAND_ENGAGE_ONCALL_SLUG", default="/dispatch-engage-oncall"
+ )
+ SLACK_COMMAND_LIST_RESOURCES_SLUG = config(
+ "SLACK_COMMAND_LIST_RESOURCES_SLUG", default="/dispatch-list-resources"
+ )
+ SLACK_COMMAND_REPORT_INCIDENT_SLUG = config(
+ "SLACK_COMMAND_REPORT_INCIDENT_SLUG", default="/dispatch-report-incident"
+ )
+ SLACK_COMMAND_REPORT_TACTICAL_SLUG = config(
+ "SLACK_COMMAND_REPORT_TACTICAL_SLUG", default="/dispatch-report-tactical"
+ )
+ SLACK_COMMAND_REPORT_EXECUTIVE_SLUG = config(
+ "SLACK_COMMAND_REPORT_EXECUTIVE_SLUG", default="/dispatch-report-executive"
+ )
+ SLACK_COMMAND_UPDATE_NOTIFICATIONS_GROUP_SLUG = config(
+ "SLACK_COMMAND_UPDATE_NOTIFICATIONS_GROUP_SLUG", default="/dispatch-notifications-group"
+ )
+ SLACK_COMMAND_ADD_TIMELINE_EVENT_SLUG = config(
+ "SLACK_COMMAND_ADD_TIMELINE_EVENT_SLUG", default="/dispatch-add-timeline-event"
+ )
+ SLACK_COMMAND_LIST_INCIDENTS_SLUG = config(
+ "SLACK_COMMAND_LIST_INCIDENTS_SLUG", default="/dispatch-list-incidents"
+ )
+ SLACK_COMMAND_RUN_WORKFLOW_SLUG = config(
+ "SLACK_COMMAND_RUN_WORKFLOW_SLUG", default="/dispatch-run-workflow"
+ )
+ SLACK_COMMAND_LIST_WORKFLOWS_SLUG = config(
+ "SLUG_COMMAND_LIST_WORKFLOWS_SLUG", default="/dispatch-list-workflows"
+ )
+
+ try:
+ slack_conversation_config = SlackConversationConfiguration(
+ api_bot_token=str(SLACK_API_BOT_TOKEN),
+ socket_mode_app_token=str(SLACK_SOCKET_MODE_APP_TOKEN),
+ signing_secret=str(SLACK_SIGNING_SECRET),
+ app_user_slug=SLACK_APP_USER_SLUG,
+ ban_threads=SLACK_BAN_THREADS,
+ timeline_event_reaction=SLACK_TIMELINE_EVENT_REACTION,
+ slack_command_tasks=SLACK_COMMAND_LIST_TASKS_SLUG,
+ slack_command_list_my_tasks=SLACK_COMMAND_LIST_MY_TASKS_SLUG,
+ slack_command_list_participants=SLACK_COMMAND_LIST_PARTICIPANTS_SLUG,
+ slack_command_assign_role=SLACK_COMMAND_ASSIGN_ROLE_SLUG,
+ slack_command_update_incident=SLACK_COMMAND_UPDATE_INCIDENT_SLUG,
+ slack_command_update_participant=SLACK_COMMAND_UPDATE_PARTICIPANT_SLUG,
+ slack_command_engage_oncall=SLACK_COMMAND_ENGAGE_ONCALL_SLUG,
+ slack_command_list_resource=SLACK_COMMAND_LIST_RESOURCES_SLUG,
+ slack_command_report_incident=SLACK_COMMAND_REPORT_INCIDENT_SLUG,
+ slack_command_report_tactical=SLACK_COMMAND_REPORT_TACTICAL_SLUG,
+ slack_command_report_executive=SLACK_COMMAND_REPORT_EXECUTIVE_SLUG,
+ slack_command_update_notifications_group=SLACK_COMMAND_UPDATE_NOTIFICATIONS_GROUP_SLUG,
+ slack_command_add_timeline_event=SLACK_COMMAND_ADD_TIMELINE_EVENT_SLUG,
+ slack_command_list_incidents=SLACK_COMMAND_LIST_INCIDENTS_SLUG,
+ slack_command_run_workflow=SLACK_COMMAND_RUN_WORKFLOW_SLUG,
+ slack_command_list_workflow=SLACK_COMMAND_LIST_WORKFLOWS_SLUG,
+ )
+
+ slack_contact_config = SlackContactConfiguration(
+ api_bot_token=str(SLACK_API_BOT_TOKEN),
+ socket_mode_app_token=str(SLACK_SOCKET_MODE_APP_TOKEN),
+ signing_secret=str(SLACK_SIGNING_SECRET),
+ profile_department_field_id=SLACK_PROFILE_DEPARTMENT_FIELD_ID,
+ profile_team_field_id=SLACK_PROFILE_TEAM_FIELD_ID,
+ profile_weblink_field_id=SLACK_PROFILE_WEBLINK_FIELD_ID,
+ )
+
+ migrate_config(instances, "slack-conversation", slack_conversation_config)
+ migrate_config(instances, "slack-contact", slack_contact_config)
+
+ except ValidationError:
+ print(
+ "Skipping automatic migration of slack plugin credentials, if you are using the slack plugin manually migrate credentials."
+ )
+
+ session.commit()
+
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "plugin_instance",
+ Column("configuration", postgresql.BYTEA(), autoincrement=False, nullable=True),
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-10-05_ceaf01079f4f.py b/src/dispatch/database/revisions/tenant/versions/2021-10-05_ceaf01079f4f.py
new file mode 100644
index 000000000000..99989c958e47
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-10-05_ceaf01079f4f.py
@@ -0,0 +1,24 @@
+"""Ensure search expression is not null
+
+Revision ID: ceaf01079f4f
+Revises: 3820a792d88a
+Create Date: 2021-10-05 14:01:53.423667
+
+"""
+from alembic import op
+
+
+# revision identifiers, used by Alembic.
+revision = "ceaf01079f4f"
+down_revision = "3820a792d88a"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ op.execute("UPDATE search_filter SET expression = '[]' WHERE expression is null")
+ op.alter_column("search_filter", "expression", nullable=False)
+
+
+def downgrade():
+ op.alter_column("search_filter", "expression", nullable=True)
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-10-27_d1dc160533c7.py b/src/dispatch/database/revisions/tenant/versions/2021-10-27_d1dc160533c7.py
new file mode 100644
index 000000000000..c7c9bd36aa8d
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-10-27_d1dc160533c7.py
@@ -0,0 +1,76 @@
+"""Migrates the storage setting to the google drive plugin
+
+Revision ID: d1dc160533c7
+Revises: ceaf01079f4f
+Create Date: 2021-10-27 14:03:01.385859
+
+"""
+import json
+from alembic import op
+from pydantic import SecretStr
+from pydantic.json import pydantic_encoder
+from sqlalchemy import Column, Integer, ForeignKey, String
+from sqlalchemy.ext.declarative import declarative_base
+from sqlalchemy.orm import relationship, Session
+from sqlalchemy_utils import StringEncryptedType
+from sqlalchemy_utils.types.encrypted.encrypted_type import AesEngine
+from dispatch.config import DISPATCH_ENCRYPTION_KEY, Config
+
+config = Config(".env")
+
+
+# revision identifiers, used by Alembic.
+revision = "d1dc160533c7"
+down_revision = "ceaf01079f4f"
+branch_labels = None
+depends_on = None
+
+Base = declarative_base()
+
+
+def show_secrets_encoder(obj):
+ if isinstance(obj, SecretStr):
+ return obj.get_secret_value()
+ else:
+ return pydantic_encoder(obj)
+
+
+class Plugin(Base):
+ __tablename__ = "plugin"
+ __table_args__ = {"schema": "dispatch_core"}
+ id = Column(Integer, primary_key=True)
+ slug = Column(String, unique=True)
+
+
+class PluginInstance(Base):
+ __tablename__ = "plugin_instance"
+ id = Column(Integer, primary_key=True)
+ _configuration = Column(
+ StringEncryptedType(key=str(DISPATCH_ENCRYPTION_KEY), engine=AesEngine, padding="pkcs5")
+ )
+ plugin_id = Column(Integer, ForeignKey(Plugin.id))
+ plugin = relationship(Plugin, backref="instances")
+
+
+def upgrade():
+ bind = op.get_bind()
+ session = Session(bind=bind)
+
+ instances = session.query(PluginInstance).filter(Plugin.slug == "google-drive-storage").all()
+
+ for instance in instances:
+ if instance._configuration:
+ configuration_json = json.loads(instance._configuration)
+ configuration_json["root_id"] = config("INCIDENT_STORAGE_FOLDER_ID")
+ configuration_json["open_on_close"] = config(
+ "INCIDENT_STORAGE_OPEN_ON_CLOSE", default=False
+ )
+ instance._configuration = json.dumps(configuration_json)
+
+ session.commit()
+
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ pass
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-10-28_f95247c63eda.py b/src/dispatch/database/revisions/tenant/versions/2021-10-28_f95247c63eda.py
new file mode 100644
index 000000000000..8aa1a7f1a022
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-10-28_f95247c63eda.py
@@ -0,0 +1,61 @@
+"""Adds additional project specific configuration items.
+
+Revision ID: f95247c63eda
+Revises: d1dc160533c7
+Create Date: 2021-10-28 13:44:55.105570
+
+"""
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy import Column, Integer, String
+from sqlalchemy.orm import Session
+from sqlalchemy.ext.declarative import declarative_base
+from dispatch.config import Config
+
+# revision identifiers, used by Alembic.
+revision = "f95247c63eda"
+down_revision = "d1dc160533c7"
+branch_labels = None
+depends_on = None
+
+config = Config(".env")
+Base = declarative_base()
+
+
+class Project(Base):
+ __tablename__ = "project"
+ id = Column(Integer, primary_key=True)
+ annual_employee_cost = Column(Integer, default=650000)
+ business_year_hours = Column(Integer, default=2000)
+
+ owner_email = Column(String)
+ owner_conversation = Column(String)
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("project", sa.Column("annual_employee_cost", sa.Integer(), nullable=True))
+ op.add_column("project", sa.Column("business_year_hours", sa.Integer(), nullable=True))
+ op.add_column("project", sa.Column("owner_email", sa.String(), nullable=True))
+ op.add_column("project", sa.Column("owner_conversation", sa.String(), nullable=True))
+
+ bind = op.get_bind()
+ session = Session(bind=bind)
+
+ projects = session.query(Project).all()
+ for project in projects:
+ project.annual_employee_cost = config("ANNUAL_COST_EMPLOYEE", cast=int, default="650000")
+ project.business_year_hours = config("BUSINESS_HOURS_YEAR", cast=int, default="2080")
+ project.owner_email = config("INCIDENT_RESPONSE_TEAM_EMAIL", default="")
+
+ session.commit()
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("project", "owner_conversation")
+ op.drop_column("project", "owner_email")
+ op.drop_column("project", "business_year_hours")
+ op.drop_column("project", "annual_employee_cost")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-11-01_5f96c909a3c0.py b/src/dispatch/database/revisions/tenant/versions/2021-11-01_5f96c909a3c0.py
new file mode 100644
index 000000000000..bf5c086419a8
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-11-01_5f96c909a3c0.py
@@ -0,0 +1,29 @@
+"""Adds an activity counter.
+
+Revision ID: 5f96c909a3c0
+Revises: ecabe49272c8
+Create Date: 2021-11-01 15:41:14.853706
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "5f96c909a3c0"
+down_revision = "ecabe49272c8"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("participant", sa.Column("activity", sa.Integer(), nullable=True))
+ # add some activity for backwards compatibility
+ op.execute("update participant set activity = 1 where activity isnull")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("participant", "activity")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-11-01_ecabe49272c8.py b/src/dispatch/database/revisions/tenant/versions/2021-11-01_ecabe49272c8.py
new file mode 100644
index 000000000000..1897b33f0ce4
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-11-01_ecabe49272c8.py
@@ -0,0 +1,27 @@
+"""Adds exclusive tag types
+
+Revision ID: ecabe49272c8
+Revises: f95247c63eda
+Create Date: 2021-11-01 15:11:53.675447
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "ecabe49272c8"
+down_revision = "f95247c63eda"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("tag_type", sa.Column("exclusive", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("tag_type", "exclusive")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2021-11-18_ce5c4ac967d8.py b/src/dispatch/database/revisions/tenant/versions/2021-11-18_ce5c4ac967d8.py
new file mode 100644
index 000000000000..00c107e2e293
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2021-11-18_ce5c4ac967d8.py
@@ -0,0 +1,240 @@
+"""Promotes values for better query performance.
+
+Revision ID: ce5c4ac967d8
+Revises: 3097592c0739
+Create Date: 2021-11-18 10:09:22.330772
+
+"""
+from alembic import op
+from collections import Counter
+import enum
+
+from sqlalchemy import Column, DateTime, ForeignKey, Integer, String, Numeric
+from sqlalchemy.ext.declarative import declarative_base
+from sqlalchemy.orm import relationship, Session
+
+
+Base = declarative_base()
+
+
+# revision identifiers, used by Alembic.
+revision = "ce5c4ac967d8"
+down_revision = "3097592c0739"
+branch_labels = None
+depends_on = None
+
+
+class ParticipantRoleType(enum.Enum):
+ incident_commander = "Incident Commander"
+ scribe = "Scribe"
+ liaison = "Liaison"
+ participant = "Participant"
+ reporter = "Reporter"
+
+
+class DocumentResourceTypes(enum.Enum):
+ executive = "dispatch-executive-report-document"
+ review = "dispatch-incident-review-document"
+ tracking = "dispatch-incident-sheet"
+ incident = "dispatch-incident-document"
+
+
+class IncidentCost(Base):
+ __tablename__ = "incident_cost"
+ id = Column(Integer, primary_key=True)
+ amount = Column(Numeric(precision=10, scale=2), nullable=True)
+ incident_id = Column(Integer, ForeignKey("incident.id"))
+
+
+class ParticipantRole(Base):
+ __tablename__ = "participant_role"
+ id = Column(Integer, primary_key=True)
+ assumed_at = Column(DateTime)
+ renounced_at = Column(DateTime)
+ role = Column(String, default=ParticipantRoleType.participant)
+ participant_id = Column(Integer, ForeignKey("participant.id"))
+
+
+class Participant(Base):
+ __tablename__ = "participant"
+ id = Column(Integer, primary_key=True)
+ team = Column(String)
+ location = Column(String)
+ participant_roles = relationship("ParticipantRole", backref="participant")
+ incident_id = Column(Integer, ForeignKey("incident.id"))
+
+
+class Document(Base):
+ __tablename__ = "document"
+ id = Column(Integer, primary_key=True)
+ incident_id = Column(Integer, ForeignKey("incident.id"))
+ resource_type = Column(String)
+
+
+class Group(Base):
+ __tablename__ = "group"
+ id = Column(Integer, primary_key=True)
+ incident_id = Column(Integer, ForeignKey("incident.id"))
+ resource_type = Column(String)
+
+
+class Incident(Base):
+ __tablename__ = "incident"
+ id = Column(Integer, primary_key=True)
+ total_cost = Column(Numeric)
+ participants_team = Column(String)
+ participants_location = Column(String)
+ commanders_location = Column(String)
+ reporters_location = Column(String)
+
+ incident_costs = relationship("IncidentCost")
+ documents = relationship("Document", foreign_keys=[Document.incident_id])
+ participants = relationship("Participant", foreign_keys=[Participant.incident_id])
+ groups = relationship("Group", foreign_keys=[Group.incident_id])
+
+ commander_id = Column(Integer, ForeignKey("participant.id"))
+ reporter_id = Column(Integer, ForeignKey("participant.id"))
+ liaison_id = Column(Integer, ForeignKey("participant.id"))
+ scribe_id = Column(Integer, ForeignKey("participant.id"))
+ incident_document_id = Column(Integer, ForeignKey("document.id"))
+ incident_review_document_id = Column(Integer, ForeignKey("document.id"))
+ tactical_group_id = Column(Integer, ForeignKey("group.id"))
+ notifications_group_id = Column(Integer, ForeignKey("group.id"))
+
+
+def get_current_participant(participants, role):
+ participant_roles = []
+ for p in participants:
+ for pr in p.participant_roles:
+ if pr.role == role.value:
+ participant_roles.append(pr)
+
+ if participant_roles:
+ return sorted(participant_roles, key=lambda pr: pr.assumed_at)[-1].participant
+
+
+def get_current_document(documents, resource_type):
+ for d in documents:
+ if d.resource_type == resource_type:
+ return d
+
+
+def get_current_group(groups, resource_type):
+ for g in groups:
+ if g.resource_type:
+ if g.resource_type.endswith(resource_type):
+ return g
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("incident", Column("total_cost", Numeric(), nullable=True))
+ op.add_column("incident", Column("participants_team", String(), nullable=True))
+ op.add_column("incident", Column("participants_location", String(), nullable=True))
+ op.add_column("incident", Column("commanders_location", String(), nullable=True))
+ op.add_column("incident", Column("reporters_location", String(), nullable=True))
+ op.add_column("incident", Column("commander_id", Integer(), nullable=True))
+ op.add_column("incident", Column("reporter_id", Integer(), nullable=True))
+ op.add_column("incident", Column("liaison_id", Integer(), nullable=True))
+ op.add_column("incident", Column("scribe_id", Integer(), nullable=True))
+ op.add_column("incident", Column("incident_document_id", Integer(), nullable=True))
+ op.add_column("incident", Column("incident_review_document_id", Integer(), nullable=True))
+ op.add_column("incident", Column("tactical_group_id", Integer(), nullable=True))
+ op.add_column("incident", Column("notifications_group_id", Integer(), nullable=True))
+ op.create_foreign_key(None, "incident", "document", ["incident_document_id"], ["id"])
+ op.create_foreign_key(None, "incident", "participant", ["reporter_id"], ["id"])
+ op.create_foreign_key(None, "incident", "participant", ["scribe_id"], ["id"])
+ op.create_foreign_key(None, "incident", "participant", ["commander_id"], ["id"])
+ op.create_foreign_key(None, "incident", "participant", ["liaison_id"], ["id"])
+ op.create_foreign_key(None, "incident", "document", ["incident_review_document_id"], ["id"])
+ op.create_foreign_key(None, "incident", "group", ["tactical_group_id"], ["id"])
+ op.create_foreign_key(None, "incident", "group", ["notifications_group_id"], ["id"])
+
+ print("Starting data migration...")
+
+ bind = op.get_bind()
+ session = Session(bind=bind)
+
+ incidents = session.query(Incident).all()
+
+ for incident in incidents:
+ # we set the total cost
+ cost = 0
+ for c in incident.incident_costs:
+ cost += c.amount
+ incident.total_cost = cost
+
+ # we set the participants team, and participants, commanders, and reporters location
+ incident.participants_team = Counter(p.team for p in incident.participants).most_common(1)[
+ 0
+ ][0]
+ incident.participants_location = Counter(
+ p.location for p in incident.participants
+ ).most_common(1)[0][0]
+
+ commander = get_current_participant(
+ incident.participants, ParticipantRoleType.incident_commander
+ )
+ if commander:
+ incident.commander_id = commander.id
+ incident.commanders_location = commander.location
+
+ reporter = get_current_participant(incident.participants, ParticipantRoleType.reporter)
+ if reporter:
+ incident.reporter_id = reporter.id
+ incident.reporters_location = reporter.location
+
+ liaison = get_current_participant(incident.participants, ParticipantRoleType.liaison)
+ if liaison:
+ incident.liaison_id = liaison.id
+
+ scribe = get_current_participant(incident.participants, ParticipantRoleType.scribe)
+ if scribe:
+ incident.scribe_id = scribe.id
+
+ # we set the incident document and post-incident review document foreign keys
+ incident_document = get_current_document(incident.documents, DocumentResourceTypes.incident)
+ if incident_document:
+ incident.incident_document_id = incident_document.id
+
+ incident_review_document = get_current_document(
+ incident.documents, DocumentResourceTypes.review
+ )
+ if incident_review_document:
+ incident.incident_review_document_id = incident_review_document.id
+
+ # we set the tactical and notifications foreign keys
+ tactical_group = get_current_group(incident.groups, "tactical-group")
+ if tactical_group:
+ incident.tactical_group_id = tactical_group.id
+
+ notifications_group = get_current_group(incident.groups, "notifications-group")
+ if notifications_group:
+ incident.notifications_group_id = notifications_group.id
+
+ session.commit()
+
+ print("Data migration completed.")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "incident", type_="foreignkey")
+ op.drop_constraint(None, "incident", type_="foreignkey")
+ op.drop_constraint(None, "incident", type_="foreignkey")
+ op.drop_constraint(None, "incident", type_="foreignkey")
+ op.drop_constraint(None, "incident", type_="foreignkey")
+ op.drop_constraint(None, "incident", type_="foreignkey")
+ op.drop_column("incident", "tactical_group_id")
+ op.drop_column("incident", "notifications_group_id")
+ op.drop_column("incident", "incident_review_document_id")
+ op.drop_column("incident", "incident_document_id")
+ op.drop_column("incident", "scribe_id")
+ op.drop_column("incident", "liaison_id")
+ op.drop_column("incident", "reporter_id")
+ op.drop_column("incident", "commander_id")
+ op.drop_column("incident", "primary_location")
+ op.drop_column("incident", "primary_team")
+ op.drop_column("incident", "total_cost")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-02-15_3097592c0739.py b/src/dispatch/database/revisions/tenant/versions/2022-02-15_3097592c0739.py
new file mode 100644
index 000000000000..e1739e56278d
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-02-15_3097592c0739.py
@@ -0,0 +1,27 @@
+"""Adds resolution column to incident data model
+
+Revision ID: 3097592c0739
+Revises: 5f96c909a3c0
+Create Date: 2022-02-15 15:48:45.724812
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "3097592c0739"
+down_revision = "5f96c909a3c0"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("incident", sa.Column("resolution", sa.String(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("incident", "resolution")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-02-22_b5d3706a1d54.py b/src/dispatch/database/revisions/tenant/versions/2022-02-22_b5d3706a1d54.py
new file mode 100644
index 000000000000..bbc8bf8a393f
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-02-22_b5d3706a1d54.py
@@ -0,0 +1,285 @@
+"""Adds models to support first class support for data sources and queries.
+
+Revision ID: b5d3706a1d54
+Revises: ce5c4ac967d8
+Create Date: 2022-02-22 10:03:21.866998
+
+"""
+from alembic import op
+import sqlalchemy as sa
+import sqlalchemy_utils
+
+# revision identifiers, used by Alembic.
+revision = "b5d3706a1d54"
+down_revision = "ce5c4ac967d8"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "source_data_format",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "source_data_format_search_vector_idx",
+ "source_data_format",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "source_environment",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "source_environment_search_vector_idx",
+ "source_environment",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "source_status",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "source_status_search_vector_idx",
+ "source_status",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "source_transport",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "source_transport_search_vector_idx",
+ "source_transport",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "source_type",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "source_type_search_vector_idx",
+ "source_type",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "source",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("cost", sa.Integer(), nullable=True),
+ sa.Column("data_last_loaded_at", sa.DateTime(), nullable=True),
+ sa.Column("daily_volume", sa.Integer(), nullable=True),
+ sa.Column("aggregated", sa.Boolean(), nullable=True),
+ sa.Column("retention", sa.Integer(), nullable=True),
+ sa.Column("size", sa.BigInteger(), nullable=True),
+ sa.Column("delay", sa.Integer(), nullable=True),
+ sa.Column("environment", sa.String(), nullable=True),
+ sa.Column("external_id", sa.String(), nullable=True),
+ sa.Column("documentation", sa.Text(), nullable=True),
+ sa.Column("sampling_rate", sa.Integer(), nullable=True),
+ sa.Column("source_schema", sa.Text(), nullable=True),
+ sa.Column("links", sa.JSON(), nullable=True),
+ sa.Column("source_type_id", sa.Integer(), nullable=True),
+ sa.Column("source_status_id", sa.Integer(), nullable=True),
+ sa.Column("source_environment_id", sa.Integer(), nullable=True),
+ sa.Column("source_data_format_id", sa.Integer(), nullable=True),
+ sa.Column("source_transport_id", sa.Integer(), nullable=True),
+ sa.Column("owner_id", sa.Integer(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["owner_id"],
+ ["service.id"],
+ ),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(
+ ["source_data_format_id"],
+ ["source_data_format.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["source_environment_id"],
+ ["source_environment.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["source_status_id"],
+ ["source_status.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["source_transport_id"],
+ ["source_transport.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["source_type_id"],
+ ["source_type.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "source_search_vector_idx",
+ "source",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "alert",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("orginator", sa.String(), nullable=True),
+ sa.Column("external_link", sa.String(), nullable=True),
+ sa.Column("source_id", sa.Integer(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["source_id"],
+ ["source.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ op.create_index(
+ "alert_search_vector_idx", "alert", ["search_vector"], unique=False, postgresql_using="gin"
+ )
+ op.create_table(
+ "assoc_source_tags",
+ sa.Column("source_id", sa.Integer(), nullable=False),
+ sa.Column("tag_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(["source_id"], ["source.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["tag_id"], ["tag.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("source_id", "tag_id"),
+ )
+ op.create_table(
+ "query",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("text", sa.String(), nullable=True),
+ sa.Column("language", sa.String(), nullable=True),
+ sa.Column("source_id", sa.Integer(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(
+ ["source_id"],
+ ["source.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ op.create_index(
+ "query_search_vector_idx", "query", ["search_vector"], unique=False, postgresql_using="gin"
+ )
+ op.create_table(
+ "assoc_query_tags",
+ sa.Column("query_id", sa.Integer(), nullable=False),
+ sa.Column("tag_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(["query_id"], ["query.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["tag_id"], ["tag.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("query_id", "tag_id"),
+ )
+ op.create_table(
+ "assoc_query_incidents",
+ sa.Column("query_id", sa.Integer(), nullable=False),
+ sa.Column("incident_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(["incident_id"], ["incident.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["query_id"], ["query.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("query_id", "incident_id"),
+ )
+ op.create_table(
+ "assoc_source_incidents",
+ sa.Column("source_id", sa.Integer(), nullable=False),
+ sa.Column("incident_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(["incident_id"], ["incident.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["source_id"], ["source.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("source_id", "incident_id"),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("assoc_source_incidents")
+ op.drop_table("assoc_query_incidents")
+ op.drop_table("assoc_query_tags")
+ op.drop_index("query_search_vector_idx", table_name="query", postgresql_using="gin")
+ op.drop_table("query")
+ op.drop_table("assoc_source_tags")
+ op.drop_index("alert_search_vector_idx", table_name="alert", postgresql_using="gin")
+ op.drop_table("alert")
+ op.drop_index("source_search_vector_idx", table_name="source", postgresql_using="gin")
+ op.drop_table("source")
+ op.drop_index("source_type_search_vector_idx", table_name="source_type", postgresql_using="gin")
+ op.drop_table("source_type")
+ op.drop_index(
+ "source_transport_search_vector_idx", table_name="source_transport", postgresql_using="gin"
+ )
+ op.drop_table("source_transport")
+ op.drop_index(
+ "source_status_search_vector_idx", table_name="source_status", postgresql_using="gin"
+ )
+ op.drop_table("source_status")
+ op.drop_index(
+ "source_environment_search_vector_idx",
+ table_name="source_environment",
+ postgresql_using="gin",
+ )
+ op.drop_table("source_environment")
+ op.drop_index(
+ "source_data_format_search_vector_idx",
+ table_name="source_data_format",
+ postgresql_using="gin",
+ )
+ op.drop_table("source_data_format")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-03-16_03b7e0d36519.py b/src/dispatch/database/revisions/tenant/versions/2022-03-16_03b7e0d36519.py
new file mode 100644
index 000000000000..bf45bef66dca
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-03-16_03b7e0d36519.py
@@ -0,0 +1,27 @@
+"""Adds default column to dispatch user project table
+
+Revision ID: 03b7e0d36519
+Revises: b5d3706a1d54
+Create Date: 2022-03-16 09:35:00.406534
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "03b7e0d36519"
+down_revision = "b5d3706a1d54"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("dispatch_user_project", sa.Column("default", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("dispatch_user_project", "default")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-04-13_e40aefe7fc4d.py b/src/dispatch/database/revisions/tenant/versions/2022-04-13_e40aefe7fc4d.py
new file mode 100644
index 000000000000..f02ddea0ead6
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-04-13_e40aefe7fc4d.py
@@ -0,0 +1,27 @@
+"""Adds column to incident priority model to allow for setting a color
+
+Revision ID: e40aefe7fc4d
+Revises: 03b7e0d36519
+Create Date: 2022-04-13 09:44:48.400912
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "e40aefe7fc4d"
+down_revision = "03b7e0d36519"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("incident_priority", sa.Column("color", sa.String(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("incident_priority", "color")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-05-23_748744207122.py b/src/dispatch/database/revisions/tenant/versions/2022-05-23_748744207122.py
new file mode 100644
index 000000000000..9b9c6c9c059b
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-05-23_748744207122.py
@@ -0,0 +1,116 @@
+"""Fixes missing resource foreign keys in the incident table
+
+Revision ID: 748744207122
+Revises: e40aefe7fc4d
+Create Date: 2022-05-23 10:18:10.080916
+
+"""
+from enum import Enum
+
+from alembic import op
+
+from sqlalchemy import Column, ForeignKey, Integer, String
+from sqlalchemy.ext.declarative import declarative_base
+from sqlalchemy.orm import relationship, Session
+
+
+# revision identifiers, used by Alembic.
+revision = "748744207122"
+down_revision = "e40aefe7fc4d"
+branch_labels = None
+depends_on = None
+
+Base = declarative_base()
+
+
+class DispatchEnum(str, Enum):
+ def __str__(self) -> str:
+ return str.__str__(self)
+
+
+class DocumentResourceTypes(DispatchEnum):
+ incident = "dispatch-incident-document"
+ review = "dispatch-incident-review-document"
+
+
+class Document(Base):
+ __tablename__ = "document"
+ id = Column(Integer, primary_key=True)
+ incident_id = Column(Integer, ForeignKey("incident.id"))
+ resource_type = Column(String)
+
+
+class Group(Base):
+ __tablename__ = "group"
+ id = Column(Integer, primary_key=True)
+ incident_id = Column(Integer, ForeignKey("incident.id"))
+ resource_type = Column(String)
+
+
+class Incident(Base):
+ __tablename__ = "incident"
+ id = Column(Integer, primary_key=True)
+
+ documents = relationship("Document", foreign_keys=[Document.incident_id])
+ groups = relationship("Group", foreign_keys=[Group.incident_id])
+
+ incident_document_id = Column(Integer, ForeignKey("document.id"))
+ incident_review_document_id = Column(Integer, ForeignKey("document.id"))
+ tactical_group_id = Column(Integer, ForeignKey("group.id"))
+ notifications_group_id = Column(Integer, ForeignKey("group.id"))
+
+
+def get_current_document(documents, resource_type):
+ for d in documents:
+ if d.resource_type == resource_type:
+ return d
+
+
+def get_current_group(groups, resource_type):
+ for g in groups:
+ if g.resource_type:
+ if g.resource_type.endswith(resource_type):
+ return g
+
+
+def upgrade():
+ print("Fixing resource foreign keys in incident table...")
+
+ bind = op.get_bind()
+ session = Session(bind=bind)
+
+ incidents = session.query(Incident).all()
+
+ for incident in incidents:
+ # we set the incident document and post-incident review document foreign keys
+ incident_document = get_current_document(incident.documents, DocumentResourceTypes.incident)
+ if incident_document:
+ incident.incident_document_id = incident_document.id
+
+ incident_review_document = get_current_document(
+ incident.documents, DocumentResourceTypes.review
+ )
+ if incident_review_document:
+ incident.incident_review_document_id = incident_review_document.id
+
+ # we set the tactical and notifications foreign keys
+ tactical_group = get_current_group(incident.groups, "tactical-group")
+ if tactical_group:
+ incident.tactical_group_id = tactical_group.id
+
+ notifications_group = get_current_group(incident.groups, "notification-group")
+ if not notifications_group:
+ # we check for the current resource type name
+ notifications_group = get_current_group(incident.groups, "notifications-group")
+
+ if notifications_group:
+ incident.notifications_group_id = notifications_group.id
+
+ session.commit()
+
+ print("Incident resource foreign keys fixed.")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ pass
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-05-26_5e2bc6083503.py b/src/dispatch/database/revisions/tenant/versions/2022-05-26_5e2bc6083503.py
new file mode 100644
index 000000000000..ae69b69e91e5
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-05-26_5e2bc6083503.py
@@ -0,0 +1,29 @@
+"""Removes total_cost column in favor of hybrid property
+
+Revision ID: 5e2bc6083503
+Revises: 748744207122
+Create Date: 2022-05-26 17:18:29.231995
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "5e2bc6083503"
+down_revision = "748744207122"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("incident", "total_cost")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "incident", sa.Column("total_cost", sa.NUMERIC(), autoincrement=False, nullable=True)
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-07-28_2ef7baab2916.py b/src/dispatch/database/revisions/tenant/versions/2022-07-28_2ef7baab2916.py
new file mode 100644
index 000000000000..eec19d3ed559
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-07-28_2ef7baab2916.py
@@ -0,0 +1,208 @@
+"""Adds data models for case, case type, priority, and severity and their assoc objects
+
+Revision ID: 2ef7baab2916
+Revises: 5e2bc6083503
+Create Date: 2022-07-28 10:56:05.452768
+
+"""
+from alembic import op
+import sqlalchemy as sa
+import sqlalchemy_utils
+
+# revision identifiers, used by Alembic.
+revision = "2ef7baab2916"
+down_revision = "5e2bc6083503"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "case_priority",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("color", sa.String(), nullable=True),
+ sa.Column("enabled", sa.Boolean(), nullable=True),
+ sa.Column("default", sa.Boolean(), nullable=True),
+ sa.Column("view_order", sa.Integer(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "case_priority_search_vector_idx",
+ "case_priority",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "case_severity",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("color", sa.String(), nullable=True),
+ sa.Column("enabled", sa.Boolean(), nullable=True),
+ sa.Column("default", sa.Boolean(), nullable=True),
+ sa.Column("view_order", sa.Integer(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "case_severity_search_vector_idx",
+ "case_severity",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "case_type",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("slug", sa.String(), nullable=True),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("visibility", sa.String(), nullable=True),
+ sa.Column("default", sa.Boolean(), nullable=True),
+ sa.Column("enabled", sa.Boolean(), nullable=True),
+ sa.Column("exclude_from_metrics", sa.Boolean(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "case_type_search_vector_idx",
+ "case_type",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "case",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("title", sa.String(), nullable=False),
+ sa.Column("description", sa.String(), nullable=False),
+ sa.Column("resolution", sa.String(), nullable=True),
+ sa.Column("status", sa.String(), nullable=True),
+ sa.Column("visibility", sa.String(), nullable=False),
+ sa.Column("reported_at", sa.DateTime(), nullable=True),
+ sa.Column("stable_at", sa.DateTime(), nullable=True),
+ sa.Column("closed_at", sa.DateTime(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("assignee_id", sa.Integer(), nullable=True),
+ sa.Column("duplicate_id", sa.Integer(), nullable=True),
+ sa.Column("source_id", sa.Integer(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["assignee_id"],
+ ["dispatch_core.dispatch_user.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["duplicate_id"],
+ ["case.id"],
+ ),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(
+ ["source_id"],
+ ["source.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "case_search_vector_idx", "case", ["search_vector"], unique=False, postgresql_using="gin"
+ )
+ op.create_table(
+ "assoc_case_case_priority",
+ sa.Column("case_id", sa.Integer(), nullable=False),
+ sa.Column("case_priority_id", sa.Integer(), nullable=False),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["case_id"],
+ ["case.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["case_priority_id"],
+ ["case_priority.id"],
+ ),
+ sa.PrimaryKeyConstraint("case_id", "case_priority_id"),
+ )
+ op.create_table(
+ "assoc_case_case_severity",
+ sa.Column("case_id", sa.Integer(), nullable=False),
+ sa.Column("case_severity_id", sa.Integer(), nullable=False),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["case_id"],
+ ["case.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["case_severity_id"],
+ ["case_severity.id"],
+ ),
+ sa.PrimaryKeyConstraint("case_id", "case_severity_id"),
+ )
+ op.create_table(
+ "assoc_case_case_type",
+ sa.Column("case_id", sa.Integer(), nullable=False),
+ sa.Column("case_type_id", sa.Integer(), nullable=False),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["case_id"],
+ ["case.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["case_type_id"],
+ ["case_type.id"],
+ ),
+ sa.PrimaryKeyConstraint("case_id", "case_type_id"),
+ )
+ op.create_table(
+ "assoc_case_tags",
+ sa.Column("case_id", sa.Integer(), nullable=False),
+ sa.Column("tag_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(["case_id"], ["case.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["tag_id"], ["tag.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("case_id", "tag_id"),
+ )
+ op.add_column("incident", sa.Column("case_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "incident", "case", ["case_id"], ["id"], ondelete="CASCADE")
+ op.add_column("ticket", sa.Column("case_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "ticket", "case", ["case_id"], ["id"], ondelete="CASCADE")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "ticket", type_="foreignkey")
+ op.drop_column("ticket", "case_id")
+ op.drop_constraint(None, "incident", type_="foreignkey")
+ op.drop_column("incident", "case_id")
+ op.drop_table("assoc_case_tags")
+ op.drop_table("assoc_case_case_type")
+ op.drop_table("assoc_case_case_severity")
+ op.drop_table("assoc_case_case_priority")
+ op.drop_index("case_search_vector_idx", table_name="case", postgresql_using="gin")
+ op.drop_table("case")
+ op.drop_index("case_type_search_vector_idx", table_name="case_type", postgresql_using="gin")
+ op.drop_table("case_type")
+ op.drop_index(
+ "case_severity_search_vector_idx", table_name="case_severity", postgresql_using="gin"
+ )
+ op.drop_table("case_severity")
+ op.drop_index(
+ "case_priority_search_vector_idx", table_name="case_priority", postgresql_using="gin"
+ )
+ op.drop_table("case_priority")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-07-28_847ff5e1f81c.py b/src/dispatch/database/revisions/tenant/versions/2022-07-28_847ff5e1f81c.py
new file mode 100644
index 000000000000..baee630447b8
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-07-28_847ff5e1f81c.py
@@ -0,0 +1,27 @@
+"""Removes slug column from case type data model
+
+Revision ID: 847ff5e1f81c
+Revises: 2ef7baab2916
+Create Date: 2022-07-28 12:00:08.607995
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "847ff5e1f81c"
+down_revision = "2ef7baab2916"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("case_type", "slug")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case_type", sa.Column("slug", sa.VARCHAR(), autoincrement=False, nullable=True))
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-07-29_0785dceb3373.py b/src/dispatch/database/revisions/tenant/versions/2022-07-29_0785dceb3373.py
new file mode 100644
index 000000000000..0bf4bc0881df
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-07-29_0785dceb3373.py
@@ -0,0 +1,118 @@
+"""Adds on delete cascade to case association objects
+
+Revision ID: 0785dceb3373
+Revises: 847ff5e1f81c
+Create Date: 2022-07-29 11:03:34.558376
+
+"""
+from alembic import op
+
+# revision identifiers, used by Alembic.
+revision = "0785dceb3373"
+down_revision = "847ff5e1f81c"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(
+ "assoc_case_case_priority_case_id_fkey", "assoc_case_case_priority", type_="foreignkey"
+ )
+ op.drop_constraint(
+ "assoc_case_case_priority_case_priority_id_fkey",
+ "assoc_case_case_priority",
+ type_="foreignkey",
+ )
+ op.create_foreign_key(
+ None, "assoc_case_case_priority", "case", ["case_id"], ["id"], ondelete="CASCADE"
+ )
+ op.create_foreign_key(
+ None,
+ "assoc_case_case_priority",
+ "case_priority",
+ ["case_priority_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ op.drop_constraint(
+ "assoc_case_case_severity_case_id_fkey", "assoc_case_case_severity", type_="foreignkey"
+ )
+ op.drop_constraint(
+ "assoc_case_case_severity_case_severity_id_fkey",
+ "assoc_case_case_severity",
+ type_="foreignkey",
+ )
+ op.create_foreign_key(
+ None,
+ "assoc_case_case_severity",
+ "case_severity",
+ ["case_severity_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(
+ None, "assoc_case_case_severity", "case", ["case_id"], ["id"], ondelete="CASCADE"
+ )
+ op.drop_constraint(
+ "assoc_case_case_type_case_id_fkey", "assoc_case_case_type", type_="foreignkey"
+ )
+ op.drop_constraint(
+ "assoc_case_case_type_case_type_id_fkey", "assoc_case_case_type", type_="foreignkey"
+ )
+ op.create_foreign_key(
+ None, "assoc_case_case_type", "case", ["case_id"], ["id"], ondelete="CASCADE"
+ )
+ op.create_foreign_key(
+ None, "assoc_case_case_type", "case_type", ["case_type_id"], ["id"], ondelete="CASCADE"
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "assoc_case_case_type", type_="foreignkey")
+ op.drop_constraint(None, "assoc_case_case_type", type_="foreignkey")
+ op.create_foreign_key(
+ "assoc_case_case_type_case_type_id_fkey",
+ "assoc_case_case_type",
+ "case_type",
+ ["case_type_id"],
+ ["id"],
+ )
+ op.create_foreign_key(
+ "assoc_case_case_type_case_id_fkey", "assoc_case_case_type", "case", ["case_id"], ["id"]
+ )
+ op.drop_constraint(None, "assoc_case_case_severity", type_="foreignkey")
+ op.drop_constraint(None, "assoc_case_case_severity", type_="foreignkey")
+ op.create_foreign_key(
+ "assoc_case_case_severity_case_severity_id_fkey",
+ "assoc_case_case_severity",
+ "case_severity",
+ ["case_severity_id"],
+ ["id"],
+ )
+ op.create_foreign_key(
+ "assoc_case_case_severity_case_id_fkey",
+ "assoc_case_case_severity",
+ "case",
+ ["case_id"],
+ ["id"],
+ )
+ op.drop_constraint(None, "assoc_case_case_priority", type_="foreignkey")
+ op.drop_constraint(None, "assoc_case_case_priority", type_="foreignkey")
+ op.create_foreign_key(
+ "assoc_case_case_priority_case_priority_id_fkey",
+ "assoc_case_case_priority",
+ "case_priority",
+ ["case_priority_id"],
+ ["id"],
+ )
+ op.create_foreign_key(
+ "assoc_case_case_priority_case_id_fkey",
+ "assoc_case_case_priority",
+ "case",
+ ["case_id"],
+ ["id"],
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-08-01_32a0fca4da9e.py b/src/dispatch/database/revisions/tenant/versions/2022-08-01_32a0fca4da9e.py
new file mode 100644
index 000000000000..03798223842c
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-08-01_32a0fca4da9e.py
@@ -0,0 +1,31 @@
+"""Removes ondelete cascade for case_id in the incident data model
+
+Revision ID: 32a0fca4da9e
+Revises: 391f7c3e6992
+Create Date: 2022-08-01 14:49:26.492323
+
+"""
+from alembic import op
+
+
+# revision identifiers, used by Alembic.
+revision = "32a0fca4da9e"
+down_revision = "391f7c3e6992"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint("incident_case_id_fkey", "incident", type_="foreignkey")
+ op.create_foreign_key(None, "incident", "case", ["case_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "incident", type_="foreignkey")
+ op.create_foreign_key(
+ "incident_case_id_fkey", "incident", "case", ["case_id"], ["id"], ondelete="CASCADE"
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-08-01_391f7c3e6992.py b/src/dispatch/database/revisions/tenant/versions/2022-08-01_391f7c3e6992.py
new file mode 100644
index 000000000000..830d3ac535ff
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-08-01_391f7c3e6992.py
@@ -0,0 +1,34 @@
+"""Removes stable_at column and adds columns for triage_at and escalated_at timestamps"
+
+Revision ID: 391f7c3e6992
+Revises: 0785dceb3373
+Create Date: 2022-08-01 12:43:22.577363
+
+"""
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy.dialects import postgresql
+
+# revision identifiers, used by Alembic.
+revision = "391f7c3e6992"
+down_revision = "0785dceb3373"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case", sa.Column("triage_at", sa.DateTime(), nullable=True))
+ op.add_column("case", sa.Column("escalated_at", sa.DateTime(), nullable=True))
+ op.drop_column("case", "stable_at")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "case", sa.Column("stable_at", postgresql.TIMESTAMP(), autoincrement=False, nullable=True)
+ )
+ op.drop_column("case", "escalated_at")
+ op.drop_column("case", "triage_at")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-08-02_cec9ab5c5c8e.py b/src/dispatch/database/revisions/tenant/versions/2022-08-02_cec9ab5c5c8e.py
new file mode 100644
index 000000000000..85d3030477ee
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-08-02_cec9ab5c5c8e.py
@@ -0,0 +1,42 @@
+"""Adds support for case events
+
+Revision ID: cec9ab5c5c8e
+Revises: 32a0fca4da9e
+Create Date: 2022-08-02 16:27:57.425836
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "cec9ab5c5c8e"
+down_revision = "32a0fca4da9e"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("event", sa.Column("dispatch_user_id", sa.Integer(), nullable=True))
+ op.add_column("event", sa.Column("case_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(
+ None,
+ "event",
+ "dispatch_user",
+ ["dispatch_user_id"],
+ ["id"],
+ referent_schema="dispatch_core",
+ ondelete="CASCADE",
+ )
+ op.create_foreign_key(None, "event", "case", ["case_id"], ["id"], ondelete="CASCADE")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "event", type_="foreignkey")
+ op.drop_constraint(None, "event", type_="foreignkey")
+ op.drop_column("event", "case_id")
+ op.drop_column("event", "dispatch_user_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-08-04_8d7232b7cbd4.py b/src/dispatch/database/revisions/tenant/versions/2022-08-04_8d7232b7cbd4.py
new file mode 100644
index 000000000000..0d9b4f7e583b
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-08-04_8d7232b7cbd4.py
@@ -0,0 +1,28 @@
+"""Adds column to case type table to support plugin metadata
+
+Revision ID: 8d7232b7cbd4
+Revises: cec9ab5c5c8e
+Create Date: 2022-08-04 10:02:21.310067
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "8d7232b7cbd4"
+down_revision = "cec9ab5c5c8e"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case_type", sa.Column("plugin_metadata", sa.JSON(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("case_type", "plugin_metadata")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-08-05_e003a33e2b81.py b/src/dispatch/database/revisions/tenant/versions/2022-08-05_e003a33e2b81.py
new file mode 100644
index 000000000000..75c56bae33e5
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-08-05_e003a33e2b81.py
@@ -0,0 +1,34 @@
+"""Adds a group relationship to the case data model
+
+Revision ID: e003a33e2b81
+Revises: 8d7232b7cbd4
+Create Date: 2022-08-05 14:49:00.383167
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "e003a33e2b81"
+down_revision = "8d7232b7cbd4"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case", sa.Column("tactical_group_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "case", "group", ["tactical_group_id"], ["id"])
+ op.add_column("group", sa.Column("case_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "group", "case", ["case_id"], ["id"], ondelete="CASCADE")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "group", type_="foreignkey")
+ op.drop_column("group", "case_id")
+ op.drop_constraint(None, "case", type_="foreignkey")
+ op.drop_column("case", "tactical_group_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-08-08_a3d211ae5404.py b/src/dispatch/database/revisions/tenant/versions/2022-08-08_a3d211ae5404.py
new file mode 100644
index 000000000000..7366c4b4f035
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-08-08_a3d211ae5404.py
@@ -0,0 +1,30 @@
+"""Adds support for storage in the case data model
+
+Revision ID: a3d211ae5404
+Revises: e003a33e2b81
+Create Date: 2022-08-08 11:38:35.722076
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "a3d211ae5404"
+down_revision = "e003a33e2b81"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("storage", sa.Column("case_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "storage", "case", ["case_id"], ["id"], ondelete="CASCADE")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "storage", type_="foreignkey")
+ op.drop_column("storage", "case_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-08-09_0344535782ee.py b/src/dispatch/database/revisions/tenant/versions/2022-08-09_0344535782ee.py
new file mode 100644
index 000000000000..470025229f0a
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-08-09_0344535782ee.py
@@ -0,0 +1,36 @@
+"""Adds support for documents in the case data model
+
+Revision ID: 0344535782ee
+Revises: a3d211ae5404
+Create Date: 2022-08-09 11:15:24.353023
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "0344535782ee"
+down_revision = "a3d211ae5404"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case", sa.Column("case_document_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "case", "document", ["case_document_id"], ["id"])
+ op.add_column("document", sa.Column("case_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(
+ None, "document", "case", ["case_id"], ["id"], ondelete="CASCADE", use_alter=True
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "document", type_="foreignkey")
+ op.drop_column("document", "case_id")
+ op.drop_constraint(None, "case", type_="foreignkey")
+ op.drop_column("case", "case_document_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-08-09_615ed0f95775.py b/src/dispatch/database/revisions/tenant/versions/2022-08-09_615ed0f95775.py
new file mode 100644
index 000000000000..693f8de9a0a1
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-08-09_615ed0f95775.py
@@ -0,0 +1,30 @@
+"""Adds a relationship between the case type and document data models
+
+Revision ID: 615ed0f95775
+Revises: 0344535782ee
+Create Date: 2022-08-09 16:51:02.857467
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "615ed0f95775"
+down_revision = "0344535782ee"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case_type", sa.Column("case_template_document_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "case_type", "document", ["case_template_document_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "case_type", type_="foreignkey")
+ op.drop_column("case_type", "case_template_document_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-08-10_33f35da7013b.py b/src/dispatch/database/revisions/tenant/versions/2022-08-10_33f35da7013b.py
new file mode 100644
index 000000000000..080a3fa10683
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-08-10_33f35da7013b.py
@@ -0,0 +1,30 @@
+"""Adds support for related cases in the case data model
+
+Revision ID: 33f35da7013b
+Revises: 615ed0f95775
+Create Date: 2022-08-10 13:27:05.121936
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "33f35da7013b"
+down_revision = "615ed0f95775"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case", sa.Column("related_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "case", "case", ["related_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "case", type_="foreignkey")
+ op.drop_column("case", "related_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-08-16_a27964951d4f.py b/src/dispatch/database/revisions/tenant/versions/2022-08-16_a27964951d4f.py
new file mode 100644
index 000000000000..630324dd3d9c
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-08-16_a27964951d4f.py
@@ -0,0 +1,30 @@
+"""Adds a relationship between case types and oncall services
+
+Revision ID: a27964951d4f
+Revises: 33f35da7013b
+Create Date: 2022-08-16 10:56:09.852403
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "a27964951d4f"
+down_revision = "33f35da7013b"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case_type", sa.Column("oncall_service_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "case_type", "service", ["oncall_service_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "case_type", type_="foreignkey")
+ op.drop_column("case_type", "oncall_service_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-08-18_86ec96cd14f8.py b/src/dispatch/database/revisions/tenant/versions/2022-08-18_86ec96cd14f8.py
new file mode 100644
index 000000000000..3aa882a702c3
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-08-18_86ec96cd14f8.py
@@ -0,0 +1,78 @@
+"""Replaces composite primary key with a dedicated one
+
+Revision ID: 86ec96cd14f8
+Revises: a27964951d4f
+Create Date: 2022-08-18 10:49:42.395466
+
+"""
+from alembic import op
+
+# import sqlalchemy as sa
+from sqlalchemy.dialects import postgresql
+
+# revision identifiers, used by Alembic.
+revision = "86ec96cd14f8"
+down_revision = "a27964951d4f"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ # op.add_column("assoc_case_case_priority", sa.Column("id", sa.Integer(), nullable=False))
+ op.drop_constraint("assoc_case_case_priority_pkey", "assoc_case_case_priority", type_="primary")
+ op.execute("ALTER TABLE assoc_case_case_priority ADD COLUMN id SERIAL PRIMARY KEY")
+ op.alter_column(
+ "assoc_case_case_priority",
+ "created_at",
+ existing_type=postgresql.TIMESTAMP(),
+ nullable=False,
+ )
+ # op.add_column("assoc_case_case_severity", sa.Column("id", sa.Integer(), nullable=False))
+ op.drop_constraint("assoc_case_case_severity_pkey", "assoc_case_case_severity", type_="primary")
+ op.execute("ALTER TABLE assoc_case_case_severity ADD COLUMN id SERIAL PRIMARY KEY")
+ op.alter_column(
+ "assoc_case_case_severity",
+ "created_at",
+ existing_type=postgresql.TIMESTAMP(),
+ nullable=False,
+ )
+ # op.add_column("assoc_case_case_type", sa.Column("id", sa.Integer(), nullable=False))
+ op.drop_constraint("assoc_case_case_type_pkey", "assoc_case_case_type", type_="primary")
+ op.execute("ALTER TABLE assoc_case_case_type ADD COLUMN id SERIAL PRIMARY KEY")
+ op.alter_column(
+ "assoc_case_case_type", "created_at", existing_type=postgresql.TIMESTAMP(), nullable=False
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.alter_column(
+ "assoc_case_case_type", "created_at", existing_type=postgresql.TIMESTAMP(), nullable=True
+ )
+ op.drop_column("assoc_case_case_type", "id")
+ op.create_primary_key(
+ "assoc_case_case_type_pkey", "assoc_case_case_type", ["case_id", "case_type_id"]
+ )
+ op.alter_column(
+ "assoc_case_case_severity",
+ "created_at",
+ existing_type=postgresql.TIMESTAMP(),
+ nullable=True,
+ )
+ op.drop_column("assoc_case_case_severity", "id")
+ op.create_primary_key(
+ "assoc_case_case_severity_pkey", "assoc_case_case_severity", ["case_id", "case_severity_id"]
+ )
+ op.alter_column(
+ "assoc_case_case_priority",
+ "created_at",
+ existing_type=postgresql.TIMESTAMP(),
+ nullable=True,
+ )
+ op.drop_column("assoc_case_case_priority", "id")
+ op.create_primary_key(
+ "assoc_case_case_priority_pkey", "assoc_case_case_priority", ["case_id", "case_priority_id"]
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-08-29_479024506e05.py b/src/dispatch/database/revisions/tenant/versions/2022-08-29_479024506e05.py
new file mode 100644
index 000000000000..b036d27f9ef6
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-08-29_479024506e05.py
@@ -0,0 +1,65 @@
+"""Sets resolution and status columns to not nullable in case and incident data models
+
+Revision ID: 479024506e05
+Revises: 86ec96cd14f8
+Create Date: 2022-08-29 16:08:16.679095
+
+"""
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy import Column, Integer, String
+from sqlalchemy.ext.declarative import declarative_base
+from sqlalchemy.orm import Session
+
+from dispatch.messaging.strings import INCIDENT_RESOLUTION_DEFAULT, CASE_RESOLUTION_DEFAULT
+
+
+Base = declarative_base()
+
+
+# revision identifiers, used by Alembic.
+revision = "479024506e05"
+down_revision = "86ec96cd14f8"
+branch_labels = None
+depends_on = None
+
+
+class Incident(Base):
+ __tablename__ = "incident"
+ id = Column(Integer, primary_key=True)
+ resolution = Column(String)
+
+
+class Case(Base):
+ __tablename__ = "case"
+ id = Column(Integer, primary_key=True)
+ resolution = Column(String)
+
+
+def upgrade():
+ bind = op.get_bind()
+ db_session = Session(bind=bind)
+
+ for incident in db_session.query(Incident).filter(Incident.resolution == None): # noqa: E711
+ incident.resolution = INCIDENT_RESOLUTION_DEFAULT
+
+ for case in db_session.query(Case).filter(Case.resolution == None): # noqa: E711
+ case.resolution = CASE_RESOLUTION_DEFAULT
+
+ db_session.commit()
+
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.alter_column("case", "resolution", existing_type=sa.VARCHAR(), nullable=False)
+ op.alter_column("case", "status", existing_type=sa.VARCHAR(), nullable=False)
+ op.alter_column("incident", "resolution", existing_type=sa.VARCHAR(), nullable=False)
+ op.alter_column("incident", "status", existing_type=sa.VARCHAR(), nullable=False)
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.alter_column("incident", "status", existing_type=sa.VARCHAR(), nullable=True)
+ op.alter_column("incident", "resolution", existing_type=sa.VARCHAR(), nullable=True)
+ op.alter_column("case", "status", existing_type=sa.VARCHAR(), nullable=True)
+ op.alter_column("case", "resolution", existing_type=sa.VARCHAR(), nullable=True)
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-09-02_f9384b873721.py b/src/dispatch/database/revisions/tenant/versions/2022-09-02_f9384b873721.py
new file mode 100644
index 000000000000..85e2627a669e
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-09-02_f9384b873721.py
@@ -0,0 +1,30 @@
+"""Adds relationship to incident type in the case type data model
+
+Revision ID: f9384b873721
+Revises: 479024506e05
+Create Date: 2022-09-02 13:54:20.742225
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "f9384b873721"
+down_revision = "479024506e05"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case_type", sa.Column("incident_type_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "case_type", "incident_type", ["incident_type_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "case_type", type_="foreignkey")
+ op.drop_column("case_type", "incident_type_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-09-09_4452426c6905.py b/src/dispatch/database/revisions/tenant/versions/2022-09-09_4452426c6905.py
new file mode 100644
index 000000000000..24b94ff4133e
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-09-09_4452426c6905.py
@@ -0,0 +1,98 @@
+"""Replaces Assoc Object tables with 1:1 relationships for case and types, severities, and priorities
+
+Revision ID: 4452426c6905
+Revises: f9384b873721
+Create Date: 2022-09-09 11:41:01.036642
+
+"""
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy.dialects import postgresql
+
+# revision identifiers, used by Alembic.
+revision = "4452426c6905"
+down_revision = "f9384b873721"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("assoc_case_case_type")
+ op.drop_table("assoc_case_case_priority")
+ op.drop_table("assoc_case_case_severity")
+ op.add_column("case", sa.Column("case_type_id", sa.Integer(), nullable=True))
+ op.add_column("case", sa.Column("case_severity_id", sa.Integer(), nullable=True))
+ op.add_column("case", sa.Column("case_priority_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "case", "case_severity", ["case_severity_id"], ["id"])
+ op.create_foreign_key(None, "case", "case_type", ["case_type_id"], ["id"])
+ op.create_foreign_key(None, "case", "case_priority", ["case_priority_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "case", type_="foreignkey")
+ op.drop_constraint(None, "case", type_="foreignkey")
+ op.drop_constraint(None, "case", type_="foreignkey")
+ op.drop_column("case", "case_priority_id")
+ op.drop_column("case", "case_severity_id")
+ op.drop_column("case", "case_type_id")
+ op.create_table(
+ "assoc_case_case_severity",
+ sa.Column("case_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("case_severity_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("created_at", postgresql.TIMESTAMP(), autoincrement=False, nullable=False),
+ sa.Column("id", sa.INTEGER(), autoincrement=True, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["case_id"],
+ ["case.id"],
+ name="assoc_case_case_severity_case_id_fkey",
+ ondelete="CASCADE",
+ ),
+ sa.ForeignKeyConstraint(
+ ["case_severity_id"],
+ ["case_severity.id"],
+ name="assoc_case_case_severity_case_severity_id_fkey",
+ ondelete="CASCADE",
+ ),
+ sa.PrimaryKeyConstraint("id", name="assoc_case_case_severity_pkey"),
+ )
+ op.create_table(
+ "assoc_case_case_priority",
+ sa.Column("case_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("case_priority_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("created_at", postgresql.TIMESTAMP(), autoincrement=False, nullable=False),
+ sa.Column("id", sa.INTEGER(), autoincrement=True, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["case_id"],
+ ["case.id"],
+ name="assoc_case_case_priority_case_id_fkey",
+ ondelete="CASCADE",
+ ),
+ sa.ForeignKeyConstraint(
+ ["case_priority_id"],
+ ["case_priority.id"],
+ name="assoc_case_case_priority_case_priority_id_fkey",
+ ondelete="CASCADE",
+ ),
+ sa.PrimaryKeyConstraint("id", name="assoc_case_case_priority_pkey"),
+ )
+ op.create_table(
+ "assoc_case_case_type",
+ sa.Column("case_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("case_type_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("created_at", postgresql.TIMESTAMP(), autoincrement=False, nullable=False),
+ sa.Column("id", sa.INTEGER(), autoincrement=True, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["case_id"], ["case.id"], name="assoc_case_case_type_case_id_fkey", ondelete="CASCADE"
+ ),
+ sa.ForeignKeyConstraint(
+ ["case_type_id"],
+ ["case_type.id"],
+ name="assoc_case_case_type_case_type_id_fkey",
+ ondelete="CASCADE",
+ ),
+ sa.PrimaryKeyConstraint("id", name="assoc_case_case_type_pkey"),
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-09-09_e49209df586d.py b/src/dispatch/database/revisions/tenant/versions/2022-09-09_e49209df586d.py
new file mode 100644
index 000000000000..edc0d24e74fa
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-09-09_e49209df586d.py
@@ -0,0 +1,30 @@
+"""Removes relationship to source in the case data model
+
+Revision ID: e49209df586d
+Revises: 4452426c6905
+Create Date: 2022-09-09 12:15:44.946539
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "e49209df586d"
+down_revision = "4452426c6905"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint("case_source_id_fkey", "case", type_="foreignkey")
+ op.drop_column("case", "source_id")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case", sa.Column("source_id", sa.INTEGER(), autoincrement=False, nullable=True))
+ op.create_foreign_key("case_source_id_fkey", "case", "source", ["source_id"], ["id"])
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-09-23_cfde1bca6a6e.py b/src/dispatch/database/revisions/tenant/versions/2022-09-23_cfde1bca6a6e.py
new file mode 100644
index 000000000000..898284af9027
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-09-23_cfde1bca6a6e.py
@@ -0,0 +1,31 @@
+"""Adds the ability to associate cases to workflow instances.
+
+Revision ID: cfde1bca6a6e
+Revises: e49209df586d
+Create Date: 2022-09-23 11:36:09.154783
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "cfde1bca6a6e"
+down_revision = "e49209df586d"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("workflow_instance", sa.Column("case_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(
+ None, "workflow_instance", "case", ["case_id"], ["id"], ondelete="CASCADE"
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "workflow_instance", type_="foreignkey")
+ op.drop_column("workflow_instance", "case_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-09-30_c1abb0cc40e5.py b/src/dispatch/database/revisions/tenant/versions/2022-09-30_c1abb0cc40e5.py
new file mode 100644
index 000000000000..8d0f372105f6
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-09-30_c1abb0cc40e5.py
@@ -0,0 +1,39 @@
+"""Removes relationship from task to ticket
+
+Revision ID: c1abb0cc40e5
+Revises: df200ca113f7
+Create Date: 2022-09-30 10:26:15.849836
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "c1abb0cc40e5"
+down_revision = "df200ca113f7"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("task_tickets")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "task_tickets",
+ sa.Column("ticket_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("task_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["task_id"], ["task.id"], name="task_tickets_task_id_fkey", ondelete="CASCADE"
+ ),
+ sa.ForeignKeyConstraint(
+ ["ticket_id"], ["ticket.id"], name="task_tickets_ticket_id_fkey", ondelete="CASCADE"
+ ),
+ sa.PrimaryKeyConstraint("ticket_id", "task_id", name="task_tickets_pkey"),
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-09-30_df200ca113f7.py b/src/dispatch/database/revisions/tenant/versions/2022-09-30_df200ca113f7.py
new file mode 100644
index 000000000000..65971d2903a7
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-09-30_df200ca113f7.py
@@ -0,0 +1,41 @@
+"""Allows for many-to-many relationships for cases and incidents
+
+Revision ID: df200ca113f7
+Revises: e49209df586d
+Create Date: 2022-09-30 10:02:46.584358
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "df200ca113f7"
+down_revision = "e49209df586d"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "assoc_case_incidents",
+ sa.Column("case_id", sa.Integer(), nullable=False),
+ sa.Column("incident_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(["case_id"], ["case.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["incident_id"], ["incident.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("case_id", "incident_id"),
+ )
+ op.drop_constraint("incident_case_id_fkey", "incident", type_="foreignkey")
+ op.drop_column("incident", "case_id")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "incident", sa.Column("case_id", sa.INTEGER(), autoincrement=False, nullable=True)
+ )
+ op.create_foreign_key("incident_case_id_fkey", "incident", "case", ["case_id"], ["id"])
+ op.drop_table("assoc_case_incidents")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-10-04_3a5e776ddce4.py b/src/dispatch/database/revisions/tenant/versions/2022-10-04_3a5e776ddce4.py
new file mode 100644
index 000000000000..71e9aaef77d8
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-10-04_3a5e776ddce4.py
@@ -0,0 +1,21 @@
+"""Merge revision
+
+Revision ID: 3a5e776ddce4
+Revises: cfde1bca6a6e, c1abb0cc40e5
+Create Date: 2022-10-04 08:54:16.613236
+
+"""
+
+# revision identifiers, used by Alembic.
+revision = "3a5e776ddce4"
+down_revision = ("c1abb0cc40e5", "cfde1bca6a6e")
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ pass
+
+
+def downgrade():
+ pass
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-10-05_534c316ba65b.py b/src/dispatch/database/revisions/tenant/versions/2022-10-05_534c316ba65b.py
new file mode 100644
index 000000000000..37b86e25e537
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-10-05_534c316ba65b.py
@@ -0,0 +1,140 @@
+"""Adds signal models
+
+Revision ID: 534c316ba65b
+Revises: c1abb0cc40e5
+Create Date: 2022-10-05 12:28:00.818607
+
+"""
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy.dialects import postgresql
+import sqlalchemy_utils
+
+# revision identifiers, used by Alembic.
+revision = "534c316ba65b"
+down_revision = "c1abb0cc40e5"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "duplication_rule",
+ sa.Column("evergreen", sa.Boolean(), nullable=True),
+ sa.Column("evergreen_owner", sa.String(), nullable=True),
+ sa.Column("evergreen_reminder_interval", sa.Integer(), nullable=True),
+ sa.Column("evergreen_last_reminder_at", sa.DateTime(), nullable=True),
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("expression", sa.JSON(), nullable=False),
+ sa.Column("creator_id", sa.Integer(), nullable=True),
+ sa.Column("mode", sa.String(), nullable=False),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["creator_id"],
+ ["dispatch_core.dispatch_user.id"],
+ ),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "duplication_rule_search_vector_idx",
+ "duplication_rule",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "suppression_rule",
+ sa.Column("evergreen", sa.Boolean(), nullable=True),
+ sa.Column("evergreen_owner", sa.String(), nullable=True),
+ sa.Column("evergreen_reminder_interval", sa.Integer(), nullable=True),
+ sa.Column("evergreen_last_reminder_at", sa.DateTime(), nullable=True),
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("expression", sa.JSON(), nullable=False),
+ sa.Column("creator_id", sa.Integer(), nullable=True),
+ sa.Column("mode", sa.String(), nullable=False),
+ sa.Column("expiration", sa.DateTime(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["creator_id"],
+ ["dispatch_core.dispatch_user.id"],
+ ),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "suppression_rule_search_vector_idx",
+ "suppression_rule",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "signal",
+ sa.Column("id", postgresql.UUID(as_uuid=True), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("external_url", sa.String(), nullable=True),
+ sa.Column("external_id", sa.String(), nullable=True),
+ sa.Column("severity", sa.String(), nullable=True),
+ sa.Column("detection", sa.String(), nullable=True),
+ sa.Column("detection_variant", sa.String(), nullable=True),
+ sa.Column("raw", postgresql.JSONB(astext_type=sa.Text()), nullable=True),
+ sa.Column("source_id", sa.Integer(), nullable=True),
+ sa.Column("case_id", sa.Integer(), nullable=True),
+ sa.Column("duplication_rule_id", sa.Integer(), nullable=True),
+ sa.Column("suppression_rule_id", sa.Integer(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["case_id"],
+ ["case.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["duplication_rule_id"],
+ ["duplication_rule.id"],
+ ),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(
+ ["source_id"],
+ ["source.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["suppression_rule_id"],
+ ["suppression_rule.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ op.create_index(
+ "signal_search_vector_idx",
+ "signal",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_index("signal_search_vector_idx", table_name="signal", postgresql_using="gin")
+ op.drop_table("signal")
+ op.drop_index(
+ "suppression_rule_search_vector_idx", table_name="suppression_rule", postgresql_using="gin"
+ )
+ op.drop_table("suppression_rule")
+ op.drop_index(
+ "duplication_rule_search_vector_idx", table_name="duplication_rule", postgresql_using="gin"
+ )
+ op.drop_table("duplication_rule")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-10-17_4f1af51dae6e.py b/src/dispatch/database/revisions/tenant/versions/2022-10-17_4f1af51dae6e.py
new file mode 100644
index 000000000000..343ad9256eec
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-10-17_4f1af51dae6e.py
@@ -0,0 +1,55 @@
+"""Adds data model for incident severity
+
+Revision ID: 4f1af51dae6e
+Revises: 534c316ba65b
+Create Date: 2022-10-17 13:07:32.137648
+
+"""
+from alembic import op
+import sqlalchemy as sa
+import sqlalchemy_utils
+
+
+# revision identifiers, used by Alembic.
+revision = "4f1af51dae6e"
+down_revision = "534c316ba65b"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "incident_severity",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("color", sa.String(), nullable=True),
+ sa.Column("enabled", sa.Boolean(), nullable=True),
+ sa.Column("default", sa.Boolean(), nullable=True),
+ sa.Column("view_order", sa.Integer(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "incident_severity_search_vector_idx",
+ "incident_severity",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_index(
+ "incident_severity_search_vector_idx",
+ table_name="incident_severity",
+ postgresql_using="gin",
+ )
+ op.drop_table("incident_severity")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-10-17_d6438d754467.py b/src/dispatch/database/revisions/tenant/versions/2022-10-17_d6438d754467.py
new file mode 100644
index 000000000000..22aaf68ea52e
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-10-17_d6438d754467.py
@@ -0,0 +1,30 @@
+"""Adds relationship between incident and incident severity
+
+Revision ID: d6438d754467
+Revises: 4f1af51dae6e
+Create Date: 2022-10-17 13:16:05.234617
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "d6438d754467"
+down_revision = "4f1af51dae6e"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("incident", sa.Column("incident_severity_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "incident", "incident_severity", ["incident_severity_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "incident", type_="foreignkey")
+ op.drop_column("incident", "incident_severity_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-10-19_3b0f5b81376f.py b/src/dispatch/database/revisions/tenant/versions/2022-10-19_3b0f5b81376f.py
new file mode 100644
index 000000000000..e255bb53c133
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-10-19_3b0f5b81376f.py
@@ -0,0 +1,124 @@
+"""Creates and sets a default incident severity to all incidents in all projects
+
+Revision ID: 3b0f5b81376f
+Revises: d6438d754467
+Create Date: 2022-10-19 13:13:17.581202
+
+"""
+from alembic import op
+from typing import Annotated
+
+from pydantic import Field, StringConstraints, ConfigDict, BaseModel
+
+from sqlalchemy import Column, ForeignKey, Integer, String, Boolean
+from sqlalchemy.ext.declarative import declarative_base
+from sqlalchemy.orm import relationship, Session
+from sqlalchemy.sql.expression import true
+from sqlalchemy.sql.schema import UniqueConstraint
+
+from dispatch.incident.severity import service as incident_severity_service
+
+PrimaryKey = Annotated[int, Field(gt=0, lt=2147483647)]
+NameStr = Annotated[str, StringConstraints(pattern=r"^.*\S.*$", strip_whitespace=True, min_length=3)]
+
+Base = declarative_base()
+
+# revision identifiers, used by Alembic.
+revision = "3b0f5b81376f"
+down_revision = "d6438d754467"
+branch_labels = None
+depends_on = None
+
+
+class Project(Base):
+ __tablename__ = "project"
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+
+
+class IncidentSeverity(Base):
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ __tablename__ = "incident_severity"
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ color = Column(String)
+ enabled = Column(Boolean, default=True)
+ default = Column(Boolean, default=False)
+
+ # This column is used to control how severities should be displayed
+ # Lower numbers will be shown first.
+ view_order = Column(Integer, default=9999)
+
+ project_id = Column(Integer, ForeignKey("project.id", ondelete="CASCADE"))
+ project = relationship("Project")
+
+
+class Incident(Base):
+ __tablename__ = "incident"
+ id = Column(Integer, primary_key=True)
+
+ incident_severity = relationship("IncidentSeverity", backref="incident")
+ incident_severity_id = Column(Integer, ForeignKey("incident_severity.id"))
+
+ project = relationship("Project")
+ project_id = Column(Integer, ForeignKey("project.id", ondelete="CASCADE"))
+
+
+class DispatchBase(BaseModel):
+ model_config = ConfigDict(from_attributes=True, validate_assignment=True, arbitrary_types_allowed=True, str_strip_whitespace=True)
+
+
+class ProjectRead(DispatchBase):
+ id: PrimaryKey
+ name: NameStr
+
+
+class IncidentSeverityCreate(DispatchBase):
+ color: str
+ default: bool
+ description: str
+ enabled: bool
+ name: NameStr
+ project: ProjectRead
+ view_order: int
+
+
+def upgrade():
+ bind = op.get_bind()
+ session = Session(bind=bind)
+
+ projects = session.query(Project).all()
+ for project in projects:
+ print(f"Creating default incident severity in project {project.name}...")
+ incident_severity_in = IncidentSeverityCreate(
+ name="Undetermined",
+ description="The severity of the incident has not yet been determined.",
+ color="#9e9e9e",
+ enabled=True,
+ default=True,
+ project=project,
+ view_order=1,
+ )
+ incident_severity = incident_severity_service.create(
+ db_session=session, incident_severity_in=incident_severity_in
+ )
+ print(f"Default incident severity created in project {project.name}.")
+
+ print(f"Setting default incident severity to all incidents in project {project.name}...")
+ incident_severity = (
+ session.query(IncidentSeverity)
+ .filter(IncidentSeverity.project_id == project.id)
+ .filter(IncidentSeverity.default == true())
+ .first()
+ )
+ incidents = session.query(Incident).filter(Incident.project_id == project.id).all()
+ for incident in incidents:
+ incident.incident_severity = incident_severity
+ session.add(incident)
+ session.commit()
+ print(f"Default incident severity set to all incidents in project {project.name}.")
+
+
+def downgrade():
+ pass
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-10-26_4b65941d065a.py b/src/dispatch/database/revisions/tenant/versions/2022-10-26_4b65941d065a.py
new file mode 100644
index 000000000000..9890cc478259
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-10-26_4b65941d065a.py
@@ -0,0 +1,75 @@
+"""Moves activity column and its value from the participant model to the participant role model
+
+Revision ID: 4b65941d065a
+Revises: 3b0f5b81376f
+Create Date: 2022-10-26 16:02:26.996119
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+from sqlalchemy import Column, ForeignKey, Integer
+from sqlalchemy.ext.declarative import declarative_base
+from sqlalchemy.orm import relationship, Session
+from pydantic import Field, StringConstraints
+from typing import Annotated
+
+PrimaryKey = Annotated[int, Field(gt=0, lt=2147483647)]
+NameStr = Annotated[str, StringConstraints(pattern=r"^.*\S.*$", strip_whitespace=True, min_length=3)]
+
+Base = declarative_base()
+
+# revision identifiers, used by Alembic.
+revision = "4b65941d065a"
+down_revision = "3b0f5b81376f"
+branch_labels = None
+depends_on = None
+
+
+class Participant(Base):
+ __tablename__ = "participant"
+ id = Column(Integer, primary_key=True)
+ activity = Column(Integer)
+ participant_roles = relationship(
+ "ParticipantRole", backref="participant", lazy="subquery", cascade="all, delete-orphan"
+ )
+
+
+class ParticipantRole(Base):
+ __tablename__ = "participant_role"
+ id = Column(Integer, primary_key=True)
+ activity = Column(Integer)
+ participant_id = Column(Integer, ForeignKey("participant.id", ondelete="CASCADE"))
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("participant_role", sa.Column("activity", sa.Integer(), nullable=True))
+
+ bind = op.get_bind()
+ session = Session(bind=bind)
+
+ print("Migrating participant activity to participant role activity...")
+
+ participants = session.query(Participant).all()
+ for participant in participants:
+ participant_activity = participant.activity
+ for participant_role in participant.participant_roles:
+ participant_role.activity = participant_activity
+ session.add(participant_role)
+ session.commit()
+
+ print("Participant activity to participant role activity migrated.")
+
+ op.drop_column("participant", "activity")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("participant_role", "activity")
+ op.add_column(
+ "participant", sa.Column("activity", sa.INTEGER(), autoincrement=False, nullable=True)
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-10-27_01aa49ca0470.py b/src/dispatch/database/revisions/tenant/versions/2022-10-27_01aa49ca0470.py
new file mode 100644
index 000000000000..bff9a08c69a8
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-10-27_01aa49ca0470.py
@@ -0,0 +1,245 @@
+"""Refactors signal processing models, will drop existing signal tables.
+
+Revision ID: 01aa49ca0470
+Revises: 3b0f5b81376f
+Create Date: 2022-10-27 10:46:56.571751
+
+"""
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy.dialects import postgresql
+
+import sqlalchemy_utils
+
+# revision identifiers, used by Alembic.
+revision = "01aa49ca0470"
+down_revision = "3b0f5b81376f"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("signal")
+ op.create_table(
+ "assoc_duplication_rule_tag_types",
+ sa.Column("duplication_rule_id", sa.Integer(), nullable=False),
+ sa.Column("tag_type_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(
+ ["duplication_rule_id"], ["duplication_rule.id"], ondelete="CASCADE"
+ ),
+ sa.ForeignKeyConstraint(["tag_type_id"], ["tag_type.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("duplication_rule_id", "tag_type_id"),
+ )
+ op.create_table(
+ "assoc_suppression_rule_tags",
+ sa.Column("suppression_rule_id", sa.Integer(), nullable=False),
+ sa.Column("tag_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(
+ ["suppression_rule_id"], ["suppression_rule.id"], ondelete="CASCADE"
+ ),
+ sa.ForeignKeyConstraint(["tag_id"], ["tag.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("suppression_rule_id", "tag_id"),
+ )
+ op.create_table(
+ "signal",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("owner", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("external_url", sa.String(), nullable=True),
+ sa.Column("external_id", sa.String(), nullable=True),
+ sa.Column("source_id", sa.Integer(), nullable=True),
+ sa.Column("variant", sa.String(), nullable=True),
+ sa.Column("case_type_id", sa.Integer(), nullable=True),
+ sa.Column("case_priority_id", sa.Integer(), nullable=True),
+ sa.Column("duplication_rule_id", sa.Integer(), nullable=True),
+ sa.Column("suppression_rule_id", sa.Integer(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["case_priority_id"],
+ ["case_priority.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["case_type_id"],
+ ["case_type.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["duplication_rule_id"],
+ ["duplication_rule.id"],
+ ),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(
+ ["source_id"],
+ ["source.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["suppression_rule_id"],
+ ["suppression_rule.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ op.create_index(
+ "signal_search_vector_idx",
+ "signal",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "signal_instance",
+ sa.Column("id", postgresql.UUID(as_uuid=True), nullable=False),
+ sa.Column("case_id", sa.Integer(), nullable=True),
+ sa.Column("signal_id", sa.Integer(), nullable=True),
+ sa.Column("fingerprint", sa.String(), nullable=True),
+ sa.Column("duplication_rule_id", sa.Integer(), nullable=True),
+ sa.Column("suppression_rule_id", sa.Integer(), nullable=True),
+ sa.Column("raw", postgresql.JSONB(astext_type=sa.Text()), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["case_id"],
+ ["case.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["duplication_rule_id"],
+ ["duplication_rule.id"],
+ ),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(
+ ["signal_id"],
+ ["signal.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["suppression_rule_id"],
+ ["suppression_rule.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ op.create_table(
+ "assoc_signal_instance_tags",
+ sa.Column("signal_instance_id", postgresql.UUID(), nullable=False),
+ sa.Column("tag_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(["signal_instance_id"], ["signal_instance.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["tag_id"], ["tag.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("signal_instance_id", "tag_id"),
+ )
+ op.add_column("duplication_rule", sa.Column("window", sa.Integer(), nullable=True))
+ op.drop_constraint("duplication_rule_name_project_id_key", "duplication_rule", type_="unique")
+ op.drop_index("duplication_rule_search_vector_idx", table_name="duplication_rule")
+ op.drop_constraint("duplication_rule_creator_id_fkey", "duplication_rule", type_="foreignkey")
+ op.drop_column("duplication_rule", "expression")
+ op.drop_column("duplication_rule", "creator_id")
+ op.drop_column("duplication_rule", "description")
+ op.drop_column("duplication_rule", "name")
+ op.drop_column("duplication_rule", "search_vector")
+ op.drop_constraint("suppression_rule_name_project_id_key", "suppression_rule", type_="unique")
+ op.drop_index("suppression_rule_search_vector_idx", table_name="suppression_rule")
+ op.drop_constraint("suppression_rule_creator_id_fkey", "suppression_rule", type_="foreignkey")
+ op.drop_column("suppression_rule", "expression")
+ op.drop_column("suppression_rule", "creator_id")
+ op.drop_column("suppression_rule", "description")
+ op.drop_column("suppression_rule", "name")
+ op.drop_column("suppression_rule", "search_vector")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "suppression_rule",
+ sa.Column("search_vector", postgresql.TSVECTOR(), autoincrement=False, nullable=True),
+ )
+ op.add_column(
+ "suppression_rule", sa.Column("name", sa.VARCHAR(), autoincrement=False, nullable=True)
+ )
+ op.add_column(
+ "suppression_rule",
+ sa.Column("description", sa.VARCHAR(), autoincrement=False, nullable=True),
+ )
+ op.add_column(
+ "suppression_rule",
+ sa.Column("creator_id", sa.INTEGER(), autoincrement=False, nullable=True),
+ )
+ op.add_column(
+ "suppression_rule",
+ sa.Column(
+ "expression",
+ postgresql.JSON(astext_type=sa.Text()),
+ autoincrement=False,
+ nullable=False,
+ ),
+ )
+ op.create_foreign_key(
+ "suppression_rule_creator_id_fkey",
+ "suppression_rule",
+ "dispatch_user",
+ ["creator_id"],
+ ["id"],
+ referent_schema="dispatch_core",
+ )
+ op.create_index(
+ "suppression_rule_search_vector_idx", "suppression_rule", ["search_vector"], unique=False
+ )
+ op.create_unique_constraint(
+ "suppression_rule_name_project_id_key", "suppression_rule", ["name", "project_id"]
+ )
+ op.add_column(
+ "plugin_instance",
+ sa.Column(
+ "configuration",
+ postgresql.JSON(astext_type=sa.Text()),
+ autoincrement=False,
+ nullable=True,
+ ),
+ )
+ op.add_column(
+ "duplication_rule",
+ sa.Column("search_vector", postgresql.TSVECTOR(), autoincrement=False, nullable=True),
+ )
+ op.add_column(
+ "duplication_rule", sa.Column("name", sa.VARCHAR(), autoincrement=False, nullable=True)
+ )
+ op.add_column(
+ "duplication_rule",
+ sa.Column("description", sa.VARCHAR(), autoincrement=False, nullable=True),
+ )
+ op.add_column(
+ "duplication_rule",
+ sa.Column("creator_id", sa.INTEGER(), autoincrement=False, nullable=True),
+ )
+ op.add_column(
+ "duplication_rule",
+ sa.Column(
+ "expression",
+ postgresql.JSON(astext_type=sa.Text()),
+ autoincrement=False,
+ nullable=False,
+ ),
+ )
+ op.create_foreign_key(
+ "duplication_rule_creator_id_fkey",
+ "duplication_rule",
+ "dispatch_user",
+ ["creator_id"],
+ ["id"],
+ referent_schema="dispatch_core",
+ )
+ op.create_index(
+ "duplication_rule_search_vector_idx", "duplication_rule", ["search_vector"], unique=False
+ )
+ op.create_unique_constraint(
+ "duplication_rule_name_project_id_key", "duplication_rule", ["name", "project_id"]
+ )
+ op.drop_column("duplication_rule", "window")
+ op.drop_table("assoc_signal_instance_tags")
+ op.drop_table("signal_instance")
+ op.drop_index("signal_search_vector_idx", table_name="signal", postgresql_using="gin")
+ op.drop_table("signal")
+ op.drop_table("assoc_suppression_rule_tags")
+ op.drop_table("assoc_duplication_rule_tag_types")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-10-31_34aeedc9d09a.py b/src/dispatch/database/revisions/tenant/versions/2022-10-31_34aeedc9d09a.py
new file mode 100644
index 000000000000..e84d1102bec6
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-10-31_34aeedc9d09a.py
@@ -0,0 +1,21 @@
+"""Merge revision
+
+Revision ID: 34aeedc9d09a
+Revises: 4b65941d065a, 01aa49ca0470, 3a5e776ddce4
+Create Date: 2022-10-31 09:41:11.971654
+
+"""
+
+# revision identifiers, used by Alembic.
+revision = "34aeedc9d09a"
+down_revision = ("4b65941d065a", "01aa49ca0470", "3a5e776ddce4")
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ pass
+
+
+def downgrade():
+ pass
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-11-09_eb348d46afd0.py b/src/dispatch/database/revisions/tenant/versions/2022-11-09_eb348d46afd0.py
new file mode 100644
index 000000000000..31c2c4d08929
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-11-09_eb348d46afd0.py
@@ -0,0 +1,30 @@
+"""Adds a cascade for deleting cases."
+
+Revision ID: eb348d46afd0
+Revises: 34aeedc9d09a
+Create Date: 2022-11-09 10:01:08.742441
+
+"""
+from alembic import op
+
+# revision identifiers, used by Alembic.
+revision = "eb348d46afd0"
+down_revision = "34aeedc9d09a"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint("signal_instance_case_id_fkey", "signal_instance", type_="foreignkey")
+ op.create_foreign_key(None, "signal_instance", "case", ["case_id"], ["id"], ondelete="CASCADE")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "signal_instance", type_="foreignkey")
+ op.create_foreign_key(
+ "signal_instance_case_id_fkey", "signal_instance", "case", ["case_id"], ["id"]
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-12-09_f809d9e76ba3.py b/src/dispatch/database/revisions/tenant/versions/2022-12-09_f809d9e76ba3.py
new file mode 100644
index 000000000000..ae0630317237
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-12-09_f809d9e76ba3.py
@@ -0,0 +1,39 @@
+"""Removes relationship to incidents
+
+Revision ID: f809d9e76ba3
+Revises: eb348d46afd0
+Create Date: 2022-12-09 12:59:54.383772
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "f809d9e76ba3"
+down_revision = "eb348d46afd0"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("service_incident")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "service_incident",
+ sa.Column("incident_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("service_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["incident_id"], ["incident.id"], name="service_incident_incident_id_fkey"
+ ),
+ sa.ForeignKeyConstraint(
+ ["service_id"], ["service.id"], name="service_incident_service_id_fkey"
+ ),
+ sa.PrimaryKeyConstraint("incident_id", "service_id", name="service_incident_pkey"),
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2022-12-15_bd61c1e1e7cd.py b/src/dispatch/database/revisions/tenant/versions/2022-12-15_bd61c1e1e7cd.py
new file mode 100644
index 000000000000..3e3578887d80
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2022-12-15_bd61c1e1e7cd.py
@@ -0,0 +1,28 @@
+"""Adds page_assignee column
+
+Revision ID: bd61c1e1e7cd
+Revises: f809d9e76ba3
+Create Date: 2022-12-15 14:59:25.077450
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "bd61c1e1e7cd"
+down_revision = "f809d9e76ba3"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case_priority", sa.Column("page_assignee", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("case_priority", "page_assignee")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-01-13_6e53722fa1f2.py b/src/dispatch/database/revisions/tenant/versions/2023-01-13_6e53722fa1f2.py
new file mode 100644
index 000000000000..708950fb3131
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-01-13_6e53722fa1f2.py
@@ -0,0 +1,58 @@
+"""Adds the ability for case to send and interact with external conversations.
+
+Revision ID: 6e53722fa1f2
+Revises: bd61c1e1e7cd
+Create Date: 2023-01-13 14:37:42.899102
+
+"""
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy.dialects import postgresql
+
+# revision identifiers, used by Alembic.
+revision = "6e53722fa1f2"
+down_revision = "bd61c1e1e7cd"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "assoc_signal_tags",
+ sa.Column("signal_id", sa.Integer(), nullable=False),
+ sa.Column("tag_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(["signal_id"], ["signal.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["tag_id"], ["tag.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("signal_id", "tag_id"),
+ )
+ op.add_column("case_type", sa.Column("conversation_target", sa.String(), nullable=True))
+ op.add_column("conversation", sa.Column("thread_id", sa.String(), nullable=True))
+ op.add_column("conversation", sa.Column("case_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "conversation", "case", ["case_id"], ["id"], ondelete="CASCADE")
+ op.add_column("signal", sa.Column("loopin_signal_identity", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("signal", "loopin_signal_identity")
+ op.add_column(
+ "search_filter", sa.Column("type", sa.VARCHAR(), autoincrement=False, nullable=True)
+ )
+ op.add_column(
+ "plugin_instance",
+ sa.Column(
+ "configuration",
+ postgresql.JSON(astext_type=sa.Text()),
+ autoincrement=False,
+ nullable=True,
+ ),
+ )
+ op.drop_column("notification", "subject")
+ op.drop_constraint(None, "conversation", type_="foreignkey")
+ op.drop_column("conversation", "case_id")
+ op.drop_column("conversation", "thread_id")
+ op.drop_column("case_type", "conversation_target")
+ op.drop_table("assoc_signal_tags")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-01-25_e23c468440c9.py b/src/dispatch/database/revisions/tenant/versions/2023-01-25_e23c468440c9.py
new file mode 100644
index 000000000000..8b7210c0e99f
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-01-25_e23c468440c9.py
@@ -0,0 +1,25 @@
+"""Sets a default search filter creator
+
+Revision ID: e23c468440c9
+Revises: 6e53722fa1f2
+Create Date: 2023-01-25 11:05:55.742100
+
+"""
+from alembic import op
+
+# revision identifiers, used by Alembic.
+revision = "e23c468440c9"
+down_revision = "6e53722fa1f2"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ # default to a known ID if there is none to help with creator migration
+ op.execute("update search_filter set creator_id = 1 where creator_id is null;")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ pass
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-01-27_956eb8f8987e.py b/src/dispatch/database/revisions/tenant/versions/2023-01-27_956eb8f8987e.py
new file mode 100644
index 000000000000..d25a331dbf15
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-01-27_956eb8f8987e.py
@@ -0,0 +1,27 @@
+"""empty message
+
+Revision ID: 956eb8f8987e
+Revises: e23c468440c9
+Create Date: 2023-01-27 11:11:47.149608
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "956eb8f8987e"
+down_revision = "e23c468440c9"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("participant", sa.Column("user_conversation_id", sa.String(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("participant", "user_conversation_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-01-30_941efd922446.py b/src/dispatch/database/revisions/tenant/versions/2023-01-30_941efd922446.py
new file mode 100644
index 000000000000..151dc4377ac6
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-01-30_941efd922446.py
@@ -0,0 +1,33 @@
+"""Migrates to subject instead of type for search filters"
+
+Revision ID: 941efd922446
+Revises: e4b4991dddcd
+Create Date: 2023-01-30 08:49:41.713653
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "941efd922446"
+down_revision = "e4b4991dddcd"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+
+ op.add_column("search_filter", sa.Column("subject", sa.String(), nullable=True))
+ op.execute("UPDATE search_filter SET subject = 'incident' WHERE subject is null")
+ op.drop_column("search_filter", "type")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "search_filter", sa.Column("type", sa.VARCHAR(), autoincrement=False, nullable=True)
+ )
+ op.drop_column("search_filter", "subject")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-01-30_e4b4991dddcd.py b/src/dispatch/database/revisions/tenant/versions/2023-01-30_e4b4991dddcd.py
new file mode 100644
index 000000000000..c271c4cdaa98
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-01-30_e4b4991dddcd.py
@@ -0,0 +1,205 @@
+"""empty message
+
+Revision ID: e4b4991dddcd
+Revises: 956eb8f8987e
+Create Date: 2023-01-30 10:52:31.676368
+
+"""
+from alembic import op
+
+from enum import Enum
+from pydantic import ConfigDict, BaseModel
+import sqlalchemy as sa
+from sqlalchemy.orm import Session, relationship
+from sqlalchemy.sql.expression import true
+from sqlalchemy.ext.declarative import declared_attr
+from sqlalchemy.ext.declarative import declarative_base
+
+# revision identifiers, used by Alembic.
+revision = "e4b4991dddcd"
+down_revision = "956eb8f8987e"
+branch_labels = None
+depends_on = None
+
+
+Base = declarative_base()
+
+
+class DispatchUser(Base):
+ __tablename__ = "dispatch_user"
+ __table_args__ = {"schema": "dispatch_core"}
+ id = sa.Column(sa.Integer, primary_key=True)
+ email = sa.Column(sa.String)
+
+
+class Project(Base):
+ __tablename__ = "project"
+ id = sa.Column(sa.Integer, primary_key=True)
+ name = sa.Column(sa.String)
+ default = sa.Column(sa.Boolean, default=False)
+
+
+class Case(Base):
+ __tablename__ = "case"
+ id = sa.Column(sa.Integer, primary_key=True)
+
+ assignee_id = sa.Column(sa.Integer, sa.ForeignKey(DispatchUser.id))
+ _assignee_id = sa.Column(sa.Integer, sa.ForeignKey("participant.id"))
+
+
+# Pydantic models...
+class DispatchBase(BaseModel):
+ model_config = ConfigDict(from_attributes=True, validate_assignment=True, arbitrary_types_allowed=True, str_strip_whitespace=True)
+
+
+class DispatchEnum(str, Enum):
+ def __str__(self) -> str:
+ return str.__str__(self)
+
+
+class ParticipantRole(Base):
+ __tablename__ = "participant_role"
+ id = sa.Column(sa.Integer, primary_key=True)
+ role = sa.Column(sa.String)
+
+
+class ParticipantRoleBase(DispatchBase):
+ role: str
+
+
+class ParticipantRoleType(DispatchEnum):
+ assignee = "Assignee"
+ reporter = "Reporter"
+
+
+class ParticipantRoleCreate(ParticipantRoleBase):
+ role: ParticipantRoleType | None = None
+
+
+class ProjectMixin(object):
+ """Project mixin"""
+
+ @declared_attr
+ def project_id(cls): # noqa
+ return sa.Column(sa.Integer, sa.ForeignKey("project.id", ondelete="CASCADE"))
+
+ @declared_attr
+ def project(cls): # noqa
+ return relationship("Project")
+
+
+class IndividualContact(Base, ProjectMixin):
+ __tablename__ = "individual_contact"
+
+ id = sa.Column(sa.Integer, primary_key=True)
+ email = sa.Column(sa.String)
+ name = sa.Column(sa.String)
+ weblink = sa.Column(sa.String)
+
+
+class Participant(Base):
+ __tablename__ = "participant"
+ id = sa.Column(sa.Integer, primary_key=True)
+ case_id = sa.Column(sa.Integer)
+ individual_contact_id = sa.Column(sa.Integer)
+
+
+def is_participant(db_session: Session, participant_id: int) -> bool:
+ return (
+ True
+ if db_session.query(Participant).filter(Participant.id == participant_id).first()
+ is not None
+ else False
+ )
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "case",
+ sa.Column("participants_team", sa.String()),
+ )
+ op.add_column(
+ "case",
+ sa.Column("participants_location", sa.String()),
+ )
+ op.add_column(
+ "case",
+ sa.Column("_assignee_id", sa.Integer()),
+ )
+ op.add_column(
+ "participant",
+ sa.Column("case_id", sa.Integer()),
+ )
+
+ print("Migrating case assignees to Participant from DispatchUser..")
+
+ bind = op.get_bind()
+ db_session = Session(bind=bind)
+
+ for case in db_session.query(Case):
+ print(f"Processing Case {case.id}...")
+ if not is_participant(db_session, participant_id=case.assignee_id):
+ current_user = (
+ db_session.query(DispatchUser).filter(DispatchUser.id == case.assignee_id).first()
+ )
+ individual = (
+ db_session.query(IndividualContact)
+ .filter(IndividualContact.email == current_user.email)
+ .first()
+ )
+ if individual is None:
+ i = {}
+ i["email"] = current_user.email
+ i["name"] = current_user.email
+ i["weblink"] = ""
+ default_project = (
+ db_session.query(Project).filter(Project.default == true()).one_or_none()
+ )
+ individual = IndividualContact(
+ **i,
+ project=default_project if default_project else "default",
+ )
+ db_session.add(individual)
+
+ participant = Participant(
+ individual_contact_id=individual.id,
+ )
+ db_session.add(participant)
+ db_session.commit()
+ role = ParticipantRole(role=ParticipantRoleType.assignee)
+ participant.participant_roles = role
+
+ case._assignee_id = participant.id
+ participant.case_id = case.id
+ participant.individual_contact_id = individual.id
+
+ db_session.commit()
+
+ op.create_foreign_key(
+ None, "participant", "case", ["case_id"], ["id"], ondelete="CASCADE", use_alter=True
+ )
+ op.drop_column("case", "assignee_id")
+ op.alter_column("case", "_assignee_id", new_column_name="assignee_id")
+ op.create_foreign_key(None, "case", "participant", ["assignee_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "participant", type_="foreignkey")
+ op.drop_column("participant", "case_id")
+ op.drop_constraint(None, "case", type_="foreignkey")
+ op.create_foreign_key(
+ "case_assignee_id_fkey",
+ "case",
+ "dispatch_user",
+ ["assignee_id"],
+ ["id"],
+ referent_schema="dispatch_core",
+ )
+ op.drop_column("case", "participants_location")
+ op.drop_column("case", "participants_team")
+
+
+# ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-02-09_8746b4e292d2.py b/src/dispatch/database/revisions/tenant/versions/2023-02-09_8746b4e292d2.py
new file mode 100644
index 000000000000..4943f60b3ae1
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-02-09_8746b4e292d2.py
@@ -0,0 +1,102 @@
+"""Adds entity and entity type tables and associations
+
+Revision ID: 8746b4e292d2
+Revises: 941efd922446
+Create Date: 2023-02-09 23:18:11.326027
+
+"""
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy.dialects.postgresql import TSVECTOR, UUID
+
+# revision identifiers, used by Alembic.
+revision = "8746b4e292d2"
+down_revision = "941efd922446"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust!
+
+ # EntityType
+ op.create_table(
+ "entity_type",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("field", sa.String(), nullable=True),
+ sa.Column("regular_expression", sa.String(), nullable=True),
+ sa.Column("global_find", sa.Boolean(), nullable=True),
+ sa.Column("enabled", sa.Boolean(), nullable=True),
+ sa.Column("search_vector", TSVECTOR, nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_table(
+ "assoc_signal_entity_types",
+ sa.Column("signal_id", sa.Integer(), nullable=False),
+ sa.Column("entity_type_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(["signal_id"], ["signal.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["entity_type_id"], ["entity_type.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("signal_id", "entity_type_id"),
+ )
+ op.create_index(
+ "entity_type_search_vector_idx",
+ "entity_type",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+
+ # Entity
+ op.create_table(
+ "entity",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("value", sa.String(), nullable=True),
+ sa.Column("source", sa.Boolean(), nullable=True),
+ sa.Column("entity_type_id", sa.Integer(), nullable=False),
+ sa.Column("search_vector", TSVECTOR, nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["entity_type_id"],
+ ["entity_type.id"],
+ ),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "ix_entity_search_vector",
+ "entity",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "assoc_signal_instance_entities",
+ sa.Column("signal_instance_id", UUID(), nullable=False),
+ sa.Column("entity_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(["signal_instance_id"], ["signal_instance.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["entity_id"], ["entity.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("signal_instance_id", "entity_id"),
+ )
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_index("ix_entity_search_vector", table_name="entity")
+ op.drop_table("entity")
+ op.drop_table("entity_type")
+ op.drop_index("entity_type_search_vector_idx", table_name="entity", postgresql_using="gin")
+ op.drop_table("assoc_signal_entity_types")
+ op.drop_table("assoc_signal_instance_entity_types")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-02-13_93b517de08e2.py b/src/dispatch/database/revisions/tenant/versions/2023-02-13_93b517de08e2.py
new file mode 100644
index 000000000000..5affe963cd87
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-02-13_93b517de08e2.py
@@ -0,0 +1,42 @@
+"""Allows many to many signal filters
+
+Revision ID: 93b517de08e2
+Revises: b168b50764c7
+Create Date: 2023-02-13 15:19:36.921571
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "93b517de08e2"
+down_revision = "b168b50764c7"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "assoc_signal_filters",
+ sa.Column("signal_id", sa.Integer(), nullable=False),
+ sa.Column("signal_filter_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(["signal_filter_id"], ["signal_filter.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["signal_id"], ["signal.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("signal_id", "signal_filter_id"),
+ )
+ op.drop_constraint("signal_filter_signal_id_fkey", "signal_filter", type_="foreignkey")
+ op.drop_column("signal_filter", "signal_id")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "signal_filter", sa.Column("signal_id", sa.INTEGER(), autoincrement=False, nullable=True)
+ )
+ op.create_foreign_key(
+ "signal_filter_signal_id_fkey", "signal_filter", "signal", ["signal_id"], ["id"]
+ )
+ op.drop_table("assoc_signal_filters")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-02-13_b168b50764c7.py b/src/dispatch/database/revisions/tenant/versions/2023-02-13_b168b50764c7.py
new file mode 100644
index 000000000000..3a8821872b22
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-02-13_b168b50764c7.py
@@ -0,0 +1,254 @@
+"""Moves signal processing to filter approach.
+
+Revision ID: b168b50764c7
+Revises: 8746b4e292d2
+Create Date: 2023-02-13 13:56:48.032074
+
+"""
+from alembic import op
+import sqlalchemy as sa
+import sqlalchemy_utils
+from sqlalchemy.dialects import postgresql
+
+# revision identifiers, used by Alembic.
+revision = "b168b50764c7"
+down_revision = "8746b4e292d2"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "signal_filter",
+ sa.Column("evergreen", sa.Boolean(), nullable=True),
+ sa.Column("evergreen_owner", sa.String(), nullable=True),
+ sa.Column("evergreen_reminder_interval", sa.Integer(), nullable=True),
+ sa.Column("evergreen_last_reminder_at", sa.DateTime(), nullable=True),
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("expression", sa.JSON(), nullable=False),
+ sa.Column("mode", sa.String(), nullable=False),
+ sa.Column("action", sa.String(), nullable=False),
+ sa.Column("expiration", sa.DateTime(), nullable=True),
+ sa.Column("window", sa.Integer(), nullable=True),
+ sa.Column("signal_id", sa.Integer(), nullable=True),
+ sa.Column("creator_id", sa.Integer(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["creator_id"],
+ ["dispatch_core.dispatch_user.id"],
+ ),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(
+ ["signal_id"],
+ ["signal.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "signal_filter_search_vector_idx",
+ "signal_filter",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.drop_constraint("signal_suppression_rule_id_fkey", "signal", type_="foreignkey")
+ op.drop_constraint("signal_duplication_rule_id_fkey", "signal", type_="foreignkey")
+ op.drop_constraint(
+ "signal_instance_duplication_rule_id_fkey", "signal_instance", type_="foreignkey"
+ )
+ op.drop_constraint(
+ "signal_instance_suppression_rule_id_fkey", "signal_instance", type_="foreignkey"
+ )
+ op.drop_table("assoc_duplication_rule_tag_types")
+ op.drop_table("assoc_suppression_rule_tags")
+ op.drop_table("assoc_signal_instance_tags")
+ op.drop_table("duplication_rule")
+ op.drop_table("suppression_rule")
+ op.drop_column("signal", "suppression_rule_id")
+ op.drop_column("signal", "duplication_rule_id")
+ op.add_column("signal_instance", sa.Column("filter_action", sa.String(), nullable=True))
+ op.drop_column("signal_instance", "suppression_rule_id")
+ op.drop_column("signal_instance", "duplication_rule_id")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "signal_instance",
+ sa.Column("duplication_rule_id", sa.INTEGER(), autoincrement=False, nullable=True),
+ )
+ op.add_column(
+ "signal_instance",
+ sa.Column("suppression_rule_id", sa.INTEGER(), autoincrement=False, nullable=True),
+ )
+ op.create_foreign_key(
+ "signal_instance_suppression_rule_id_fkey",
+ "signal_instance",
+ "suppression_rule",
+ ["suppression_rule_id"],
+ ["id"],
+ )
+ op.create_foreign_key(
+ "signal_instance_duplication_rule_id_fkey",
+ "signal_instance",
+ "duplication_rule",
+ ["duplication_rule_id"],
+ ["id"],
+ )
+ op.drop_column("signal_instance", "filter_action")
+ op.add_column(
+ "signal", sa.Column("duplication_rule_id", sa.INTEGER(), autoincrement=False, nullable=True)
+ )
+ op.add_column(
+ "signal", sa.Column("suppression_rule_id", sa.INTEGER(), autoincrement=False, nullable=True)
+ )
+ op.create_foreign_key(
+ "signal_duplication_rule_id_fkey",
+ "signal",
+ "duplication_rule",
+ ["duplication_rule_id"],
+ ["id"],
+ )
+ op.create_foreign_key(
+ "signal_suppression_rule_id_fkey",
+ "signal",
+ "suppression_rule",
+ ["suppression_rule_id"],
+ ["id"],
+ )
+ op.add_column(
+ "plugin_instance",
+ sa.Column(
+ "configuration",
+ postgresql.JSON(astext_type=sa.Text()),
+ autoincrement=False,
+ nullable=True,
+ ),
+ )
+ op.drop_index("entity_search_vector_idx", table_name="entity", postgresql_using="gin")
+ op.create_index("ix_entity_search_vector", "entity", ["search_vector"], unique=False)
+ op.create_table(
+ "service_incident",
+ sa.Column("incident_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("service_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["incident_id"], ["incident.id"], name="service_incident_incident_id_fkey"
+ ),
+ sa.ForeignKeyConstraint(
+ ["service_id"], ["service.id"], name="service_incident_service_id_fkey"
+ ),
+ sa.PrimaryKeyConstraint("incident_id", "service_id", name="service_incident_pkey"),
+ )
+ op.create_table(
+ "assoc_suppression_rule_tags",
+ sa.Column("suppression_rule_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("tag_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["suppression_rule_id"],
+ ["suppression_rule.id"],
+ name="assoc_suppression_rule_tags_suppression_rule_id_fkey",
+ ondelete="CASCADE",
+ ),
+ sa.ForeignKeyConstraint(
+ ["tag_id"],
+ ["tag.id"],
+ name="assoc_suppression_rule_tags_tag_id_fkey",
+ ondelete="CASCADE",
+ ),
+ sa.PrimaryKeyConstraint(
+ "suppression_rule_id", "tag_id", name="assoc_suppression_rule_tags_pkey"
+ ),
+ )
+ op.create_table(
+ "assoc_duplication_rule_tag_types",
+ sa.Column("duplication_rule_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.Column("tag_type_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["duplication_rule_id"],
+ ["duplication_rule.id"],
+ name="assoc_duplication_rule_tag_types_duplication_rule_id_fkey",
+ ondelete="CASCADE",
+ ),
+ sa.ForeignKeyConstraint(
+ ["tag_type_id"],
+ ["tag_type.id"],
+ name="assoc_duplication_rule_tag_types_tag_type_id_fkey",
+ ondelete="CASCADE",
+ ),
+ sa.PrimaryKeyConstraint(
+ "duplication_rule_id", "tag_type_id", name="assoc_duplication_rule_tag_types_pkey"
+ ),
+ )
+ op.create_table(
+ "assoc_signal_instance_tags",
+ sa.Column("signal_instance_id", postgresql.UUID(), autoincrement=False, nullable=False),
+ sa.Column("tag_id", sa.INTEGER(), autoincrement=False, nullable=False),
+ sa.ForeignKeyConstraint(
+ ["signal_instance_id"],
+ ["signal_instance.id"],
+ name="assoc_signal_instance_tags_signal_instance_id_fkey",
+ ondelete="CASCADE",
+ ),
+ sa.ForeignKeyConstraint(
+ ["tag_id"],
+ ["tag.id"],
+ name="assoc_signal_instance_tags_tag_id_fkey",
+ ondelete="CASCADE",
+ ),
+ sa.PrimaryKeyConstraint(
+ "signal_instance_id", "tag_id", name="assoc_signal_instance_tags_pkey"
+ ),
+ )
+ op.create_table(
+ "suppression_rule",
+ sa.Column("evergreen", sa.BOOLEAN(), autoincrement=False, nullable=True),
+ sa.Column("evergreen_owner", sa.VARCHAR(), autoincrement=False, nullable=True),
+ sa.Column("evergreen_reminder_interval", sa.INTEGER(), autoincrement=False, nullable=True),
+ sa.Column(
+ "evergreen_last_reminder_at", postgresql.TIMESTAMP(), autoincrement=False, nullable=True
+ ),
+ sa.Column("id", sa.INTEGER(), autoincrement=True, nullable=False),
+ sa.Column("mode", sa.VARCHAR(), autoincrement=False, nullable=False),
+ sa.Column("expiration", postgresql.TIMESTAMP(), autoincrement=False, nullable=True),
+ sa.Column("project_id", sa.INTEGER(), autoincrement=False, nullable=True),
+ sa.ForeignKeyConstraint(
+ ["project_id"],
+ ["project.id"],
+ name="suppression_rule_project_id_fkey",
+ ondelete="CASCADE",
+ ),
+ sa.PrimaryKeyConstraint("id", name="suppression_rule_pkey"),
+ )
+ op.create_table(
+ "duplication_rule",
+ sa.Column("evergreen", sa.BOOLEAN(), autoincrement=False, nullable=True),
+ sa.Column("evergreen_owner", sa.VARCHAR(), autoincrement=False, nullable=True),
+ sa.Column("evergreen_reminder_interval", sa.INTEGER(), autoincrement=False, nullable=True),
+ sa.Column(
+ "evergreen_last_reminder_at", postgresql.TIMESTAMP(), autoincrement=False, nullable=True
+ ),
+ sa.Column("id", sa.INTEGER(), autoincrement=True, nullable=False),
+ sa.Column("mode", sa.VARCHAR(), autoincrement=False, nullable=False),
+ sa.Column("project_id", sa.INTEGER(), autoincrement=False, nullable=True),
+ sa.Column("window", sa.INTEGER(), autoincrement=False, nullable=True),
+ sa.ForeignKeyConstraint(
+ ["project_id"],
+ ["project.id"],
+ name="duplication_rule_project_id_fkey",
+ ondelete="CASCADE",
+ ),
+ sa.PrimaryKeyConstraint("id", name="duplication_rule_pkey"),
+ )
+ op.drop_index(
+ "signal_filter_search_vector_idx", table_name="signal_filter", postgresql_using="gin"
+ )
+ op.drop_table("signal_filter")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-02-27_20d3801ad5b7.py b/src/dispatch/database/revisions/tenant/versions/2023-02-27_20d3801ad5b7.py
new file mode 100644
index 000000000000..3d58de331f0d
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-02-27_20d3801ad5b7.py
@@ -0,0 +1,26 @@
+"""Adds external id for tags
+
+Revision ID: 20d3801ad5b7
+Revises: 93b517de08e2
+Create Date: 2023-02-27 15:38:07.029395
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "20d3801ad5b7"
+down_revision = "93b517de08e2"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("tag", sa.Column("external_id", sa.String(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("tag", "external_id")
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-03-02_5955ed5f76b4.py b/src/dispatch/database/revisions/tenant/versions/2023-03-02_5955ed5f76b4.py
new file mode 100644
index 000000000000..81f496ae68dd
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-03-02_5955ed5f76b4.py
@@ -0,0 +1,27 @@
+"""Adds the ability to toggle signal definitions.
+
+Revision ID: 5955ed5f76b4
+Revises: 20d3801ad5b7
+Create Date: 2023-03-02 11:12:18.168787
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "5955ed5f76b4"
+down_revision = "20d3801ad5b7"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("signal", sa.Column("enabled", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("signal", "enabled")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-03-02_65db2acae3ea.py b/src/dispatch/database/revisions/tenant/versions/2023-03-02_65db2acae3ea.py
new file mode 100644
index 000000000000..fee5fc356ede
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-03-02_65db2acae3ea.py
@@ -0,0 +1,26 @@
+"""Adds unique constraint in task model
+
+Revision ID: 65db2acae3ea
+Revises: 5955ed5f76b4
+Create Date: 2023-03-02 11:16:27.810635
+
+"""
+from alembic import op
+
+# revision identifiers, used by Alembic.
+revision = "65db2acae3ea"
+down_revision = "5955ed5f76b4"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_unique_constraint(None, "task", ["resource_id", "incident_id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "task", type_="unique")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-03-03_7ddae3ba7822.py b/src/dispatch/database/revisions/tenant/versions/2023-03-03_7ddae3ba7822.py
new file mode 100644
index 000000000000..3b14aeff3334
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-03-03_7ddae3ba7822.py
@@ -0,0 +1,30 @@
+"""Adds signal triggers
+
+Revision ID: 7ddae3ba7822
+Revises: 65db2acae3ea
+Create Date: 2023-03-03 10:16:57.890859
+
+"""
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy.schema import MetaData
+
+from dispatch.search.fulltext import sync_trigger
+
+# revision identifiers, used by Alembic.
+revision = "7ddae3ba7822"
+down_revision = "65db2acae3ea"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ conn = op.get_context().connection
+ metadata = MetaData(schema=conn.dialect.default_schema_name)
+ metadata.bind = conn
+ table = sa.Table("signal", metadata, autoload_with=conn)
+ sync_trigger(conn, table, "search_vector", ["name", "description", "variant"])
+
+
+def downgrade():
+ pass
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-03-08_1d21b28bc553.py b/src/dispatch/database/revisions/tenant/versions/2023-03-08_1d21b28bc553.py
new file mode 100644
index 000000000000..8fd558ad2476
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-03-08_1d21b28bc553.py
@@ -0,0 +1,28 @@
+"""Adds ondelete cascade to case table for participant FK
+
+Revision ID: 1d21b28bc553
+Revises: 1ded4c2a7801
+Create Date: 2023-03-08 10:26:05.472740
+
+"""
+from alembic import op
+
+# revision identifiers, used by Alembic.
+revision = "1d21b28bc553"
+down_revision = "1ded4c2a7801"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint("case_assignee_id_fkey", "case", type_="foreignkey")
+ op.create_foreign_key(None, "case", "participant", ["assignee_id"], ["id"], ondelete="CASCADE")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "case", type_="foreignkey")
+ op.create_foreign_key("case_assignee_id_fkey", "case", "participant", ["assignee_id"], ["id"])
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-03-08_1ded4c2a7801.py b/src/dispatch/database/revisions/tenant/versions/2023-03-08_1ded4c2a7801.py
new file mode 100644
index 000000000000..eb85fab74ce8
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-03-08_1ded4c2a7801.py
@@ -0,0 +1,41 @@
+"""Adds ondelete cascade to participant table for individual_contact FK
+
+Revision ID: 1ded4c2a7801
+Revises: 7ddae3ba7822
+Create Date: 2023-03-08 10:18:03.250210
+
+"""
+from alembic import op
+
+# revision identifiers, used by Alembic.
+revision = "1ded4c2a7801"
+down_revision = "7ddae3ba7822"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint("participant_individual_contact_id_fkey", "participant", type_="foreignkey")
+ op.create_foreign_key(
+ None,
+ "participant",
+ "individual_contact",
+ ["individual_contact_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "participant", type_="foreignkey")
+ op.create_foreign_key(
+ "participant_individual_contact_id_fkey",
+ "participant",
+ "individual_contact",
+ ["individual_contact_id"],
+ ["id"],
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-03-09_7db13bf5c5d7.py b/src/dispatch/database/revisions/tenant/versions/2023-03-09_7db13bf5c5d7.py
new file mode 100644
index 000000000000..49be82954051
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-03-09_7db13bf5c5d7.py
@@ -0,0 +1,31 @@
+"""Adds missing tag search columns
+
+Revision ID: 7db13bf5c5d7
+Revises: ec78c132ab93
+Create Date: 2023-03-09 16:31:22.963497
+
+"""
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy.schema import MetaData
+
+from dispatch.search.fulltext import sync_trigger
+
+
+# revision identifiers, used by Alembic.
+revision = "7db13bf5c5d7"
+down_revision = "ec78c132ab93"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ conn = op.get_context().connection
+ metadata = MetaData(schema=conn.dialect.default_schema_name)
+ metadata.bind = conn
+ table = sa.Table("tag", metadata, autoload_with=conn)
+ sync_trigger(conn, table, "search_vector", ["name", "description", "external_id"])
+
+
+def downgrade():
+ pass
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-03-09_ec78c132ab93.py b/src/dispatch/database/revisions/tenant/versions/2023-03-09_ec78c132ab93.py
new file mode 100644
index 000000000000..39721e873587
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-03-09_ec78c132ab93.py
@@ -0,0 +1,27 @@
+"""Adds case resolution reason
+
+Revision ID: ec78c132ab93
+Revises: 1d21b28bc553
+Create Date: 2023-03-09 10:16:54.885960
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "ec78c132ab93"
+down_revision = "1d21b28bc553"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case", sa.Column("resolution_reason", sa.String(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("case", "resolution_reason")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-03-14_b48b245f9c4c.py b/src/dispatch/database/revisions/tenant/versions/2023-03-14_b48b245f9c4c.py
new file mode 100644
index 000000000000..6034965f4d78
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-03-14_b48b245f9c4c.py
@@ -0,0 +1,34 @@
+"""empty message
+
+Revision ID: b48b245f9c4c
+Revises: 7db13bf5c5d7
+Create Date: 2023-03-14 16:56:11.844768
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "b48b245f9c4c"
+down_revision = "7db13bf5c5d7"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "assoc_signal_workflows",
+ sa.Column("signal_id", sa.Integer(), nullable=False),
+ sa.Column("workflow_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(["signal_id"], ["signal.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["workflow_id"], ["workflow.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("signal_id", "workflow_id"),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("assoc_signal_workflows")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-03-24_1dafcb9ad889.py b/src/dispatch/database/revisions/tenant/versions/2023-03-24_1dafcb9ad889.py
new file mode 100644
index 000000000000..cafbae49c89b
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-03-24_1dafcb9ad889.py
@@ -0,0 +1,31 @@
+"""Associate WorkflowInstances with Signals
+
+Revision ID: 1dafcb9ad889
+Revises: b48b245f9c4c
+Create Date: 2023-03-24 10:51:26.443920
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "1dafcb9ad889"
+down_revision = "b48b245f9c4c"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("workflow_instance", sa.Column("signal_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(
+ None, "workflow_instance", "signal", ["signal_id"], ["id"], ondelete="CASCADE"
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "workflow_instance", type_="foreignkey")
+ op.drop_column("workflow_instance", "signal_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-03-27_d089d1d110f0.py b/src/dispatch/database/revisions/tenant/versions/2023-03-27_d089d1d110f0.py
new file mode 100644
index 000000000000..74423697fadb
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-03-27_d089d1d110f0.py
@@ -0,0 +1,33 @@
+"""Adds signal case override
+
+Revision ID: d089d1d110f0
+Revises: d1b5ed66d83d
+Create Date: 2023-03-27 16:16:04.098781
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "d089d1d110f0"
+down_revision = "d1b5ed66d83d"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("signal", sa.Column("create_case", sa.Boolean(), nullable=True))
+ op.add_column("signal", sa.Column("conversation_target", sa.String(), nullable=True))
+ op.add_column("signal", sa.Column("oncall_service_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "signal", "service", ["oncall_service_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "signal", type_="foreignkey")
+ op.drop_column("signal", "oncall_service_id")
+ op.drop_column("signal", "conversation_target")
+ op.drop_column("signal", "create_case")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-03-27_d1b5ed66d83d.py b/src/dispatch/database/revisions/tenant/versions/2023-03-27_d1b5ed66d83d.py
new file mode 100644
index 000000000000..86f0ab1ba735
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-03-27_d1b5ed66d83d.py
@@ -0,0 +1,31 @@
+"""Adds email search.
+
+Revision ID: d1b5ed66d83d
+Revises: 1dafcb9ad889
+Create Date: 2023-03-27 08:57:44.499535
+
+"""
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy.schema import MetaData
+
+from dispatch.search.fulltext import sync_trigger
+
+
+# revision identifiers, used by Alembic.
+revision = "d1b5ed66d83d"
+down_revision = "1dafcb9ad889"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ conn = op.get_context().connection
+ metadata = MetaData(schema=conn.dialect.default_schema_name)
+ metadata.bind = conn
+ table = sa.Table("individual_contact", metadata, autoload_with=conn)
+ sync_trigger(conn, table, "search_vector", ["name", "title", "email", "company", "notes"])
+
+
+def downgrade():
+ pass
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-04-10_61a861559de9.py b/src/dispatch/database/revisions/tenant/versions/2023-04-10_61a861559de9.py
new file mode 100644
index 000000000000..dc27637484a4
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-04-10_61a861559de9.py
@@ -0,0 +1,35 @@
+"""Adds search filter and entity triggers
+
+Revision ID: 61a861559de9
+Revises: d089d1d110f0
+Create Date: 2023-04-10 11:57:54.896708
+
+"""
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy.schema import MetaData
+
+from dispatch.search.fulltext import sync_trigger
+
+# revision identifiers, used by Alembic.
+revision = "61a861559de9"
+down_revision = "d089d1d110f0"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ conn = op.get_context().connection
+ metadata = MetaData(bind=conn, schema=conn.dialect.default_schema_name)
+ table = sa.Table("entity_type", metadata, autoload=True)
+ sync_trigger(conn, table, "search_vector", ["name", "description"])
+ table = sa.Table("signal_filter", metadata, autoload=True)
+ sync_trigger(conn, table, "search_vector", ["name", "description"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ pass
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-04-14_38a7fb709167.py b/src/dispatch/database/revisions/tenant/versions/2023-04-14_38a7fb709167.py
new file mode 100644
index 000000000000..0f2f33b879ca
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-04-14_38a7fb709167.py
@@ -0,0 +1,77 @@
+"""Adds SignalEngagement table and associates with Signal table
+
+Revision ID: 38a7fb709167
+Revises: 61a861559de9
+Create Date: 2023-04-14 15:17:04.638238
+
+"""
+from alembic import op
+import sqlalchemy as sa
+import sqlalchemy_utils
+
+# revision identifiers, used by Alembic.
+revision = "38a7fb709167"
+down_revision = "61a861559de9"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "signal_engagement",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("message", sa.String(), nullable=True),
+ sa.Column("require_mfa", sa.Boolean(), nullable=True),
+ sa.Column("entity_type_id", sa.Integer(), nullable=True),
+ sa.Column("creator_id", sa.Integer(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["creator_id"],
+ ["dispatch_core.dispatch_user.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["entity_type_id"],
+ ["entity_type.id"],
+ ),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "signal_engagement_search_vector_idx",
+ "signal_engagement",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "assoc_signal_engagements",
+ sa.Column("signal_id", sa.Integer(), nullable=False),
+ sa.Column("signal_engagement_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(
+ ["signal_engagement_id"], ["signal_engagement.id"], ondelete="CASCADE"
+ ),
+ sa.ForeignKeyConstraint(["signal_id"], ["signal.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("signal_id", "signal_engagement_id"),
+ )
+ op.add_column("signal_instance", sa.Column("engagement_thread_ts", sa.String(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("signal_instance", "engagement_thread_ts")
+ op.drop_table("assoc_signal_engagements")
+ op.drop_index(
+ "signal_engagement_search_vector_idx",
+ table_name="signal_engagement",
+ postgresql_using="gin",
+ )
+ op.drop_table("signal_engagement")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-04-20_56eb1c0a3a92.py b/src/dispatch/database/revisions/tenant/versions/2023-04-20_56eb1c0a3a92.py
new file mode 100644
index 000000000000..7a8a54e6c82d
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-04-20_56eb1c0a3a92.py
@@ -0,0 +1,33 @@
+"""Adds case override information to signal instances
+
+Revision ID: 56eb1c0a3a92
+Revises: 38a7fb709167
+Create Date: 2023-04-20 12:42:17.734488
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "56eb1c0a3a92"
+down_revision = "38a7fb709167"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("signal_instance", sa.Column("case_type_id", sa.Integer(), nullable=True))
+ op.add_column("signal_instance", sa.Column("case_priority_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "signal_instance", "case_priority", ["case_priority_id"], ["id"])
+ op.create_foreign_key(None, "signal_instance", "case_type", ["case_type_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "signal_instance", type_="foreignkey")
+ op.drop_constraint(None, "signal_instance", type_="foreignkey")
+ op.drop_column("signal_instance", "case_priority_id")
+ op.drop_column("signal_instance", "case_type_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-04-20_930eb80028d2.py b/src/dispatch/database/revisions/tenant/versions/2023-04-20_930eb80028d2.py
new file mode 100644
index 000000000000..b79d8d03f372
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-04-20_930eb80028d2.py
@@ -0,0 +1,27 @@
+"""Adds a thread_ts column for a signal message in a case so it can be updated by conversation plugin
+
+Revision ID: 930eb80028d2
+Revises: 56eb1c0a3a92
+Create Date: 2023-04-20 16:55:33.901449
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "930eb80028d2"
+down_revision = "56eb1c0a3a92"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case", sa.Column("signal_thread_ts", sa.String(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("case", "signal_thread_ts")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-05-17_9ad021045e45.py b/src/dispatch/database/revisions/tenant/versions/2023-05-17_9ad021045e45.py
new file mode 100644
index 000000000000..ac0114d29f1c
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-05-17_9ad021045e45.py
@@ -0,0 +1,31 @@
+"""Adds timestamps to the SearchFilter model
+
+Revision ID: 9ad021045e45
+Revises: 930eb80028d2
+Create Date: 2023-05-17 11:30:33.542458
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "9ad021045e45"
+down_revision = "930eb80028d2"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("search_filter", sa.Column("enabled", sa.Boolean(), nullable=True))
+ op.add_column("search_filter", sa.Column("updated_at", sa.DateTime(), nullable=True))
+ op.add_column("search_filter", sa.Column("created_at", sa.DateTime(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("search_filter", "created_at")
+ op.drop_column("search_filter", "updated_at")
+ op.drop_column("search_filter", "enabled")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-06-05_35a2ad8b53a2.py b/src/dispatch/database/revisions/tenant/versions/2023-06-05_35a2ad8b53a2.py
new file mode 100644
index 000000000000..8932d4f8d9ea
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-06-05_35a2ad8b53a2.py
@@ -0,0 +1,28 @@
+"""Remove constraint
+
+Revision ID: 35a2ad8b53a2
+Revises: bad95cd82724
+Create Date: 2023-06-05 16:02:46.337434
+
+"""
+from alembic import op
+
+# revision identifiers, used by Alembic.
+revision = "35a2ad8b53a2"
+down_revision = "bad95cd82724"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint("entity_type_name_project_id_key", "entity_type", type_="unique")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_unique_constraint(
+ "entity_type_name_project_id_key", "entity_type", ["name", "project_id"]
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-06-05_bad95cd82724.py b/src/dispatch/database/revisions/tenant/versions/2023-06-05_bad95cd82724.py
new file mode 100644
index 000000000000..ba8e52eff0df
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-06-05_bad95cd82724.py
@@ -0,0 +1,33 @@
+"""Adds entity type scoping.
+
+Revision ID: bad95cd82724
+Revises: 9ad021045e45
+Create Date: 2023-06-05 08:48:53.486613
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "bad95cd82724"
+down_revision = "9ad021045e45"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("entity_type", sa.Column("scope", sa.String(), nullable=True))
+ op.execute("UPDATE entity_type SET scope = 'multiple'")
+ op.alter_column("entity_type", "scope", nullable=False)
+ op.drop_column("entity_type", "global_find")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "entity_type", sa.Column("global_find", sa.BOOLEAN(), autoincrement=False, nullable=True)
+ )
+ op.drop_column("entity_type", "scope")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-06-06_0a19c5a4c2ba.py b/src/dispatch/database/revisions/tenant/versions/2023-06-06_0a19c5a4c2ba.py
new file mode 100644
index 000000000000..6fdc539ee82a
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-06-06_0a19c5a4c2ba.py
@@ -0,0 +1,26 @@
+"""Renames field column to jpath
+
+Revision ID: 0a19c5a4c2ba
+Revises: 35a2ad8b53a2
+Create Date: 2023-06-06 08:54:24.986736
+
+"""
+from alembic import op
+
+# revision identifiers, used by Alembic.
+revision = "0a19c5a4c2ba"
+down_revision = "35a2ad8b53a2"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.alter_column("entity_type", "field", nullable=False, new_column_name="jpath")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.alter_column("entity_type", "jpath", nullable=False, new_column_name="field")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-07-03_fa23324d5679.py b/src/dispatch/database/revisions/tenant/versions/2023-07-03_fa23324d5679.py
new file mode 100644
index 000000000000..dac0308e5dc9
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-07-03_fa23324d5679.py
@@ -0,0 +1,29 @@
+"""Adds reporter to Case model
+
+Revision ID: fa23324d5679
+Revises: 0a19c5a4c2ba
+Create Date: 2023-07-03 11:12:32.679321
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "fa23324d5679"
+down_revision = "0a19c5a4c2ba"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case", sa.Column("reporter_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "case", "participant", ["reporter_id"], ["id"], ondelete="CASCADE")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "case", type_="foreignkey")
+ op.drop_column("case", "reporter_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-07-05_a3fb1380cf76.py b/src/dispatch/database/revisions/tenant/versions/2023-07-05_a3fb1380cf76.py
new file mode 100644
index 000000000000..12cc5783b59f
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-07-05_a3fb1380cf76.py
@@ -0,0 +1,33 @@
+"""Adds reminder delay time to Incident model
+
+Revision ID: a3fb1380cf76
+Revises: fa23324d5679
+Create Date: 2023-07-05 14:27:32.239616
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "a3fb1380cf76"
+down_revision = "fa23324d5679"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "incident", sa.Column("delay_executive_report_reminder", sa.DateTime(), nullable=True)
+ )
+ op.add_column(
+ "incident", sa.Column("delay_tactical_report_reminder", sa.DateTime(), nullable=True)
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("incident", "delay_tactical_report_reminder")
+ op.drop_column("incident", "delay_executive_report_reminder")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-08-04_6b98c28edd86.py b/src/dispatch/database/revisions/tenant/versions/2023-08-04_6b98c28edd86.py
new file mode 100644
index 000000000000..469856237476
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-08-04_6b98c28edd86.py
@@ -0,0 +1,27 @@
+"""Adds daily report setting to project
+
+Revision ID: 6b98c28edd86
+Revises: a3fb1380cf76
+Create Date: 2023-08-04 17:36:18.646729
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "6b98c28edd86"
+down_revision = "a3fb1380cf76"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("project", sa.Column("send_daily_reports", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("project", "send_daily_reports")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-08-10_74d279f6e4f6.py b/src/dispatch/database/revisions/tenant/versions/2023-08-10_74d279f6e4f6.py
new file mode 100644
index 000000000000..e505b453683d
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-08-10_74d279f6e4f6.py
@@ -0,0 +1,27 @@
+"""Adds column to indicate if health metrics need to be collected for the service
+
+Revision ID: 74d279f6e4f6
+Revises: 6b98c28edd86
+Create Date: 2023-03-29 13:51:11.269860
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "74d279f6e4f6"
+down_revision = "6b98c28edd86"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("service", sa.Column("health_metrics", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("service", "health_metrics")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-08-11_4e57f5b1f3f3.py b/src/dispatch/database/revisions/tenant/versions/2023-08-11_4e57f5b1f3f3.py
new file mode 100644
index 000000000000..b9acea256dc4
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-08-11_4e57f5b1f3f3.py
@@ -0,0 +1,56 @@
+"""Adds service feedback data model
+
+Revision ID: 4e57f5b1f3f3
+Revises: 74d279f6e4f6
+Create Date: 2023-04-04 09:43:04.586304
+
+"""
+from alembic import op
+import sqlalchemy as sa
+import sqlalchemy_utils
+
+# revision identifiers, used by Alembic.
+revision = "4e57f5b1f3f3"
+down_revision = "74d279f6e4f6"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "service_feedback",
+ sa.Column("rating", sa.String(), nullable=True),
+ sa.Column("feedback", sa.String(), nullable=True),
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("hours", sa.Integer(), nullable=True),
+ sa.Column("shift_start_at", sa.DateTime(), nullable=True),
+ sa.Column("shift_end_at", sa.DateTime(), nullable=True),
+ sa.Column("schedule", sa.String(), nullable=True),
+ sa.Column("individual_contact_id", sa.Integer(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["individual_contact_id"],
+ ["individual_contact.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ op.create_index(
+ "service_feedback_search_vector_idx",
+ "service_feedback",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_index(
+ "service_feedback_search_vector_idx", table_name="service_feedback", postgresql_using="gin"
+ )
+ op.drop_table("service_feedback")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-09-07_0356472ea980.py b/src/dispatch/database/revisions/tenant/versions/2023-09-07_0356472ea980.py
new file mode 100644
index 000000000000..01a90e835da2
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-09-07_0356472ea980.py
@@ -0,0 +1,29 @@
+"""Adds new restrict to stable incident priority id
+
+Revision ID: 0356472ea980
+Revises: 4e57f5b1f3f3
+Create Date: 2023-09-01 15:30:52.512886
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "0356472ea980"
+down_revision = "4e57f5b1f3f3"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("project", sa.Column("stable_priority_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "project", "incident_priority", ["stable_priority_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "project", type_="foreignkey")
+ op.drop_column("project", "stable_priority_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-09-08_0560fab4537f.py b/src/dispatch/database/revisions/tenant/versions/2023-09-08_0560fab4537f.py
new file mode 100644
index 000000000000..767fc54954fd
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-09-08_0560fab4537f.py
@@ -0,0 +1,47 @@
+"""Adds reminders to oncall shift feedback
+
+Revision ID: 0560fab4537f
+Revises: 0356472ea980
+Create Date: 2023-09-08 17:13:57.903367
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "0560fab4537f"
+down_revision = "0356472ea980"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "service_feedback_reminder",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("reminder_at", sa.DateTime(), nullable=True),
+ sa.Column("schedule_id", sa.String(), nullable=True),
+ sa.Column("schedule_name", sa.String(), nullable=True),
+ sa.Column("shift_end_at", sa.DateTime(), nullable=True),
+ sa.Column("individual_contact_id", sa.Integer(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["individual_contact_id"],
+ ["individual_contact.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["project_id"],
+ ["project.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("service_feedback_reminder")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-09-13_e875e9544048.py b/src/dispatch/database/revisions/tenant/versions/2023-09-13_e875e9544048.py
new file mode 100644
index 000000000000..2522f08015a9
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-09-13_e875e9544048.py
@@ -0,0 +1,26 @@
+"""Adds canary column to signal_instance
+
+Revision ID: e875e9544048
+Revises: 0560fab4537f
+Create Date: 2023-09-13 09:31:24.223488
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "e875e9544048"
+down_revision = "0560fab4537f"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("signal_instance", sa.Column("canary", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("signal_instance", "canary")
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-09-15_3538650dc471.py b/src/dispatch/database/revisions/tenant/versions/2023-09-15_3538650dc471.py
new file mode 100644
index 000000000000..a2358b662ce3
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-09-15_3538650dc471.py
@@ -0,0 +1,31 @@
+"""Adds type to events
+
+Revision ID: 3538650dc471
+Revises: e875e9544048
+Create Date: 2023-09-12 13:43:42.539336
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "3538650dc471"
+down_revision = "e875e9544048"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("event", sa.Column("type", sa.String(), nullable=True))
+ op.add_column("event", sa.Column("owner", sa.String(), nullable=True))
+ op.add_column("event", sa.Column("pinned", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("event", "type")
+ op.drop_column("event", "pinned")
+ op.drop_column("event", "owner")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-10-06_064c71206256.py b/src/dispatch/database/revisions/tenant/versions/2023-10-06_064c71206256.py
new file mode 100644
index 000000000000..145a3b2eae55
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-10-06_064c71206256.py
@@ -0,0 +1,35 @@
+"""Adds project to the service feedback table and allows hours to be fractional
+
+Revision ID: 064c71206256
+Revises: 3538650dc471
+Create Date: 2023-10-06 09:53:50.119947
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "064c71206256"
+down_revision = "3538650dc471"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("service_feedback", sa.Column("project_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "service_feedback", "project", ["project_id"], ["id"])
+ op.alter_column(
+ "service_feedback", "hours", existing_type=sa.Integer(), type_=sa.Numeric(), nullable=True
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "service_feedback", type_="foreignkey")
+ op.drop_column("service_feedback", "project_id")
+ op.alter_column(
+ "service_feedback", "hours", existing_type=sa.Numeric(), type_=sa.Integer(), nullable=True
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-10-23_f2605bfc1f59.py b/src/dispatch/database/revisions/tenant/versions/2023-10-23_f2605bfc1f59.py
new file mode 100644
index 000000000000..43a49943f03b
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-10-23_f2605bfc1f59.py
@@ -0,0 +1,41 @@
+"""Adds search vector column to the signal instance model
+
+Revision ID: f2605bfc1f59
+Revises: 064c71206256
+Create Date: 2023-10-23 11:15:53.297563
+
+"""
+from alembic import op
+import sqlalchemy as sa
+import sqlalchemy_utils
+
+# revision identifiers, used by Alembic.
+revision = "f2605bfc1f59"
+down_revision = "064c71206256"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "signal_instance",
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ )
+ op.create_index(
+ "signal_instance_search_vector_idx",
+ "signal_instance",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_index(
+ "signal_instance_search_vector_idx", table_name="signal_instance", postgresql_using="gin"
+ )
+ op.drop_column("signal_instance", "search_vector")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-11-29a_bdaeabba3e53.py b/src/dispatch/database/revisions/tenant/versions/2023-11-29a_bdaeabba3e53.py
new file mode 100644
index 000000000000..9a89179e9ccd
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-11-29a_bdaeabba3e53.py
@@ -0,0 +1,28 @@
+"""Removes search vector from signal instance model
+
+Revision ID: bdaeabba3e53
+Revises: f2605bfc1f59
+Create Date: 2023-11-29 12:59:45.408085
+
+"""
+from alembic import op
+
+# revision identifiers, used by Alembic.
+revision = "bdaeabba3e53"
+down_revision = "f2605bfc1f59"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_index(
+ "signal_instance_search_vector_idx", table_name="signal_instance", postgresql_using="gin"
+ )
+ op.drop_column("signal_instance", "search_vector")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ pass
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-11-29b_580a18ec4c39.py b/src/dispatch/database/revisions/tenant/versions/2023-11-29b_580a18ec4c39.py
new file mode 100644
index 000000000000..77270583353a
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-11-29b_580a18ec4c39.py
@@ -0,0 +1,30 @@
+"""Removes unused fingerprint column from signal instance model
+
+Revision ID: 580a18ec4c39
+Revises: bdaeabba3e53
+Create Date: 2023-11-29 15:40:12.524085
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "580a18ec4c39"
+down_revision = "bdaeabba3e53"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("signal_instance", "fingerprint")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "signal_instance",
+ sa.Column("fingerprint", sa.VARCHAR(), autoincrement=False, nullable=True),
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-12-08_2f06fd73eae6.py b/src/dispatch/database/revisions/tenant/versions/2023-12-08_2f06fd73eae6.py
new file mode 100644
index 000000000000..6b80160e443e
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-12-08_2f06fd73eae6.py
@@ -0,0 +1,27 @@
+"""Adds the engage_next_oncall column to the incident_role table.
+
+Revision ID: 2f06fd73eae6
+Revises: 580a18ec4c39
+Create Date: 2023-12-08 11:22:15.565073
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "2f06fd73eae6"
+down_revision = "580a18ec4c39"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("incident_role", sa.Column("engage_next_oncall", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("incident_role", "engage_next_oncall")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-12-09_99d317cbc848.py b/src/dispatch/database/revisions/tenant/versions/2023-12-09_99d317cbc848.py
new file mode 100644
index 000000000000..12c93f07c0b4
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-12-09_99d317cbc848.py
@@ -0,0 +1,61 @@
+"""Adds forms types
+
+Revision ID: 99d317cbc848
+Revises: 2f06fd73eae6
+Create Date: 2023-11-09 16:07:51.636218
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "99d317cbc848"
+down_revision = "2f06fd73eae6"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table("forms_type",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("enabled", sa.Boolean(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("form_schema", sa.String(), nullable=True),
+ sa.Column("creator_id", sa.Integer(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(["creator_id"], ["individual_contact.id"], ),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id")
+ )
+ op.create_table("forms",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("form_data", sa.String(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("status", sa.String(), nullable=True),
+ sa.Column("attorney_status", sa.String(), nullable=True),
+ sa.Column("attorney_questions", sa.String(), nullable=True),
+ sa.Column("attorney_analysis", sa.String(), nullable=True),
+ sa.Column("creator_id", sa.Integer(), nullable=True),
+ sa.Column("incident_id", sa.Integer(), nullable=True),
+ sa.Column("form_type_id", sa.Integer(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(["creator_id"], ["individual_contact.id"], ),
+ sa.ForeignKeyConstraint(["incident_id"], ["incident.id"], ),
+ sa.ForeignKeyConstraint(["form_type_id"], ["forms_type.id"], ),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("forms_type")
+ op.drop_table("forms")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-12-18_6c1a250b1e4b.py b/src/dispatch/database/revisions/tenant/versions/2023-12-18_6c1a250b1e4b.py
new file mode 100644
index 000000000000..024a13894e37
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-12-18_6c1a250b1e4b.py
@@ -0,0 +1,39 @@
+"""Adds new columns to the tag_type table to support new UI
+
+Revision ID: 6c1a250b1e4b
+Revises: 99d317cbc848
+Create Date: 2023-12-18 10:22:13.668372
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "6c1a250b1e4b"
+down_revision = "99d317cbc848"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("tag_type", sa.Column("discoverable_case", sa.Boolean(), nullable=True, server_default="t", default=True))
+ op.add_column("tag_type", sa.Column("discoverable_incident", sa.Boolean(), nullable=True, server_default="t", default=True))
+ op.add_column("tag_type", sa.Column("discoverable_query", sa.Boolean(), nullable=True, server_default="t", default=True))
+ op.add_column("tag_type", sa.Column("discoverable_signal", sa.Boolean(), nullable=True, server_default="t", default=True))
+ op.add_column("tag_type", sa.Column("discoverable_source", sa.Boolean(), nullable=True, server_default="t", default=True))
+ op.add_column("tag_type", sa.Column("color", sa.String(), nullable=True))
+ op.add_column("tag_type", sa.Column("icon", sa.String(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("tag_type", "icon")
+ op.drop_column("tag_type", "color")
+ op.drop_column("tag_type", "discoverable_case")
+ op.drop_column("tag_type", "discoverable_incident")
+ op.drop_column("tag_type", "discoverable_query")
+ op.drop_column("tag_type", "discoverable_signal")
+ op.drop_column("tag_type", "discoverable_source")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2023-12-27_065c59f15267.py b/src/dispatch/database/revisions/tenant/versions/2023-12-27_065c59f15267.py
new file mode 100644
index 000000000000..e17c726a2384
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2023-12-27_065c59f15267.py
@@ -0,0 +1,105 @@
+"""Adds cost model tables: cost_model, cost_model_activity, participant_activity
+
+Revision ID: 065c59f15267
+Revises: 6c1a250b1e4b
+Create Date: 2023-12-27 13:44:18.845443
+
+"""
+from alembic import op
+import sqlalchemy as sa
+import sqlalchemy_utils
+
+# revision identifiers, used by Alembic.
+revision = "065c59f15267"
+down_revision = "6c1a250b1e4b"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "cost_model",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("enabled", sa.Boolean(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("name", "project_id"),
+ )
+ op.create_index(
+ "cost_model_search_vector_idx",
+ "cost_model",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "cost_model_activity",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("plugin_event_id", sa.Integer(), nullable=True),
+ sa.Column("response_time_seconds", sa.Integer(), nullable=True),
+ sa.Column("enabled", sa.Boolean(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["plugin_event_id"], ["dispatch_core.plugin_event.id"], ondelete="CASCADE"
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ op.create_table(
+ "assoc_cost_model_activities",
+ sa.Column("cost_model_id", sa.Integer(), nullable=False),
+ sa.Column("cost_model_activity_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(
+ ["cost_model_activity_id"],
+ ["cost_model_activity.id"],
+ ondelete="CASCADE",
+ ),
+ sa.ForeignKeyConstraint(["cost_model_id"], ["cost_model.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("cost_model_id", "cost_model_activity_id"),
+ )
+ op.create_table(
+ "participant_activity",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("plugin_event_id", sa.Integer(), nullable=True),
+ sa.Column("started_at", sa.DateTime(), nullable=True),
+ sa.Column("ended_at", sa.DateTime(), nullable=True),
+ sa.Column("participant_id", sa.Integer(), nullable=True),
+ sa.Column("incident_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["incident_id"],
+ ["incident.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["participant_id"],
+ ["participant.id"],
+ ),
+ sa.ForeignKeyConstraint(
+ ["plugin_event_id"],
+ ["dispatch_core.plugin_event.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ op.add_column("incident", sa.Column("cost_model_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "incident", "cost_model", ["cost_model_id"], ["id"])
+ # # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "incident", type_="foreignkey")
+ op.drop_column("incident", "cost_model_id")
+ op.drop_table("participant_activity")
+ op.drop_table("assoc_cost_model_activities")
+ op.drop_table("cost_model_activity")
+ op.drop_index(
+ "cost_model_search_vector_idx",
+ table_name="cost_model",
+ postgresql_using="gin",
+ )
+ op.drop_table("cost_model")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-01-17_d4bbb234d0bc.py b/src/dispatch/database/revisions/tenant/versions/2024-01-17_d4bbb234d0bc.py
new file mode 100644
index 000000000000..9c0ea273336f
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-01-17_d4bbb234d0bc.py
@@ -0,0 +1,29 @@
+"""Adds service column to forms
+
+Revision ID: d4bbb234d0bc
+Revises: 065c59f15267
+Create Date: 2024-01-17 09:45:33.493774
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "d4bbb234d0bc"
+down_revision = "065c59f15267"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("forms_type", sa.Column("service_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "forms_type", "service", ["service_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "forms_type", type_="foreignkey")
+ op.drop_column("forms_type", "service_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-02-13_0283f2bbe9dd.py b/src/dispatch/database/revisions/tenant/versions/2024-02-13_0283f2bbe9dd.py
new file mode 100644
index 000000000000..3f95582d8b4c
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-02-13_0283f2bbe9dd.py
@@ -0,0 +1,28 @@
+"""Adds boolean column default to signal model
+
+Revision ID: 0283f2bbe9dd
+Revises: d4bbb234d0bc
+Create Date: 2024-02-13 15:31:17.975089
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "0283f2bbe9dd"
+down_revision = "d4bbb234d0bc"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("signal", sa.Column("default", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("signal", "default")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-02-21_27e6558e26a8.py b/src/dispatch/database/revisions/tenant/versions/2024-02-21_27e6558e26a8.py
new file mode 100644
index 000000000000..d0e053a76e14
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-02-21_27e6558e26a8.py
@@ -0,0 +1,26 @@
+"""Adds required column to tag_type
+Revision ID: 27e6558e26a8
+Revises: 0283f2bbe9dd
+Create Date: 2024-02-21 14:08:48.787183
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "27e6558e26a8"
+down_revision = "0283f2bbe9dd"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("tag_type", sa.Column("required", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("tag_type", "required")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-03-11_91bd05855ad1.py b/src/dispatch/database/revisions/tenant/versions/2024-03-11_91bd05855ad1.py
new file mode 100644
index 000000000000..0f70337b1937
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-03-11_91bd05855ad1.py
@@ -0,0 +1,41 @@
+"""Adds email template table
+
+Revision ID: 91bd05855ad1
+Revises: 27e6558e26a8
+Create Date: 2024-03-11 10:40:08.313520
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "91bd05855ad1"
+down_revision = "27e6558e26a8"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "email_templates",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("email_template_type", sa.String(), nullable=True),
+ sa.Column("welcome_text", sa.String(), nullable=True),
+ sa.Column("welcome_body", sa.String(), nullable=True),
+ sa.Column("components", sa.String(), nullable=True),
+ sa.Column("enabled", sa.Boolean(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("email_template_type", "project_id"),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("email_templates")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-04-02_71cb25c06fa0.py b/src/dispatch/database/revisions/tenant/versions/2024-04-02_71cb25c06fa0.py
new file mode 100644
index 000000000000..25ba64ec4877
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-04-02_71cb25c06fa0.py
@@ -0,0 +1,27 @@
+"""Adds enabled to project
+
+Revision ID: 71cb25c06fa0
+Revises: 91bd05855ad1
+Create Date: 2024-04-02 12:31:20.012288
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "71cb25c06fa0"
+down_revision = "91bd05855ad1"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("project", sa.Column("enabled", sa.Boolean(), server_default="t", nullable=True, default=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("project", "enabled")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-04-09_3a33bc153e7e.py b/src/dispatch/database/revisions/tenant/versions/2024-04-09_3a33bc153e7e.py
new file mode 100644
index 000000000000..cf6f25a604be
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-04-09_3a33bc153e7e.py
@@ -0,0 +1,29 @@
+"""Creates a column for `dedicated_channel` on the Case model to support Cases
+in dedicated coversation channel (as opposed to only threads).
+
+Revision ID: 3a33bc153e7e
+Revises: 71cb25c06fa0
+Create Date: 2024-04-09 16:28:06.148971
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "3a33bc153e7e"
+down_revision = "71cb25c06fa0"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case", sa.Column("dedicated_channel", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("case", "dedicated_channel")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-04-22_50e99c66e72f.py b/src/dispatch/database/revisions/tenant/versions/2024-04-22_50e99c66e72f.py
new file mode 100644
index 000000000000..16c33635b8e2
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-04-22_50e99c66e72f.py
@@ -0,0 +1,37 @@
+"""Removes the cost_model field from Incident. Adds the cost_model field to IncidentType.
+Revision ID: 50e99c66e72f
+Revises: 3a33bc153e7e
+Create Date: 2024-04-22 14:19:18.388675
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "50e99c66e72f"
+down_revision = "3a33bc153e7e"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint("incident_cost_model_id_fkey", "incident", type_="foreignkey")
+ op.drop_column("incident", "cost_model_id")
+ op.add_column("incident_type", sa.Column("cost_model_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "incident_type", "cost_model", ["cost_model_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "incident_type", type_="foreignkey")
+ op.drop_column("incident_type", "cost_model_id")
+ op.add_column(
+ "incident", sa.Column("cost_model_id", sa.INTEGER(), autoincrement=False, nullable=True)
+ )
+ op.create_foreign_key(
+ "incident_cost_model_id_fkey", "incident", "cost_model", ["cost_model_id"], ["id"]
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-05-17_b07bb852fd67.py b/src/dispatch/database/revisions/tenant/versions/2024-05-17_b07bb852fd67.py
new file mode 100644
index 000000000000..d879a57f0fab
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-05-17_b07bb852fd67.py
@@ -0,0 +1,31 @@
+"""Adds attorney form schema and scoring schema to forms type
+
+Revision ID: b07bb852fd67
+Revises: 50e99c66e72f
+Create Date: 2024-04-10 15:51:10.914748
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "b07bb852fd67"
+down_revision = "50e99c66e72f"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("forms_type", sa.Column("attorney_form_schema", sa.String(), nullable=True))
+ op.add_column("forms_type", sa.Column("scoring_schema", sa.String(), nullable=True))
+ op.add_column("forms", sa.Column("attorney_form_data", sa.String(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("forms_type", "scoring_schema")
+ op.drop_column("forms_type", "attorney_form_schema")
+ op.drop_column("forms", "attorney_form_data")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-05-29_a836d4850a75.py b/src/dispatch/database/revisions/tenant/versions/2024-05-29_a836d4850a75.py
new file mode 100644
index 000000000000..8661fb076e09
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-05-29_a836d4850a75.py
@@ -0,0 +1,27 @@
+"""Adding in "allow_self_join" column to enable self join in dispatch UI
+
+Revision ID: a836d4850a75
+Revises: b07bb852fd67
+Create Date: 2024-04-29 10:28:37.777618
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = 'a836d4850a75'
+down_revision = 'b07bb852fd67'
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column('project', sa.Column('allow_self_join', sa.Boolean(), server_default='t', nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column('project', 'allow_self_join')
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-06-12_4286dcce0a2d.py b/src/dispatch/database/revisions/tenant/versions/2024-06-12_4286dcce0a2d.py
new file mode 100644
index 000000000000..41ed3ce919b3
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-06-12_4286dcce0a2d.py
@@ -0,0 +1,31 @@
+"""Create new channel description and description service id columns for incident type
+
+Revision ID: 4286dcce0a2d
+Revises: a836d4850a75
+Create Date: 2024-06-12 17:45:25.556120
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "4286dcce0a2d"
+down_revision = "a836d4850a75"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("incident_type", sa.Column("channel_description", sa.String(), nullable=True))
+ op.add_column("incident_type", sa.Column("description_service_id", sa.Integer(), nullable=True))
+ op.create_foreign_key("description_service_id_fkey", "incident_type", "service", ["description_service_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint("description_service_id_fkey", "incident_type", type_="foreignkey")
+ op.drop_column("incident_type", "description_service_id")
+ op.drop_column("incident_type", "channel_description")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-06-28_15a8d3228123.py b/src/dispatch/database/revisions/tenant/versions/2024-06-28_15a8d3228123.py
new file mode 100644
index 000000000000..eda4af05ba1f
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-06-28_15a8d3228123.py
@@ -0,0 +1,80 @@
+"""empty message
+
+Revision ID: 15a8d3228123
+Revises: 4286dcce0a2d
+Create Date: 2024-06-28 11:19:27.227089
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+import sqlalchemy_utils
+
+# revision identifiers, used by Alembic.
+revision = "15a8d3228123"
+down_revision = "4286dcce0a2d"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "case_cost_type",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("name", sa.String(), nullable=True),
+ sa.Column("description", sa.String(), nullable=True),
+ sa.Column("category", sa.String(), nullable=True),
+ sa.Column("details", sqlalchemy_utils.types.json.JSONType(), nullable=True),
+ sa.Column("default", sa.Boolean(), nullable=True),
+ sa.Column("editable", sa.Boolean(), nullable=True),
+ sa.Column("search_vector", sqlalchemy_utils.types.ts_vector.TSVectorType(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ op.create_index(
+ "case_cost_type_search_vector_idx",
+ "case_cost_type",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.create_table(
+ "case_cost",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("amount", sa.Numeric(precision=10, scale=2), nullable=True),
+ sa.Column("case_cost_type_id", sa.Integer(), nullable=True),
+ sa.Column("case_id", sa.Integer(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["case_cost_type_id"],
+ ["case_cost_type.id"],
+ ),
+ sa.ForeignKeyConstraint(["case_id"], ["case.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ op.add_column("case_type", sa.Column("cost_model_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "case_type", "cost_model", ["cost_model_id"], ["id"])
+ op.add_column("participant_activity", sa.Column("case_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "participant_activity", "case", ["case_id"], ["id"])
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "participant_activity", type_="foreignkey")
+ op.drop_column("participant_activity", "case_id")
+ op.drop_constraint(None, "case_type", type_="foreignkey")
+ op.drop_column("case_type", "cost_model_id")
+ op.drop_table("case_cost")
+ op.drop_index(
+ "case_cost_type_search_vector_idx", table_name="case_cost_type", postgresql_using="gin"
+ )
+ op.drop_table("case_cost_type")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-07-12_25b8b5829158.py b/src/dispatch/database/revisions/tenant/versions/2024-07-12_25b8b5829158.py
new file mode 100644
index 000000000000..c22c9b9adc1c
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-07-12_25b8b5829158.py
@@ -0,0 +1,27 @@
+"""Adds score to form model
+
+Revision ID: 25b8b5829158
+Revises: 15a8d3228123
+Create Date: 2024-07-12 17:04:12.645228
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "25b8b5829158"
+down_revision = "15a8d3228123"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("forms", sa.Column("score", sa.Integer(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("forms", "score")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-07-19_a812a4cf61e1.py b/src/dispatch/database/revisions/tenant/versions/2024-07-19_a812a4cf61e1.py
new file mode 100644
index 000000000000..d22f3ed24bfd
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-07-19_a812a4cf61e1.py
@@ -0,0 +1,51 @@
+"""Creates alternative storage attributes in project
+
+Revision ID: a812a4cf61e1
+Revises: 25b8b5829158
+Create Date: 2024-04-12 14:48:05.481102
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "a812a4cf61e1"
+down_revision = "25b8b5829158"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("project", sa.Column("storage_folder_one", sa.String(), nullable=True))
+ op.add_column("project", sa.Column("storage_folder_two", sa.String(), nullable=True))
+ op.add_column(
+ "project", sa.Column("storage_use_folder_one_as_primary", sa.Boolean(), nullable=True)
+ )
+ op.add_column(
+ "project",
+ sa.Column(
+ "storage_use_title", sa.Boolean(), server_default=sa.text("false"), nullable=True
+ ),
+ )
+ op.add_column(
+ "tag_type",
+ sa.Column(
+ "use_for_project_folder",
+ sa.Boolean(),
+ server_default=sa.text("false"),
+ nullable=True,
+ ),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("project", "storage_use_folder_one_as_primary")
+ op.drop_column("project", "storage_folder_two")
+ op.drop_column("project", "storage_folder_one")
+ op.drop_column("project", "storage_use_title")
+ op.drop_column("tag_type", "use_for_project_folder")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-07-22_ef17416626ff.py b/src/dispatch/database/revisions/tenant/versions/2024-07-22_ef17416626ff.py
new file mode 100644
index 000000000000..c4c3938cf402
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-07-22_ef17416626ff.py
@@ -0,0 +1,37 @@
+"""Allows association between documents and tags
+
+Revision ID: ef17416626ff
+Revises: a812a4cf61e1
+Create Date: 2024-07-22 15:08:59.529688
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "ef17416626ff"
+down_revision = "a812a4cf61e1"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "assoc_document_tags",
+ sa.Column("document_id", sa.Integer(), nullable=False),
+ sa.Column("tag_id", sa.Integer(), nullable=False),
+ sa.ForeignKeyConstraint(["document_id"], ["document.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["tag_id"], ["tag.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("document_id", "tag_id"),
+ )
+ op.add_column("tag_type", sa.Column("discoverable_document", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("tag_type", "discoverable_document")
+ op.drop_table("assoc_document_tags")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-08-05_71cd7ed999c4.py b/src/dispatch/database/revisions/tenant/versions/2024-08-05_71cd7ed999c4.py
new file mode 100644
index 000000000000..954c18f46d9a
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-08-05_71cd7ed999c4.py
@@ -0,0 +1,28 @@
+"""Adds lifecycle column to signal model
+
+Revision ID: 71cd7ed999c4
+Revises: ef17416626ff
+Create Date: 2024-08-05 15:22:27.578399
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "71cd7ed999c4"
+down_revision = "ef17416626ff"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("signal", sa.Column("lifecycle", sa.String(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("signal", "lifecycle")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-08-26_d6b3853be8e4.py b/src/dispatch/database/revisions/tenant/versions/2024-08-26_d6b3853be8e4.py
new file mode 100644
index 000000000000..f6e05a0701ef
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-08-26_d6b3853be8e4.py
@@ -0,0 +1,22 @@
+"""Adding column for select_commander_visibility
+
+Revision ID: d6b3853be8e4
+Revises: 71cd7ed999c4
+Create Date: 2024-08-26 14:42:48.423369
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = 'd6b3853be8e4'
+down_revision = '71cd7ed999c4'
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ op.add_column('project', sa.Column('select_commander_visibility', sa.Boolean(), server_default='t', nullable=True))
+
+def downgrade():
+ op.drop_column('project', 'select_commander_visibility')
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-09-06_51eacaf1f62c.py b/src/dispatch/database/revisions/tenant/versions/2024-09-06_51eacaf1f62c.py
new file mode 100644
index 000000000000..2457593fc369
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-09-06_51eacaf1f62c.py
@@ -0,0 +1,45 @@
+"""Adds mfa_challenge to track challenges against core Dispatch MFA plugin.
+
+Revision ID: 51eacaf1f62c
+Revises: 71cd7ed999c4
+Create Date: 2024-08-09 12:59:54.631968
+
+"""
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy.dialects import postgresql
+
+# revision identifiers, used by Alembic.
+revision = "51eacaf1f62c"
+down_revision = "d6b3853be8e4"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "mfa_challenge",
+ sa.Column("id", sa.Integer(), nullable=False, autoincrement=True),
+ sa.Column("valid", sa.Boolean(), nullable=True, server_default=sa.text("true")),
+ sa.Column("reason", sa.String(), nullable=True),
+ sa.Column("action", sa.String(), nullable=True),
+ sa.Column("challenge_id", postgresql.UUID(as_uuid=True), nullable=True),
+ sa.Column("dispatch_user_id", sa.Integer(), nullable=False),
+ sa.Column("status", sa.String(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(
+ ["dispatch_user_id"],
+ ["dispatch_core.dispatch_user.id"],
+ ),
+ sa.PrimaryKeyConstraint("id"),
+ sa.UniqueConstraint("challenge_id"),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("mfa_challenge")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-09-10_0a6702319f6a.py b/src/dispatch/database/revisions/tenant/versions/2024-09-10_0a6702319f6a.py
new file mode 100644
index 000000000000..bd0a32a7d410
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-09-10_0a6702319f6a.py
@@ -0,0 +1,31 @@
+"""Adds restriction ability to incident severity
+
+Revision ID: 0a6702319f6a
+Revises: 51eacaf1f62c
+Create Date: 2024-09-10 10:13:04.192475
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "0a6702319f6a"
+down_revision = "51eacaf1f62c"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "incident_severity",
+ sa.Column("allowed_for_stable_incidents", sa.Boolean(), server_default="t", nullable=True),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("incident_severity", "allowed_for_stable_incidents")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-09-14_f729d61738b0.py b/src/dispatch/database/revisions/tenant/versions/2024-09-14_f729d61738b0.py
new file mode 100644
index 000000000000..d577128f95cc
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-09-14_f729d61738b0.py
@@ -0,0 +1,37 @@
+"""Adds project parameter to send weekly reports
+
+Revision ID: f729d61738b0
+Revises: 0a6702319f6a
+Create Date: 2024-08-12 15:22:41.977924
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "f729d61738b0"
+down_revision = "0a6702319f6a"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "project",
+ sa.Column(
+ "send_weekly_reports", sa.Boolean(), nullable=True, server_default=sa.text("false")
+ ),
+ )
+ op.add_column(
+ "project", sa.Column("weekly_report_notification_id", sa.Integer(), nullable=True)
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("project", "send_weekly_reports")
+ op.drop_column("project", "weekly_report_notification_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-09-16_47d616802f56.py b/src/dispatch/database/revisions/tenant/versions/2024-09-16_47d616802f56.py
new file mode 100644
index 000000000000..e8cfc6e8893c
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-09-16_47d616802f56.py
@@ -0,0 +1,30 @@
+"""Adding details column to support additional feedback information
+
+Revision ID: 47d616802f56
+Revises: f729d61738b0
+Create Date: 2024-09-13 10:28:16.823990
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "47d616802f56"
+down_revision = "f729d61738b0"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("service_feedback", sa.Column("details", sa.JSON(), nullable=True))
+ op.add_column("service_feedback_reminder", sa.Column("details", sa.JSON(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("service_feedback_reminder", "details")
+ op.drop_column("service_feedback", "details")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-09-19_19c10c121a22.py b/src/dispatch/database/revisions/tenant/versions/2024-09-19_19c10c121a22.py
new file mode 100644
index 000000000000..72ac2e4de02b
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-09-19_19c10c121a22.py
@@ -0,0 +1,33 @@
+"""Adds GenAI fields to Signal model
+
+Revision ID: 19c10c121a22
+Revises: 47d616802f56
+Create Date: 2024-09-19 14:45:53.064341
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = '19c10c121a22'
+down_revision = '47d616802f56'
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column('signal', sa.Column('runbook', sa.String(), nullable=True))
+ op.add_column('signal', sa.Column('genai_enabled', sa.Boolean(), nullable=True))
+ op.add_column('signal', sa.Column('genai_model', sa.String(), nullable=True))
+ op.add_column('signal', sa.Column('genai_system_message', sa.String(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column('signal', 'genai_system_message')
+ op.drop_column('signal', 'genai_model')
+ op.drop_column('signal', 'genai_enabled')
+ op.drop_column('signal', 'runbook')
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-09-20_1f4dc687945d.py b/src/dispatch/database/revisions/tenant/versions/2024-09-20_1f4dc687945d.py
new file mode 100644
index 000000000000..edd02b4eb38f
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-09-20_1f4dc687945d.py
@@ -0,0 +1,27 @@
+"""Adds new genai prompt column to signal definition model
+
+Revision ID: 1f4dc687945d
+Revises: 19c10c121a22
+Create Date: 2024-09-20 10:49:11.303112
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = '1f4dc687945d'
+down_revision = '19c10c121a22'
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column('signal', sa.Column('genai_prompt', sa.String(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column('signal', 'genai_prompt')
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-09-23_32652e0360dd.py b/src/dispatch/database/revisions/tenant/versions/2024-09-23_32652e0360dd.py
new file mode 100644
index 000000000000..5efbba277340
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-09-23_32652e0360dd.py
@@ -0,0 +1,105 @@
+"""Adds dispatch-create-task slug to slack conversation plugin config.
+
+Revision ID: 32652e0360dd
+Revises: 1f4dc687945d
+Create Date: 2024-09-23 16:02:50.742796
+
+"""
+
+from alembic import op
+from pydantic import SecretStr, ValidationError
+from pydantic.json import pydantic_encoder
+
+from sqlalchemy import Column, Integer, ForeignKey, String
+from sqlalchemy.ext.declarative import declarative_base
+from sqlalchemy.orm import relationship, Session
+from sqlalchemy.ext.hybrid import hybrid_property
+from sqlalchemy_utils import StringEncryptedType
+from sqlalchemy_utils.types.encrypted.encrypted_type import AesEngine
+from dispatch.config import config, DISPATCH_ENCRYPTION_KEY
+
+
+# revision identifiers, used by Alembic.
+revision = "32652e0360dd"
+down_revision = "1f4dc687945d"
+branch_labels = None
+depends_on = None
+
+Base = declarative_base()
+
+
+def show_secrets_encoder(obj):
+ if isinstance(obj, SecretStr):
+ return obj.get_secret_value()
+ else:
+ return pydantic_encoder(obj)
+
+
+def migrate_config(instances, slug, config):
+ for instance in instances:
+ if slug == instance.plugin.slug:
+ instance.configuration = config
+
+
+class Plugin(Base):
+ __tablename__ = "plugin"
+ __table_args__ = {"schema": "dispatch_core"}
+ id = Column(Integer, primary_key=True)
+ slug = Column(String, unique=True)
+
+
+class PluginInstance(Base):
+ __tablename__ = "plugin_instance"
+ id = Column(Integer, primary_key=True)
+ _configuration = Column(
+ StringEncryptedType(key=str(DISPATCH_ENCRYPTION_KEY), engine=AesEngine, padding="pkcs5")
+ )
+ plugin_id = Column(Integer, ForeignKey(Plugin.id))
+ plugin = relationship(Plugin, backref="instances")
+
+ @hybrid_property
+ def configuration(self):
+ """Property that correctly returns a plugins configuration object."""
+ pass
+
+ @configuration.setter
+ def configuration(self, configuration):
+ """Property that correctly sets a plugins configuration object."""
+ if configuration:
+ self._configuration = configuration.json(encoder=show_secrets_encoder)
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ from dispatch.plugins.dispatch_slack.config import SlackConversationConfiguration
+
+ bind = op.get_bind()
+ session = Session(bind=bind)
+
+ instances = session.query(PluginInstance).all()
+
+ # Slash commands
+ SLACK_COMMAND_CREATE_TASK_SLUG = config(
+ "SLACK_COMMAND_CREATE_TASK_SLUG", default="/dispatch-create-task"
+ )
+
+ try:
+ slack_conversation_config = SlackConversationConfiguration(
+ slack_command_create_task=SLACK_COMMAND_CREATE_TASK_SLUG,
+ )
+
+ migrate_config(instances, "slack-conversation", slack_conversation_config)
+
+ except ValidationError:
+ print(
+ "Skipping automatic migration of slack plugin credentials, if you are using the slack plugin manually migrate credentials."
+ )
+
+ session.commit()
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ pass
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-09-27_f5107ce190fc.py b/src/dispatch/database/revisions/tenant/versions/2024-09-27_f5107ce190fc.py
new file mode 100644
index 000000000000..158042b54159
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-09-27_f5107ce190fc.py
@@ -0,0 +1,34 @@
+"""Adds custom incident report card fields
+
+Revision ID: f5107ce190fc
+Revises: 32652e0360dd
+Create Date: 2024-09-27 12:33:17.418418
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "f5107ce190fc"
+down_revision = "32652e0360dd"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("project", sa.Column("report_incident_instructions", sa.String(), nullable=True))
+ op.add_column("project", sa.Column("report_incident_title_hint", sa.String(), nullable=True))
+ op.add_column(
+ "project", sa.Column("report_incident_description_hint", sa.String(), nullable=True)
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("project", "report_incident_description_hint")
+ op.drop_column("project", "report_incident_title_hint")
+ op.drop_column("project", "report_incident_instructions")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-10-08_b057c079c2d5.py b/src/dispatch/database/revisions/tenant/versions/2024-10-08_b057c079c2d5.py
new file mode 100644
index 000000000000..1b5129437373
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-10-08_b057c079c2d5.py
@@ -0,0 +1,37 @@
+"""Adds genai_analysis to case model
+
+Revision ID: b057c079c2d5
+Revises: f5107ce190fc
+Create Date: 2024-10-08 10:38:39.668625
+
+"""
+
+import sqlalchemy as sa
+from alembic import op
+from sqlalchemy.dialects import postgresql
+
+# revision identifiers, used by Alembic.
+revision = "b057c079c2d5"
+down_revision = "f5107ce190fc"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "case",
+ sa.Column(
+ "genai_analysis",
+ postgresql.JSONB(astext_type=sa.Text()),
+ server_default="{}",
+ nullable=False,
+ ),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("case", "genai_analysis")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-10-09_b8c1a8a4d957.py b/src/dispatch/database/revisions/tenant/versions/2024-10-09_b8c1a8a4d957.py
new file mode 100644
index 000000000000..103bffd67bfd
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-10-09_b8c1a8a4d957.py
@@ -0,0 +1,35 @@
+"""Adds tickets to tasks and ticket metadata to incident types
+
+Revision ID: b8c1a8a4d957
+Revises: b057c079c2d5
+Create Date: 2024-10-05 09:06:34.177407
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "b8c1a8a4d957"
+down_revision = "b057c079c2d5"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "incident_type",
+ sa.Column("task_plugin_metadata", sa.JSON(), nullable=True, server_default="[]"),
+ )
+ op.add_column("ticket", sa.Column("task_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "ticket", "task", ["task_id"], ["id"], ondelete="CASCADE")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "ticket", type_="foreignkey")
+ op.drop_column("ticket", "task_id")
+ op.drop_column("incident_type", "task_plugin_metadata")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-10-16_3c49f62d7914.py b/src/dispatch/database/revisions/tenant/versions/2024-10-16_3c49f62d7914.py
new file mode 100644
index 000000000000..caad8de0efa5
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-10-16_3c49f62d7914.py
@@ -0,0 +1,63 @@
+"""Adds case_id and project_id to feedback
+
+Revision ID: 3c49f62d7914
+Revises: b8c1a8a4d957
+Create Date: 2024-10-16 15:21:17.120891
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy.orm import Session
+from dispatch.feedback.incident.models import Feedback
+
+# revision identifiers, used by Alembic.
+revision = "3c49f62d7914"
+down_revision = "b8c1a8a4d957"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("feedback", sa.Column("case_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(
+ "assoc_feedback_case_id_fkey",
+ "feedback",
+ "case",
+ ["case_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+ op.add_column("feedback", sa.Column("project_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(
+ "assoc_feedback_project_id_fkey",
+ "feedback",
+ "project",
+ ["project_id"],
+ ["id"],
+ ondelete="CASCADE",
+ )
+
+ bind = op.get_bind()
+ session = Session(bind=bind)
+
+ instances = session.query(Feedback).all()
+
+ for instance in instances:
+ if instance.incident:
+ instance.project_id = instance.incident.project_id
+ elif instance.case:
+ instance.project_id = instance.case.project_id
+
+ session.commit()
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint("assoc_feedback_case_id_fkey", "feedback", type_="foreignkey")
+ op.drop_column("feedback", "case_id")
+ op.drop_constraint("assoc_feedback_project_id_fkey", "feedback", type_="foreignkey")
+ op.drop_column("feedback", "project_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-10-25_24322617ce9a.py b/src/dispatch/database/revisions/tenant/versions/2024-10-25_24322617ce9a.py
new file mode 100644
index 000000000000..efd161033e6e
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-10-25_24322617ce9a.py
@@ -0,0 +1,99 @@
+"""Adds configuration to the Dispatch Ticket PluginInstance
+
+Revision ID: 24322617ce9a
+Revises: 3c49f62d7914
+Create Date: 2024-10-25 15:15:38.078421
+
+"""
+
+from alembic import op
+from pydantic import SecretStr, ValidationError
+from pydantic.json import pydantic_encoder
+
+from sqlalchemy import Column, Integer, ForeignKey, String
+from sqlalchemy.ext.declarative import declarative_base
+from sqlalchemy.orm import relationship, Session
+from sqlalchemy.ext.hybrid import hybrid_property
+from sqlalchemy_utils import StringEncryptedType
+from sqlalchemy_utils.types.encrypted.encrypted_type import AesEngine
+from dispatch.config import DISPATCH_ENCRYPTION_KEY
+
+# revision identifiers, used by Alembic.
+revision = "24322617ce9a"
+down_revision = "3c49f62d7914"
+branch_labels = None
+depends_on = None
+
+Base = declarative_base()
+
+
+def show_secrets_encoder(obj):
+ if isinstance(obj, SecretStr):
+ return obj.get_secret_value()
+ else:
+ return pydantic_encoder(obj)
+
+
+def migrate_config(instances, slug, config):
+ for instance in instances:
+ if slug == instance.plugin.slug:
+ instance.configuration = config
+
+
+class Plugin(Base):
+ __tablename__ = "plugin"
+ __table_args__ = {"schema": "dispatch_core"}
+ id = Column(Integer, primary_key=True)
+ slug = Column(String, unique=True)
+
+
+class PluginInstance(Base):
+ __tablename__ = "plugin_instance"
+ id = Column(Integer, primary_key=True)
+ _configuration = Column(
+ StringEncryptedType(key=str(DISPATCH_ENCRYPTION_KEY), engine=AesEngine, padding="pkcs5")
+ )
+ plugin_id = Column(Integer, ForeignKey(Plugin.id))
+ plugin = relationship(Plugin, backref="instances")
+
+ @hybrid_property
+ def configuration(self):
+ """Property that correctly returns a plugins configuration object."""
+ pass
+
+ @configuration.setter
+ def configuration(self, configuration):
+ """Property that correctly sets a plugins configuration object."""
+ if configuration:
+ self._configuration = configuration.json(encoder=show_secrets_encoder)
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ from dispatch.plugins.dispatch_core.config import DispatchTicketConfiguration
+
+ bind = op.get_bind()
+ session = Session(bind=bind)
+
+ instances = session.query(PluginInstance).all()
+
+ try:
+ dispatch_ticket_config = DispatchTicketConfiguration(
+ use_incident_name=False,
+ )
+
+ migrate_config(instances, "dispatch-ticket", dispatch_ticket_config)
+
+ except ValidationError:
+ print(
+ "Skipping automatic migration of Dispatch ticket plugin, if you are using the Dispatch ticket plugin, please manually migrate."
+ )
+
+ session.commit()
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ pass
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-10-29_3edb0476365a.py b/src/dispatch/database/revisions/tenant/versions/2024-10-29_3edb0476365a.py
new file mode 100644
index 000000000000..53f964fba5fa
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-10-29_3edb0476365a.py
@@ -0,0 +1,31 @@
+"""Adds auto_close column to case_type model
+
+Revision ID: 3edb0476365a
+Revises: 24322617ce9a
+Create Date: 2024-10-29 13:26:29.001448
+
+"""
+
+import sqlalchemy as sa
+from alembic import op
+
+# revision identifiers, used by Alembic.
+revision = "3edb0476365a"
+down_revision = "24322617ce9a"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "case_type",
+ sa.Column("auto_close", sa.Boolean(), server_default=sa.text("false"), nullable=True),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("case_type", "auto_close")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-11-04_928b725d64f6.py b/src/dispatch/database/revisions/tenant/versions/2024-11-04_928b725d64f6.py
new file mode 100644
index 000000000000..2489456fed8e
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-11-04_928b725d64f6.py
@@ -0,0 +1,67 @@
+"""Fixes automatic generation issues
+
+Revision ID: 928b725d64f6
+Revises: 3edb0476365a
+Create Date: 2024-11-04 15:55:57.864691
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+from sqlalchemy.engine.reflection import Inspector
+
+# revision identifiers, used by Alembic.
+revision = "928b725d64f6"
+down_revision = "3edb0476365a"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ conn = op.get_bind()
+ inspector = Inspector.from_engine(conn)
+
+ # Check if the table exists
+ if "service_incident" in inspector.get_table_names():
+ op.drop_table("service_incident")
+
+ op.alter_column(
+ "entity", "source", existing_type=sa.BOOLEAN(), type_=sa.String(), existing_nullable=True
+ )
+
+ indexes = inspector.get_indexes("entity")
+ index_exists = any(index["name"] == "ix_entity_search_vector" for index in indexes)
+
+ if index_exists:
+ op.drop_index("ix_entity_search_vector", table_name="entity", postgresql_using="gin")
+
+ index_exists = any(index["name"] == "entity_search_vector_idx" for index in indexes)
+ if not index_exists:
+ op.create_index(
+ "entity_search_vector_idx",
+ "entity",
+ ["search_vector"],
+ unique=False,
+ postgresql_using="gin",
+ )
+ op.alter_column("entity_type", "jpath", existing_type=sa.VARCHAR(), nullable=True)
+
+ columns = inspector.get_columns("plugin_instance")
+ column_exists = any(column["name"] == "configuration" for column in columns)
+ if column_exists:
+ op.drop_column("plugin_instance", "configuration")
+
+ foreign_keys = inspector.get_foreign_keys("project")
+
+ constraint_name = "project_stable_priority_id_fkey"
+ constraint_exists = any(fk["name"] == constraint_name for fk in foreign_keys)
+ if constraint_exists:
+ op.drop_constraint(constraint_name, "project", type_="foreignkey")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ pass
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-12-05_575ca7d954a8.py b/src/dispatch/database/revisions/tenant/versions/2024-12-05_575ca7d954a8.py
new file mode 100644
index 000000000000..8365c4d919cc
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-12-05_575ca7d954a8.py
@@ -0,0 +1,29 @@
+"""Adds incident summary to the incident table.
+
+Revision ID: 575ca7d954a8
+Revises: 928b725d64f6
+Create Date: 2024-12-05 15:05:46.932404
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "575ca7d954a8"
+down_revision = "928b725d64f6"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("incident", sa.Column("summary", sa.String(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("incident", "summary")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-12-12_2d9e4d392ea4.py b/src/dispatch/database/revisions/tenant/versions/2024-12-12_2d9e4d392ea4.py
new file mode 100644
index 000000000000..01c33a2ed21e
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-12-12_2d9e4d392ea4.py
@@ -0,0 +1,35 @@
+"""Adding display name to the projct model
+
+Revision ID: 2d9e4d392ea4
+Revises: 575ca7d954a8
+Create Date: 2024-12-12 16:34:58.098426
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "2d9e4d392ea4"
+down_revision = "575ca7d954a8"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "project", sa.Column("display_name", sa.String(), server_default="", nullable=False)
+ )
+
+ # Copy data from 'name' column to 'display_name' column
+ op.execute("UPDATE project SET display_name = name")
+
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("project", "display_name")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2024-12-17_dfc8e213a2c4.py b/src/dispatch/database/revisions/tenant/versions/2024-12-17_dfc8e213a2c4.py
new file mode 100644
index 000000000000..75c55e903ebe
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2024-12-17_dfc8e213a2c4.py
@@ -0,0 +1,34 @@
+"""Adds conversation target and oncall service override options to signal instances
+
+Revision ID: dfc8e213a2c4
+Revises: 2d9e4d392ea4
+Create Date: 2024-12-17 10:02:26.920568
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = 'dfc8e213a2c4'
+down_revision = '2d9e4d392ea4'
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ op.add_column("signal_instance", sa.Column("conversation_target", sa.String(), nullable=True))
+ op.add_column("signal_instance", sa.Column("oncall_service_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(
+ "oncall_service_id_fkey",
+ "signal_instance",
+ "service",
+ ["oncall_service_id"],
+ ["id"]
+ )
+
+
+def downgrade():
+ op.drop_constraint("oncall_service_id_fkey", "signal_instance", type_="foreignkey")
+ op.drop_column("signal_instance", "oncall_service_id")
+ op.drop_column("signal_instance", "conversation_target")
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-01-24_753ea20c2680.py b/src/dispatch/database/revisions/tenant/versions/2025-01-24_753ea20c2680.py
new file mode 100644
index 000000000000..f5eda029732c
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-01-24_753ea20c2680.py
@@ -0,0 +1,32 @@
+"""Adds a column allowing an incident priority level configurable on whether a "IC will be slow to respond" Slack message
+
+Revision ID: 753ea20c2680
+Revises: dfc8e213a2c4
+Create Date: 2025-01-24 13:48:26.599465
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "753ea20c2680"
+down_revision = "dfc8e213a2c4"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "incident_priority",
+ sa.Column("disable_delayed_message_warning", sa.Boolean(), nullable=True),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("incident_priority", "disable_delayed_message_warning")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-02-02_8790ff1c8d6d.py b/src/dispatch/database/revisions/tenant/versions/2025-02-02_8790ff1c8d6d.py
new file mode 100644
index 000000000000..cb60811e676c
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-02-02_8790ff1c8d6d.py
@@ -0,0 +1,31 @@
+"""Add signal relationship to event
+
+Revision ID: 8790ff1c8d6d
+Revises: 753ea20c2680
+Create Date: 2025-02-02 17:51:32.831061
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "8790ff1c8d6d"
+down_revision = "753ea20c2680"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("event", sa.Column("signal_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(None, "event", "signal", ["signal_id"], ["id"], ondelete="CASCADE")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "event", type_="foreignkey")
+ op.drop_column("event", "signal_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-02-17_2f18776252bb.py b/src/dispatch/database/revisions/tenant/versions/2025-02-17_2f18776252bb.py
new file mode 100644
index 000000000000..20ec796656e0
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-02-17_2f18776252bb.py
@@ -0,0 +1,29 @@
+"""Adds column to exclude incident types from reminders
+
+Revision ID: 2f18776252bb
+Revises: 8790ff1c8d6d
+Create Date: 2025-02-17 14:52:36.771074
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "2f18776252bb"
+down_revision = "8790ff1c8d6d"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("incident_type", sa.Column("exclude_from_reminders", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("incident_type", "exclude_from_reminders")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-03-04_da444de005a6.py b/src/dispatch/database/revisions/tenant/versions/2025-03-04_da444de005a6.py
new file mode 100644
index 000000000000..8d12c153c0cd
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-03-04_da444de005a6.py
@@ -0,0 +1,49 @@
+"""Adds model_type columnn to case_case_type and sets to 'New' when default
+
+Revision ID: da444de005a6
+Revises: 2f18776252bb
+Create Date: 2025-03-04 10:06:46.787201
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "da444de005a6"
+down_revision = "2f18776252bb"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ op.add_column("case_cost_type", sa.Column("model_type", sa.String(), nullable=True))
+
+ # Update 'model_type' to 'New' where 'default' is true
+ op.execute(
+ """
+ UPDATE case_cost_type
+ SET model_type = 'New'
+ WHERE "default" = true
+ """
+ )
+
+ op.drop_column("case_cost_type", "default")
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ op.add_column(
+ "case_cost_type",
+ sa.Column("default", sa.Boolean(), server_default=False),
+ )
+ op.execute(
+ """
+ UPDATE case_cost_type
+ SET "default" = true
+ WHERE model_type = 'New'
+ """
+ )
+ op.drop_column("case_cost_type", "model_type")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-03-11_37406cca756c.py b/src/dispatch/database/revisions/tenant/versions/2025-03-11_37406cca756c.py
new file mode 100644
index 000000000000..5774f9eaa6c4
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-03-11_37406cca756c.py
@@ -0,0 +1,49 @@
+"""Replaces foreign key for case_cost and case_cost_type tables
+
+Revision ID: 37406cca756c
+Revises: da444de005a6
+Create Date: 2025-03-11 16:26:52.406726
+
+"""
+
+from alembic import op
+from sqlalchemy.engine.reflection import Inspector
+
+# revision identifiers, used by Alembic.
+revision = "37406cca756c"
+down_revision = "da444de005a6"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # Get the connection and inspector
+ conn = op.get_bind()
+ inspector = Inspector.from_engine(conn)
+
+ # Check if the constraint exists
+ constraints = inspector.get_foreign_keys("case_cost")
+ constraint_names = [constraint["name"] for constraint in constraints]
+
+ # Drop the constraint if it exists
+ if "case_cost_case_cost_type_id_fkey" in constraint_names:
+ op.drop_constraint("case_cost_case_cost_type_id_fkey", "case_cost", type_="foreignkey")
+
+ # Create the new foreign key constraint
+ op.create_foreign_key(
+ None, "case_cost", "case_cost_type", ["case_cost_type_id"], ["id"], ondelete="CASCADE"
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "case_cost", type_="foreignkey")
+ op.create_foreign_key(
+ "case_cost_case_cost_type_id_fkey",
+ "case_cost",
+ "case_cost_type",
+ ["case_cost_type_id"],
+ ["id"],
+ )
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-03-14_92a359040b8e.py b/src/dispatch/database/revisions/tenant/versions/2025-03-14_92a359040b8e.py
new file mode 100644
index 000000000000..81c9032d66c0
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-03-14_92a359040b8e.py
@@ -0,0 +1,28 @@
+"""Adds column to exclude incident types from review
+
+Revision ID: 92a359040b8e
+Revises: 37406cca756c
+Create Date: 2025-03-14 14:05:30.522240
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = '92a359040b8e'
+down_revision = '37406cca756c'
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column('incident_type', sa.Column('exclude_from_review', sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column('incident_type', 'exclude_from_review')
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-03-28_bccbf255d6d1.py b/src/dispatch/database/revisions/tenant/versions/2025-03-28_bccbf255d6d1.py
new file mode 100644
index 000000000000..4c6a119b3e17
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-03-28_bccbf255d6d1.py
@@ -0,0 +1,35 @@
+"""adds snooze extension oncall service to project
+
+Revision ID: bccbf255d6d1
+Revises: 92a359040b8e
+Create Date: 2025-03-28 13:12:07.514337
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "bccbf255d6d1"
+down_revision = "92a359040b8e"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "project", sa.Column("snooze_extension_oncall_service_id", sa.Integer(), nullable=True)
+ )
+ op.create_foreign_key(
+ None, "project", "service", ["snooze_extension_oncall_service_id"], ["id"]
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint(None, "project", type_="foreignkey")
+ op.drop_column("project", "snooze_extension_oncall_service_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-04-09_4aa819afb065.py b/src/dispatch/database/revisions/tenant/versions/2025-04-09_4aa819afb065.py
new file mode 100644
index 000000000000..4799a018d3b2
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-04-09_4aa819afb065.py
@@ -0,0 +1,31 @@
+"""Adds shift_hours_type column to service table
+
+Revision ID: 4aa819afb065
+Revises: bccbf255d6d1
+Create Date: 2025-04-09 15:58:57.943940
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "4aa819afb065"
+down_revision = "bccbf255d6d1"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "service", sa.Column("shift_hours_type", sa.Integer(), nullable=True, server_default="24")
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("service", "shift_hours_type")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-04-17_8f324b0f365a.py b/src/dispatch/database/revisions/tenant/versions/2025-04-17_8f324b0f365a.py
new file mode 100644
index 000000000000..6c52da9052ad
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-04-17_8f324b0f365a.py
@@ -0,0 +1,25 @@
+"""Adds `event` column to differentiate if the case was reported via the event reporting form
+
+Revision ID: 8f324b0f365a
+Revises: 4aa819afb065
+Create Date: 2025-04-17 11:52:11.148817
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "8f324b0f365a"
+down_revision = "4aa819afb065"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ op.add_column("case", sa.Column("event", sa.Boolean(), nullable=True))
+
+
+def downgrade():
+ op.drop_column("case", "event")
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-06-04_7fc3888c7b9a.py b/src/dispatch/database/revisions/tenant/versions/2025-06-04_7fc3888c7b9a.py
new file mode 100644
index 000000000000..d26c95380694
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-06-04_7fc3888c7b9a.py
@@ -0,0 +1,29 @@
+"""Add GenAI suggestions column to tag_type table
+
+Revision ID: 7fc3888c7b9a
+Revises: 8f324b0f365a
+Create Date: 2025-06-04 14:49:20.592746
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "7fc3888c7b9a"
+down_revision = "8f324b0f365a"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("tag_type", sa.Column("genai_suggestions", sa.Boolean(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("tag_type", "genai_suggestions")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-06-20_5ed5defd1a55.py b/src/dispatch/database/revisions/tenant/versions/2025-06-20_5ed5defd1a55.py
new file mode 100644
index 000000000000..635c55e28005
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-06-20_5ed5defd1a55.py
@@ -0,0 +1,28 @@
+"""Add stable_at column to case table
+Revision ID: 5ed5defd1a55
+Revises: 7fc3888c7b9a
+Create Date: 2025-06-20 11:59:13.546032
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "5ed5defd1a55"
+down_revision = "7fc3888c7b9a"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case", sa.Column("stable_at", sa.DateTime(), nullable=True))
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("case", "stable_at")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-07-08_f63ad392dbbf.py b/src/dispatch/database/revisions/tenant/versions/2025-07-08_f63ad392dbbf.py
new file mode 100644
index 000000000000..951b51a7d742
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-07-08_f63ad392dbbf.py
@@ -0,0 +1,41 @@
+"""Adds settings to case and incident types for generating read-in summaries.
+
+Revision ID: f63ad392dbbf
+Revises: 5ed5defd1a55
+Create Date: 2025-07-08 13:56:35.033622
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "f63ad392dbbf"
+down_revision = "5ed5defd1a55"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "case_type",
+ sa.Column(
+ "generate_read_in_summary", sa.Boolean(), server_default=sa.text("false"), nullable=True
+ ),
+ )
+ op.add_column(
+ "incident_type",
+ sa.Column(
+ "generate_read_in_summary", sa.Boolean(), server_default=sa.text("false"), nullable=True
+ ),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("incident_type", "generate_read_in_summary")
+ op.drop_column("case_type", "generate_read_in_summary")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-07-11_aa87efd3d6c1.py b/src/dispatch/database/revisions/tenant/versions/2025-07-11_aa87efd3d6c1.py
new file mode 100644
index 000000000000..8c0d1d435f73
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-07-11_aa87efd3d6c1.py
@@ -0,0 +1,103 @@
+"""Adds new Slack command for read-in summary generation
+
+Revision ID: aa87efd3d6c1
+Revises: f63ad392dbbf
+Create Date: 2025-07-11 10:02:39.819258
+
+"""
+
+from alembic import op
+from pydantic import SecretStr, ValidationError
+from pydantic.json import pydantic_encoder
+
+from sqlalchemy import Column, Integer, ForeignKey, String
+from sqlalchemy.ext.declarative import declarative_base
+from sqlalchemy.orm import relationship, Session
+from sqlalchemy.ext.hybrid import hybrid_property
+from sqlalchemy_utils import StringEncryptedType
+from sqlalchemy_utils.types.encrypted.encrypted_type import AesEngine
+from dispatch.config import config, DISPATCH_ENCRYPTION_KEY
+
+
+# revision identifiers, used by Alembic.
+revision = "aa87efd3d6c1"
+down_revision = "f63ad392dbbf"
+branch_labels = None
+depends_on = None
+
+Base = declarative_base()
+
+
+def show_secrets_encoder(obj):
+ if isinstance(obj, SecretStr):
+ return obj.get_secret_value()
+ else:
+ return pydantic_encoder(obj)
+
+
+def migrate_config(instances, slug, config):
+ for instance in instances:
+ if slug == instance.plugin.slug:
+ instance.configuration = config
+
+
+class Plugin(Base):
+ __tablename__ = "plugin"
+ __table_args__ = {"schema": "dispatch_core"}
+ id = Column(Integer, primary_key=True)
+ slug = Column(String, unique=True)
+
+
+class PluginInstance(Base):
+ __tablename__ = "plugin_instance"
+ id = Column(Integer, primary_key=True)
+ _configuration = Column(
+ StringEncryptedType(key=str(DISPATCH_ENCRYPTION_KEY), engine=AesEngine, padding="pkcs5")
+ )
+ plugin_id = Column(Integer, ForeignKey(Plugin.id))
+ plugin = relationship(Plugin, backref="instances")
+
+ @hybrid_property
+ def configuration(self):
+ """Property that correctly returns a plugins configuration object."""
+ pass
+
+ @configuration.setter
+ def configuration(self, configuration):
+ """Property that correctly sets a plugins configuration object."""
+ if configuration:
+ self._configuration = configuration.json(encoder=show_secrets_encoder)
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ from dispatch.plugins.dispatch_slack.config import SlackConversationConfiguration
+
+ bind = op.get_bind()
+ session = Session(bind=bind)
+
+ instances = session.query(PluginInstance).all()
+
+ # Slash commands
+ SLACK_COMMAND_SUMMARY_SLUG = config("SLACK_COMMAND_SUMMARY_SLUG", default="/dispatch-summary")
+
+ try:
+ slack_conversation_config = SlackConversationConfiguration(
+ slack_command_summary=SLACK_COMMAND_SUMMARY_SLUG,
+ )
+
+ migrate_config(instances, "slack-conversation", slack_conversation_config)
+
+ except ValidationError:
+ print(
+ "Skipping automatic migration of slack plugin credentials, if you are using the slack plugin manually migrate credentials."
+ )
+
+ session.commit()
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ pass
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-07-12_6e66b6578810.py b/src/dispatch/database/revisions/tenant/versions/2025-07-12_6e66b6578810.py
new file mode 100644
index 000000000000..282344c0e482
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-07-12_6e66b6578810.py
@@ -0,0 +1,37 @@
+"""Adds disable delayed message warning column to case_priority table
+
+Revision ID: 6e66b6578810
+Revises: aa87efd3d6c1
+Create Date: 2025-07-11 12:26:16.155438
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "6e66b6578810"
+down_revision = "aa87efd3d6c1"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column(
+ "case_priority",
+ sa.Column(
+ "disable_delayed_message_warning",
+ sa.Boolean(),
+ nullable=True,
+ server_default=sa.text("false"),
+ ),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_column("case_priority", "disable_delayed_message_warning")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-07-14_df10accae9a9.py b/src/dispatch/database/revisions/tenant/versions/2025-07-14_df10accae9a9.py
new file mode 100644
index 000000000000..61b4e49f2eb5
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-07-14_df10accae9a9.py
@@ -0,0 +1,42 @@
+"""Add case notes model
+
+Revision ID: df10accae9a9
+Revises: 6e66b6578810
+Create Date: 2025-07-11 12:59:07.633861
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "df10accae9a9"
+down_revision = "6e66b6578810"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ # Create case_notes table
+ op.create_table(
+ "case_notes",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("content", sa.String(), nullable=True),
+ sa.Column("case_id", sa.Integer(), nullable=True),
+ sa.Column("last_updated_by_id", sa.Integer(), nullable=True),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.ForeignKeyConstraint(["case_id"], ["case.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["last_updated_by_id"], ["individual_contact.id"]),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ # Drop case_notes table
+ op.drop_table("case_notes")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-07-16_408118048599.py b/src/dispatch/database/revisions/tenant/versions/2025-07-16_408118048599.py
new file mode 100644
index 000000000000..31fb19943b10
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-07-16_408118048599.py
@@ -0,0 +1,51 @@
+"""Adds GenAI prompt table.
+
+Revision ID: 408118048599
+Revises: df10accae9a9
+Create Date: 2025-07-16 16:39:33.957649
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "408118048599"
+down_revision = "df10accae9a9"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "prompt",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("genai_type", sa.Integer(), nullable=False),
+ sa.Column("genai_prompt", sa.String(), nullable=False),
+ sa.Column("genai_system_message", sa.String(), nullable=True),
+ sa.Column("enabled", sa.Boolean(), nullable=False),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ )
+
+ # Add unique constraint to ensure only one enabled prompt per type per project
+ op.create_unique_constraint(
+ "uq_prompt_type_project_enabled",
+ "prompt",
+ ["genai_type", "project_id", "enabled"],
+ deferrable=True,
+ initially="DEFERRED",
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint("uq_prompt_type_project_enabled", "prompt", type_="unique")
+ op.drop_table("prompt")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-08-01_4649b11b683f.py b/src/dispatch/database/revisions/tenant/versions/2025-08-01_4649b11b683f.py
new file mode 100644
index 000000000000..a44a9fbd1d41
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-08-01_4649b11b683f.py
@@ -0,0 +1,33 @@
+"""Add resolved_by to case table
+
+Revision ID: 4649b11b683f
+Revises: 408118048599
+Create Date: 2025-08-01 14:11:04.276577
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+
+# revision identifiers, used by Alembic.
+revision = "4649b11b683f"
+down_revision = "408118048599"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.add_column("case", sa.Column("resolved_by_id", sa.Integer(), nullable=True))
+ op.create_foreign_key(
+ "fk_case_resolved_by_id", "case", "individual_contact", ["resolved_by_id"], ["id"]
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_constraint("fk_case_resolved_by_id", "case", type_="foreignkey")
+ op.drop_column("case", "resolved_by_id")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-08-06_f2bce475e71b.py b/src/dispatch/database/revisions/tenant/versions/2025-08-06_f2bce475e71b.py
new file mode 100644
index 000000000000..de8e663235ee
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-08-06_f2bce475e71b.py
@@ -0,0 +1,23 @@
+"""Adds support for configuring security event suggestions in the project model.
+
+Revision ID: f2bce475e71b
+Revises: 408118048599
+Create Date: 2025-08-06 09:48:16.045362
+
+"""
+from alembic import op
+import sqlalchemy as sa
+
+
+revision = 'f2bce475e71b'
+down_revision = '4649b11b683f'
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ op.add_column('project', sa.Column('suggest_security_event_over_incident', sa.Boolean(), server_default='f', nullable=True))
+
+
+def downgrade():
+ op.drop_column('project', 'suggest_security_event_over_incident')
diff --git a/src/dispatch/database/revisions/tenant/versions/2025-08-28_ff08d822ef2c.py b/src/dispatch/database/revisions/tenant/versions/2025-08-28_ff08d822ef2c.py
new file mode 100644
index 000000000000..9e1aa5e06a64
--- /dev/null
+++ b/src/dispatch/database/revisions/tenant/versions/2025-08-28_ff08d822ef2c.py
@@ -0,0 +1,42 @@
+"""Create canvas table.
+
+Revision ID: ff08d822ef2c
+Revises: f2bce475e71b
+Create Date: 2025-08-28 15:33:37.139043
+
+"""
+
+from alembic import op
+import sqlalchemy as sa
+
+# revision identifiers, used by Alembic.
+revision = "ff08d822ef2c"
+down_revision = "f2bce475e71b"
+branch_labels = None
+depends_on = None
+
+
+def upgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.create_table(
+ "canvas",
+ sa.Column("id", sa.Integer(), nullable=False),
+ sa.Column("canvas_id", sa.String(), nullable=False),
+ sa.Column("incident_id", sa.Integer(), nullable=True),
+ sa.Column("case_id", sa.Integer(), nullable=True),
+ sa.Column("type", sa.String(), nullable=False),
+ sa.Column("created_at", sa.DateTime(), nullable=True),
+ sa.Column("updated_at", sa.DateTime(), nullable=True),
+ sa.Column("project_id", sa.Integer(), nullable=True),
+ sa.ForeignKeyConstraint(["case_id"], ["case.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["incident_id"], ["incident.id"], ondelete="CASCADE"),
+ sa.ForeignKeyConstraint(["project_id"], ["project.id"], ondelete="CASCADE"),
+ sa.PrimaryKeyConstraint("id"),
+ )
+ # ### end Alembic commands ###
+
+
+def downgrade():
+ # ### commands auto generated by Alembic - please adjust! ###
+ op.drop_table("canvas")
+ # ### end Alembic commands ###
diff --git a/src/dispatch/database/service.py b/src/dispatch/database/service.py
new file mode 100644
index 000000000000..32e34f85b739
--- /dev/null
+++ b/src/dispatch/database/service.py
@@ -0,0 +1,850 @@
+import logging
+import json
+from collections import namedtuple
+from collections.abc import Iterable
+from inspect import signature
+from itertools import chain
+
+from fastapi import Depends, Query
+from pydantic import StringConstraints
+from pydantic import Json
+from six import string_types
+from sortedcontainers import SortedSet
+from sqlalchemy import Table, and_, desc, func, not_, or_, orm, exists
+from sqlalchemy.exc import InvalidRequestError, ProgrammingError
+from sqlalchemy.orm import mapperlib, Query as SQLAlchemyQuery
+from sqlalchemy_filters import apply_pagination, apply_sort
+from sqlalchemy_filters.exceptions import BadFilterFormat, FieldNotFound
+from sqlalchemy_filters.models import Field, BadQuery, BadSpec
+
+from .core import Base, get_class_by_tablename, get_model_name_by_tablename
+from dispatch.auth.models import DispatchUser
+from dispatch.auth.service import CurrentUser, get_current_role
+from dispatch.case.models import Case
+from dispatch.data.query.models import Query as QueryModel
+from dispatch.data.source.models import Source
+from dispatch.database.core import DbSession
+from dispatch.enums import UserRoles, Visibility
+from dispatch.feedback.incident.models import Feedback
+from dispatch.incident.models import Incident
+from dispatch.incident.type.models import IncidentType
+from dispatch.individual.models import IndividualContact
+from dispatch.participant.models import Participant
+from dispatch.plugin.models import Plugin, PluginInstance
+from dispatch.project.models import Project
+from dispatch.search.fulltext.composite_search import CompositeSearch
+from dispatch.signal.models import Signal, SignalInstance
+from dispatch.tag.models import Tag
+
+from dispatch.task.models import Task
+from typing import Annotated
+
+log = logging.getLogger(__file__)
+
+# allows only printable characters
+QueryStr = Annotated[str, StringConstraints(pattern=r"^[ -~]+$", min_length=1)]
+
+BooleanFunction = namedtuple("BooleanFunction", ("key", "sqlalchemy_fn", "only_one_arg"))
+BOOLEAN_FUNCTIONS = [
+ BooleanFunction("or", or_, False),
+ BooleanFunction("and", and_, False),
+ BooleanFunction("not", not_, True),
+]
+
+
+class Operator(object):
+ OPERATORS = {
+ "is_null": lambda f: f.is_(None),
+ "is_not_null": lambda f: f.isnot(None),
+ "==": lambda f, a: f == a,
+ "eq": lambda f, a: f == a,
+ "!=": lambda f, a: f != a,
+ "ne": lambda f, a: f != a,
+ ">": lambda f, a: f > a,
+ "gt": lambda f, a: f > a,
+ "<": lambda f, a: f < a,
+ "lt": lambda f, a: f < a,
+ ">=": lambda f, a: f >= a,
+ "ge": lambda f, a: f >= a,
+ "<=": lambda f, a: f <= a,
+ "le": lambda f, a: f <= a,
+ "like": lambda f, a: f.like(a),
+ "ilike": lambda f, a: f.ilike(a),
+ "not_ilike": lambda f, a: ~f.ilike(a),
+ "in": lambda f, a: f.in_(a),
+ "not_in": lambda f, a: ~f.in_(a),
+ "any": lambda f, a: f.any(a),
+ "not_any": lambda f, a: func.not_(f.any(a)),
+ }
+
+ def __init__(self, operator=None):
+ if not operator:
+ operator = "=="
+
+ if operator not in self.OPERATORS:
+ raise BadFilterFormat("Operator `{}` not valid.".format(operator))
+
+ self.operator = operator
+ self.function = self.OPERATORS[operator]
+ self.arity = len(signature(self.function).parameters)
+
+
+class Filter(object):
+ def __init__(self, filter_spec):
+ self.filter_spec = filter_spec
+
+ try:
+ filter_spec["field"]
+ except KeyError:
+ raise BadFilterFormat("`field` is a mandatory filter attribute.") from None
+ except TypeError:
+ raise BadFilterFormat(
+ "Filter spec `{}` should be a dictionary.".format(filter_spec)
+ ) from None
+
+ self.operator = Operator(filter_spec.get("op"))
+ self.value = filter_spec.get("value")
+ value_present = True if "value" in filter_spec else False
+ if not value_present and self.operator.arity == 2:
+ raise BadFilterFormat("`value` must be provided.")
+
+ def get_named_models(self):
+ if "model" in self.filter_spec:
+ model = self.filter_spec["model"]
+ if model in ["Participant", "Commander", "Assignee"]:
+ return {"IndividualContact"}
+ if model == "TagAll":
+ return {"Tag"}
+ if model == "NotCaseType":
+ return {"CaseType"}
+ else:
+ return {self.filter_spec["model"]}
+ return set()
+
+ def format_for_sqlalchemy(self, query: SQLAlchemyQuery, default_model):
+ filter_spec = self.filter_spec
+ if filter_spec.get("model") in ["Participant", "Commander", "Assignee"]:
+ filter_spec["model"] = "IndividualContact"
+ elif filter_spec.get("model") == "NotCaseType":
+ filter_spec["model"] = "CaseType"
+ elif filter_spec.get("model") == "TagAll":
+ filter_spec["model"] = "Tag"
+
+ operator = self.operator
+ value = self.value
+
+ # Special handling for TagType.id filtering on Tag model
+ # Needed since TagType.id is not a column on the Tag model
+ # Convert TagType.id filter to tag_type_id filter on Tag model
+ if (
+ filter_spec.get("model") == "TagType"
+ and filter_spec.get("field") == "id"
+ and default_model
+ and getattr(default_model, "__tablename__", None) == "tag"
+ ):
+ filter_spec = {"model": "Tag", "field": "tag_type_id", "op": filter_spec.get("op")}
+
+ model = get_model_from_spec(filter_spec, query, default_model)
+
+ function = operator.function
+ arity = operator.arity
+
+ field_name = filter_spec["field"]
+ field = Field(model, field_name)
+ sqlalchemy_field = field.get_sqlalchemy_field()
+
+ if arity == 1:
+ return function(sqlalchemy_field)
+
+ if arity == 2:
+ return function(sqlalchemy_field, value)
+
+
+def get_model_from_spec(spec, query, default_model=None):
+ """Determine the model to which a spec applies on a given query.
+ A spec that does not specify a model may be applied to a query that
+ contains a single model. Otherwise the spec must specify the model to
+ which it applies, and that model must be present in the query.
+ :param query:
+ A :class:`sqlalchemy.orm.Query` instance.
+ :param spec:
+ A dictionary that may or may not contain a model name to resolve
+ against the query.
+ :returns:
+ A model instance.
+ :raise BadSpec:
+ If the spec is ambiguous or refers to a model not in the query.
+ :raise BadQuery:
+ If the query contains no models.
+ """
+ models = get_query_models(query)
+ if not models:
+ raise BadQuery("The query does not contain any models.")
+
+ model_name = spec.get("model")
+ if model_name is not None:
+ models = [v for (k, v) in models.items() if k == model_name]
+ if not models:
+ raise BadSpec(f"The query had models {models} does not contain model `{model_name}`.")
+ model = models[0]
+ else:
+ if len(models) == 1:
+ model = list(models.values())[0]
+ elif default_model is not None:
+ return default_model
+ else:
+ raise BadSpec("Ambiguous spec. Please specify a model.")
+
+ return model
+
+
+class BooleanFilter(object):
+ def __init__(self, function, *filters):
+ self.function = function
+ self.filters = filters
+
+ def get_named_models(self):
+ models = SortedSet()
+ for filter in self.filters:
+ named_models = filter.get_named_models()
+ if named_models:
+ models.add(*named_models)
+ return models
+
+ def format_for_sqlalchemy(self, query: SQLAlchemyQuery, default_model):
+ return self.function(
+ *[filter.format_for_sqlalchemy(query, default_model) for filter in self.filters]
+ )
+
+
+def _is_iterable_filter(filter_spec):
+ """`filter_spec` may be a list of nested filter specs, or a dict."""
+ return isinstance(filter_spec, Iterable) and not isinstance(filter_spec, (string_types, dict))
+
+
+def build_filters(filter_spec):
+ """Recursively process `filter_spec`"""
+ if _is_iterable_filter(filter_spec):
+ return list(chain.from_iterable(build_filters(item) for item in filter_spec))
+
+ if isinstance(filter_spec, dict):
+ # Check if filter spec defines a boolean function.
+ for boolean_function in BOOLEAN_FUNCTIONS:
+ if boolean_function.key in filter_spec:
+ # The filter spec is for a boolean-function
+ # Get the function argument definitions and validate
+ fn_args = filter_spec[boolean_function.key]
+
+ if not _is_iterable_filter(fn_args):
+ raise BadFilterFormat(
+ "`{}` value must be an iterable across the function arguments".format(
+ boolean_function.key
+ )
+ )
+ if boolean_function.only_one_arg and len(fn_args) != 1:
+ raise BadFilterFormat(
+ "`{}` must have one argument".format(boolean_function.key)
+ )
+ if not boolean_function.only_one_arg and len(fn_args) < 1:
+ raise BadFilterFormat(
+ "`{}` must have one or more arguments".format(boolean_function.key)
+ )
+ return [BooleanFilter(boolean_function.sqlalchemy_fn, *build_filters(fn_args))]
+
+ return [Filter(filter_spec)]
+
+
+def get_model_from_table(table: Table): # pragma: nocover
+ """Resolve model class from table object"""
+
+ for registry in mapperlib._all_registries():
+ for mapper in registry.mappers:
+ if table in mapper.tables:
+ return mapper.class_
+ return None
+
+
+def get_query_models(query):
+ """Get models from query.
+
+ :param query:
+ A :class:`sqlalchemy.orm.Query` instance.
+
+ :returns:
+ A dictionary with all the models included in the query.
+ """
+ models = [col_desc["entity"] for col_desc in query.column_descriptions]
+
+ # In SQLAlchemy 2.x, we need to use a different approach to get joined entities
+ try:
+ # Try to get the statement from the query
+ stmt = query.statement
+
+ # Extract entities from the statement's froms
+ for from_obj in stmt.froms:
+ if hasattr(from_obj, "entity"):
+ # For select statements with an entity
+ if from_obj.entity not in models:
+ models.append(from_obj.entity)
+ elif hasattr(from_obj, "left") and hasattr(from_obj, "right"):
+ # For join objects
+ for side in [from_obj.left, from_obj.right]:
+ if hasattr(side, "entity"):
+ if side.entity not in models:
+ models.append(side.entity)
+ elif hasattr(side, "table"):
+ model_class = get_model_from_table(side.table)
+ if model_class and model_class not in models:
+ models.append(model_class)
+
+ # Try to extract joined entities from the query's _join_entities
+ # This is for SQLAlchemy 2.x's internal structure
+ if hasattr(query, "_compile_state"):
+ try:
+ compile_state = query._compile_state()
+ if hasattr(compile_state, "_join_entities"):
+ for mapper in compile_state._join_entities:
+ if hasattr(mapper, "class_"):
+ if mapper.class_ not in models:
+ models.append(mapper.class_)
+ except Exception:
+ pass
+ except (AttributeError, InvalidRequestError):
+ # If we can't get the statement or process it, fall back to simpler approach
+ pass
+
+ return {model.__name__: model for model in models}
+
+
+def get_model_class_by_name(registry, name):
+ """Return the model class matching `name` in the given `registry`."""
+ for cls in registry.values():
+ if getattr(cls, "__name__", None) == name:
+ return cls
+
+
+def get_named_models(filters):
+ models = []
+ for filter in filters:
+ models.extend(filter.get_named_models())
+ return models
+
+
+def get_default_model(query):
+ """Return the singular model from `query`, or `None` if `query` contains
+ multiple models.
+ """
+ query_models = get_query_models(query).values()
+ if len(query_models) == 1:
+ (default_model,) = iter(query_models)
+ else:
+ default_model = None
+ return default_model
+
+
+def auto_join(query, *model_names):
+ """Automatically join models to `query` if they're not already present
+ and the join can be done implicitly.
+ """
+ # every model has access to the registry, so we can use any from the query
+ query_models = get_query_models(query).values()
+ last_model = list(query_models)[-1]
+ model_registry = last_model.registry._class_registry
+
+ for name in model_names:
+ model = get_model_class_by_name(model_registry, name)
+ if model and (model not in get_query_models(query).values()):
+ try: # pragma: nocover
+ # https://docs.sqlalchemy.org/en/14/changelog/migration_14.html
+ # Many Core and ORM statement objects now perform much of
+ # their construction and validation in the compile phase
+ tmp = query.join(model)
+ tmp._compile_state()
+ query = tmp
+ except InvalidRequestError:
+ pass # can't be autojoined
+ return query
+
+
+def apply_model_specific_filters(
+ model: Base, query: orm.Query, current_user: DispatchUser, role: UserRoles
+):
+ """Applies any model specific filter as it pertains to the given user."""
+ model_map = {
+ Incident: [restricted_incident_filter],
+ Case: [restricted_case_filter],
+ # IncidentType: [restricted_incident_type_filter],
+ }
+
+ filters = model_map.get(model, [])
+
+ for f in filters:
+ query = f(query, current_user, role)
+
+ return query
+
+
+def apply_filters(query, filter_spec, model_cls=None, do_auto_join=True):
+ """Apply filters to a SQLAlchemy query.
+
+ :param query:
+ A :class:`sqlalchemy.orm.Query` instance.
+
+ :param filter_spec:
+ A dict or an iterable of dicts, where each one includes
+ the necessary information to create a filter to be applied to the
+ query.
+
+ Example::
+
+ filter_spec = [
+ {'model': 'Foo', 'field': 'name', 'op': '==', 'value': 'foo'},
+ ]
+
+ If the query being modified refers to a single model, the `model` key
+ may be omitted from the filter spec.
+
+ Filters may be combined using boolean functions.
+
+ Example:
+
+ filter_spec = {
+ 'or': [
+ {
+ 'model': 'Foo',
+ 'field': 'id',
+ 'op': '==',
+ 'value': '1'
+ },
+ {
+ 'model': 'Bar',
+ 'field': 'id',
+ 'op': '==',
+ 'value': '2'
+ },
+ ]
+ }
+
+ :returns:
+ The :class:`sqlalchemy.orm.Query` instance after all the filters
+ have been applied.
+ """
+ default_model = get_default_model(query)
+ if not default_model:
+ default_model = model_cls
+
+ filters = build_filters(filter_spec)
+ filter_models = get_named_models(filters)
+
+ if do_auto_join:
+ query = auto_join(query, *filter_models)
+
+ sqlalchemy_filters = [filter.format_for_sqlalchemy(query, default_model) for filter in filters]
+
+ if sqlalchemy_filters:
+ query = query.filter(*sqlalchemy_filters)
+
+ return query
+
+
+def apply_filter_specific_joins(model: Base, filter_spec: dict, query: orm.query):
+ """Applies any model specific implicitly joins."""
+ # this is required because by default sqlalchemy-filter's auto-join
+ # knows nothing about how to join many-many relationships.
+ model_map = {
+ (Feedback, "Incident"): (Incident, False),
+ (Feedback, "Case"): (Case, False),
+ (Task, "Project"): (Incident, False),
+ (Task, "Incident"): (Incident, False),
+ (Task, "IncidentPriority"): (Incident, False),
+ (Task, "IncidentType"): (Incident, False),
+ (PluginInstance, "Plugin"): (Plugin, False),
+ (Source, "Tag"): (Source.tags, True),
+ (Source, "TagType"): (Source.tags, True),
+ (QueryModel, "Tag"): (QueryModel.tags, True),
+ (QueryModel, "TagType"): (QueryModel.tags, True),
+ (DispatchUser, "Organization"): (DispatchUser.organizations, True),
+ (Case, "Tag"): (Case.tags, True),
+ (Case, "TagType"): (Case.tags, True),
+ (Case, "IndividualContact"): (Case.participants, True),
+ (Incident, "Tag"): (Incident.tags, True),
+ (Incident, "TagType"): (Incident.tags, True),
+ (Incident, "IndividualContact"): (Incident.participants, True),
+ (Incident, "Term"): (Incident.terms, True),
+ (Signal, "Tag"): (Signal.tags, True),
+ (Signal, "TagType"): (Signal.tags, True),
+ (SignalInstance, "Entity"): (SignalInstance.entities, True),
+ (SignalInstance, "EntityType"): (SignalInstance.entities, True),
+ # (Tag, "TagType"): (TagType, False), # Disabled: filtering by tag_type_id directly
+ (Tag, "Project"): (Project, False),
+ (IndividualContact, "Project"): (Project, False),
+ }
+ filters = build_filters(filter_spec)
+
+ # Replace mapping if looking for commander
+ if "Commander" in str(filter_spec):
+ model_map.update({(Incident, "IndividualContact"): (Incident.commander, True)})
+ if "Assignee" in str(filter_spec):
+ model_map.update({(Case, "IndividualContact"): (Case.assignee, True)})
+
+ filter_models = get_named_models(filters)
+ joined_models = []
+ for filter_model in filter_models:
+ if model_map.get((model, filter_model)):
+ joined_model, is_outer = model_map[(model, filter_model)]
+ try:
+ # Use the model or table itself for tracking joins
+ model_or_table = getattr(joined_model, "parent", joined_model)
+ if model_or_table not in joined_models:
+ query = query.join(joined_model, isouter=is_outer)
+ joined_models.append(model_or_table)
+ except Exception as e:
+ log.exception(e)
+
+ return query
+
+
+def composite_search(*, db_session, query_str: str, models: list[Base], current_user: DispatchUser):
+ """Perform a multi-table search based on the supplied query."""
+ s = CompositeSearch(db_session, models)
+ query = s.build_query(query_str, sort=True)
+
+ # TODO can we do this with composite filtering?
+ # for model in models:
+ # query = apply_model_specific_filters(model, query, current_user)
+
+ return s.search(query=query)
+
+
+def search(*, query_str: str, query: Query, model: str, sort=False):
+ """Perform a search based on the query."""
+ search_model = get_class_by_tablename(model)
+
+ if not query_str.strip():
+ return query
+
+ search = []
+ if hasattr(search_model, "search_vector"):
+ vector = search_model.search_vector
+ search.append(vector.op("@@")(func.tsq_parse(query_str)))
+
+ if hasattr(search_model, "name"):
+ search.append(
+ search_model.name.ilike(f"%{query_str}%"),
+ )
+ search.append(search_model.name == query_str)
+
+ if not search:
+ raise Exception(f"Search not supported for model: {model}")
+
+ query = query.filter(or_(*search))
+
+ if sort:
+ query = query.order_by(desc(func.ts_rank_cd(vector, func.tsq_parse(query_str))))
+
+ return query.params(term=query_str)
+
+
+def create_sort_spec(model, sort_by, descending):
+ """Creates sort_spec."""
+ sort_spec = []
+ if sort_by and descending:
+ for field, direction in zip(sort_by, descending, strict=False):
+ direction = "desc" if direction else "asc"
+
+ # check to see if field is json with a key parameter
+ try:
+ new_field = json.loads(field)
+ field = new_field.get("key", "")
+ except json.JSONDecodeError:
+ pass
+
+ # we have a complex field, we may need to join
+ if "." in field:
+ complex_model, complex_field = field.split(".")[-2:]
+
+ sort_spec.append(
+ {
+ "model": get_model_name_by_tablename(complex_model),
+ "field": complex_field,
+ "direction": direction,
+ }
+ )
+ else:
+ sort_spec.append({"model": model, "field": field, "direction": direction})
+ log.debug(f"Sort Spec: {json.dumps(sort_spec, indent=2)}")
+ return sort_spec
+
+
+def get_all(*, db_session, model):
+ """Fetches a query object based on the model class name."""
+ return db_session.query(get_class_by_tablename(model))
+
+
+def common_parameters(
+ current_user: CurrentUser,
+ db_session: DbSession,
+ page: int = Query(1, gt=0, lt=2147483647),
+ items_per_page: int = Query(5, alias="itemsPerPage", gt=-2, lt=2147483647),
+ query_str: QueryStr = Query(None, alias="q"),
+ filter_spec: QueryStr = Query(None, alias="filter"),
+ sort_by: list[str] = Query([], alias="sortBy[]"),
+ descending: list[bool] = Query([], alias="descending[]"),
+ role: UserRoles = Depends(get_current_role),
+ security_event_only: bool = Query(None, alias="security_event_only"),
+):
+ return {
+ "db_session": db_session,
+ "page": page,
+ "items_per_page": items_per_page,
+ "query_str": query_str,
+ "filter_spec": filter_spec,
+ "sort_by": sort_by,
+ "descending": descending,
+ "current_user": current_user,
+ "role": role,
+ "security_event_only": security_event_only,
+ }
+
+
+CommonParameters = Annotated[
+ dict[
+ str,
+ int | CurrentUser | DbSession | QueryStr | Json | list[str] | list[bool] | UserRoles | bool,
+ ],
+ Depends(common_parameters),
+]
+
+
+def has_filter_model(model: str, filter_spec: list[dict]):
+ """Checks if the filter spec has a TagAll filter."""
+
+ if isinstance(filter_spec, list):
+ return False
+
+ for key, value in filter_spec.items():
+ if key == "and":
+ for condition in value:
+ or_condition = condition.get("or", [])
+ if or_condition and or_condition[0].get("model") == model:
+ return True
+ return False
+
+
+def has_tag_all(filter_spec: list[dict]):
+ return has_filter_model("TagAll", filter_spec)
+
+
+def has_not_case_type(filter_spec: list[dict]):
+ return has_filter_model("NotCaseType", filter_spec)
+
+
+def rebuild_filter_spec_for_not_case_type(filter_spec: dict):
+ new_filter_spec = []
+ for key, value in filter_spec.items():
+ if key == "and":
+ for condition in value:
+ or_condition = condition.get("or", [])
+ if or_condition and or_condition[0].get("model") == "NotCaseType":
+ for cond in or_condition:
+ cond["op"] = "!="
+ new_filter_spec.append({"and": [{"and": [cond]}]})
+ else:
+ new_filter_spec.append(condition)
+ return {"and": new_filter_spec}
+
+
+def rebuild_filter_spec_without_tag_all(filter_spec: dict):
+ """Rebuilds the filter spec without the TagAll filter."""
+ new_filter_spec = []
+ tag_all_spec = []
+ for key, value in filter_spec.items():
+ if key == "and":
+ for condition in value:
+ or_condition = condition.get("or", [])
+ if or_condition and or_condition[0].get("model") == "TagAll":
+ for cond in or_condition:
+ tag_all_spec.append({"and": [{"or": [cond]}]})
+ else:
+ new_filter_spec.append(condition)
+ return ({"and": new_filter_spec} if len(new_filter_spec) else None, tag_all_spec)
+
+
+def search_filter_sort_paginate(
+ db_session,
+ model,
+ query_str: str = None,
+ filter_spec: str | dict | None = None,
+ page: int = 1,
+ items_per_page: int = 5,
+ sort_by: list[str] = None,
+ descending: list[bool] = None,
+ current_user: DispatchUser = None,
+ role: UserRoles = UserRoles.member,
+ security_event_only: bool = None,
+):
+ """Common functionality for searching, filtering, sorting, and pagination."""
+ model_cls = get_class_by_tablename(model)
+
+ try:
+ query = db_session.query(model_cls)
+
+ if query_str:
+ sort = False if sort_by else True
+ query = search(query_str=query_str, query=query, model=model, sort=sort)
+
+ # Apply model-specific filters directly to the query to avoid intersect ordering issues
+ query = apply_model_specific_filters(model_cls, query, current_user, role)
+
+ tag_all_filters = []
+ if filter_spec:
+ # some functions pass filter_spec as dictionary such as auth/views.py/get_users
+ # but most come from API as seraialized JSON
+ if isinstance(filter_spec, str):
+ filter_spec = json.loads(filter_spec)
+ query = apply_filter_specific_joins(model_cls, filter_spec, query)
+ # if the filter_spec has the TagAll filter, we need to split the query up
+ # and intersect all of the results
+ if has_not_case_type(filter_spec):
+ new_filter_spec = rebuild_filter_spec_for_not_case_type(filter_spec)
+ if new_filter_spec:
+ query = apply_filters(query, new_filter_spec, model_cls)
+ if has_tag_all(filter_spec):
+ new_filter_spec, tag_all_spec = rebuild_filter_spec_without_tag_all(filter_spec)
+ if new_filter_spec:
+ query = apply_filters(query, new_filter_spec, model_cls)
+ for tag_filter in tag_all_spec:
+ tag_all_filters.append(apply_filters(query, tag_filter, model_cls))
+ else:
+ query = apply_filters(query, filter_spec, model_cls)
+
+ # Handle security_event_only filter for Case model
+ if model == "Case" and security_event_only:
+ # Use NOT EXISTS to find cases that do NOT have signal instances
+ query = query.filter(~exists().where(SignalInstance.case_id == Case.id))
+
+ # Apply tag_all filters using intersect only when necessary
+ for filter in tag_all_filters:
+ query = query.intersect(filter)
+
+ if sort_by:
+ sort_spec = create_sort_spec(model, sort_by, descending)
+ query = apply_sort(query, sort_spec)
+
+ except (FieldNotFound, BadFilterFormat) as e:
+ log.error(f"Error building or applying filters: {str(e)}")
+ raise e
+
+ if items_per_page == -1:
+ items_per_page = None
+
+ # sometimes we get bad input for the search function
+ # TODO investigate moving to a different way to parsing queries that won't through errors
+ # e.g. websearch_to_tsquery
+ # https://www.postgresql.org/docs/current/textsearch-controls.html
+ try:
+ # Check if this model is likely to have duplicate results from many-to-many joins
+ # Models with many secondary relationships (like Tag) can cause count inflation
+ models_needing_distinct = ["Tag"] # Add other models here as needed
+
+ if model in models_needing_distinct and items_per_page is not None:
+ # Use custom pagination that handles DISTINCT properly
+ from collections import namedtuple
+
+ Pagination = namedtuple(
+ "Pagination", ["page_number", "page_size", "num_pages", "total_results"]
+ )
+
+ # Get total count using distinct ID to avoid duplicates
+ # Remove ORDER BY clause for counting since it's not needed and causes issues with DISTINCT
+ count_query = query.with_entities(model_cls.id).distinct().order_by(None)
+ total_count = count_query.count()
+
+ # Apply DISTINCT to the main query as well to avoid duplicate results
+ # Remove ORDER BY clause since it can conflict with DISTINCT when ordering by joined table columns
+ query = query.distinct().order_by(None)
+
+ # Apply pagination to the distinct query
+ offset = (page - 1) * items_per_page if page > 1 else 0
+ query = query.offset(offset).limit(items_per_page)
+
+ # Calculate number of pages
+ num_pages = (
+ (total_count + items_per_page - 1) // items_per_page if items_per_page > 0 else 1
+ )
+
+ pagination = Pagination(page, items_per_page, num_pages, total_count)
+ else:
+ # Use standard pagination for other models
+ query, pagination = apply_pagination(query, page_number=page, page_size=items_per_page)
+
+ except ProgrammingError as e:
+ log.debug(e)
+ return {
+ "items": [],
+ "itemsPerPage": items_per_page,
+ "page": page,
+ "total": 0,
+ }
+
+ return {
+ "items": query.all(),
+ "itemsPerPage": pagination.page_size,
+ "page": pagination.page_number,
+ "total": pagination.total_results,
+ }
+
+
+def restricted_incident_filter(query: orm.Query, current_user: DispatchUser, role: UserRoles):
+ """Adds additional incident filters to query (usually for permissions)."""
+ # Allow unrestricted access for admin roles
+ if role in [UserRoles.admin, UserRoles.owner, UserRoles.manager]:
+ return query.distinct()
+
+ # For all other roles (including member, none, and any unhandled roles),
+ # apply restrictive filtering - default deny approach
+ query = (
+ query.outerjoin(Participant, Incident.id == Participant.incident_id)
+ .outerjoin(IndividualContact, IndividualContact.id == Participant.individual_contact_id)
+ .filter(
+ or_(
+ Incident.visibility == Visibility.open,
+ and_(
+ Incident.visibility == Visibility.restricted,
+ IndividualContact.email == current_user.email,
+ ),
+ )
+ )
+ )
+ return query.distinct()
+
+
+def restricted_case_filter(query: orm.Query, current_user: DispatchUser, role: UserRoles):
+ """Adds additional case filters to query (usually for permissions)."""
+ # Allow unrestricted access for admin roles
+ if role in [UserRoles.admin, UserRoles.owner, UserRoles.manager]:
+ return query.distinct()
+
+ # For all other roles (including member, none, and any unhandled roles),
+ # apply restrictive filtering - default deny approach
+ query = (
+ query.outerjoin(Participant, Case.id == Participant.case_id)
+ .outerjoin(IndividualContact, IndividualContact.id == Participant.individual_contact_id)
+ .filter(
+ or_(
+ Case.visibility == Visibility.open,
+ and_(
+ Case.visibility == Visibility.restricted,
+ IndividualContact.email == current_user.email,
+ ),
+ )
+ )
+ )
+ return query.distinct()
+
+
+def restricted_incident_type_filter(query: orm.Query, current_user: DispatchUser):
+ """Adds additional incident type filters to query (usually for permissions)."""
+ if current_user:
+ query = query.filter(IncidentType.visibility == Visibility.open)
+ return query
diff --git a/src/dispatch/decorators.py b/src/dispatch/decorators.py
index b65106152aec..8b88e6b45bec 100644
--- a/src/dispatch/decorators.py
+++ b/src/dispatch/decorators.py
@@ -2,12 +2,14 @@
import logging
import time
from functools import wraps
-from typing import Any, List
+
+from sqlalchemy.orm import scoped_session
from dispatch.metrics import provider as metrics_provider
-from .extensions import sentry_sdk, configure_extensions
-from .database import SessionLocal
+from dispatch.organization import service as organization_service
+from dispatch.project import service as project_service
+from .database.core import engine, sessionmaker
log = logging.getLogger(__name__)
@@ -17,6 +19,72 @@ def fullname(o):
return f"{module.__name__}.{o.__qualname__}"
+def _execute_task_in_project_context(
+ func,
+ *args,
+ **kwargs,
+) -> None:
+ CoreSession = scoped_session(sessionmaker(bind=engine))
+ db_session = CoreSession()
+
+ metrics_provider.counter("function.call.counter", tags={"function": fullname(func)})
+ start = time.perf_counter()
+
+ try:
+ # iterate for all schema
+ for organization in organization_service.get_all(db_session=db_session):
+ schema_engine = engine.execution_options(
+ schema_translate_map={None: f"dispatch_organization_{organization.slug}"}
+ )
+ OrgSession = scoped_session(sessionmaker(bind=schema_engine))
+ schema_session = OrgSession()
+
+ try:
+ kwargs["db_session"] = schema_session
+ for project in project_service.get_all(db_session=schema_session):
+ kwargs["project"] = project
+ func(*args, **kwargs)
+ except Exception as e:
+ log.error(
+ f"Error trying to execute task: {fullname(func)} with parameters {args} and {kwargs}"
+ )
+ log.exception(e)
+ schema_session.rollback()
+ finally:
+ schema_session.close()
+ OrgSession.remove()
+
+ elapsed_time = time.perf_counter() - start
+ metrics_provider.timer(
+ "function.elapsed.time", value=elapsed_time, tags={"function": fullname(func)}
+ )
+ except Exception as e:
+ log.error(f"Error trying to execute task: {fullname(func)}")
+ log.exception(e)
+ db_session.rollback()
+ finally:
+ db_session.close()
+ CoreSession.remove()
+
+
+def scheduled_project_task(func):
+ """Decorator that sets up a background task function with
+ a database session and exception tracking.
+
+ Each task is executed in a specific project context.
+ """
+
+ @wraps(func)
+ def wrapper(*args, **kwargs):
+ _execute_task_in_project_context(
+ func,
+ *args,
+ **kwargs,
+ )
+
+ return wrapper
+
+
def background_task(func):
"""Decorator that sets up the a background task function
with a database session and exception tracking.
@@ -27,30 +95,45 @@ def background_task(func):
@wraps(func)
def wrapper(*args, **kwargs):
- db_session = SessionLocal()
- kwargs["db_session"] = db_session
+ session_owned = False
+ db_session = kwargs.get("db_session")
+
+ if not db_session:
+ if not kwargs.get("organization_slug"):
+ raise Exception("If no db_session is supplied organization slug must be provided.")
+
+ schema_engine = engine.execution_options(
+ schema_translate_map={
+ None: f"dispatch_organization_{kwargs['organization_slug']}",
+ }
+ )
+ session_factory = sessionmaker(bind=schema_engine)
+ db_session = session_factory()
+ session_owned = True
+ kwargs["db_session"] = db_session
+
try:
- metrics_provider.counter(f"function.call.counter", tags={"function": fullname(func)})
+ metrics_provider.counter("function.call.counter", tags={"function": fullname(func)})
start = time.perf_counter()
result = func(*args, **kwargs)
elapsed_time = time.perf_counter() - start
metrics_provider.timer(
- f"function.elapsed.time", value=elapsed_time, tags={"function": fullname(func)}
+ "function.elapsed.time", value=elapsed_time, tags={"function": fullname(func)}
)
return result
except Exception as e:
- import traceback
-
- configure_extensions()
- print(traceback.format_exc())
- sentry_sdk.capture_exception(e)
+ log.exception(e)
+ if session_owned:
+ db_session.rollback()
+ raise
finally:
- db_session.close()
+ if session_owned:
+ db_session.close()
return wrapper
-def timer(func: Any):
+def timer(func):
"""Timing decorator that sends a timing metric."""
@wraps(func)
@@ -59,25 +142,26 @@ def wrapper(*args, **kwargs):
result = func(*args, **kwargs)
elapsed_time = time.perf_counter() - start
metrics_provider.timer(
- f"function.elapsed.time", value=elapsed_time, tags={"function": fullname(func)}
+ "function.elapsed.time", value=elapsed_time, tags={"function": fullname(func)}
)
+ log.debug(f"function.elapsed.time.{fullname(func)}: {elapsed_time}")
return result
return wrapper
-def counter(func: Any):
+def counter(func):
"""Counting decorator that sends a counting metric."""
@wraps(func)
def wrapper(*args, **kwargs):
- metrics_provider.counter(f"function.call.counter", tags={"function": fullname(func)})
+ metrics_provider.counter("function.call.counter", tags={"function": fullname(func)})
return func(*args, **kwargs)
return wrapper
-def apply(decorator: Any, exclude: List[str] = None):
+def apply(decorator, exclude: list[str] = None):
"""Class decorator that applies specified decorator to all class methods."""
if not exclude:
exclude = []
diff --git a/src/dispatch/definition/models.py b/src/dispatch/definition/models.py
index 62f5e4c893a5..17913f7999f2 100644
--- a/src/dispatch/definition/models.py
+++ b/src/dispatch/definition/models.py
@@ -1,47 +1,70 @@
-from typing import List, Optional
-
-from sqlalchemy import Column, Integer, String
+from sqlalchemy import Table, Column, Integer, String, ForeignKey, PrimaryKeyConstraint
from sqlalchemy.orm import relationship
+from sqlalchemy.sql.schema import UniqueConstraint
+
from sqlalchemy_utils import TSVectorType
-from dispatch.database import Base
-from dispatch.models import (
- DispatchBase,
- TermNested,
- TermReadNested,
- definition_teams,
- definition_terms,
+from dispatch.database.core import Base
+from dispatch.models import PrimaryKey, DispatchBase, ProjectMixin, Pagination
+from dispatch.project.models import ProjectRead
+
+# Association tables
+definition_teams = Table(
+ "definition_teams",
+ Base.metadata,
+ Column("definition_id", Integer, ForeignKey("definition.id")),
+ Column("team_contact_id", Integer, ForeignKey("team_contact.id")),
+ PrimaryKeyConstraint("definition_id", "team_contact_id"),
)
+definition_terms = Table(
+ "definition_terms",
+ Base.metadata,
+ Column("definition_id", Integer, ForeignKey("definition.id")),
+ Column("term_id", Integer, ForeignKey("term.id")),
+ PrimaryKeyConstraint("definition_id", "term_id"),
+)
-class Definition(Base):
+
+class Definition(Base, ProjectMixin):
+ __table_args__ = (UniqueConstraint("text", "project_id"),)
id = Column(Integer, primary_key=True)
- text = Column(String, unique=True)
+ text = Column(String)
source = Column(String, default="dispatch")
terms = relationship("Term", secondary=definition_terms, backref="definitions")
teams = relationship("TeamContact", secondary=definition_teams)
- search_vector = Column(TSVectorType("text"))
+ search_vector = Column(
+ TSVectorType(
+ "text",
+ regconfig="pg_catalog.simple",
+ )
+ )
+
+
+class DefinitionTerm(DispatchBase):
+ id: PrimaryKey | None = None
+ text: str | None = None
# Pydantic models...
class DefinitionBase(DispatchBase):
text: str
- source: Optional[str] = None
+ source: str | None = None
class DefinitionCreate(DefinitionBase):
- terms: Optional[List[TermNested]] = []
+ terms: list[DefinitionTerm | None] = []
+ project: ProjectRead
class DefinitionUpdate(DefinitionBase):
- terms: Optional[List[TermReadNested]] = []
+ terms: list[DefinitionTerm | None] = []
class DefinitionRead(DefinitionBase):
- id: int
- terms: Optional[List[TermReadNested]]
+ id: PrimaryKey
+ terms: list[DefinitionTerm | None]
-class DefinitionPagination(DispatchBase):
- total: int
- items: List[DefinitionRead] = []
+class DefinitionPagination(Pagination):
+ items: list[DefinitionRead] = []
diff --git a/src/dispatch/definition/service.py b/src/dispatch/definition/service.py
index 8fb25a8a9759..958d10ac3dbb 100644
--- a/src/dispatch/definition/service.py
+++ b/src/dispatch/definition/service.py
@@ -1,33 +1,44 @@
-from typing import List, Optional
-from fastapi.encoders import jsonable_encoder
-from .models import Definition, DefinitionCreate, DefinitionUpdate
+from dispatch.project import service as project_service
from dispatch.term import service as term_service
+from .models import Definition, DefinitionCreate, DefinitionUpdate
+
-def get(*, db_session, definition_id: int) -> Optional[Definition]:
+def get(*, db_session, definition_id: int) -> Definition | None:
+ """Gets a definition by its id."""
return db_session.query(Definition).filter(Definition.id == definition_id).first()
-def get_by_text(*, db_session, text: str) -> Optional[Definition]:
+def get_by_text(*, db_session, text: str) -> Definition | None:
+ """Gets a definition by its text."""
return db_session.query(Definition).filter(Definition.text == text).first()
-def get_all(*, db_session) -> List[Optional[Definition]]:
+def get_all(*, db_session) -> list[Definition | None]:
+ """Gets all definitions."""
return db_session.query(Definition)
def create(*, db_session, definition_in: DefinitionCreate) -> Definition:
+ """Creates a new definition."""
terms = [
term_service.get_or_create(db_session=db_session, term_in=t) for t in definition_in.terms
]
- definition = Definition(**definition_in.dict(exclude={"terms"}), terms=terms)
+
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=definition_in.project
+ )
+ definition = Definition(
+ **definition_in.dict(exclude={"terms", "project"}), project=project, terms=terms
+ )
db_session.add(definition)
db_session.commit()
return definition
-def create_all(*, db_session, definitions_in: List[DefinitionCreate]) -> List[Definition]:
+def create_all(*, db_session, definitions_in: list[DefinitionCreate]) -> list[Definition]:
+ """Creates a definitions in bulk."""
definitions = [Definition(text=d.text) for d in definitions_in]
db_session.bulk_save_insert(definitions)
db_session.commit()
@@ -36,24 +47,26 @@ def create_all(*, db_session, definitions_in: List[DefinitionCreate]) -> List[De
def update(*, db_session, definition: Definition, definition_in: DefinitionUpdate) -> Definition:
- definition_data = jsonable_encoder(definition)
+ """Updates a definition."""
+ definition_data = definition.dict()
terms = [
term_service.get_or_create(db_session=db_session, term_in=t) for t in definition_in.terms
]
- update_data = definition_in.dict(skip_defaults=True, exclude={"terms"})
+ update_data = definition_in.dict(exclude_unset=True, exclude={"terms"})
for field in definition_data:
if field in update_data:
setattr(definition, field, update_data[field])
definition.terms = terms
- db_session.add(definition)
+
db_session.commit()
return definition
def delete(*, db_session, definition_id: int):
+ """Deletes a definition."""
definition = db_session.query(Definition).filter(Definition.id == definition_id).first()
definition.terms = []
db_session.delete(definition)
@@ -61,6 +74,7 @@ def delete(*, db_session, definition_id: int):
def upsert(*, db_session, definition_in: DefinitionCreate) -> Definition:
+ """Gets or creates a new definition."""
# we only care about unique columns
q = db_session.query(Definition).filter(Definition.text == definition_in.text)
instance = q.first()
diff --git a/src/dispatch/definition/views.py b/src/dispatch/definition/views.py
index 8431cadcc974..9831d5fffe41 100644
--- a/src/dispatch/definition/views.py
+++ b/src/dispatch/definition/views.py
@@ -1,83 +1,77 @@
-from fastapi import APIRouter, Depends, HTTPException
-from sqlalchemy.orm import Session
+from fastapi import APIRouter, HTTPException, status
+from pydantic import ValidationError
-from dispatch.database import get_db, paginate
-from dispatch.search.service import search
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
from .models import (
- Definition,
DefinitionCreate,
DefinitionPagination,
DefinitionRead,
DefinitionUpdate,
)
-from .service import create, delete, get, get_all, get_by_text, update
+from .service import create, delete, get, get_by_text, update
router = APIRouter()
-@router.get("/", response_model=DefinitionPagination)
-def get_definitions(
- db_session: Session = Depends(get_db), page: int = 1, itemsPerPage: int = 5, q: str = None
-):
- """
- Get all definitions.
- """
- if q:
- query = search(db_session=db_session, query_str=q, model=Definition)
- else:
- query = get_all(db_session=db_session)
-
- items, total = paginate(query=query, page=page, items_per_page=itemsPerPage)
- return {"items": items, "total": total}
+@router.get("", response_model=DefinitionPagination)
+def get_definitions(common: CommonParameters):
+ """Get all definitions."""
+ return search_filter_sort_paginate(model="Definition", **common)
@router.get("/{definition_id}", response_model=DefinitionRead)
-def get_definition(*, db_session: Session = Depends(get_db), definition_id: int):
- """
- Update a definition.
- """
+def get_definition(db_session: DbSession, definition_id: PrimaryKey):
+ """Get a definition."""
definition = get(db_session=db_session, definition_id=definition_id)
if not definition:
- raise HTTPException(status_code=404, detail="The definition with this id does not exist.")
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A definition with this id does not exist."}],
+ )
return definition
-@router.post("/", response_model=DefinitionRead)
-def create_definition(*, db_session: Session = Depends(get_db), definition_in: DefinitionCreate):
- """
- Create a new definition.
- """
+@router.post("", response_model=DefinitionRead)
+def create_definition(db_session: DbSession, definition_in: DefinitionCreate):
+ """Create a new definition."""
definition = get_by_text(db_session=db_session, text=definition_in.text)
if definition:
- raise HTTPException(
- status_code=400,
- detail=f"The description with this text ({definition_in.text}) already exists.",
- )
- definition = create(db_session=db_session, definition_in=definition_in)
- return definition
+ raise ValidationError([
+ {
+ "msg": "A description with this text already exists.",
+ "loc": "text",
+ }
+ ])
+
+ return create(db_session=db_session, definition_in=definition_in)
@router.put("/{definition_id}", response_model=DefinitionRead)
def update_definition(
- *, db_session: Session = Depends(get_db), definition_id: int, definition_in: DefinitionUpdate
+ db_session: DbSession,
+ definition_id: PrimaryKey,
+ definition_in: DefinitionUpdate,
):
- """
- Update a definition.
- """
+ """Update a definition."""
definition = get(db_session=db_session, definition_id=definition_id)
if not definition:
- raise HTTPException(status_code=404, detail="The definition with this id does not exist.")
- definition = update(db_session=db_session, definition=definition, definition_in=definition_in)
- return definition
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A definition with this id does not exist."}],
+ )
+ return update(db_session=db_session, definition=definition, definition_in=definition_in)
-@router.delete("/{definition_id}")
-def delete_definition(*, db_session: Session = Depends(get_db), definition_id: int):
- """
- Delete a definition.
- """
+@router.delete("/{definition_id}", response_model=None)
+def delete_definition(db_session: DbSession, definition_id: PrimaryKey):
+ """Delete a definition."""
definition = get(db_session=db_session, definition_id=definition_id)
if not definition:
- raise HTTPException(status_code=404, detail="The definition with this id does not exist.")
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A definition with this id does not exist."}],
+ )
delete(db_session=db_session, definition_id=definition_id)
diff --git a/src/dispatch/document/flows.py b/src/dispatch/document/flows.py
new file mode 100644
index 000000000000..41fd5cdc094e
--- /dev/null
+++ b/src/dispatch/document/flows.py
@@ -0,0 +1,323 @@
+import logging
+from typing import Any
+from sqlalchemy.orm import Session
+
+from dispatch.database.core import resolve_attr
+from dispatch.database.core import get_table_name_by_class_instance
+from dispatch.enums import DocumentResourceTypes
+from dispatch.event import service as event_service
+from dispatch.plugin import service as plugin_service
+from dispatch.tag_type import service as tag_type_service
+
+from .models import Document, DocumentCreate
+from .service import create, delete
+from .utils import deslug
+
+
+log = logging.getLogger(__name__)
+
+
+def create_document(
+ subject: Any,
+ document_type: str,
+ document_template: Document,
+ db_session: Session,
+):
+ """Creates a document."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=subject.project.id, plugin_type="storage"
+ )
+ if not plugin:
+ log.warning("Document not created. No storage plugin enabled.")
+ return
+
+ # we create the external document
+ document_name = subject.title if subject.project.storage_use_title else subject.name
+ external_document_name = f"{document_name} - {deslug(document_type)}"
+ external_document_description = ""
+ try:
+ if document_template:
+ external_document_description = document_template.description
+
+ # we make a copy of the template in the storage folder
+ external_document = plugin.instance.copy_file(
+ folder_id=subject.storage.resource_id,
+ file_id=document_template.resource_id,
+ name=external_document_name,
+ )
+ # we move the document to the storage folder
+ plugin.instance.move_file(subject.storage.resource_id, file_id=external_document["id"])
+ else:
+ # we create a blank document in the storage folder
+ external_document = plugin.instance.create_file(
+ parent_id=subject.storage.resource_id,
+ name=external_document_name,
+ file_type="document",
+ )
+ except Exception as e:
+ log.exception(e)
+ return
+
+ if not external_document:
+ log.error(
+ f"{external_document_name} not created. Plugin {plugin.plugin.slug} encountered an error." # noqa: E501
+ )
+ return
+
+ external_document.update(
+ {
+ "name": external_document_name,
+ "description": external_document_description,
+ "resource_type": document_type,
+ "resource_id": external_document["id"],
+ }
+ )
+
+ # we create the internal document
+ document_in = DocumentCreate(
+ name=external_document["name"],
+ description=external_document["description"],
+ project={"name": subject.project.name},
+ resource_id=external_document["resource_id"],
+ resource_type=external_document["resource_type"],
+ weblink=external_document["weblink"],
+ )
+
+ document = create(db_session=db_session, document_in=document_in)
+ subject.documents.append(document)
+
+ if document_type == DocumentResourceTypes.case:
+ subject.case_document_id = document.id
+
+ if document_type == DocumentResourceTypes.incident:
+ subject.incident_document_id = document.id
+
+ if document_type == DocumentResourceTypes.review:
+ subject.incident_review_document_id = document.id
+
+ db_session.add(subject)
+ db_session.commit()
+
+ subject_type = get_table_name_by_class_instance(subject)
+ if subject_type == "case":
+ event_service.log_case_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description=f"{deslug(document_type).lower().capitalize()} created",
+ case_id=subject.id,
+ )
+ else:
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description=f"{deslug(document_type).lower().capitalize()} created",
+ incident_id=subject.id,
+ )
+
+ return document
+
+
+def update_document(document: Document, project_id: int, db_session: Session):
+ """Updates an existing document."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type="document"
+ )
+ if not plugin:
+ log.warning("Document not updated. No document plugin enabled.")
+ return
+
+ document_kwargs = {}
+ if document.resource_type == DocumentResourceTypes.case:
+ document_kwargs = {
+ "case_description": document.case.description,
+ "case_name": document.case.name,
+ "case_owner": document.case.assignee.individual.email,
+ "case_priority": document.case.case_priority.name,
+ "case_resolution": document.case.resolution,
+ "case_severity": document.case.case_severity.name,
+ "case_status": document.case.status,
+ "case_storage_weblink": resolve_attr(document.case, "storage.weblink"),
+ "case_title": document.case.title,
+ "case_type": document.case.case_type.name,
+ }
+
+ if (
+ document.resource_type == DocumentResourceTypes.incident
+ or document.resource_type == DocumentResourceTypes.review
+ ):
+ document_kwargs = {
+ "commander_fullname": document.incident.commander.individual.name,
+ "conference_challenge": resolve_attr(document.incident, "conference.challenge"),
+ "conference_weblink": resolve_attr(document.incident, "conference.weblink"),
+ "conversation_weblink": resolve_attr(document.incident, "conversation.weblink"),
+ "description": document.incident.description,
+ "document_weblink": resolve_attr(document.incident, "incident_document.weblink"),
+ "name": document.incident.name,
+ "priority": document.incident.incident_priority.name,
+ "reported_at": document.incident.reported_at.strftime("%m/%d/%Y %H:%M:%S"),
+ "resolution": document.incident.resolution,
+ "severity": document.incident.incident_severity.name,
+ "status": document.incident.status,
+ "storage_weblink": resolve_attr(document.incident, "storage.weblink"),
+ "ticket_weblink": resolve_attr(document.incident, "ticket.weblink"),
+ "title": document.incident.title,
+ "type": document.incident.incident_type.name,
+ }
+ """
+ Iterate through tags and create new replacement text. Prefix with âtag_â, i.e., for tag actor,
+ the template should have {{tag_actor}}. Also, create replacements for the source of each tag
+ type: {{tag_actor.source}}. Thus, if the source for actor was hacking,
+ this would be the replaced text. Only create the source replacements if not null.
+ For any tag types with multiple selected tags, replace with a comma-separated list.
+ """
+ # first ensure all tags types have a placeholder in the document template
+ tag_types = tag_type_service.get_all_by_project(
+ db_session=db_session, project_id=project_id
+ )
+ for tag_type in tag_types:
+ document_kwargs[f"tag_{tag_type.name}"] = "N/A"
+ document_kwargs[f"tag_{tag_type.name}.source"] = "N/A"
+
+ # create document template placeholders for tags
+ for tag in document.incident.tags:
+ if document_kwargs[f"tag_{tag.tag_type.name}"] == "N/A":
+ document_kwargs[f"tag_{tag.tag_type.name}"] = tag.name
+ else:
+ document_kwargs[f"tag_{tag.tag_type.name}"] += f", {tag.name}"
+ if tag.source:
+ if document_kwargs[f"tag_{tag.tag_type.name}.source"] == "N/A":
+ document_kwargs[f"tag_{tag.tag_type.name}.source"] = tag.source
+ else:
+ document_kwargs[f"tag_{tag.tag_type.name}.source"] += f", {tag.source}"
+
+ if document.resource_type == DocumentResourceTypes.review:
+ document_kwargs["stable_at"] = document.incident.stable_at.strftime("%m/%d/%Y %H:%M:%S")
+
+ plugin.instance.update(document.resource_id, **document_kwargs)
+
+ if document.resource_type == DocumentResourceTypes.case:
+ event_service.log_case_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description=f"{deslug(DocumentResourceTypes.case).lower().capitalize()} updated",
+ case_id=document.case.id,
+ )
+
+ if document.resource_type == DocumentResourceTypes.incident:
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description=f"{deslug(DocumentResourceTypes.incident).lower().capitalize()} updated",
+ incident_id=document.incident.id,
+ )
+
+
+def delete_document(document: Document, project_id: int, db_session: Session):
+ """Deletes an existing document."""
+ # we delete the external document
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type="storage"
+ )
+ if plugin:
+ # TODO(mvilanova): implement deleting the external document
+ # plugin.instance.delete()
+ pass
+ else:
+ log.warning("Document not deleted. No storage plugin enabled.")
+
+ # we delete the internal document
+ delete(db_session=db_session, document_id=document.id)
+
+
+def open_document_access(document: Document | None, db_session: Session):
+ """Opens access to document by adding domain wide permission, handling both incidents and cases."""
+ subject_type = None
+ project_id = None
+ subject = None
+
+ if not document:
+ log.warning("Document not opened. No document provided.")
+ return
+
+ if document.incident:
+ subject_type = "incident"
+ subject = document.incident
+ project_id = document.incident.project.id
+ elif document.case:
+ subject_type = "case"
+ subject = document.case
+ project_id = document.case.project.id
+
+ if not subject_type:
+ log.warning(f"Document {document.id} is neither linked to an incident nor a case.")
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type="storage"
+ )
+ if not plugin:
+ log.warning("Access to document not opened. No storage plugin enabled.")
+ return
+
+ try:
+ plugin.instance.open(document.resource_id)
+ except Exception as e:
+ event_service.log_subject_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Opening {deslug(document.resource_type).lower()} to anyone in the domain failed. Reason: {e}",
+ subject=subject,
+ )
+ log.exception(e)
+ else:
+ event_service.log_subject_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"{deslug(document.resource_type).lower().capitalize()} opened to anyone in the domain",
+ subject=subject,
+ )
+
+
+def mark_document_as_readonly(document: Document, db_session: Session):
+ """Marks document as readonly, handling both incidents and cases."""
+ subject_type = None
+ project_id = None
+ subject = None
+
+ if document.incident:
+ subject_type = "incident"
+ subject = document.incident
+ project_id = document.incident.project.id
+ elif document.case:
+ subject_type = "case"
+ subject = document.case
+ project_id = document.case.project.id
+
+ if not subject_type:
+ log.warning(f"Document {document.id} is neither linked to an incident nor a case.")
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type="storage"
+ )
+ if not plugin:
+ log.warning("Document not marked as readonly. No storage plugin enabled.")
+ return
+
+ try:
+ plugin.instance.mark_readonly(document.resource_id)
+ except Exception as e:
+ event_service.log_subject_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Marking {deslug(document.resource_type).lower()} as readonly failed. Reason: {e}",
+ subject=subject,
+ )
+ log.exception(e)
+ else:
+ event_service.log_subject_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"{deslug(document.resource_type).lower().capitalize()} marked as readonly",
+ subject=subject,
+ )
diff --git a/src/dispatch/document/models.py b/src/dispatch/document/models.py
index a05a8c7a0b61..94b650418648 100644
--- a/src/dispatch/document/models.py
+++ b/src/dispatch/document/models.py
@@ -1,108 +1,128 @@
-from datetime import datetime
-from typing import List, Optional
+"""Models for document resources in the Dispatch application."""
-from pydantic import validator
-from sqlalchemy import Column, ForeignKey, Integer, PrimaryKeyConstraint, String, Table
-from sqlalchemy.orm import backref, relationship
+from datetime import datetime
+from collections import defaultdict
+
+from pydantic import field_validator, ValidationInfo
+from sqlalchemy import (
+ Column,
+ ForeignKey,
+ Integer,
+ PrimaryKeyConstraint,
+ String,
+ Table,
+)
+from sqlalchemy.orm import relationship
from sqlalchemy_utils import TSVectorType
-from dispatch.database import Base
-from dispatch.incident_priority.models import IncidentPriorityCreate, IncidentPriorityRead
-from dispatch.incident_type.models import IncidentTypeCreate, IncidentTypeRead
-from dispatch.messaging import INCIDENT_DOCUMENT_DESCRIPTIONS
-from dispatch.models import DispatchBase, ResourceMixin, TermNested, TermReadNested, TimeStampMixin
+from dispatch.database.core import Base
+from dispatch.messaging.strings import DOCUMENT_DESCRIPTIONS
+from dispatch.models import EvergreenBase, NameStr, PrimaryKey
+from dispatch.models import ResourceBase, ProjectMixin, ResourceMixin, EvergreenMixin, Pagination
+from dispatch.project.models import ProjectRead
+from dispatch.search_filter.models import SearchFilterRead
+from dispatch.tag.models import TagRead
# Association tables for many to many relationships
-assoc_document_incident_priorities = Table(
- "document_incident_priority",
- Base.metadata,
- Column("incident_priority_id", Integer, ForeignKey("incident_priority.id")),
- Column("document_id", Integer, ForeignKey("document.id")),
- PrimaryKeyConstraint("incident_priority_id", "document_id"),
-)
-
-assoc_document_incident_types = Table(
- "document_incident_type",
+assoc_document_filters = Table(
+ "assoc_document_filters",
Base.metadata,
- Column("incident_type_id", Integer, ForeignKey("incident_type.id")),
- Column("document_id", Integer, ForeignKey("document.id")),
- PrimaryKeyConstraint("incident_type_id", "document_id"),
+ Column("document_id", Integer, ForeignKey("document.id", ondelete="CASCADE")),
+ Column("search_filter_id", Integer, ForeignKey("search_filter.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("document_id", "search_filter_id"),
)
-assoc_document_incidents = Table(
- "document_incident",
+assoc_document_tags = Table(
+ "assoc_document_tags",
Base.metadata,
- Column("incident_id", Integer, ForeignKey("incident.id")),
- Column("document_id", Integer, ForeignKey("document.id")),
- PrimaryKeyConstraint("incident_id", "document_id"),
+ Column("document_id", Integer, ForeignKey("document.id", ondelete="CASCADE")),
+ Column("tag_id", Integer, ForeignKey("tag.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("document_id", "tag_id"),
)
-assoc_document_terms = Table(
- "document_terms",
- Base.metadata,
- Column("term_id", Integer, ForeignKey("term.id")),
- Column("document_id", Integer, ForeignKey("document.id")),
- PrimaryKeyConstraint("term_id", "document_id"),
-)
+class Document(ProjectMixin, ResourceMixin, EvergreenMixin, Base):
+ """SQLAlchemy model for document resources."""
-class Document(Base, ResourceMixin, TimeStampMixin):
id = Column(Integer, primary_key=True)
name = Column(String)
description = Column(String)
- incident_priorities = relationship(
- "IncidentPriority", secondary=assoc_document_incident_priorities, backref="documents"
+ report_id = Column(Integer, ForeignKey("report.id", ondelete="CASCADE"))
+ incident_id = Column(Integer, ForeignKey("incident.id", ondelete="CASCADE", use_alter=True))
+ case_id = Column(Integer, ForeignKey("case.id", ondelete="CASCADE", use_alter=True))
+
+ filters = relationship("SearchFilter", secondary=assoc_document_filters, backref="documents")
+
+ search_vector = Column(TSVectorType("name", regconfig="pg_catalog.simple"))
+ tags = relationship(
+ "Tag",
+ secondary=assoc_document_tags,
+ lazy="subquery",
+ backref="documents",
)
- incident_types = relationship(
- "IncidentType", secondary=assoc_document_incident_types, backref="documents"
- )
- terms = relationship(
- "Term", secondary=assoc_document_terms, backref=backref("documents", cascade="all")
- )
- search_vector = Column(TSVectorType("name"))
# Pydantic models...
-class DocumentBase(DispatchBase):
- resource_type: Optional[str]
- resource_id: Optional[str]
- description: Optional[str]
- weblink: str
- name: str
- created_at: Optional[datetime] = None
- updated_at: Optional[datetime] = None
+class DocumentBase(ResourceBase, EvergreenBase):
+ """Base Pydantic model for document resources."""
+
+ description: str | None = None
+ name: NameStr
+ created_at: datetime | None = None
+ updated_at: datetime | None = None
class DocumentCreate(DocumentBase):
- terms: Optional[List[TermNested]] = []
- incident_priorities: Optional[List[IncidentPriorityCreate]] = []
- incident_types: Optional[List[IncidentTypeCreate]] = []
+ """Pydantic model for creating a document resource."""
+
+ filters: list[SearchFilterRead] | None = []
+ project: ProjectRead
+ tags: list[TagRead] | None = []
class DocumentUpdate(DocumentBase):
- terms: Optional[List[TermNested]] = []
- incident_priorities: Optional[List[IncidentPriorityCreate]] = []
- incident_types: Optional[List[IncidentTypeCreate]] = []
+ """Pydantic model for updating a document resource."""
+
+ filters: list[SearchFilterRead] | None = None
+ tags: list[TagRead] | None = []
+
+ @field_validator("tags")
+ @classmethod
+ def find_exclusive(cls, v):
+ """Ensures only one exclusive tag per tag type is applied."""
+ if v:
+ exclusive_tags = defaultdict(list)
+ for tag in v:
+ if tag.tag_type.exclusive:
+ exclusive_tags[tag.tag_type.id].append(tag)
+
+ for tag_types in exclusive_tags.values():
+ if len(tag_types) > 1:
+ raise ValueError(
+ f"Found multiple exclusive tags. Please ensure that only one tag of a given type is applied. Tags: {','.join([t.name for t in v])}" # noqa: E501
+ )
+ return v
class DocumentRead(DocumentBase):
- id: int
- incident_priorities: Optional[List[IncidentPriorityRead]] = []
- incident_types: Optional[List[IncidentTypeRead]] = []
- terms: Optional[List[TermReadNested]] = []
-
- @validator("description", pre=True, always=True)
- def set_description(cls, v, values):
- """Sets the description"""
+ """Pydantic model for reading a document resource."""
+
+ id: PrimaryKey
+ filters: list[SearchFilterRead] | None = []
+ project: ProjectRead | None
+ tags: list[TagRead] | None = []
+
+ @field_validator("description", mode="before")
+ @classmethod
+ def set_description(cls, v, info: ValidationInfo):
+ """Sets the description for the document resource."""
if not v:
- return INCIDENT_DOCUMENT_DESCRIPTIONS.get(values["resource_type"], "No Description")
+ resource_type = info.data.get("resource_type")
+ return DOCUMENT_DESCRIPTIONS.get(resource_type, "No Description")
return v
-class DocumentNested(DocumentBase):
- id: int
-
+class DocumentPagination(Pagination):
+ """Pydantic model for paginated document results."""
-class DocumentPagination(DispatchBase):
- total: int
- items: List[DocumentRead] = []
+ items: list[DocumentRead] = []
diff --git a/src/dispatch/document/scheduled.py b/src/dispatch/document/scheduled.py
index 891367a1971b..6fd87fbe143a 100644
--- a/src/dispatch/document/scheduled.py
+++ b/src/dispatch/document/scheduled.py
@@ -2,13 +2,13 @@
from schedule import every
from sqlalchemy import func
-
-from dispatch.config import INCIDENT_PLUGIN_STORAGE_SLUG
-from dispatch.decorators import background_task
-from dispatch.plugins.base import plugins
-from dispatch.route import service as route_service
+from sqlalchemy.orm import Session
+from dispatch.decorators import scheduled_project_task, timer
+from dispatch.nlp import build_phrase_matcher, build_term_vocab, extract_terms_from_text
+from dispatch.plugin import service as plugin_service
+from dispatch.project.models import Project
from dispatch.scheduler import scheduler
-from dispatch.extensions import sentry_sdk
+from dispatch.term import service as term_service
from dispatch.term.models import Term
from .service import get_all
@@ -16,40 +16,46 @@
log = logging.getLogger(__name__)
-@scheduler.add(every(1).day, name="document-term-sync")
-@background_task
-def sync_document_terms(db_session=None):
+@scheduler.add(every(1).day, name="sync-document-terms")
+@timer
+@scheduled_project_task
+def sync_document_terms(db_session: Session, project: Project):
"""Performs term extraction from known documents."""
- documents = get_all(db_session=db_session)
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, plugin_type="storage", project_id=project.id
+ )
- for doc in documents:
- log.debug(f"Processing document. Name: {doc.name}")
- p = plugins.get(
- INCIDENT_PLUGIN_STORAGE_SLUG
- ) # this may need to be refactored if we support multiple document types
+ if not plugin:
+ log.warn(
+ f"Document terms not synced. No storage plugin enabled. Project: {project.name}. Organization: {project.organization.name}"
+ )
+ return
- try:
- if "sheet" in doc.resource_type:
- mime_type = "text/csv"
- else:
- mime_type = "text/plain"
+ terms = term_service.get_all(db_session=db_session, project_id=project.id).all()
+ term_strings = [t.text.lower() for t in terms if t.discoverable]
+ phrases = build_term_vocab(term_strings)
+ matcher = build_phrase_matcher("dispatch-term", phrases)
- doc_text = p.get(doc.resource_id, mime_type)
- extracted_terms = route_service.get_terms(db_session=db_session, text=doc_text)
+ documents = get_all(db_session=db_session)
+ for document in documents:
+ mime_type = "text/plain"
+ if "sheet" in document.resource_type:
+ mime_type = "text/csv"
- matched_terms = (
- db_session.query(Term)
- .filter(func.upper(Term.text).in_([func.upper(t) for t in extracted_terms]))
- .all()
- )
+ try:
+ document_text = plugin.instance.get(document.resource_id, mime_type)
+ except Exception as e:
+ log.warn(e)
+ continue
- log.debug(f"Extracted the following terms from {doc.weblink}. Terms: {extracted_terms}")
+ extracted_terms = list(set(extract_terms_from_text(document_text, matcher)))
- if matched_terms:
- doc.terms = matched_terms
- db_session.commit()
+ matched_terms = (
+ db_session.query(Term)
+ .filter(func.upper(Term.text).in_([func.upper(t) for t in extracted_terms]))
+ .all()
+ )
- except Exception as e:
- # even if one document fails we don't want them to all fail
- sentry_sdk.capture_exception(e)
- log.exception(e)
+ if matched_terms:
+ document.terms = matched_terms
+ db_session.commit()
diff --git a/src/dispatch/document/service.py b/src/dispatch/document/service.py
index b96ad6f4ead9..df95137d65d0 100644
--- a/src/dispatch/document/service.py
+++ b/src/dispatch/document/service.py
@@ -1,70 +1,200 @@
-from typing import List, Optional
+from datetime import datetime
+from pydantic import ValidationError
-from fastapi.encoders import jsonable_encoder
-
-from dispatch.incident_priority import service as incident_priority_service
-from dispatch.incident_type import service as incident_type_service
-from dispatch.term import service as term_service
+from dispatch.enums import DocumentResourceReferenceTypes, DocumentResourceTemplateTypes
+from dispatch.project import service as project_service
+from dispatch.search_filter import service as search_filter_service
+from dispatch.tag import service as tag_service
from .models import Document, DocumentCreate, DocumentUpdate
-def get(*, db_session, document_id: int) -> Optional[Document]:
+def get(*, db_session, document_id: int) -> Document | None:
"""Returns a document based on the given document id."""
return db_session.query(Document).filter(Document.id == document_id).one_or_none()
def get_by_incident_id_and_resource_type(
- *, db_session, incident_id: int, resource_type: str
-) -> Optional[Document]:
+ *, db_session, incident_id: int, project_id: int, resource_type: str
+) -> Document | None:
"""Returns a document based on the given incident and id and document resource type."""
return (
db_session.query(Document)
.filter(Document.incident_id == incident_id)
+ .filter(Document.project_id == project_id)
+ .filter(Document.resource_type == resource_type)
+ .one_or_none()
+ )
+
+
+def get_by_case_id_and_resource_type(
+ *, db_session, case_id: int, project_id: int, resource_type: str
+) -> Document | None:
+ """Returns a document based on the given case and id and document resource type."""
+ return (
+ db_session.query(Document)
+ .filter(Document.case_id == case_id)
+ .filter(Document.project_id == project_id)
+ .filter(Document.resource_type == resource_type)
+ .one_or_none()
+ )
+
+
+def get_project_forms_export_template(*, db_session, project_id: int) -> Document | None:
+ """Fetches the project forms export template."""
+ resource_type = DocumentResourceTemplateTypes.forms
+ return (
+ db_session.query(Document)
+ .filter(Document.project_id == project_id)
.filter(Document.resource_type == resource_type)
.one_or_none()
)
-def get_all(*, db_session) -> List[Optional[Document]]:
+def get_incident_faq_document(*, db_session, project_id: int):
+ """Fetches incident faq document."""
+ return (
+ db_session.query(Document).filter(
+ Document.resource_type == DocumentResourceReferenceTypes.faq,
+ Document.project_id == project_id,
+ )
+ ).one_or_none()
+
+
+def get_conversation_reference_document(*, db_session, project_id: int):
+ """Fetches conversation reference document."""
+ return (
+ db_session.query(Document).filter(
+ Document.resource_type == DocumentResourceReferenceTypes.conversation,
+ Document.project_id == project_id,
+ )
+ ).one_or_none()
+
+
+def get_overdue_evergreen_documents(*, db_session, project_id: int) -> list[Document | None]:
+ """Returns all documents that have not had a recent evergreen notification."""
+ query = (
+ db_session.query(Document)
+ .filter(Document.project_id == project_id)
+ .filter(Document.evergreen == True) # noqa
+ .filter(Document.overdue == True) # noqa
+ )
+ return query.all()
+
+
+def get_all(*, db_session) -> list[Document | None]:
"""Returns all documents."""
return db_session.query(Document)
def create(*, db_session, document_in: DocumentCreate) -> Document:
"""Creates a new document."""
- terms = [
- term_service.get_or_create(db_session=db_session, term_in=t) for t in document_in.terms
- ]
- incident_priorities = [
- incident_priority_service.get_by_name(db_session=db_session, name=n.name)
- for n in document_in.incident_priorities
- ]
- incident_types = [
- incident_type_service.get_by_name(db_session=db_session, name=n.name)
- for n in document_in.incident_types
+ # handle the special case of only allowing 1 FAQ / Forms Export document per-project
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=document_in.project
+ )
+
+ if document_in.resource_type == DocumentResourceReferenceTypes.faq:
+ faq_doc = (
+ db_session.query(Document)
+ .filter(Document.resource_type == DocumentResourceReferenceTypes.faq)
+ .filter(Document.project_id == project.id)
+ .one_or_none()
+ )
+ if faq_doc:
+ raise ValidationError(
+ [
+ {
+ "msg": "FAQ document already defined for this project.",
+ "loc": "document",
+ }
+ ]
+ )
+
+ if document_in.resource_type == DocumentResourceTemplateTypes.forms:
+ forms_doc = (
+ db_session.query(Document)
+ .filter(Document.resource_type == DocumentResourceTemplateTypes.forms)
+ .filter(Document.project_id == project.id)
+ .one_or_none()
+ )
+ if forms_doc:
+ raise ValidationError(
+ [
+ {
+ "msg": "Forms export template document already defined for this project.",
+ "loc": "document",
+ }
+ ]
+ )
+
+ filters = [
+ search_filter_service.get(db_session=db_session, search_filter_id=f.id)
+ for f in document_in.filters
]
+
+ tags = []
+ for t in document_in.tags:
+ tags.append(tag_service.get_or_create(db_session=db_session, tag_in=t))
+
+ # set the last reminder to now
+ if document_in.evergreen:
+ document_in.evergreen_last_reminder_at = datetime.utcnow()
+
document = Document(
- **document_in.dict(exclude={"terms", "incident_priorities", "incident_types"}),
- incident_priorities=incident_priorities,
- incident_types=incident_types,
- terms=terms,
+ **document_in.dict(exclude={"project", "filters", "tags"}),
+ filters=filters,
+ project=project,
+ tags=tags,
)
+
db_session.add(document)
db_session.commit()
return document
+def get_or_create(*, db_session, document_in) -> Document:
+ """Gets a document by it's resource_id or creates a new document."""
+ if hasattr(document_in, "resource_id"):
+ q = db_session.query(Document).filter(Document.resource_id == document_in.resource_id)
+ else:
+ q = db_session.query(Document).filter_by(**document_in.dict())
+
+ instance = q.first()
+ if instance:
+ return instance
+
+ return create(db_session=db_session, document_in=document_in)
+
+
def update(*, db_session, document: Document, document_in: DocumentUpdate) -> Document:
"""Updates a document."""
- document_data = jsonable_encoder(document)
- update_data = document_in.dict(skip_defaults=True)
+ document_data = document.dict()
+
+ # we reset the last evergreeen reminder to now
+ if document_in.evergreen:
+ if not document.evergreen:
+ document_in.evergreen_last_reminder_at = datetime.utcnow()
+
+ update_data = document_in.dict(exclude_unset=True, exclude={"filters", "tags"})
+
+ tags = []
+ for t in document_in.tags:
+ tags.append(tag_service.get_or_create(db_session=db_session, tag_in=t))
for field in document_data:
if field in update_data:
setattr(document, field, update_data[field])
- db_session.add(document)
+ if document_in.filters is not None:
+ filters = [
+ search_filter_service.get(db_session=db_session, search_filter_id=f.id)
+ for f in document_in.filters
+ ]
+ document.filters = filters
+
+ document.tags = tags
+
db_session.commit()
return document
diff --git a/src/dispatch/document/utils.py b/src/dispatch/document/utils.py
new file mode 100644
index 000000000000..088421504238
--- /dev/null
+++ b/src/dispatch/document/utils.py
@@ -0,0 +1,6 @@
+from dispatch.enums import DocumentResourceTypes
+
+
+def deslug(document_type: DocumentResourceTypes) -> str:
+ """Deslugs a document resource type."""
+ return " ".join([w.capitalize() for w in document_type.split("-")[1:]])
diff --git a/src/dispatch/document/views.py b/src/dispatch/document/views.py
index 99d43d6d9ecb..6c45b9159242 100644
--- a/src/dispatch/document/views.py
+++ b/src/dispatch/document/views.py
@@ -1,86 +1,59 @@
-from typing import List
+from fastapi import APIRouter, HTTPException, status
-from fastapi import APIRouter, Depends, HTTPException, Query
-from sqlalchemy.orm import Session
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
-from dispatch.database import get_db, search_filter_sort_paginate
-from dispatch.search.service import search
-
-from .models import Document, DocumentCreate, DocumentPagination, DocumentRead, DocumentUpdate
-from .service import create, delete, get, get_all, update
+from .models import DocumentCreate, DocumentPagination, DocumentRead, DocumentUpdate
+from .service import create, delete, get, update
router = APIRouter()
-@router.get("/", response_model=DocumentPagination)
-def get_documents(
- db_session: Session = Depends(get_db),
- page: int = 1,
- items_per_page: int = Query(5, alias="itemsPerPage"),
- query_str: str = Query(None, alias="q"),
- sort_by: List[str] = Query(None, alias="sortBy[]"),
- descending: List[bool] = Query(None, alias="descending[]"),
- fields: List[str] = Query(None, alias="field[]"),
- ops: List[str] = Query(None, alias="op[]"),
- values: List[str] = Query(None, alias="value[]"),
-):
- """
- Get all documents.
- """
- return search_filter_sort_paginate(
- db_session=db_session,
- model="Document",
- query_str=query_str,
- page=page,
- items_per_page=items_per_page,
- sort_by=sort_by,
- descending=descending,
- fields=fields,
- values=values,
- ops=ops,
- )
+@router.get("", response_model=DocumentPagination)
+def get_documents(common: CommonParameters):
+ """Get all documents."""
+ return search_filter_sort_paginate(model="Document", **common)
@router.get("/{document_id}", response_model=DocumentRead)
-def get_document(*, db_session: Session = Depends(get_db), document_id: int):
- """
- Update a document.
- """
+def get_document(db_session: DbSession, document_id: PrimaryKey):
+ """Get a document."""
document = get(db_session=db_session, document_id=document_id)
if not document:
- raise HTTPException(status_code=404, detail="The document with this id does not exist.")
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A document with this id does not exist."}],
+ )
return document
-@router.post("/", response_model=DocumentCreate)
-def create_document(*, db_session: Session = Depends(get_db), document_in: DocumentCreate):
- """
- Create a new document.
- """
- document = create(db_session=db_session, document_in=document_in)
- return document
+@router.post("", response_model=DocumentRead)
+def create_document(db_session: DbSession, document_in: DocumentCreate):
+ """Create a new document."""
+ return create(db_session=db_session, document_in=document_in)
-@router.put("/{document_id}", response_model=DocumentCreate)
-def update_document(
- *, db_session: Session = Depends(get_db), document_id: int, document_in: DocumentUpdate
-):
- """
- Update a document.
- """
+@router.put("/{document_id}", response_model=DocumentRead)
+def update_document(db_session: DbSession, document_id: PrimaryKey, document_in: DocumentUpdate):
+ """Update a document."""
document = get(db_session=db_session, document_id=document_id)
if not document:
- raise HTTPException(status_code=404, detail="The document with this id does not exist.")
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A document with this id does not exist."}],
+ )
document = update(db_session=db_session, document=document, document_in=document_in)
return document
-@router.delete("/{document_id}")
-def delete_document(*, db_session: Session = Depends(get_db), document_id: int):
- """
- Delete a document.
- """
+@router.delete("/{document_id}", response_model=None)
+def delete_document(db_session: DbSession, document_id: PrimaryKey):
+ """Delete a document."""
document = get(db_session=db_session, document_id=document_id)
if not document:
- raise HTTPException(status_code=404, detail="The document with this id does not exist.")
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A document with this id does not exist."}],
+ )
delete(db_session=db_session, document_id=document_id)
diff --git a/src/dispatch/email_templates/enums.py b/src/dispatch/email_templates/enums.py
new file mode 100644
index 000000000000..888cd08fe38f
--- /dev/null
+++ b/src/dispatch/email_templates/enums.py
@@ -0,0 +1,6 @@
+from dispatch.enums import DispatchEnum
+
+
+class EmailTemplateTypes(DispatchEnum):
+ case_welcome = "Case Welcome Email"
+ incident_welcome = "Incident Welcome Email"
diff --git a/src/dispatch/email_templates/models.py b/src/dispatch/email_templates/models.py
new file mode 100644
index 000000000000..acb894091d99
--- /dev/null
+++ b/src/dispatch/email_templates/models.py
@@ -0,0 +1,46 @@
+from datetime import datetime
+from sqlalchemy import Column, Integer, String, Boolean, UniqueConstraint
+
+from dispatch.database.core import Base
+from dispatch.models import DispatchBase, TimeStampMixin, PrimaryKey, Pagination, ProjectMixin
+from dispatch.project.models import ProjectRead
+
+
+class EmailTemplates(TimeStampMixin, ProjectMixin, Base):
+ __table_args__ = (UniqueConstraint("email_template_type", "project_id"),)
+ # Columns
+ id = Column(Integer, primary_key=True)
+ email_template_type = Column(String, nullable=True)
+ welcome_text = Column(String, nullable=True)
+ welcome_body = Column(String, nullable=True)
+ components = Column(String, nullable=True)
+ enabled = Column(Boolean, default=True)
+
+
+# Pydantic models
+class EmailTemplatesBase(DispatchBase):
+ email_template_type: str | None = None
+ welcome_text: str | None = None
+ welcome_body: str | None = None
+ components: str | None = None
+ enabled: bool | None = None
+
+
+class EmailTemplatesCreate(EmailTemplatesBase):
+ project: ProjectRead | None = None
+
+
+class EmailTemplatesUpdate(EmailTemplatesBase):
+ id: PrimaryKey = None
+
+
+class EmailTemplatesRead(EmailTemplatesBase):
+ id: PrimaryKey
+ project: ProjectRead | None = None
+ created_at: datetime | None = None
+ updated_at: datetime | None = None
+
+
+class EmailTemplatesPagination(Pagination):
+ items: list[EmailTemplatesRead]
+ total: int
diff --git a/src/dispatch/email_templates/service.py b/src/dispatch/email_templates/service.py
new file mode 100644
index 000000000000..be97cd2d499b
--- /dev/null
+++ b/src/dispatch/email_templates/service.py
@@ -0,0 +1,77 @@
+import logging
+
+from sqlalchemy.orm import Session
+
+from .models import EmailTemplates, EmailTemplatesUpdate, EmailTemplatesCreate
+from dispatch.project import service as project_service
+
+log = logging.getLogger(__name__)
+
+
+def get(*, email_template_id: int, db_session: Session) -> EmailTemplates | None:
+ """Gets an email template by its id."""
+ return (
+ db_session.query(EmailTemplates)
+ .filter(EmailTemplates.id == email_template_id)
+ .one_or_none()
+ )
+
+
+def get_by_type(
+ *, email_template_type: str, project_id: int, db_session: Session
+) -> EmailTemplates | None:
+ """Gets an email template by its type."""
+ return (
+ db_session.query(EmailTemplates)
+ .filter(EmailTemplates.project_id == project_id)
+ .filter(EmailTemplates.email_template_type == email_template_type)
+ .filter(EmailTemplates.enabled == True) # noqa
+ .first()
+ )
+
+
+def get_all(*, db_session: Session) -> list[EmailTemplates | None]:
+ """Gets all email templates."""
+ return db_session.query(EmailTemplates)
+
+
+def create(*, email_template_in: EmailTemplatesCreate, db_session: Session) -> EmailTemplates:
+ """Creates email template data."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=email_template_in.project
+ )
+
+ email_template = EmailTemplates(**email_template_in.dict(exclude={"project"}), project=project)
+
+ db_session.add(email_template)
+ db_session.commit()
+ return email_template
+
+
+def update(
+ *,
+ email_template: EmailTemplates,
+ email_template_in: EmailTemplatesUpdate,
+ db_session: Session,
+) -> EmailTemplates:
+ """Updates an email template."""
+ new_template = email_template.dict()
+ update_data = email_template_in.dict(exclude_unset=True)
+
+ for field in new_template:
+ if field in update_data:
+ setattr(email_template, field, update_data[field])
+
+ db_session.commit()
+ return email_template
+
+
+def delete(*, db_session, email_template_id: int):
+ """Deletes an email template."""
+ email_template = (
+ db_session.query(EmailTemplates)
+ .filter(EmailTemplates.id == email_template_id)
+ .one_or_none()
+ )
+ db_session.delete(email_template)
+ db_session.commit()
diff --git a/src/dispatch/email_templates/views.py b/src/dispatch/email_templates/views.py
new file mode 100644
index 000000000000..cc4e637c4e1d
--- /dev/null
+++ b/src/dispatch/email_templates/views.py
@@ -0,0 +1,118 @@
+import logging
+from fastapi import APIRouter, HTTPException, status, Depends
+from pydantic import ValidationError
+
+from sqlalchemy.exc import IntegrityError
+
+from dispatch.auth.permissions import (
+ SensitiveProjectActionPermission,
+ PermissionsDependency,
+)
+from dispatch.database.core import DbSession
+from dispatch.auth.service import CurrentUser
+from dispatch.database.service import search_filter_sort_paginate, CommonParameters
+from dispatch.models import PrimaryKey
+
+from .models import (
+ EmailTemplatesRead,
+ EmailTemplatesUpdate,
+ EmailTemplatesPagination,
+ EmailTemplatesCreate,
+)
+from .service import get, create, update, delete
+
+log = logging.getLogger(__name__)
+router = APIRouter()
+
+
+@router.get("", response_model=EmailTemplatesPagination)
+def get_email_templates(commons: CommonParameters):
+ """Get all email templates, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="EmailTemplates", **commons)
+
+
+@router.get("/{email_template_id}", response_model=EmailTemplatesRead)
+def get_email_template(db_session: DbSession, email_template_id: PrimaryKey):
+ """Get an email template by its id."""
+ email_template = get(db_session=db_session, email_template_id=email_template_id)
+ if not email_template:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An email template with this id does not exist."}],
+ )
+ return email_template
+
+
+@router.post(
+ "",
+ response_model=EmailTemplatesRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_email_template(
+ db_session: DbSession,
+ email_template_in: EmailTemplatesCreate,
+ current_user: CurrentUser,
+):
+ """Create a new email template."""
+ try:
+ return create(db_session=db_session, email_template_in=email_template_in)
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "An email template with this name already exists.",
+ "loc": "name",
+ }
+ ],
+ ) from None
+
+
+@router.put(
+ "/{email_template_id}",
+ response_model=EmailTemplatesRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_email_template(
+ db_session: DbSession,
+ email_template_id: PrimaryKey,
+ email_template_in: EmailTemplatesUpdate,
+):
+ """Update a search filter."""
+ email_template = get(db_session=db_session, email_template_id=email_template_id)
+ if not email_template:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An email template with this id does not exist."}],
+ )
+ try:
+ email_template = update(
+ db_session=db_session,
+ email_template=email_template,
+ email_template_in=email_template_in,
+ )
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "An email template with this name already exists.",
+ "loc": "name",
+ }
+ ],
+ ) from None
+ return email_template
+
+
+@router.delete(
+ "/{email_template_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def delete_email_template(db_session: DbSession, email_template_id: PrimaryKey):
+ """Delete an email template, returning only an HTTP 200 OK if successful."""
+ email_template = get(db_session=db_session, email_template_id=email_template_id)
+ if not email_template:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An email template with this id does not exist."}],
+ )
+ delete(db_session=db_session, email_template_id=email_template_id)
diff --git a/src/dispatch/entity/__init__.py b/src/dispatch/entity/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/entity/models.py b/src/dispatch/entity/models.py
new file mode 100644
index 000000000000..8e8b7a6739a4
--- /dev/null
+++ b/src/dispatch/entity/models.py
@@ -0,0 +1,86 @@
+from sqlalchemy import Column, Integer, String, ForeignKey
+from sqlalchemy.orm import relationship
+from sqlalchemy.sql.schema import UniqueConstraint
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base
+from dispatch.models import (
+ DispatchBase,
+ TimeStampMixin,
+ ProjectMixin,
+ PrimaryKey,
+ Pagination,
+)
+from dispatch.project.models import ProjectRead
+from dispatch.entity_type.models import (
+ EntityTypeCreate,
+ EntityTypeRead,
+ EntityTypeReadMinimal,
+ EntityTypeUpdate,
+)
+
+
+class Entity(Base, TimeStampMixin, ProjectMixin):
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+
+ # Columns
+ id = Column(Integer, primary_key=True, autoincrement=True)
+ name = Column(String)
+ description = Column(String)
+ value = Column(String)
+ source = Column(String)
+
+ # Relationships
+ entity_type_id = Column(Integer, ForeignKey("entity_type.id"), nullable=False)
+ entity_type = relationship("EntityType", backref="entity")
+
+ # the catalog here is simple to help matching "named entities"
+ search_vector = Column(
+ TSVectorType(
+ "name",
+ "description",
+ weights={"name": "A", "description": "B"},
+ regconfig="pg_catalog.simple",
+ )
+ )
+
+
+# Pydantic models
+class EntityBase(DispatchBase):
+ name: str | None = None
+ source: str | None = None
+ value: str | None = None
+ description: str | None = None
+
+
+class EntityCreate(EntityBase):
+ def __hash__(self):
+ return hash((self.id, self.value))
+
+ id: PrimaryKey | None = None
+ entity_type: EntityTypeCreate
+ project: ProjectRead
+
+
+class EntityUpdate(EntityBase):
+ id: PrimaryKey | None = None
+ entity_type: EntityTypeUpdate | None = None
+
+
+class EntityRead(EntityBase):
+ id: PrimaryKey
+ entity_type: EntityTypeRead | None = None
+ project: ProjectRead
+
+
+class EntityReadMinimal(DispatchBase):
+ id: PrimaryKey
+ name: str | None = None
+ source: str | None = None
+ value: str | None = None
+ description: str | None = None
+ entity_type: EntityTypeReadMinimal | None = None
+
+
+class EntityPagination(Pagination):
+ items: list[EntityRead]
diff --git a/src/dispatch/entity/service.py b/src/dispatch/entity/service.py
new file mode 100644
index 000000000000..885a25a0a11b
--- /dev/null
+++ b/src/dispatch/entity/service.py
@@ -0,0 +1,334 @@
+from datetime import datetime, timedelta
+import logging
+import re
+from collections.abc import Generator, Sequence
+from typing import NamedTuple
+import jsonpath_ng
+from pydantic import ValidationError
+from sqlalchemy import desc
+from sqlalchemy.orm import Session, joinedload
+
+from dispatch.project import service as project_service
+from dispatch.case.models import Case
+from dispatch.entity.models import Entity, EntityCreate, EntityUpdate, EntityRead
+from dispatch.entity_type import service as entity_type_service
+from dispatch.entity_type.models import EntityType
+from dispatch.signal.models import Signal, SignalInstance
+
+
+log = logging.getLogger(__name__)
+
+
+def get(*, db_session: Session, entity_id: int) -> Entity | None:
+ """Gets a entity by its id."""
+ return db_session.query(Entity).filter(Entity.id == entity_id).one_or_none()
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> Entity | None:
+ """Gets a entity by its project and name."""
+ return (
+ db_session.query(Entity)
+ .filter(Entity.name == name)
+ .filter(Entity.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(
+ *, db_session: Session, project_id: int, entity_in=EntityRead
+) -> EntityRead:
+ """Returns the entity specified or raises ValidationError."""
+ entity = get_by_name(db_session=db_session, project_id=project_id, name=entity_in.name)
+
+ if not entity:
+ raise ValidationError.from_exception_data(
+ "EntityRead",
+ [
+ {
+ "type": "value_error",
+ "loc": ("entity",),
+ "input": entity_in.name,
+ "ctx": {"error_message": "Entity not found."},
+ }
+ ],
+ )
+
+ return entity
+
+
+def get_by_value(*, db_session: Session, project_id: int, value: str) -> Entity | None:
+ """Gets a entity by its value."""
+ return (
+ db_session.query(Entity)
+ .filter(Entity.value == value)
+ .filter(Entity.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_all(*, db_session: Session, project_id: int):
+ """Gets all entities by their project."""
+ return db_session.query(Entity).filter(Entity.project_id == project_id)
+
+
+def get_all_by_signal(*, db_session: Session, signal_id: int) -> list[Entity]:
+ """Gets all entities for a specific signal."""
+ return (
+ db_session.query(Entity)
+ .join(Entity.signal_instances)
+ .join(SignalInstance.signal)
+ .filter(Signal.id == signal_id)
+ .all()
+ )
+
+
+def get_all_desc_by_signal(
+ *, db_session: Session, signal_id: int, case_id: int = None
+) -> list[Entity]:
+ """Gets all entities for a specific signal in descending order.
+
+ Args:
+ db_session: The database session.
+ signal_id: The ID of the signal to filter by.
+ case_id: Optional case ID to further filter the entities.
+
+ Returns:
+ A list of entities ordered by creation date in descending order.
+ """
+ query = (
+ db_session.query(Entity)
+ .join(Entity.signal_instances)
+ .join(SignalInstance.signal)
+ .filter(Signal.id == signal_id)
+ )
+
+ if case_id is not None:
+ # Add case filter if case_id is provided
+ query = query.filter(SignalInstance.case_id == case_id)
+
+ return query.order_by(desc(Entity.created_at)).all()
+
+
+def create(*, db_session: Session, entity_in: EntityCreate) -> Entity:
+ """Creates a new entity."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=entity_in.project
+ )
+ entity_type = entity_type_service.get_or_create(
+ db_session=db_session, entity_type_in=entity_in.entity_type
+ )
+ entity = Entity(
+ **entity_in.dict(exclude={"entity_type", "project"}),
+ project=project,
+ entity_type=entity_type,
+ )
+ entity.entity_type = entity_type
+ entity.project = project
+ db_session.add(entity)
+ db_session.commit()
+ return entity
+
+
+def get_by_value_or_create(*, db_session: Session, entity_in: EntityCreate) -> Entity:
+ """Gets or creates a new entity."""
+ # prefer the entity id if available
+ if entity_in.id:
+ q = db_session.query(Entity).filter(Entity.id == entity_in.id)
+ else:
+ q = db_session.query(Entity).filter_by(
+ value=entity_in.value, entity_type_id=entity_in.entity_type.id
+ )
+
+ instance = q.first()
+ if instance:
+ return instance
+
+ return create(db_session=db_session, entity_in=entity_in)
+
+
+def update(*, db_session: Session, entity: Entity, entity_in: EntityUpdate) -> Entity:
+ """Updates an existing entity."""
+ entity_data = entity.dict()
+ update_data = entity_in.dict(exclude_unset=True, exclude={"entity_type"})
+
+ for field in entity_data:
+ if field in update_data:
+ setattr(entity, field, update_data[field])
+
+ if entity_in.entity_type is not None:
+ entity_type = entity_type_service.get_by_name_or_raise(
+ db_session=db_session,
+ project_id=entity.project.id,
+ entity_type_in=entity_in.entity_type,
+ )
+ entity.entity_type = entity_type
+
+ db_session.commit()
+ return entity
+
+
+def delete(*, db_session: Session, entity_id: int):
+ """Deletes an existing entity."""
+ entity = db_session.query(Entity).filter(Entity.id == entity_id).one()
+ db_session.delete(entity)
+ db_session.commit()
+
+
+def get_cases_with_entity(db_session: Session, entity_id: int, days_back: int) -> list[Case]:
+ """Searches for cases with the same entity within a given timeframe."""
+ # Calculate the datetime for the start of the search window
+ start_date = datetime.utcnow() - timedelta(days=days_back)
+
+ cases = (
+ db_session.query(Case)
+ .join(Case.signal_instances)
+ .join(SignalInstance.entities)
+ .filter(Entity.id == entity_id, SignalInstance.created_at >= start_date)
+ .all()
+ )
+ return cases
+
+
+def get_case_count_with_entity(db_session: Session, entity_id: int, days_back: int) -> int:
+ """Calculate the count of cases with a given Entity by it's ID."""
+ # Calculate the datetime for the start of the search window
+ start_date = datetime.utcnow() - timedelta(days=days_back)
+
+ count = (
+ db_session.query(Case)
+ .join(Case.signal_instances)
+ .join(SignalInstance.entities)
+ .filter(Entity.id == entity_id, SignalInstance.created_at >= start_date)
+ .count()
+ )
+ return count
+
+
+def get_signal_instances_with_entity(
+ db_session: Session, entity_id: int, days_back: int
+) -> list[SignalInstance]:
+ """Searches for signal instances with the same entity within a given timeframe."""
+ # Calculate the datetime for the start of the search window
+ start_date = datetime.utcnow() - timedelta(days=days_back)
+
+ # Query for signal instances containing the entity within the search window
+ signal_instances = (
+ db_session.query(SignalInstance)
+ .options(joinedload(SignalInstance.signal))
+ .join(SignalInstance.entities)
+ .filter(SignalInstance.created_at >= start_date, Entity.id == entity_id)
+ .all()
+ )
+
+ return signal_instances
+
+
+def get_signal_instances_with_entities(
+ db_session: Session, signal_id: int, entity_ids: list[int], days_back: int
+) -> list[SignalInstance]:
+ """Searches a signal instance with the same entities within a given timeframe."""
+ # Calculate the datetime for the start of the search window
+ start_date = datetime.utcnow() - timedelta(days=days_back)
+
+ # Query for signal instances containing the entity within the search window
+ signal_instances = (
+ db_session.query(SignalInstance)
+ .options(joinedload(SignalInstance.signal))
+ .join(SignalInstance.entities)
+ .filter(SignalInstance.created_at >= start_date)
+ .filter(SignalInstance.signal_id == signal_id)
+ .filter(Entity.id.in_(entity_ids))
+ .all()
+ )
+
+ return signal_instances
+
+
+EntityTypePair = NamedTuple(
+ "EntityTypePairTuple",
+ [
+ ("entity_type", EntityType),
+ ("regex", re.Pattern[str] | None),
+ ("json_path", jsonpath_ng.JSONPath | None),
+ ],
+)
+
+
+def find_entities(
+ db_session: Session, signal_instance: SignalInstance, entity_types: Sequence[EntityType]
+) -> list[Entity]:
+ """
+ Find entities in a SignalInstance based on a list of EntityTypes.
+
+ Args:
+ db_session (Session): The database session to use for entity creation.
+ signal_instance (SignalInstance): The SignalInstance to extract entities from.
+ entity_types (Sequence[EntityType]): A list of EntityTypes to search for in the SignalInstance.
+
+ Returns:
+ list[Entity]: A list of entities found in the SignalInstance.
+ """
+
+ def _find_entities_by_jsonpath_expression(
+ signal_instance: SignalInstance,
+ entity_type_pairs: list[EntityTypePair],
+ ) -> Generator[EntityCreate, None, None]:
+ """
+ Yield entities found in a SignalInstance by searching its fields using regular expressions and JSONPath expressions.
+
+ Args:
+ signal_instance: The SignalInstance to extract entities from.
+ entity_type_pairs: A list of (entity_type, entity_regex, jpath) tuples to search for.
+
+ Yields:
+ EntityCreate: An entity found in the SignalInstance.
+ """
+ for entity_type, _, jpath in entity_type_pairs:
+ if jpath:
+ try:
+ matches = jpath.find(signal_instance.raw)
+ for match in matches:
+ if isinstance(match.value, str):
+ yield EntityCreate(
+ id=None,
+ value=match.value,
+ entity_type=entity_type,
+ project=signal_instance.project,
+ )
+
+ except KeyError:
+ log.warning(
+ f"Unable to extract entity {str(jpath)} is not a valid JSONPath for Instance {signal_instance.id}."
+ f"A KeyError usually occurs when the JSONPath includes a list index lookup against a dictionary value."
+ f" Example: dictionary[0].value"
+ )
+ continue
+ except Exception as e: # Add this to handle general exceptions
+ log.exception(
+ f"An error occurred while extracting entity {str(jpath)} from Instance {signal_instance.id}: {str(e)}"
+ )
+ continue
+
+ # Create a list of (entity type, regular expression, jpath) tuples
+ entity_type_pairs = [
+ (
+ type,
+ re.compile(type.regular_expression) if type.regular_expression else None,
+ jsonpath_ng.parse(type.jpath) if type.jpath else None,
+ )
+ for type in entity_types
+ if isinstance(type.regular_expression, str) or type.jpath is not None
+ ]
+
+ entities = []
+
+ entities.extend(_find_entities_by_jsonpath_expression(signal_instance, entity_type_pairs))
+
+ # Filter out duplicate entities
+ entities = list(set(entities))
+
+ entities_out = [
+ get_by_value_or_create(db_session=db_session, entity_in=entity_in) for entity_in in entities
+ ]
+
+ return entities_out
diff --git a/src/dispatch/entity/views.py b/src/dispatch/entity/views.py
new file mode 100644
index 000000000000..13b6147d20eb
--- /dev/null
+++ b/src/dispatch/entity/views.py
@@ -0,0 +1,86 @@
+from fastapi import APIRouter, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.entity.service import get_cases_with_entity, get_signal_instances_with_entity
+from dispatch.models import PrimaryKey
+
+from .models import (
+ EntityCreate,
+ EntityPagination,
+ EntityRead,
+ EntityUpdate,
+)
+from .service import create, delete, get, update
+
+router = APIRouter()
+
+
+@router.get("", response_model=EntityPagination)
+def get_entities(common: CommonParameters):
+ """Get all entities, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="Entity", **common)
+
+
+@router.get("/{entity_id}", response_model=EntityRead)
+def get_entity(db_session: DbSession, entity_id: PrimaryKey):
+ """Given its unique id, retrieve details about a single entity."""
+ entity = get(db_session=db_session, entity_id=entity_id)
+ if not entity:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "The requested entity does not exist."}],
+ )
+ return entity
+
+
+@router.post("", response_model=EntityRead)
+def create_entity(db_session: DbSession, entity_in: EntityCreate):
+ """Creates a new entity."""
+ return create(db_session=db_session, entity_in=entity_in)
+
+
+@router.put("/{entity_id}", response_model=EntityRead)
+def update_entity(db_session: DbSession, entity_id: PrimaryKey, entity_in: EntityUpdate):
+ """Updates an existing entity."""
+ entity = get(db_session=db_session, entity_id=entity_id)
+ if not entity:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A entity with this id does not exist."}],
+ )
+ return update(db_session=db_session, entity=entity, entity_in=entity_in)
+
+
+@router.delete("/{entity_id}", response_model=None)
+def delete_entity(db_session: DbSession, entity_id: PrimaryKey):
+ """Deletes a entity, returning only an HTTP 200 OK if successful."""
+ entity = get(db_session=db_session, entity_id=entity_id)
+ if not entity:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A entity with this id does not exist."}],
+ )
+ delete(db_session=db_session, entity_id=entity_id)
+
+
+@router.get("/{entity_id}/cases/{days_back}", response_model=None)
+def count_cases_with_entity(
+ db_session: DbSession,
+ entity_id: PrimaryKey,
+ days_back: int = 7,
+):
+ cases = get_cases_with_entity(db_session=db_session, entity_id=entity_id, days_back=days_back)
+ return {"cases": cases}
+
+
+@router.get("/{entity_id}/signal_instances/{days_back}", response_model=None)
+def get_signal_instances_by_entity(
+ db_session: DbSession,
+ entity_id: PrimaryKey,
+ days_back: int = 7,
+):
+ instances = get_signal_instances_with_entity(
+ db_session=db_session, entity_id=entity_id, days_back=days_back
+ )
+ return {"instances": instances}
diff --git a/src/dispatch/entity_type/__init__.py b/src/dispatch/entity_type/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/entity_type/flows.py b/src/dispatch/entity_type/flows.py
new file mode 100644
index 000000000000..c008d72f2785
--- /dev/null
+++ b/src/dispatch/entity_type/flows.py
@@ -0,0 +1,37 @@
+from sqlalchemy.orm import Session
+from dispatch.entity_type.models import EntityScopeEnum
+from dispatch.signal.models import SignalInstance
+from dispatch.entity_type import service as entity_type_service
+
+from dispatch.entity_type.models import EntityType
+from dispatch.entity import service as entity_service
+
+
+def recalculate_entity_flow(
+ db_session: Session,
+ entity_type: EntityType,
+ signal_instance: SignalInstance,
+):
+ """
+ Recalculate entity findings for historical signals based on a newly associated EntityType.
+
+ Args:
+ db_session (Session): The database session to use for entity creation.
+ new_entity_type (EntityType): The newly created EntityType to associate with signals.
+ """
+ # fetch `all` entities that should be associated with all signal definitions
+ entity_types = entity_type_service.get_all(
+ db_session=db_session, scope=EntityScopeEnum.all
+ ).all()
+ entity_types = signal_instance.signal.entity_types.append(entity_type)
+
+ if entity_types:
+ entities = entity_service.find_entities(
+ db_session=db_session,
+ signal_instance=signal_instance,
+ entity_types=entity_types,
+ )
+ signal_instance.entities = entities
+ db_session.commit()
+
+ return signal_instance
diff --git a/src/dispatch/entity_type/models.py b/src/dispatch/entity_type/models.py
new file mode 100644
index 000000000000..eb7525426ab9
--- /dev/null
+++ b/src/dispatch/entity_type/models.py
@@ -0,0 +1,90 @@
+from pydantic import Field
+
+from sqlalchemy import Column, Integer, String
+from sqlalchemy.sql.sqltypes import Boolean
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.enums import DispatchEnum
+from dispatch.database.core import Base
+from dispatch.models import (
+ DispatchBase,
+ NameStr,
+ TimeStampMixin,
+ ProjectMixin,
+ PrimaryKey,
+ Pagination,
+)
+from dispatch.project.models import ProjectRead
+
+
+class EntityScopeEnum(DispatchEnum):
+ single = "single" # is only associated with a single definition
+ multiple = "multiple" # can be associated with multiple definitions
+ all = "all" # is associated with all definitions implicitly
+
+
+class EntityType(Base, TimeStampMixin, ProjectMixin):
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ jpath = Column(String)
+ regular_expression = Column(String)
+ scope = Column(String, default=EntityScopeEnum.single, nullable=False)
+ enabled = Column(Boolean, default=False)
+ search_vector = Column(
+ TSVectorType(
+ "name",
+ "description",
+ weights={"name": "A", "description": "B"},
+ regconfig="pg_catalog.simple",
+ )
+ )
+
+
+# Pydantic models
+
+
+class SignalRead(DispatchBase):
+ id: PrimaryKey
+ name: str
+ owner: str
+ conversation_target: str | None = None
+ description: str | None = None
+ variant: str | None = None
+
+
+class EntityTypeBase(DispatchBase):
+ name: NameStr | None = None
+ description: str | None = None
+ jpath: str | None = None
+ scope: EntityScopeEnum | None = Field(EntityScopeEnum.single, nullable=False)
+ enabled: bool | None = None
+ signals: list[SignalRead | None] = Field([], nullable=True)
+ regular_expression: str | None = None
+
+
+class EntityTypeCreate(EntityTypeBase):
+ id: PrimaryKey | None = None
+ project: ProjectRead
+
+
+class EntityTypeUpdate(EntityTypeBase):
+ id: PrimaryKey = None
+
+
+class EntityTypeRead(EntityTypeBase):
+ id: PrimaryKey
+ project: ProjectRead
+
+
+class EntityTypeReadMinimal(DispatchBase):
+ id: PrimaryKey
+ name: NameStr
+ description: str | None = None
+ scope: EntityScopeEnum
+ enabled: bool | None = None
+ regular_expression: str | None = None
+
+
+class EntityTypePagination(Pagination):
+ items: list[EntityTypeRead]
diff --git a/src/dispatch/entity_type/service.py b/src/dispatch/entity_type/service.py
new file mode 100644
index 000000000000..219c48861337
--- /dev/null
+++ b/src/dispatch/entity_type/service.py
@@ -0,0 +1,162 @@
+import logging
+
+from pydantic import ValidationError
+from sqlalchemy.orm import Query, Session
+from jsonpath_ng import parse
+from dispatch.project import service as project_service
+from dispatch.signal import service as signal_service
+from .models import EntityType, EntityTypeCreate, EntityTypeRead, EntityTypeUpdate
+
+logger = logging.getLogger(__name__)
+
+
+def get(*, db_session, entity_type_id: int) -> EntityType | None:
+ """Gets a entity type by its id."""
+ return db_session.query(EntityType).filter(EntityType.id == entity_type_id).one_or_none()
+
+
+def get_by_name(*, db_session: Session, project_id: int, name: str) -> EntityType | None:
+ """Gets a entity type by its name."""
+ return (
+ db_session.query(EntityType)
+ .filter(EntityType.name == name)
+ .filter(EntityType.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(
+ *, db_session: Session, project_id: int, entity_type_in=EntityTypeRead
+) -> EntityTypeRead:
+ """Returns the entity type specified or raises ValidationError."""
+ entity_type = get_by_name(
+ db_session=db_session, project_id=project_id, name=entity_type_in.name
+ )
+
+ if not entity_type:
+ raise ValidationError.from_exception_data(
+ "EntityTypeRead",
+ [
+ {
+ "type": "value_error",
+ "loc": ("entity_type",),
+ "input": entity_type_in.name,
+ "ctx": {"error": ValueError("Entity type not found.")},
+ }
+ ],
+ )
+
+ return entity_type
+
+
+def get_all(*, db_session: Session, scope: str = None) -> Query:
+ """Gets all entity types."""
+ if scope:
+ return db_session.query(EntityType).filter(EntityType.scope == scope)
+ return db_session.query(EntityType)
+
+
+def create(
+ *, db_session: Session, entity_type_in: EntityTypeCreate, case_id: int | None = None
+) -> EntityType:
+ """Creates a new entity type and extracts entities from existing signal instances."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=entity_type_in.project
+ )
+ entity_type = EntityType(
+ **entity_type_in.dict(exclude={"project", "signals", "jpath"}), project=project
+ )
+
+ signals = []
+ for signal in entity_type_in.signals:
+ signal = signal_service.get(db_session=db_session, signal_id=signal.id)
+ signals.append(signal)
+
+ entity_type.signals = signals
+ set_jpath(entity_type, entity_type_in)
+
+ db_session.add(entity_type)
+ db_session.commit()
+
+ # Extract entities for all relevant signal instances
+ from dispatch.signal.models import SignalInstance
+ from dispatch.entity.service import find_entities
+
+ if case_id:
+ # Get all signal instances for the case
+ signal_instances = (
+ db_session.query(SignalInstance)
+ .filter(SignalInstance.case_id == case_id)
+ .limit(100)
+ .all()
+ )
+ # Extract and create entities for these instances using only the new entity_type
+ for signal_instance in signal_instances:
+ new_entities = find_entities(db_session, signal_instance, [entity_type])
+ # Associate new entities with the signal_instance
+ for entity in new_entities:
+ if entity not in signal_instance.entities:
+ signal_instance.entities.append(entity)
+
+ db_session.commit()
+
+ return entity_type
+
+
+def get_or_create(
+ *, db_session: Session, entity_type_in: EntityTypeCreate, case_id: int | None = None
+) -> EntityType:
+ """Gets or creates a new entity type."""
+ q = (
+ db_session.query(EntityType)
+ .filter(EntityType.name == entity_type_in.name)
+ .filter(EntityType.project_id == entity_type_in.project.id)
+ )
+
+ instance = q.first()
+ if instance:
+ return instance
+
+ return create(db_session=db_session, entity_type_in=entity_type_in, case_id=case_id)
+
+
+def update(
+ *, db_session: Session, entity_type: EntityType, entity_type_in: EntityTypeUpdate
+) -> EntityType:
+ """Updates an entity type."""
+ entity_type_data = entity_type.dict()
+ update_data = entity_type_in.dict(exclude={"jpath"}, exclude_unset=True)
+
+ for field in entity_type_data:
+ if field in update_data:
+ setattr(entity_type, field, update_data[field])
+
+ signals = []
+ for signal in entity_type_in.signals:
+ signal = signal_service.get(db_session=db_session, signal_id=signal.id)
+ signals.append(signal)
+
+ entity_type.signals = signals
+
+ set_jpath(entity_type, entity_type_in)
+
+ db_session.commit()
+ return entity_type
+
+
+def delete(*, db_session: Session, entity_type_id: int) -> None:
+ """Deletes an entity type."""
+ entity_type = db_session.query(EntityType).filter(EntityType.id == entity_type_id).one()
+ db_session.delete(entity_type)
+ db_session.commit()
+
+
+def set_jpath(entity_type: EntityType, entity_type_in: EntityTypeCreate):
+ entity_type.jpath = ""
+ try:
+ parse(entity_type_in.jpath)
+ entity_type.jpath = entity_type_in.jpath
+ except Exception:
+ logger.error(
+ f"Failed to parse jPath: {entity_type_in.jpath}. The jPath field will be skipped."
+ )
diff --git a/src/dispatch/entity_type/views.py b/src/dispatch/entity_type/views.py
new file mode 100644
index 000000000000..35343a9439e8
--- /dev/null
+++ b/src/dispatch/entity_type/views.py
@@ -0,0 +1,200 @@
+import logging
+
+from fastapi import APIRouter, HTTPException, status
+from pydantic import ValidationError
+from sqlalchemy.exc import IntegrityError
+
+from dispatch.case.messaging import send_entity_update_notification
+from dispatch.case.service import get as get_case
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
+from dispatch.signal.models import SignalInstanceRead
+
+from .models import (
+ EntityTypeCreate,
+ EntityTypePagination,
+ EntityTypeRead,
+ EntityTypeUpdate,
+)
+from .flows import recalculate_entity_flow
+from .service import create, delete, get, update
+
+log = logging.getLogger(__name__)
+router = APIRouter()
+
+
+@router.get("", response_model=EntityTypePagination)
+def get_entity_types(common: CommonParameters):
+ """Get all entities, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="EntityType", **common)
+
+
+@router.get("/{entity_type_id}", response_model=EntityTypeRead)
+def get_entity_type(db_session: DbSession, entity_type_id: PrimaryKey):
+ """Get a entity by its id."""
+ entity_type = get(db_session=db_session, entity_type_id=entity_type_id)
+ if not entity_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A entity_type with this id does not exist."}],
+ )
+ return entity_type
+
+
+@router.post("", response_model=EntityTypeRead)
+def create_entity_type(db_session: DbSession, entity_type_in: EntityTypeCreate):
+ """Create a new entity."""
+ try:
+ entity_type = create(db_session=db_session, entity_type_in=entity_type_in)
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "An entity with this name already exists.",
+ "loc": "name",
+ }
+ ]
+ ) from None
+ return entity_type
+
+
+@router.post("/{case_id}", response_model=EntityTypeRead)
+def create_entity_type_with_case(
+ db_session: DbSession, case_id: PrimaryKey, entity_type_in: EntityTypeCreate
+):
+ """Create a new entity."""
+ try:
+ entity_type = create(db_session=db_session, entity_type_in=entity_type_in, case_id=case_id)
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "An entity with this name already exists.",
+ "loc": "name",
+ }
+ ]
+ ) from None
+ return entity_type
+
+
+@router.put("/recalculate/{entity_type_id}/{case_id}", response_model=list[SignalInstanceRead])
+def recalculate(db_session: DbSession, entity_type_id: PrimaryKey, case_id: PrimaryKey):
+ """Recalculates the associated entities for all signal instances in a case."""
+ entity_type = get(
+ db_session=db_session,
+ entity_type_id=entity_type_id,
+ )
+ if not entity_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An entity type with this id does not exist."}],
+ )
+
+ case = get_case(db_session=db_session, case_id=case_id)
+ if not case:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A case with this id does not exist."}],
+ )
+
+ # Get all signal instances associated with the case
+ signal_instances = case.signal_instances
+ if not signal_instances:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "No signal instances found for this case."}],
+ )
+
+ # Recalculate entities for each signal instance
+ updated_signal_instances = []
+ for signal_instance in signal_instances:
+ updated_signal_instance = recalculate_entity_flow(
+ db_session=db_session,
+ entity_type=entity_type,
+ signal_instance=signal_instance,
+ )
+ updated_signal_instances.append(updated_signal_instance)
+
+ try:
+ send_entity_update_notification(
+ db_session=db_session,
+ entity_type=entity_type,
+ case=signal_instances[0].case,
+ )
+ except Exception as e:
+ log.warning(f"Failed to send entity update notification: {e}")
+
+ return updated_signal_instances
+
+
+@router.put("/{entity_type_id}", response_model=EntityTypeRead)
+def update_entity_type(
+ db_session: DbSession,
+ entity_type_id: PrimaryKey,
+ entity_type_in: EntityTypeUpdate,
+):
+ """Update an entity."""
+ entity_type = get(db_session=db_session, entity_type_id=entity_type_id)
+ if not entity_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A entity with this id does not exist."}],
+ )
+
+ try:
+ entity_type = update(
+ db_session=db_session, entity_type=entity_type, entity_type_in=entity_type_in
+ )
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "An entity with this name already exists.",
+ "loc": "name",
+ }
+ ]
+ ) from None
+ return entity_type
+
+
+@router.put("/{entity_type_id}/process", response_model=EntityTypeRead)
+def process_entity_type(
+ db_session: DbSession,
+ entity_type_id: PrimaryKey,
+ entity_type_in: EntityTypeUpdate,
+):
+ """Process an entity type."""
+ entity_type = get(db_session=db_session, entity_type_id=entity_type_id)
+ if not entity_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A entity with this id does not exist."}],
+ )
+
+ try:
+ entity_type = update(
+ db_session=db_session, entity_type=entity_type, entity_type_in=entity_type_in
+ )
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "An entity with this name already exists.",
+ "loc": "name",
+ }
+ ]
+ ) from None
+ return entity_type
+
+
+@router.delete("/{entity_type_id}", response_model=None)
+def delete_entity_type(db_session: DbSession, entity_type_id: PrimaryKey):
+ """Delete an entity."""
+ entity_type = get(db_session=db_session, entity_type_id=entity_type_id)
+ if not entity_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A entity type with this id does not exist."}],
+ )
+ delete(db_session=db_session, entity_type_id=entity_type_id)
diff --git a/src/dispatch/enums.py b/src/dispatch/enums.py
index d1e705f01f34..2fd2ca8a9615 100644
--- a/src/dispatch/enums.py
+++ b/src/dispatch/enums.py
@@ -1,15 +1,93 @@
-from enum import Enum
+from enum import StrEnum
-class Visibility(str, Enum):
+class DispatchEnum(StrEnum):
+ """
+ A custom Enum class that extends StrEnum.
+
+ This class inherits all functionality from StrEnum, including
+ string representation and automatic value conversion to strings.
+
+ Example:
+ class Visibility(DispatchEnum):
+ OPEN = "Open"
+ RESTRICTED = "Restricted"
+
+ assert str(Visibility.OPEN) == "Open"
+
+ Note:
+ In `3.12` we will get `__contains__` functionality:
+
+ DeprecationWarning: in 3.12 __contains__ will no longer raise TypeError, but will return True or
+ False depending on whether the value is a member or the value of a member
+ """
+
+ pass # No additional implementation needed
+
+
+class Visibility(DispatchEnum):
open = "Open"
restricted = "Restricted"
-class SearchTypes(str, Enum):
- term = "Term"
+class SearchTypes(DispatchEnum):
definition = "Definition"
- individual_contact = "Individual"
- team_contact = "Team"
+ document = "Document"
+ incident = "Incident"
+ incident_priority = "IncidentPriority"
+ incident_type = "IncidentType"
+ individual_contact = "IndividualContact"
+ plugin = "Plugin"
+ query = "Query"
+ search_filter = "SearchFilter"
+ case = "Case"
service = "Service"
- policy = "Policy"
+ source = "Source"
+ tag = "Tag"
+ task = "Task"
+ team_contact = "TeamContact"
+ term = "Term"
+
+
+class UserRoles(DispatchEnum):
+ owner = "Owner"
+ manager = "Manager"
+ admin = "Admin"
+ member = "Member"
+
+
+class DocumentResourceTypes(DispatchEnum):
+ case = "dispatch-case-document"
+ executive = "dispatch-executive-report-document"
+ incident = "dispatch-incident-document"
+ review = "dispatch-incident-review-document"
+ tracking = "dispatch-incident-sheet"
+
+
+class DocumentResourceReferenceTypes(DispatchEnum):
+ conversation = "dispatch-conversation-reference-document"
+ faq = "dispatch-incident-reference-faq-document"
+
+
+class DocumentResourceTemplateTypes(DispatchEnum):
+ case = "dispatch-case-document-template"
+ executive = "dispatch-executive-report-document-template"
+ incident = "dispatch-incident-document-template"
+ review = "dispatch-incident-review-document-template"
+ tracking = "dispatch-incident-tracking-template"
+ forms = "dispatch-forms-export-template"
+
+
+class EventType(DispatchEnum):
+ other = "Other" # default and catch-all (x resource created/updated, etc.)
+ field_updated = "Field updated" # for fields like title, description, tags, type, etc.
+ assessment_updated = "Assessment updated" # for priority, status, or severity changes
+ participant_updated = "Participant updated" # for added/removed users and role changes
+ imported_message = "Imported message" # for stopwatch-reacted messages from Slack
+ custom_event = "Custom event" # for user-added events (new feature)
+
+
+class SubjectNames(DispatchEnum):
+ CASE = "Case"
+ INCIDENT = "Incident"
+ SIGNAL = "Signal"
diff --git a/src/dispatch/event/__init__.py b/src/dispatch/event/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/event/flows.py b/src/dispatch/event/flows.py
new file mode 100644
index 000000000000..2dd167949d47
--- /dev/null
+++ b/src/dispatch/event/flows.py
@@ -0,0 +1,143 @@
+import logging
+
+from dispatch.decorators import background_task
+from dispatch.event import service as event_service
+from dispatch.incident import service as incident_service
+from dispatch.case import service as case_service
+from dispatch.individual import service as individual_service
+from dispatch.event.models import EventUpdate, EventCreateMinimal
+from dispatch.auth import service as auth_service
+
+log = logging.getLogger(__name__)
+
+
+@background_task
+def log_incident_event(
+ user_email: str,
+ incident_id: int,
+ event_in: EventCreateMinimal,
+ db_session=None,
+ organization_slug: str = None,
+):
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+ individual = individual_service.get_by_email_and_project(
+ db_session=db_session, email=user_email, project_id=incident.project.id
+ )
+ event_in.source = f"Custom event created by {individual.name}"
+ event_in.owner = individual.name
+
+ event_service.log_incident_event(
+ db_session=db_session,
+ incident_id=incident_id,
+ individual_id=individual.id,
+ **event_in.__dict__,
+ )
+
+
+@background_task
+def update_incident_event(
+ event_in: EventUpdate,
+ db_session=None,
+ organization_slug: str = None,
+):
+ event_service.update_incident_event(
+ db_session=db_session,
+ event_in=event_in,
+ )
+
+
+@background_task
+def delete_incident_event(
+ event_uuid: str,
+ db_session=None,
+ organization_slug: str = None,
+):
+ event_service.delete_incident_event(
+ db_session=db_session,
+ uuid=event_uuid,
+ )
+
+
+def export_timeline(
+ timeline_filters: dict,
+ incident_id: int,
+ db_session=None,
+ organization_slug: str = None,
+):
+ try:
+ event_service.export_timeline(
+ db_session=db_session,
+ timeline_filters=timeline_filters,
+ incident_id=incident_id,
+ )
+
+ except Exception:
+ raise
+
+
+@background_task
+def log_case_event(
+ user_email: str,
+ case_id: int,
+ event_in: EventCreateMinimal,
+ db_session=None,
+ organization_slug: str = None,
+):
+ case = case_service.get(db_session=db_session, case_id=case_id)
+ individual = individual_service.get_by_email_and_project(
+ db_session=db_session, email=user_email, project_id=case.project.id
+ )
+ event_in.source = f"Custom event created by {individual.name}"
+ event_in.owner = individual.name
+
+ # Get dispatch user by email
+ dispatch_user = auth_service.get_by_email(db_session=db_session, email=user_email)
+ dispatch_user_id = dispatch_user.id if dispatch_user else None
+
+ event_service.log_case_event(
+ db_session=db_session,
+ case_id=case_id,
+ dispatch_user_id=dispatch_user_id,
+ **event_in.__dict__,
+ )
+
+
+@background_task
+def update_case_event(
+ event_in: EventUpdate,
+ db_session=None,
+ organization_slug: str = None,
+):
+ event_service.update_case_event(
+ db_session=db_session,
+ event_in=event_in,
+ )
+
+
+@background_task
+def delete_case_event(
+ event_uuid: str,
+ db_session=None,
+ organization_slug: str = None,
+):
+ event_service.delete_case_event(
+ db_session=db_session,
+ uuid=event_uuid,
+ )
+
+
+def export_case_timeline(
+ timeline_filters: dict,
+ case_id: int,
+ db_session=None,
+ organization_slug: str = None,
+):
+ try:
+ event_service.export_case_timeline(
+ db_session=db_session,
+ timeline_filters=timeline_filters,
+ case_id=case_id,
+ )
+
+ except Exception:
+ raise
diff --git a/src/dispatch/event/models.py b/src/dispatch/event/models.py
new file mode 100644
index 000000000000..5f7293435197
--- /dev/null
+++ b/src/dispatch/event/models.py
@@ -0,0 +1,76 @@
+from datetime import datetime
+from uuid import UUID
+
+
+from sqlalchemy import Column, DateTime, ForeignKey, Integer, String, Boolean
+from sqlalchemy.dialects.postgresql import UUID as SQLAlchemyUUID
+from sqlalchemy_utils import TSVectorType, JSONType
+
+from dispatch.database.core import Base
+from dispatch.models import DispatchBase, TimeStampMixin
+from dispatch.enums import EventType
+
+
+# SQLAlchemy Model
+class Event(Base, TimeStampMixin):
+ # columns
+ id = Column(Integer, primary_key=True)
+ uuid = Column(SQLAlchemyUUID(as_uuid=True), unique=True, nullable=False)
+ started_at = Column(DateTime, nullable=False)
+ ended_at = Column(DateTime, nullable=False)
+ source = Column(String, nullable=False)
+ description = Column(String, nullable=False)
+ details = Column(JSONType, nullable=True)
+ type = Column(String, default=EventType.other, nullable=True)
+ owner = Column(String, nullable=True)
+ pinned = Column(Boolean, default=False)
+
+ # relationships
+ individual_id = Column(Integer, ForeignKey("individual_contact.id", ondelete="CASCADE"))
+ incident_id = Column(Integer, ForeignKey("incident.id", ondelete="CASCADE"))
+
+ dispatch_user_id = Column(
+ Integer, ForeignKey("dispatch_core.dispatch_user.id", ondelete="CASCADE")
+ )
+ case_id = Column(Integer, ForeignKey("case.id", ondelete="CASCADE"))
+ signal_id = Column(Integer, ForeignKey("signal.id", ondelete="CASCADE"))
+
+ # full text search capabilities
+ search_vector = Column(
+ TSVectorType("source", "description", weights={"source": "A", "description": "B"})
+ )
+
+
+# Pydantic Models
+class EventBase(DispatchBase):
+ uuid: UUID
+ started_at: datetime
+ ended_at: datetime
+ source: str
+ description: str
+ details: dict | None = None
+ type: str | None = None
+ owner: str | None = None
+ pinned: bool | None = False
+
+
+class EventCreate(EventBase):
+ pass
+
+
+class EventUpdate(EventBase):
+ pass
+
+
+class EventRead(EventBase):
+ pass
+
+
+class EventCreateMinimal(DispatchBase):
+ started_at: datetime
+ source: str
+ description: str
+ details: dict | None = None
+ type: str | None = None
+ owner: str | None = None
+ pinned: bool | None = False
diff --git a/src/dispatch/event/service.py b/src/dispatch/event/service.py
new file mode 100644
index 000000000000..f0c5c9906391
--- /dev/null
+++ b/src/dispatch/event/service.py
@@ -0,0 +1,882 @@
+from uuid import uuid4
+from datetime import datetime
+import logging
+import json
+import pytz
+
+from dispatch.auth import service as auth_service
+from dispatch.case import service as case_service
+from dispatch.incident import service as incident_service
+from dispatch.individual import service as individual_service
+from dispatch.enums import EventType
+from dispatch.incident.models import Incident
+from dispatch.signal import service as signal_service
+from dispatch.types import Subject
+
+from .models import Event, EventCreate, EventUpdate
+from dispatch.document import service as document_service
+from dispatch.plugin import service as plugin_service
+
+log = logging.getLogger(__name__)
+
+
+def get(*, db_session, event_id: int) -> Event | None:
+ """Get an event by id."""
+ return (
+ db_session.query(Event)
+ .filter(Event.id == event_id)
+ .order_by(Event.started_at)
+ .one_or_none()
+ )
+
+
+def get_by_case_id(*, db_session, case_id: int) -> list[Event | None]:
+ """Get events by case id."""
+ return db_session.query(Event).filter(Event.case_id == case_id).order_by(Event.started_at)
+
+
+def get_by_incident_id(*, db_session, incident_id: int) -> list[Event | None]:
+ """Get events by incident id."""
+
+ return (
+ db_session.query(Event).filter(Event.incident_id == incident_id).order_by(Event.started_at)
+ )
+
+
+def get_by_uuid(*, db_session, uuid: str) -> Event | None:
+ """Get events by uuid."""
+ return db_session.query(Event).filter(Event.uuid == uuid).one_or_none()
+
+
+def get_all(*, db_session) -> list[Event | None]:
+ """Get all events."""
+ return db_session.query(Event)
+
+
+def create(*, db_session, event_in: EventCreate) -> Event:
+ """Create a new event."""
+ event = Event(**event_in.dict())
+ db_session.add(event)
+ db_session.commit()
+ return event
+
+
+def update(*, db_session, event: Event, event_in: EventUpdate) -> Event:
+ """Updates an event."""
+ event_data = event.dict()
+ update_data = event_in.dict(exclude_unset=True)
+
+ for field in event_data:
+ if field in update_data:
+ setattr(event, field, update_data[field])
+
+ db_session.commit()
+ return event
+
+
+def delete(*, db_session, event_id: int):
+ """Deletes an event."""
+ event = db_session.query(Event).filter(Event.id == event_id).first()
+ db_session.delete(event)
+ db_session.commit()
+
+
+def log_subject_event(subject: Subject, **kwargs) -> Event:
+ if isinstance(subject, Incident):
+ return log_incident_event(incident_id=subject.id, **kwargs)
+ else:
+ return log_case_event(case_id=subject.id, **kwargs)
+
+
+def log_incident_event(
+ db_session,
+ source: str,
+ description: str,
+ incident_id: int,
+ individual_id: int | None = None,
+ started_at: datetime | None = None,
+ ended_at: datetime | None = None,
+ details: dict | None = None,
+ type: str = EventType.other,
+ owner: str = "",
+ pinned: bool = False,
+) -> Event:
+ """Logs an event in the incident timeline."""
+ uuid = uuid4()
+
+ if not started_at:
+ started_at = datetime.utcnow()
+
+ if not ended_at:
+ ended_at = started_at
+
+ event_in = EventCreate(
+ uuid=uuid,
+ started_at=started_at,
+ ended_at=ended_at,
+ source=source,
+ description=description,
+ details=details,
+ type=type,
+ owner=owner,
+ pinned=pinned,
+ )
+ event = create(db_session=db_session, event_in=event_in)
+
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+ incident.events.append(event)
+ db_session.add(incident)
+
+ if individual_id:
+ individual = individual_service.get(
+ db_session=db_session, individual_contact_id=individual_id
+ )
+ individual.events.append(event)
+ db_session.add(individual)
+
+ db_session.commit()
+
+ return event
+
+
+def log_case_event(
+ db_session,
+ source: str,
+ description: str,
+ case_id: int,
+ dispatch_user_id: int | None = None,
+ started_at: datetime | None = None,
+ ended_at: datetime | None = None,
+ details: dict | None = None,
+ type: str = EventType.other,
+ owner: str = "",
+ pinned: bool = False,
+) -> Event:
+ """Logs an event in the case timeline."""
+ uuid = uuid4()
+
+ if not started_at:
+ started_at = datetime.utcnow()
+
+ if not ended_at:
+ ended_at = started_at
+
+ event_in = EventCreate(
+ uuid=uuid,
+ started_at=started_at,
+ ended_at=ended_at,
+ source=source,
+ description=description,
+ details=details,
+ type=type,
+ owner=owner,
+ pinned=pinned,
+ )
+ event = create(db_session=db_session, event_in=event_in)
+
+ case = case_service.get(db_session=db_session, case_id=case_id)
+ case.events.append(event)
+ db_session.add(case)
+
+ if dispatch_user_id:
+ dispatch_user = auth_service.get(db_session=db_session, user_id=dispatch_user_id)
+ dispatch_user.events.append(event)
+ db_session.add(dispatch_user)
+
+ db_session.commit()
+
+ return event
+
+
+def log_signal_event(
+ db_session,
+ source: str,
+ description: str,
+ signal_id: int,
+ individual_id: int | None = None,
+ started_at: datetime | None = None,
+ ended_at: datetime | None = None,
+ details: dict | None = None,
+ dispatch_user_id: int | None = None,
+ type: str = EventType.other,
+ owner: str = "",
+ pinned: bool = False,
+) -> Event:
+ """Logs an event in the signal definition timeline."""
+ uuid = uuid4()
+
+ if not started_at:
+ started_at = datetime.utcnow()
+
+ if not ended_at:
+ ended_at = started_at
+
+ event_in = EventCreate(
+ uuid=uuid,
+ started_at=started_at,
+ ended_at=ended_at,
+ source=source,
+ description=description,
+ details=details,
+ type=type,
+ owner=owner,
+ pinned=pinned,
+ dispatch_user_id=dispatch_user_id,
+ )
+ event = create(db_session=db_session, event_in=event_in)
+
+ signal = signal_service.get(db_session=db_session, signal_id=signal_id)
+ signal.events.append(event)
+ db_session.add(signal)
+
+ if individual_id:
+ individual = individual_service.get(
+ db_session=db_session, individual_contact_id=individual_id
+ )
+ individual.events.append(event)
+ db_session.add(individual)
+
+ db_session.commit()
+
+ return event
+
+
+def update_incident_event(
+ db_session,
+ event_in: EventUpdate,
+) -> Event:
+ """Updates an event in the incident timeline."""
+ event = get_by_uuid(db_session=db_session, uuid=event_in.uuid)
+ event = update(db_session=db_session, event=event, event_in=event_in)
+
+ return event
+
+
+def delete_incident_event(
+ db_session,
+ uuid: str,
+):
+ """Deletes an event."""
+ event = get_by_uuid(db_session=db_session, uuid=uuid)
+
+ delete(db_session=db_session, event_id=event.id)
+
+
+def update_case_event(
+ db_session,
+ event_in: EventUpdate,
+) -> Event:
+ """Updates an event in the case timeline."""
+ event = get_by_uuid(db_session=db_session, uuid=event_in.uuid)
+ event = update(db_session=db_session, event=event, event_in=event_in)
+
+ return event
+
+
+def delete_case_event(
+ db_session,
+ uuid: str,
+):
+ """Deletes a case event."""
+ event = get_by_uuid(db_session=db_session, uuid=uuid)
+
+ delete(db_session=db_session, event_id=event.id)
+
+
+def export_timeline(
+ db_session,
+ timeline_filters: str,
+ incident_id: int,
+):
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project_id, plugin_type="document"
+ )
+ if not plugin:
+ log.error("Document not created. No storage plugin enabled.")
+ return False
+
+ """gets timeline events for incident"""
+ event = get_by_incident_id(db_session=db_session, incident_id=incident_id)
+ table_data = []
+ dates = set()
+ data_inserted = False
+
+ """Filters events based on user filter"""
+ for e in event.all():
+ time_header = "Time (UTC)"
+ event_timestamp = e.started_at.strftime("%Y-%m-%d %H:%M:%S")
+ if not e.owner:
+ e.owner = "Dispatch"
+ if timeline_filters.get("timezone").strip() == "America/Los_Angeles":
+ time_header = "Time (PST/PDT)"
+ event_timestamp = (
+ pytz.utc.localize(e.started_at)
+ .astimezone(pytz.timezone(timeline_filters.get("timezone").strip()))
+ .replace(tzinfo=None)
+ .strftime("%Y-%m-%d %H:%M:%S")
+ )
+ date, time = str(event_timestamp).split(" ")
+ if e.pinned or timeline_filters.get(e.type):
+ if date in dates:
+ if timeline_filters.get("exportOwner"):
+ table_data.append(
+ {time_header: time, "Description": e.description, "Owner": e.owner}
+ )
+ else:
+ table_data.append({time_header: time, "Description": e.description})
+
+ else:
+ dates.add(date)
+ if timeline_filters.get("exportOwner"):
+ table_data.append({time_header: date, "Description": "\t", "Owner": "\t"})
+ table_data.append(
+ {time_header: time, "Description": e.description, "Owner": e.owner}
+ )
+ else:
+ table_data.append({time_header: date, "Description": "\t"})
+ table_data.append({time_header: time, "Description": e.description})
+
+ if table_data:
+ table_data = json.loads(json.dumps(table_data))
+ num_columns = len(table_data[0].keys() if table_data else [])
+ column_headers = table_data[0].keys()
+
+ documents_list = []
+ if timeline_filters.get("incidentDocument"):
+ documents = document_service.get_by_incident_id_and_resource_type(
+ db_session=db_session,
+ incident_id=incident_id,
+ project_id=incident.project.id,
+ resource_type="dispatch-incident-document",
+ )
+ if documents:
+ documents_list.append((documents.resource_id, "Incident"))
+
+ if timeline_filters.get("reviewDocument"):
+ documents = document_service.get_by_incident_id_and_resource_type(
+ db_session=db_session,
+ incident_id=incident_id,
+ project_id=incident.project.id,
+ resource_type="dispatch-incident-review-document",
+ )
+ if documents:
+ documents_list.append((documents.resource_id, "Incident Review"))
+
+ for doc_id, doc_name in documents_list:
+ # Checks for existing table in the document
+ table_exists, curr_table_start, curr_table_end, _ = plugin.instance.get_table_details(
+ document_id=doc_id, header="Timeline", doc_name=doc_name
+ )
+
+ # Deletes existing table
+ if table_exists:
+ delete_table_request = [
+ {
+ "deleteContentRange": {
+ "range": {
+ "segmentId": "",
+ "startIndex": curr_table_start,
+ "endIndex": curr_table_end,
+ }
+ }
+ }
+ ]
+ if plugin.instance.delete_table(document_id=doc_id, request=delete_table_request):
+ log.debug("Existing table in the doc has been deleted")
+
+ else:
+ log.debug("Table doesn't exist under header, creating new table")
+ curr_table_start += 1
+
+ # Insert new table with required rows & columns
+ insert_table_request = [
+ {
+ "insertTable": {
+ "rows": len(table_data) + 1,
+ "columns": num_columns,
+ "location": {"index": curr_table_start - 1},
+ }
+ }
+ ]
+ if plugin.instance.insert(document_id=doc_id, request=insert_table_request):
+ log.debug("Table skeleton inserted successfully")
+
+ else:
+ log.error(
+ f"Unable to insert table skeleton in the {doc_name} document with id {doc_id}"
+ )
+ raise Exception(
+ f"Unable to insert table skeleton for timeline export in the {doc_name} document"
+ )
+
+ # Formatting & inserting empty table
+ insert_data_request = [
+ {
+ "updateTableCellStyle": {
+ "tableCellStyle": {
+ "backgroundColor": {
+ "color": {"rgbColor": {"green": 0.4, "red": 0.4, "blue": 0.4}}
+ }
+ },
+ "fields": "backgroundColor",
+ "tableRange": {
+ "columnSpan": num_columns,
+ "rowSpan": 1,
+ "tableCellLocation": {
+ "columnIndex": 0,
+ "rowIndex": 0,
+ "tableStartLocation": {"index": curr_table_start},
+ },
+ },
+ }
+ },
+ {
+ "updateTableColumnProperties": {
+ "tableStartLocation": {
+ "index": curr_table_start,
+ },
+ "columnIndices": [0],
+ "tableColumnProperties": {
+ "width": {"magnitude": 90, "unit": "PT"},
+ "widthType": "FIXED_WIDTH",
+ },
+ "fields": "width,widthType",
+ }
+ },
+ ]
+
+ if timeline_filters.get("exportOwner"):
+ insert_data_request.append(
+ {
+ "updateTableColumnProperties": {
+ "tableStartLocation": {
+ "index": curr_table_start,
+ },
+ "columnIndices": [2],
+ "tableColumnProperties": {
+ "width": {"magnitude": 105, "unit": "PT"},
+ "widthType": "FIXED_WIDTH",
+ },
+ "fields": "width,widthType",
+ }
+ }
+ )
+
+ if plugin.instance.insert(document_id=doc_id, request=insert_data_request):
+ log.debug("Table Formatted successfully")
+
+ else:
+ log.error(
+ f"Unable to format table for timeline export in {doc_name} document with id {doc_id}"
+ )
+ raise Exception(
+ f"Unable to format table for timeline export in the {doc_name} document"
+ )
+
+ # Calculating table cell indices
+ _, _, _, cell_indices = plugin.instance.get_table_details(
+ document_id=doc_id, header="Timeline", doc_name=doc_name
+ )
+ data_to_insert = list(column_headers) + [
+ item for row in table_data for item in row.values()
+ ]
+ str_len = 0
+ row_idx = 0
+ insert_data_request = []
+
+ for index, text in zip(cell_indices, data_to_insert, strict=True):
+ # Adjusting index based on string length
+ new_idx = index + str_len
+
+ insert_data_request.append(
+ {"insertText": {"location": {"index": new_idx}, "text": text}}
+ )
+
+ # Header field formatting
+ if text in column_headers:
+ insert_data_request.append(
+ {
+ "updateTextStyle": {
+ "range": {"startIndex": new_idx, "endIndex": new_idx + len(text)},
+ "textStyle": {
+ "bold": True,
+ "foregroundColor": {
+ "color": {"rgbColor": {"red": 1, "green": 1, "blue": 1}}
+ },
+ "fontSize": {"magnitude": 10, "unit": "PT"},
+ },
+ "fields": "bold,foregroundColor",
+ }
+ }
+ )
+
+ # Formating for date rows
+ if text == "\t":
+ insert_data_request.append(
+ {
+ "updateTableCellStyle": {
+ "tableCellStyle": {
+ "backgroundColor": {
+ "color": {
+ "rgbColor": {"green": 0.8, "red": 0.8, "blue": 0.8}
+ }
+ }
+ },
+ "fields": "backgroundColor",
+ "tableRange": {
+ "columnSpan": num_columns,
+ "rowSpan": 1,
+ "tableCellLocation": {
+ "tableStartLocation": {"index": curr_table_start},
+ "columnIndex": 0,
+ "rowIndex": row_idx // len(column_headers),
+ },
+ },
+ }
+ }
+ )
+
+ # Formating for time column
+ if row_idx % num_columns == 0:
+ insert_data_request.append(
+ {
+ "updateTextStyle": {
+ "range": {"startIndex": new_idx, "endIndex": new_idx + len(text)},
+ "textStyle": {
+ "bold": True,
+ },
+ "fields": "bold",
+ }
+ }
+ )
+
+ row_idx += 1
+ str_len += len(text) if text else 0
+
+ data_inserted = plugin.instance.insert(document_id=doc_id, request=insert_data_request)
+ if not data_inserted:
+ raise Exception(f"Encountered error while inserting data into the {doc_name} document")
+
+ else:
+ log.error("No data to export")
+ raise Exception("No data to export, please check filter selection")
+ # raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail=[{"msg": "No timeline data to export"}]) from None
+
+ return True
+
+
+def export_case_timeline(
+ db_session,
+ timeline_filters: str,
+ case_id: int,
+):
+ case = case_service.get(db_session=db_session, case_id=case_id)
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project_id, plugin_type="document"
+ )
+ if not plugin:
+ log.error("Document not created. No storage plugin enabled.")
+ return False
+
+ """gets timeline events for case"""
+ event = get_by_case_id(db_session=db_session, case_id=case_id)
+ table_data = []
+ dates = set()
+ data_inserted = False
+
+ """Filters events based on user filter"""
+ for e in event:
+ time_header = "Time (UTC)"
+ event_timestamp = e.started_at.strftime("%Y-%m-%d %H:%M:%S")
+ if not e.owner:
+ e.owner = "Dispatch"
+ if timeline_filters.get("timezone").strip() == "America/Los_Angeles":
+ time_header = "Time (PST/PDT)"
+ event_timestamp = (
+ pytz.utc.localize(e.started_at)
+ .astimezone(pytz.timezone(timeline_filters.get("timezone").strip()))
+ .replace(tzinfo=None)
+ .strftime("%Y-%m-%d %H:%M:%S")
+ )
+ date, time = str(event_timestamp).split(" ")
+ if e.pinned or timeline_filters.get(e.type):
+ if date in dates:
+ if timeline_filters.get("exportOwner"):
+ table_data.append(
+ {time_header: time, "Description": e.description, "Owner": e.owner}
+ )
+ else:
+ table_data.append({time_header: time, "Description": e.description})
+
+ else:
+ dates.add(date)
+ if timeline_filters.get("exportOwner"):
+ table_data.append({time_header: date, "Description": "\t", "Owner": "\t"})
+ table_data.append(
+ {time_header: time, "Description": e.description, "Owner": e.owner}
+ )
+ else:
+ table_data.append({time_header: date, "Description": "\t"})
+ table_data.append({time_header: time, "Description": e.description})
+
+ if table_data:
+ table_data = json.loads(json.dumps(table_data))
+ num_columns = len(table_data[0].keys() if table_data else [])
+ column_headers = table_data[0].keys()
+
+ documents_list = []
+ if timeline_filters.get("caseDocument"):
+ documents = document_service.get_by_case_id_and_resource_type(
+ db_session=db_session,
+ case_id=case_id,
+ project_id=case.project.id,
+ resource_type="dispatch-case-document",
+ )
+ if documents:
+ documents_list.append((documents.resource_id, "Case"))
+
+ for doc_id, doc_name in documents_list:
+ # Checks for existing table in the document
+ table_exists, curr_table_start, curr_table_end, _ = plugin.instance.get_table_details(
+ document_id=doc_id, header="Timeline", doc_name=doc_name
+ )
+
+ # Deletes existing table
+ if table_exists:
+ delete_table_request = [
+ {
+ "deleteContentRange": {
+ "range": {
+ "segmentId": "",
+ "startIndex": curr_table_start,
+ "endIndex": curr_table_end,
+ }
+ }
+ }
+ ]
+ if plugin.instance.delete_table(document_id=doc_id, request=delete_table_request):
+ log.debug("Existing table in the doc has been deleted")
+
+ else:
+ log.debug("Table doesn't exist under header, creating new table")
+ curr_table_start += 1
+
+ # Insert new table with required rows & columns
+ insert_table_request = [
+ {
+ "insertTable": {
+ "rows": len(table_data) + 1,
+ "columns": num_columns,
+ "location": {"index": curr_table_start - 1},
+ }
+ }
+ ]
+ if plugin.instance.insert(document_id=doc_id, request=insert_table_request):
+ log.debug("Table skeleton inserted successfully")
+
+ else:
+ log.error(
+ f"Unable to insert table skeleton in the {doc_name} document with id {doc_id}"
+ )
+ raise Exception(
+ f"Unable to insert table skeleton for timeline export in the {doc_name} document"
+ )
+
+ # Formatting & inserting empty table
+ insert_data_request = [
+ {
+ "updateTableCellStyle": {
+ "tableCellStyle": {
+ "backgroundColor": {
+ "color": {"rgbColor": {"green": 0.4, "red": 0.4, "blue": 0.4}}
+ }
+ },
+ "fields": "backgroundColor",
+ "tableRange": {
+ "columnSpan": num_columns,
+ "rowSpan": 1,
+ "tableCellLocation": {
+ "columnIndex": 0,
+ "rowIndex": 0,
+ "tableStartLocation": {"index": curr_table_start},
+ },
+ },
+ }
+ },
+ {
+ "updateTableColumnProperties": {
+ "tableStartLocation": {
+ "index": curr_table_start,
+ },
+ "columnIndices": [0],
+ "tableColumnProperties": {
+ "width": {"magnitude": 90, "unit": "PT"},
+ "widthType": "FIXED_WIDTH",
+ },
+ "fields": "width,widthType",
+ }
+ },
+ ]
+
+ if timeline_filters.get("exportOwner"):
+ insert_data_request.append(
+ {
+ "updateTableColumnProperties": {
+ "tableStartLocation": {
+ "index": curr_table_start,
+ },
+ "columnIndices": [2],
+ "tableColumnProperties": {
+ "width": {"magnitude": 105, "unit": "PT"},
+ "widthType": "FIXED_WIDTH",
+ },
+ "fields": "width,widthType",
+ }
+ }
+ )
+
+ if plugin.instance.insert(document_id=doc_id, request=insert_data_request):
+ log.debug("Table Formatted successfully")
+
+ else:
+ log.error(
+ f"Unable to format table for timeline export in {doc_name} document with id {doc_id}"
+ )
+ raise Exception(
+ f"Unable to format table for timeline export in the {doc_name} document"
+ )
+
+ # Calculating table cell indices
+ _, _, _, cell_indices = plugin.instance.get_table_details(
+ document_id=doc_id, header="Timeline", doc_name=doc_name
+ )
+ data_to_insert = list(column_headers) + [
+ item for row in table_data for item in row.values()
+ ]
+ str_len = 0
+ row_idx = 0
+ insert_data_request = []
+
+ for index, text in zip(cell_indices, data_to_insert, strict=True):
+ # Adjusting index based on string length
+ new_idx = index + str_len
+
+ insert_data_request.append(
+ {"insertText": {"location": {"index": new_idx}, "text": text}}
+ )
+
+ # Header field formatting
+ if text in column_headers:
+ insert_data_request.append(
+ {
+ "updateTextStyle": {
+ "range": {"startIndex": new_idx, "endIndex": new_idx + len(text)},
+ "textStyle": {
+ "bold": True,
+ "foregroundColor": {
+ "color": {"rgbColor": {"red": 1, "green": 1, "blue": 1}}
+ },
+ "fontSize": {"magnitude": 10, "unit": "PT"},
+ },
+ "fields": "bold,foregroundColor",
+ }
+ }
+ )
+
+ # Formating for date rows
+ if text == "\t":
+ insert_data_request.append(
+ {
+ "updateTableCellStyle": {
+ "tableCellStyle": {
+ "backgroundColor": {
+ "color": {
+ "rgbColor": {"green": 0.8, "red": 0.8, "blue": 0.8}
+ }
+ }
+ },
+ "fields": "backgroundColor",
+ "tableRange": {
+ "columnSpan": num_columns,
+ "rowSpan": 1,
+ "tableCellLocation": {
+ "tableStartLocation": {"index": curr_table_start},
+ "columnIndex": 0,
+ "rowIndex": row_idx // len(column_headers),
+ },
+ },
+ }
+ }
+ )
+
+ # Formating for time column
+ if row_idx % num_columns == 0:
+ insert_data_request.append(
+ {
+ "updateTextStyle": {
+ "range": {"startIndex": new_idx, "endIndex": new_idx + len(text)},
+ "textStyle": {
+ "bold": True,
+ },
+ "fields": "bold",
+ }
+ }
+ )
+
+ row_idx += 1
+ str_len += len(text) if text else 0
+
+ data_inserted = plugin.instance.insert(document_id=doc_id, request=insert_data_request)
+ if not data_inserted:
+ raise Exception(f"Encountered error while inserting data into the {doc_name} document")
+
+ else:
+ log.error("No data to export")
+ raise Exception("No data to export, please check filter selection")
+
+ return True
+
+
+def get_recent_summary_event(
+ db_session,
+ case_id: int | None = None,
+ incident_id: int | None = None,
+ max_age_seconds: int = 300,
+): # 5 minutes default
+ """Get the most recent AI read-in summary event for this subject."""
+ from datetime import datetime, timedelta
+ from dispatch.ai.enums import AIEventSource, AIEventDescription
+
+ cutoff_time = datetime.utcnow() - timedelta(seconds=max_age_seconds)
+
+ if incident_id:
+ # This is an incident
+ return (
+ db_session.query(Event)
+ .filter(Event.incident_id == incident_id)
+ .filter(Event.source == AIEventSource.dispatch_genai)
+ .filter(
+ Event.description.like(
+ f"{AIEventDescription.read_in_summary_created.format(participant_email='')}%"
+ )
+ )
+ .filter(Event.started_at >= cutoff_time)
+ .order_by(Event.started_at.desc())
+ .first()
+ )
+ else:
+ # This is a case
+ return (
+ db_session.query(Event)
+ .filter(Event.case_id == case_id)
+ .filter(Event.source == AIEventSource.dispatch_genai)
+ .filter(
+ Event.description.like(
+ f"{AIEventDescription.read_in_summary_created.format(participant_email='')}%"
+ )
+ )
+ .filter(Event.started_at >= cutoff_time)
+ .order_by(Event.started_at.desc())
+ .first()
+ )
diff --git a/src/dispatch/evergreen/scheduled.py b/src/dispatch/evergreen/scheduled.py
new file mode 100644
index 000000000000..e5774a7fc3e0
--- /dev/null
+++ b/src/dispatch/evergreen/scheduled.py
@@ -0,0 +1,130 @@
+"""
+.. module: dispatch.evergreen.scheduled
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
+
+import logging
+from typing import Any
+from collections import defaultdict
+from datetime import datetime
+from schedule import every
+
+from sqlalchemy.orm import Session
+
+from dispatch.decorators import scheduled_project_task, timer
+from dispatch.document import service as document_service
+from dispatch.messaging.strings import EVERGREEN_REMINDER
+from dispatch.notification import service as notification_service
+from dispatch.plugin import service as plugin_service
+from dispatch.project.models import Project
+from dispatch.scheduler import scheduler
+from dispatch.service import service as service_service
+from dispatch.team import service as team_service
+
+
+log = logging.getLogger(__name__)
+
+
+def create_evergreen_reminder(
+ db_session: Session, project: Project, owner_email: str, resource_groups: Any
+):
+ """Contains the logic for evergreen reminders."""
+ if not owner_email:
+ log.warning(
+ "Evergreen reminder not sent. No owner email. Project: {project.name}. Organization: {project.organization.name}"
+ )
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, plugin_type="email", project_id=project.id
+ )
+ if not plugin:
+ log.warning(
+ "Evergreen reminder not sent. No email plugin enabled. Project: {project.name}. Organization: {project.organization.name}"
+ )
+ return
+
+ items = []
+ for resource_type, resources in resource_groups.items():
+ for resource in resources:
+ weblink = getattr(resource, "weblink", "N/A")
+ items.append(
+ {
+ "description": getattr(resource, "description", None),
+ "name": resource.name,
+ "project": resource.project.name,
+ "resource_type": resource_type.replace("_", " ").title(),
+ "weblink": weblink,
+ }
+ )
+
+ notification_template = EVERGREEN_REMINDER
+ notification_type = "evergreen-reminder"
+ name = subject = notification_text = "Evergreen Reminder"
+
+ # Can raise exception "tenacity.RetryError: RetryError". (Email may still go through).
+ success = False
+ try:
+ success = plugin.instance.send(
+ owner_email,
+ notification_text,
+ notification_template,
+ notification_type,
+ name=name,
+ subject=subject,
+ items=items, # plugin expect dicts
+ )
+ except Exception as e:
+ log.error(f"Error in sending {notification_text} email to {owner_email}: {e}")
+
+ if not success:
+ log.error(f"Unable to send evergreen message. Email: {owner_email}")
+ return
+
+ # we set the evergreen last reminder at time to now
+ for _, resources in resource_groups.items():
+ for resource in resources:
+ resource.evergreen_last_reminder_at = datetime.utcnow()
+
+ db_session.commit()
+
+
+def group_items_by_owner_and_type(items):
+ """Groups items by owner."""
+ grouped = defaultdict(lambda: defaultdict(lambda: []))
+ for item in items:
+ grouped[item.evergreen_owner][item.__tablename__].append(item)
+ return grouped
+
+
+@scheduler.add(every().monday.at("18:00"), name="create-evergreen-reminders")
+@timer
+@scheduled_project_task
+def create_evergreen_reminders(db_session: Session, project: Project):
+ """Sends reminders for items that have evergreen enabled."""
+ items = []
+
+ # Overdue evergreen documents
+ items += document_service.get_overdue_evergreen_documents(
+ db_session=db_session, project_id=project.id
+ )
+
+ # Overdue evergreen oncall services
+ items += service_service.get_overdue_evergreen_services(
+ db_session=db_session, project_id=project.id
+ )
+
+ # Overdue evergreen teams
+ items += team_service.get_overdue_evergreen_teams(db_session=db_session, project_id=project.id)
+
+ # Overdue evergreen notifications
+ items += notification_service.get_overdue_evergreen_notifications(
+ db_session=db_session, project_id=project.id
+ )
+
+ if items:
+ grouped_items = group_items_by_owner_and_type(items)
+ for owner, items in grouped_items.items():
+ create_evergreen_reminder(db_session, project, owner, items)
diff --git a/src/dispatch/exceptions.py b/src/dispatch/exceptions.py
index 952ea322a292..178759d1cdca 100644
--- a/src/dispatch/exceptions.py
+++ b/src/dispatch/exceptions.py
@@ -1,14 +1,52 @@
+try:
+ from pydantic.v1 import PydanticValueError
+except ImportError:
+ from pydantic import PydanticValueError
+
+
class DispatchException(Exception):
pass
-class InvalidConfiguration(DispatchException):
+class DispatchPluginException(DispatchException):
pass
-class InvalidFilterPolicy(DispatchException):
- pass
+class NotFoundError(PydanticValueError):
+ code = "not_found"
+ msg_template = "{msg}"
-class DispatchPluginException(DispatchException):
- pass
+class FieldNotFoundError(PydanticValueError):
+ code = "not_found.field"
+ msg_template = "{msg}"
+
+
+class ModelNotFoundError(PydanticValueError):
+ code = "not_found.model"
+ msg_template = "{msg}"
+
+
+class ExistsError(PydanticValueError):
+ code = "exists"
+ msg_template = "{msg}"
+
+
+class InvalidConfigurationError(PydanticValueError):
+ code = "invalid.configuration"
+ msg_template = "{msg}"
+
+
+class InvalidFilterError(PydanticValueError):
+ code = "invalid.filter"
+ msg_template = "{msg}"
+
+
+class InvalidUsernameError(PydanticValueError):
+ code = "invalid.username"
+ msg_template = "{msg}"
+
+
+class InvalidPasswordError(PydanticValueError):
+ code = "invalid.password"
+ msg_template = "{msg}"
diff --git a/src/dispatch/extensions.py b/src/dispatch/extensions.py
index 0b424587706c..210eea39b8cf 100644
--- a/src/dispatch/extensions.py
+++ b/src/dispatch/extensions.py
@@ -1,9 +1,17 @@
import logging
import sentry_sdk
+from sentry_sdk.integrations.sqlalchemy import SqlalchemyIntegration
+from sentry_sdk.integrations.aiohttp import AioHttpIntegration
from sentry_sdk.integrations.logging import LoggingIntegration
+from sentry_sdk.integrations.stdlib import StdlibIntegration
+from sentry_sdk.integrations.excepthook import ExcepthookIntegration
+from sentry_sdk.integrations.dedupe import DedupeIntegration
+from sentry_sdk.integrations.atexit import AtexitIntegration
+from sentry_sdk.integrations.modules import ModulesIntegration
+
+from .config import ENV_TAGS, SENTRY_DSN, ENV
-from .config import SENTRY_DSN, ENV
log = logging.getLogger(__file__)
@@ -16,4 +24,22 @@
def configure_extensions():
log.debug("Configuring extensions...")
if SENTRY_DSN:
- sentry_sdk.init(dsn=str(SENTRY_DSN), integrations=[sentry_logging], environment=ENV)
+ sentry_sdk.init(
+ dsn=str(SENTRY_DSN),
+ integrations=[
+ AioHttpIntegration(),
+ AtexitIntegration(),
+ DedupeIntegration(),
+ ExcepthookIntegration(),
+ ModulesIntegration(),
+ SqlalchemyIntegration(),
+ StdlibIntegration(),
+ sentry_logging,
+ ],
+ environment=ENV,
+ auto_enabling_integrations=False,
+ )
+ with sentry_sdk.configure_scope() as scope:
+ log.debug(f"Using the following tags... ENV_TAGS: {ENV_TAGS}")
+ for k, v in ENV_TAGS.items():
+ scope.set_tag(k, v)
diff --git a/src/dispatch/feedback/incident/__init__.py b/src/dispatch/feedback/incident/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/feedback/incident/enums.py b/src/dispatch/feedback/incident/enums.py
new file mode 100644
index 000000000000..d29150d630a7
--- /dev/null
+++ b/src/dispatch/feedback/incident/enums.py
@@ -0,0 +1,9 @@
+from dispatch.enums import DispatchEnum
+
+
+class FeedbackRating(DispatchEnum):
+ very_satisfied = "Very satisfied"
+ somewhat_satisfied = "Somewhat satisfied"
+ neither_satisfied_nor_dissatisfied = "Neither satisfied nor dissatisfied"
+ somewhat_dissatisfied = "Somewhat dissatisfied"
+ very_dissatisfied = "Very dissatisfied"
diff --git a/src/dispatch/feedback/incident/messaging.py b/src/dispatch/feedback/incident/messaging.py
new file mode 100644
index 000000000000..d663236044ee
--- /dev/null
+++ b/src/dispatch/feedback/incident/messaging.py
@@ -0,0 +1,115 @@
+import logging
+
+from sqlalchemy.orm import Session
+
+from dispatch.messaging.strings import (
+ INCIDENT_FEEDBACK_DAILY_REPORT,
+ CASE_FEEDBACK_DAILY_REPORT,
+ MessageType,
+)
+from dispatch.plugin import service as plugin_service
+
+from .models import Feedback
+
+
+log = logging.getLogger(__name__)
+
+
+def send_incident_feedback_daily_report(
+ commander_email: str, feedback: list[Feedback], project_id: int, db_session: Session
+):
+ """Sends an incident feedback daily report to all incident commanders who received feedback."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type="email"
+ )
+
+ if not plugin:
+ log.warning("Incident feedback daily report not sent. Email plugin is not enabled.")
+ return
+
+ items = []
+ for piece in feedback:
+ participant = piece.participant.individual.name if piece.participant else "Anonymous"
+ items.append(
+ {
+ "name": piece.incident.name,
+ "title": piece.incident.title,
+ "rating": piece.rating,
+ "feedback": piece.feedback,
+ "participant": participant,
+ "created_at": piece.created_at,
+ }
+ )
+
+ name = subject = notification_text = "Incident Feedback Daily Report"
+ commander_fullname = feedback[0].incident.commander.individual.name
+ commander_weblink = feedback[0].incident.commander.individual.weblink
+
+ # Can raise exception "tenacity.RetryError: RetryError". (Email may still go through).
+ try:
+ plugin.instance.send(
+ commander_email,
+ notification_text,
+ INCIDENT_FEEDBACK_DAILY_REPORT,
+ MessageType.incident_feedback_daily_report,
+ name=name,
+ subject=subject,
+ cc=plugin.project.owner_email,
+ items=items,
+ contact_fullname=commander_fullname,
+ contact_weblink=commander_weblink,
+ )
+ except Exception as e:
+ log.error(f"Error in sending {notification_text} email to {commander_email}: {e}")
+
+ log.debug(f"Incident feedback daily report sent to {commander_email}.")
+
+
+def send_case_feedback_daily_report(
+ assignee_email: str, feedback: list[Feedback], project_id: int, db_session: Session
+):
+ """Sends an case feedback daily report to all case assignees who received feedback."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type="email"
+ )
+
+ if not plugin:
+ log.warning("Case feedback daily report not sent. Email plugin is not enabled.")
+ return
+
+ items = []
+ for piece in feedback:
+ participant = piece.participant.individual.name if piece.participant else "Anonymous"
+ items.append(
+ {
+ "name": piece.case.name,
+ "title": piece.case.title,
+ "rating": piece.rating,
+ "feedback": piece.feedback,
+ "participant": participant,
+ "created_at": piece.created_at,
+ }
+ )
+
+ name = subject = notification_text = "Case Feedback Daily Report"
+ assignee_fullname = feedback[0].case.assignee.individual.name
+ assignee_weblink = feedback[0].case.assignee.individual.weblink
+
+ # Can raise exception "tenacity.RetryError: RetryError". (Email may still go through).
+ try:
+ plugin.instance.send(
+ assignee_email,
+ notification_text,
+ CASE_FEEDBACK_DAILY_REPORT,
+ MessageType.case_feedback_daily_report,
+ name=name,
+ subject=subject,
+ cc=plugin.project.owner_email,
+ items=items,
+ contact_fullname=assignee_fullname,
+ contact_weblink=assignee_weblink,
+ )
+ except Exception as e:
+ log.error(f"Error in sending {notification_text} email to {assignee_email}: {e}")
+
+ log.debug(f"Case feedback daily report sent to {assignee_email}.")
diff --git a/src/dispatch/feedback/incident/models.py b/src/dispatch/feedback/incident/models.py
new file mode 100644
index 000000000000..653907c6df59
--- /dev/null
+++ b/src/dispatch/feedback/incident/models.py
@@ -0,0 +1,64 @@
+from datetime import datetime
+from sqlalchemy import Column, Integer, ForeignKey
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base
+from dispatch.incident.models import IncidentReadBasic
+from dispatch.models import (
+ DispatchBase,
+ TimeStampMixin,
+ FeedbackMixin,
+ PrimaryKey,
+ Pagination,
+ ProjectMixin,
+)
+from dispatch.participant.models import ParticipantRead
+from dispatch.project.models import ProjectRead
+from dispatch.case.models import CaseReadMinimal
+
+from .enums import FeedbackRating
+
+
+class Feedback(TimeStampMixin, FeedbackMixin, ProjectMixin, Base):
+ # Columns
+ id = Column(Integer, primary_key=True)
+
+ # Relationships
+ incident_id = Column(Integer, ForeignKey("incident.id", ondelete="CASCADE"))
+ case_id = Column(Integer, ForeignKey("case.id", ondelete="CASCADE"))
+ participant_id = Column(Integer, ForeignKey("participant.id"))
+
+ search_vector = Column(
+ TSVectorType(
+ "feedback",
+ "rating",
+ regconfig="pg_catalog.simple",
+ )
+ )
+
+
+# Pydantic models
+class FeedbackBase(DispatchBase):
+ created_at: datetime | None = None
+ rating: FeedbackRating = FeedbackRating.very_satisfied
+ feedback: str | None = None
+ incident: IncidentReadBasic | None = None
+ case: CaseReadMinimal | None = None
+ participant: ParticipantRead | None = None
+
+
+class FeedbackCreate(FeedbackBase):
+ pass
+
+
+class FeedbackUpdate(FeedbackBase):
+ id: PrimaryKey = None
+
+
+class FeedbackRead(FeedbackBase):
+ id: PrimaryKey
+ project: ProjectRead | None = None
+
+
+class FeedbackPagination(Pagination):
+ items: list[FeedbackRead]
diff --git a/src/dispatch/feedback/incident/scheduled.py b/src/dispatch/feedback/incident/scheduled.py
new file mode 100644
index 000000000000..0b0696acd736
--- /dev/null
+++ b/src/dispatch/feedback/incident/scheduled.py
@@ -0,0 +1,62 @@
+from collections import defaultdict
+from schedule import every
+import logging
+
+from sqlalchemy.orm import Session
+
+from dispatch.decorators import scheduled_project_task, timer
+from dispatch.project.models import Project
+from dispatch.scheduler import scheduler
+
+from .messaging import send_incident_feedback_daily_report, send_case_feedback_daily_report
+from .service import (
+ get_all_incident_last_x_hours_by_project_id,
+ get_all_case_last_x_hours_by_project_id,
+)
+
+log = logging.getLogger(__name__)
+
+
+def group_feedback_by_commander(feedback):
+ """Groups feedback by commander."""
+ grouped = defaultdict(lambda: [])
+ for piece in feedback:
+ if piece.incident and piece.incident.commander:
+ grouped[piece.incident.commander.individual.email].append(piece)
+ return grouped
+
+
+def group_feedback_by_assignee(feedback):
+ """Groups feedback by assignee."""
+ grouped = defaultdict(lambda: [])
+ for piece in feedback:
+ if piece.case and piece.case.assignee:
+ grouped[piece.case.assignee.individual.email].append(piece)
+ return grouped
+
+
+@scheduler.add(every(1).day.at("18:00"), name="feedback-report-daily")
+@timer
+@scheduled_project_task
+def feedback_report_daily(db_session: Session, project: Project):
+ """
+ Fetches all incident and case feedback provided in the last 24 hours
+ and sends a daily report to the commanders and assignees who handled the incidents/cases.
+ """
+ incident_feedback = get_all_incident_last_x_hours_by_project_id(
+ db_session=db_session, project_id=project.id
+ )
+
+ if incident_feedback:
+ grouped_incident_feedback = group_feedback_by_commander(incident_feedback)
+ for commander_email, feedback in grouped_incident_feedback.items():
+ send_incident_feedback_daily_report(commander_email, feedback, project.id, db_session)
+
+ case_feedback = get_all_case_last_x_hours_by_project_id(
+ db_session=db_session, project_id=project.id
+ )
+
+ if case_feedback:
+ grouped_case_feedback = group_feedback_by_assignee(case_feedback)
+ for assignee_email, feedback in grouped_case_feedback.items():
+ send_case_feedback_daily_report(assignee_email, feedback, project.id, db_session)
diff --git a/src/dispatch/feedback/incident/service.py b/src/dispatch/feedback/incident/service.py
new file mode 100644
index 000000000000..8d8d45d8c8c9
--- /dev/null
+++ b/src/dispatch/feedback/incident/service.py
@@ -0,0 +1,107 @@
+from datetime import datetime, timedelta
+
+from dispatch.incident import service as incident_service
+from dispatch.case import service as case_service
+from dispatch.incident.models import Incident
+from dispatch.case.models import Case
+from dispatch.project.models import Project
+
+from .models import Feedback, FeedbackCreate, FeedbackUpdate
+
+
+def get(*, db_session, feedback_id: int) -> Feedback | None:
+ """Gets a piece of feedback by its id."""
+ return db_session.query(Feedback).filter(Feedback.id == feedback_id).one_or_none()
+
+
+def get_all(*, db_session):
+ """Gets all pieces of feedback."""
+ return db_session.query(Feedback)
+
+
+def get_all_incident_last_x_hours_by_project_id(
+ *, db_session, hours: int = 24, project_id: int
+) -> list[Feedback | None]:
+ """Returns all feedback provided in the last x hours by project id. Defaults to 24 hours."""
+ return (
+ db_session.query(Feedback)
+ .join(Incident)
+ .join(Project)
+ .filter(Project.id == project_id)
+ .filter(Feedback.created_at >= datetime.utcnow() - timedelta(hours=hours))
+ .all()
+ )
+
+
+def get_all_case_last_x_hours_by_project_id(
+ *, db_session, hours: int = 24, project_id: int
+) -> list[Feedback | None]:
+ """Returns all feedback provided in the last x hours by project id. Defaults to 24 hours."""
+ return (
+ db_session.query(Feedback)
+ .join(Case)
+ .join(Project)
+ .filter(Project.id == project_id)
+ .filter(Feedback.created_at >= datetime.utcnow() - timedelta(hours=hours))
+ .all()
+ )
+
+
+def create(*, db_session, feedback_in: FeedbackCreate) -> Feedback:
+ """Creates a new piece of feedback."""
+ if feedback_in.incident:
+ incident = incident_service.get(
+ db_session=db_session,
+ incident_id=feedback_in.incident.id,
+ )
+ project = incident.project
+ case = None
+ participant = feedback_in.participant
+ else:
+ case = case_service.get(
+ db_session=db_session,
+ case_id=feedback_in.case.id,
+ )
+ project = case.project
+ incident = None
+ # Get the participant from the database if it's provided as a dict/model
+ participant = None
+ if feedback_in.participant:
+ from dispatch.participant.service import get as get_participant
+ participant = get_participant(
+ db_session=db_session,
+ participant_id=feedback_in.participant.id
+ )
+
+ # Create feedback with the actual ORM objects, not the Pydantic models
+ feedback = Feedback(
+ rating=feedback_in.rating,
+ feedback=feedback_in.feedback,
+ incident=incident,
+ case=case,
+ project=project,
+ participant=participant
+ )
+ db_session.add(feedback)
+ db_session.commit()
+ return feedback
+
+
+def update(*, db_session, feedback: Feedback, feedback_in: FeedbackUpdate) -> Feedback:
+ """Updates a piece of feedback."""
+ feedback_data = feedback.dict()
+ update_data = feedback_in.dict(exclude_unset=True)
+
+ for field in feedback_data:
+ if field in update_data:
+ setattr(feedback, field, update_data[field])
+
+ db_session.commit()
+ return feedback
+
+
+def delete(*, db_session, feedback_id: int):
+ """Deletes a piece of feedback."""
+ feedback = db_session.query(Feedback).filter(Feedback.id == feedback_id).one_or_none()
+ db_session.delete(feedback)
+ db_session.commit()
diff --git a/src/dispatch/feedback/incident/views.py b/src/dispatch/feedback/incident/views.py
new file mode 100644
index 000000000000..8338db5403e0
--- /dev/null
+++ b/src/dispatch/feedback/incident/views.py
@@ -0,0 +1,66 @@
+from fastapi import APIRouter, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import search_filter_sort_paginate, CommonParameters
+from dispatch.models import PrimaryKey
+
+
+from .models import (
+ FeedbackCreate,
+ FeedbackPagination,
+ FeedbackRead,
+ FeedbackUpdate,
+)
+from .service import create, delete, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=FeedbackPagination)
+def get_feedback_entries(commons: CommonParameters):
+ """Get all feedback entries, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="Feedback", **commons)
+
+
+@router.get("/{feedback_id}", response_model=FeedbackRead)
+def get_feedback(db_session: DbSession, feedback_id: PrimaryKey):
+ """Get a feedback entry by its id."""
+ feedback = get(db_session=db_session, feedback_id=feedback_id)
+ if not feedback:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A feedback entry with this id does not exist."}],
+ )
+ return feedback
+
+
+@router.post("", response_model=FeedbackRead)
+def create_feedback(db_session: DbSession, feedback_in: FeedbackCreate):
+ """Create a new feedback entry."""
+ return create(db_session=db_session, feedback_in=feedback_in)
+
+
+@router.put("/{feedback_id}", response_model=FeedbackRead)
+def update_feedback(db_session: DbSession, feedback_id: PrimaryKey, feedback_in: FeedbackUpdate):
+ """Updates a feedback entry by its id."""
+ feedback = get(db_session=db_session, feedback_id=feedback_id)
+ if not feedback:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A feedback entry with this id does not exist."}],
+ )
+ feedback = update(db_session=db_session, feedback=feedback, feedback_in=feedback_in)
+ return feedback
+
+
+@router.delete("/{feedback_id}", response_model=None)
+def delete_feedback(db_session: DbSession, feedback_id: PrimaryKey):
+ """Delete a feedback entry, returning only an HTTP 200 OK if successful."""
+ feedback = get(db_session=db_session, feedback_id=feedback_id)
+ if not feedback:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A feedback entry with this id does not exist."}],
+ )
+ delete(db_session=db_session, feedback_id=feedback_id)
diff --git a/src/dispatch/feedback/service/__init__.py b/src/dispatch/feedback/service/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/feedback/service/enums.py b/src/dispatch/feedback/service/enums.py
new file mode 100644
index 000000000000..2da7b4e45157
--- /dev/null
+++ b/src/dispatch/feedback/service/enums.py
@@ -0,0 +1,10 @@
+from dispatch.enums import DispatchEnum
+
+
+class ServiceFeedbackRating(DispatchEnum):
+ no_effort = "No effort"
+ little_effort = "Little effort"
+ moderate_effort = "Moderate effort"
+ lots_of_effort = "Lots of effort"
+ very_high_effort = "Very high effort"
+ extreme_effort = "Extreme effort, everything I could give"
diff --git a/src/dispatch/feedback/service/messaging.py b/src/dispatch/feedback/service/messaging.py
new file mode 100644
index 000000000000..20e1d1481e8e
--- /dev/null
+++ b/src/dispatch/feedback/service/messaging.py
@@ -0,0 +1,96 @@
+import logging
+from datetime import datetime, timedelta
+
+from sqlalchemy.orm import Session
+
+from dispatch.individual.models import IndividualContact
+from dispatch.messaging.strings import (
+ ONCALL_SHIFT_FEEDBACK_NOTIFICATION,
+ ONCALL_SHIFT_FEEDBACK_NOTIFICATION_REMINDER,
+ MessageType,
+)
+from dispatch.plugin import service as plugin_service
+from dispatch.project.models import Project
+from .reminder.models import ServiceFeedbackReminder, ServiceFeedbackReminderUpdate
+from .reminder import service as reminder_service
+
+log = logging.getLogger(__name__)
+
+
+def send_oncall_shift_feedback_message(
+ *,
+ project: Project,
+ individual: IndividualContact,
+ service_id: str,
+ shift_end_at: str,
+ schedule_name: str,
+ reminder: ServiceFeedbackReminder | None = None,
+ details: list[dict | None] = None,
+ db_session: Session,
+):
+ """
+ Experimental: sends a direct message to the oncall about to end their shift
+ asking to provide feedback about their experience.
+ """
+ notification_text = "Oncall Shift Feedback Request"
+ notification_template = ONCALL_SHIFT_FEEDBACK_NOTIFICATION
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning(
+ "Oncall shift feedback request notification not sent. No conversation plugin enabled."
+ )
+ return
+
+ if reminder:
+ # update reminder with 23 hours from now
+ reminder = reminder_service.update(
+ db_session=db_session,
+ reminder=reminder,
+ reminder_in=ServiceFeedbackReminderUpdate(
+ id=reminder.id,
+ reminder_at=datetime.utcnow() + timedelta(hours=23),
+ ),
+ )
+ notification_template = ONCALL_SHIFT_FEEDBACK_NOTIFICATION_REMINDER
+ else:
+ # create reminder and pass to plugin
+ reminder = reminder_service.create(
+ db_session=db_session,
+ reminder_in=ServiceFeedbackReminder(
+ reminder_at=datetime.utcnow() + timedelta(hours=23),
+ individual_contact_id=individual.id,
+ project_id=project.id,
+ schedule_id=service_id,
+ schedule_name=schedule_name,
+ shift_end_at=shift_end_at,
+ details=[] if details is None else details,
+ ),
+ )
+
+ shift_end_clean = shift_end_at.replace("T", " ").replace("Z", "")
+ items = [
+ {
+ "individual_name": individual.name,
+ "oncall_schedule_id": service_id,
+ "oncall_service_name": schedule_name,
+ "organization_slug": project.organization.slug,
+ "project_id": project.id,
+ "shift_end_at": shift_end_clean,
+ "reminder_id": reminder.id,
+ "details": [] if details is None else details,
+ }
+ ]
+
+ try:
+ plugin.instance.send_direct(
+ individual.email,
+ notification_text,
+ notification_template,
+ MessageType.service_feedback,
+ items=items,
+ )
+ except Exception as e:
+ log.exception(e)
diff --git a/src/dispatch/feedback/service/models.py b/src/dispatch/feedback/service/models.py
new file mode 100644
index 000000000000..e4ef5d837309
--- /dev/null
+++ b/src/dispatch/feedback/service/models.py
@@ -0,0 +1,69 @@
+from datetime import datetime
+from pydantic import Field
+
+from sqlalchemy import Column, Integer, ForeignKey, DateTime, String, Numeric, JSON
+from sqlalchemy_utils import TSVectorType
+from sqlalchemy.orm import relationship
+
+from dispatch.database.core import Base
+from dispatch.individual.models import IndividualContactReadMinimal
+from dispatch.models import DispatchBase, TimeStampMixin, FeedbackMixin, PrimaryKey, Pagination
+from dispatch.project.models import ProjectRead
+
+from .enums import ServiceFeedbackRating
+
+
+class ServiceFeedback(TimeStampMixin, FeedbackMixin, Base):
+ # Columns
+ id = Column(Integer, primary_key=True)
+ schedule = Column(String)
+ hours = Column(Numeric(precision=10, scale=2))
+ shift_start_at = Column(DateTime)
+ shift_end_at = Column(DateTime)
+ details = Column(JSON)
+
+ # Relationships
+ individual_contact_id = Column(Integer, ForeignKey("individual_contact.id"))
+
+ project_id = Column(Integer, ForeignKey("project.id"))
+ project = relationship("Project")
+
+ search_vector = Column(
+ TSVectorType(
+ "feedback",
+ "rating",
+ regconfig="pg_catalog.simple",
+ )
+ )
+
+
+# Pydantic models
+class ServiceFeedbackBase(DispatchBase):
+ feedback: str | None = None
+ hours: float | None = None
+ individual: IndividualContactReadMinimal | None = None
+ rating: ServiceFeedbackRating = ServiceFeedbackRating.little_effort
+ schedule: str | None = None
+ shift_end_at: datetime | None = None
+ shift_start_at: datetime | None = None
+ project: ProjectRead | None = None
+ created_at: datetime | None = None
+ details: list[dict | None] = Field([], nullable=True)
+
+
+class ServiceFeedbackCreate(ServiceFeedbackBase):
+ pass
+
+
+class ServiceFeedbackUpdate(ServiceFeedbackBase):
+ id: PrimaryKey = None
+
+
+class ServiceFeedbackRead(ServiceFeedbackBase):
+ id: PrimaryKey
+ project: ProjectRead | None = None
+
+
+class ServiceFeedbackPagination(Pagination):
+ items: list[ServiceFeedbackRead]
+ total: int
diff --git a/src/dispatch/feedback/service/reminder/__init__.py b/src/dispatch/feedback/service/reminder/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/feedback/service/reminder/models.py b/src/dispatch/feedback/service/reminder/models.py
new file mode 100644
index 000000000000..46f3c2c2637d
--- /dev/null
+++ b/src/dispatch/feedback/service/reminder/models.py
@@ -0,0 +1,47 @@
+from datetime import datetime
+from pydantic import Field
+
+from sqlalchemy import Column, Integer, ForeignKey, DateTime, String, JSON
+
+from dispatch.database.core import Base
+from dispatch.individual.models import IndividualContactRead
+from dispatch.models import DispatchBase, TimeStampMixin, PrimaryKey
+from dispatch.project.models import ProjectRead
+
+
+class ServiceFeedbackReminder(TimeStampMixin, Base):
+ # Columns
+ id = Column(Integer, primary_key=True)
+ reminder_at = Column(DateTime)
+ schedule_id = Column(String)
+ schedule_name = Column(String)
+ shift_end_at = Column(DateTime)
+ details = Column(JSON)
+
+ # Relationships
+ individual_contact_id = Column(Integer, ForeignKey("individual_contact.id"))
+ project_id = Column(Integer, ForeignKey("project.id"))
+
+
+# Pydantic models
+class ServiceFeedbackReminderBase(DispatchBase):
+ reminder_at: datetime | None = None
+ individual: IndividualContactRead | None = None
+ project: ProjectRead | None = None
+ schedule_id: str | None = None
+ schedule_name: str | None = None
+ shift_end_at: datetime | None = None
+ details: list[dict | None] = Field([], nullable=True)
+
+
+class ServiceFeedbackReminderCreate(ServiceFeedbackReminderBase):
+ pass
+
+
+class ServiceFeedbackReminderUpdate(ServiceFeedbackReminderBase):
+ id: PrimaryKey = None
+ reminder_at: datetime | None = None
+
+
+class ServiceFeedbackReminderRead(ServiceFeedbackReminderBase):
+ id: PrimaryKey
diff --git a/src/dispatch/feedback/service/reminder/service.py b/src/dispatch/feedback/service/reminder/service.py
new file mode 100644
index 000000000000..058278b3ac7f
--- /dev/null
+++ b/src/dispatch/feedback/service/reminder/service.py
@@ -0,0 +1,58 @@
+from datetime import datetime, timedelta
+
+from .models import (
+ ServiceFeedbackReminder,
+ ServiceFeedbackReminderUpdate,
+)
+from dispatch.individual.models import IndividualContact
+from dispatch.project.models import Project
+
+
+def get_all_expired_reminders_by_project_id(
+ *, db_session, project_id: int
+) -> list[ServiceFeedbackReminder | None]:
+ """Returns all expired reminders by project id."""
+ return (
+ db_session.query(ServiceFeedbackReminder)
+ .join(IndividualContact)
+ .join(Project)
+ .filter(Project.id == project_id)
+ .filter(datetime.utcnow() >= ServiceFeedbackReminder.reminder_at - timedelta(minutes=1))
+ .all()
+ )
+
+
+def create(*, db_session, reminder_in: ServiceFeedbackReminder) -> ServiceFeedbackReminder:
+ """Creates a new service feedback reminder."""
+ reminder = ServiceFeedbackReminder(**reminder_in.dict())
+
+ db_session.add(reminder)
+ db_session.commit()
+ return reminder
+
+
+def update(
+ *, db_session, reminder: ServiceFeedbackReminder, reminder_in: ServiceFeedbackReminderUpdate
+) -> ServiceFeedbackReminder:
+ """Updates a service feedback reminder."""
+ reminder_data = reminder.dict()
+ update_data = reminder_in.dict(exclude_unset=True)
+
+ for field in reminder_data:
+ if field in update_data:
+ setattr(reminder, field, update_data[field])
+
+ db_session.commit()
+ return reminder
+
+
+def delete(*, db_session, reminder_id: int):
+ """Deletes a service feedback reminder."""
+ reminder = (
+ db_session.query(ServiceFeedbackReminder)
+ .filter(ServiceFeedbackReminder.id == reminder_id)
+ .one_or_none()
+ )
+ if reminder:
+ db_session.delete(reminder)
+ db_session.commit()
diff --git a/src/dispatch/feedback/service/scheduled.py b/src/dispatch/feedback/service/scheduled.py
new file mode 100644
index 000000000000..f4df776bffec
--- /dev/null
+++ b/src/dispatch/feedback/service/scheduled.py
@@ -0,0 +1,203 @@
+from schedule import every
+import logging
+from datetime import datetime
+
+from sqlalchemy.orm import Session
+
+from dispatch.decorators import scheduled_project_task, timer
+from dispatch.individual import service as individual_service
+from dispatch.plugin import service as plugin_service
+from dispatch.plugin.models import PluginInstance
+from dispatch.project.models import Project
+from dispatch.scheduler import scheduler
+from dispatch.service import service as service_service
+from .reminder import service as reminder_service
+from dispatch.incident import service as incident_service
+from dispatch.case import service as case_service
+
+from .messaging import send_oncall_shift_feedback_message
+
+log = logging.getLogger(__name__)
+
+"""
+ Experimental: will wake up and check the oncall schedule for previous day
+ vs current day to see if a different person is oncall, if so, the previous day's
+ oncall will receive a shift feedback form.
+
+ Timing is determined by the shift_hours_type attribute of the service:
+ - For 12-hour shifts: Teams trade off between UCAN and EMEA
+ - EMEA: wake at 6am UTC == 8am UTC+2 Standard / 9am UTC+2 Daylight Saving
+ - UCAN: wake at 4pm UTC == 8am PST / 9am PDT
+ - For 24-hour shifts: All UCAN, no EMEA
+ - UCAN only: wake at 4pm UTC == 8am PST / 9am PDT
+"""
+
+
+@scheduler.add(every(1).day.at("16:00"), name="oncall-shift-feedback-ucan")
+@timer
+@scheduled_project_task
+def oncall_shift_feedback_ucan(db_session: Session, project: Project):
+ # Process both 12-hour shifts (UCAN handoff) and 24-hour shifts (UCAN only) at 4pm UTC
+ # First process 12-hour shifts
+ oncall_shift_feedback(db_session=db_session, project=project, hour=6, shift_hours=12)
+ # Then process 24-hour shifts
+ oncall_shift_feedback(db_session=db_session, project=project, hour=16, shift_hours=24)
+ find_expired_reminders_and_send(db_session=db_session, project=project)
+
+
+@scheduler.add(every(1).day.at("06:00"), name="oncall-shift-feedback-emea")
+@timer
+@scheduled_project_task
+def oncall_shift_feedback_emea(db_session: Session, project: Project):
+ # Process 12-hour shifts at 6am UTC (EMEA handoff)
+ oncall_shift_feedback(db_session=db_session, project=project, hour=16, shift_hours=12)
+ find_expired_reminders_and_send(db_session=db_session, project=project)
+
+
+def find_expired_reminders_and_send(*, db_session: Session, project: Project):
+ reminders = reminder_service.get_all_expired_reminders_by_project_id(
+ db_session=db_session, project_id=project.id
+ )
+ for reminder in reminders:
+ individual = individual_service.get(
+ db_session=db_session, individual_contact_id=reminder.individual_contact_id
+ )
+ send_oncall_shift_feedback_message(
+ project=project,
+ individual=individual,
+ service_id=reminder.schedule_id,
+ shift_end_at=str(reminder.shift_end_at),
+ schedule_name=reminder.schedule_name,
+ reminder=reminder,
+ db_session=db_session,
+ details=reminder.details,
+ )
+
+
+def find_schedule_and_send(
+ *,
+ db_session: Session,
+ project: Project,
+ oncall_plugin: PluginInstance,
+ schedule_id: str,
+ service_id: str,
+ hour: int,
+):
+ """
+ Given PagerDuty schedule_id, determine if the shift ended for the previous oncall person and
+ send the health metrics feedback request - note that if current and previous oncall is the
+ same, then did_oncall_just_go_off_shift will return None
+ """
+
+ # Counts the number of participants with one or more active roles
+ def count_active_participants(participants):
+ return sum(1 for participant in participants if len(participant.active_roles) >= 1)
+
+ current_oncall = oncall_plugin.instance.did_oncall_just_go_off_shift(schedule_id, hour)
+
+ if current_oncall is None:
+ return
+
+ individual = individual_service.get_by_email_and_project(
+ db_session=db_session, email=current_oncall["email"], project_id=project.id
+ )
+
+ # Calculate the number of hours in the shift
+ if current_oncall["shift_start"]:
+ shift_start_raw = current_oncall["shift_start"]
+ shift_start_at = (
+ datetime.strptime(shift_start_raw, "%Y-%m-%dT%H:%M:%SZ")
+ if "T" in shift_start_raw
+ else datetime.strptime(shift_start_raw, "%Y-%m-%d %H:%M:%S")
+ )
+ shift_end_raw = current_oncall["shift_end"]
+ shift_end_at = (
+ datetime.strptime(shift_end_raw, "%Y-%m-%dT%H:%M:%SZ")
+ if "T" in shift_end_raw
+ else datetime.strptime(shift_end_raw, "%Y-%m-%d %H:%M:%S")
+ )
+ hours_in_shift = (shift_end_at - shift_start_at).total_seconds() / 3600
+ else:
+ hours_in_shift = 7 * 24 # default to 7 days
+
+ num_incidents = 0
+ num_cases = 0
+ num_participants = 0
+ incidents = incident_service.get_all_last_x_hours(db_session=db_session, hours=hours_in_shift)
+ for incident in incidents:
+ if incident.commander.individual.email == current_oncall["email"]:
+ num_participants += count_active_participants(incident.participants)
+ num_incidents += 1
+
+ cases = case_service.get_all_last_x_hours(db_session=db_session, hours=hours_in_shift)
+ for case in cases:
+ if case.has_channel and case.assignee.individual.email == current_oncall["email"]:
+ num_participants += count_active_participants(case.participants)
+ num_cases += 1
+ details = [
+ {
+ "num_incidents": num_incidents,
+ "num_cases": num_cases,
+ "num_participants": num_participants,
+ }
+ ]
+
+ send_oncall_shift_feedback_message(
+ project=project,
+ individual=individual,
+ service_id=service_id,
+ shift_end_at=current_oncall["shift_end"],
+ schedule_name=current_oncall["schedule_name"],
+ details=details,
+ db_session=db_session,
+ )
+
+
+def oncall_shift_feedback(
+ db_session: Session, project: Project, hour: int, shift_hours: int = None
+):
+ """
+ Experimental: collects feedback from individuals participating in an oncall service that has health metrics enabled
+ when their oncall shift ends. For now, only for one project and schedule.
+
+ Args:
+ db_session: Database session
+ project: Project to process
+ hour: Hour of the day to check for shift changes
+ shift_hours: If provided, only process services with this shift_hours_type (12 or 24)
+ """
+ oncall_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project.id, plugin_type="oncall"
+ )
+
+ if not oncall_plugin:
+ log.warning(
+ f"Skipping collection of oncall shift feedback for project {project.name}. No oncall plugin enabled."
+ )
+ return
+
+ # Get all oncall services marked for health metrics for this project
+ oncall_services = service_service.get_all_by_health_metrics(
+ db_session=db_session,
+ service_type=oncall_plugin.instance.slug,
+ health_metrics=True,
+ project_id=project.id,
+ )
+
+ for oncall_service in oncall_services:
+ # Skip services that don't match the requested shift_hours_type
+ if shift_hours and oncall_service.shift_hours_type != shift_hours:
+ continue
+
+ # for each service, get the schedule_id
+ external_id = oncall_service.external_id
+ schedule_id = oncall_plugin.instance.get_schedule_id_from_service_id(service_id=external_id)
+ if schedule_id:
+ find_schedule_and_send(
+ db_session=db_session,
+ project=project,
+ oncall_plugin=oncall_plugin,
+ schedule_id=schedule_id,
+ service_id=external_id,
+ hour=hour,
+ )
diff --git a/src/dispatch/feedback/service/service.py b/src/dispatch/feedback/service/service.py
new file mode 100644
index 000000000000..75b0c7f2794d
--- /dev/null
+++ b/src/dispatch/feedback/service/service.py
@@ -0,0 +1,66 @@
+
+from sqlalchemy.orm import Session
+
+from .models import ServiceFeedback, ServiceFeedbackCreate, ServiceFeedbackUpdate
+
+
+def get(*, service_feedback_id: int, db_session: Session) -> ServiceFeedback | None:
+ """Gets a piece of service feedback by its id."""
+ return (
+ db_session.query(ServiceFeedback)
+ .filter(ServiceFeedback.id == service_feedback_id)
+ .one_or_none()
+ )
+
+
+def get_all(*, db_session: Session):
+ """Gets all pieces of service feedback."""
+ return db_session.query(ServiceFeedback)
+
+
+def create(*, service_feedback_in: ServiceFeedbackCreate, db_session: Session) -> ServiceFeedback:
+ """Creates a new piece of service feedback."""
+
+ individual_contact_id = (
+ None if not service_feedback_in.individual else service_feedback_in.individual.id
+ )
+
+ project_id = None if not service_feedback_in.project else service_feedback_in.project.id
+
+ service_feedback = ServiceFeedback(
+ **service_feedback_in.dict(exclude={"individual", "project"}),
+ individual_contact_id=individual_contact_id,
+ project_id=project_id,
+ )
+ db_session.add(service_feedback)
+ db_session.commit()
+ return service_feedback
+
+
+def update(
+ *,
+ service_feedback: ServiceFeedback,
+ service_feedback_in: ServiceFeedbackUpdate,
+ db_session: Session,
+) -> ServiceFeedback:
+ """Updates a piece of service feedback."""
+ service_feedback_data = service_feedback.dict()
+ update_data = service_feedback_in.dict(exclude_unset=True)
+
+ for field in service_feedback_data:
+ if field in update_data:
+ setattr(service_feedback, field, update_data[field])
+
+ db_session.commit()
+ return service_feedback
+
+
+def delete(*, db_session, service_feedback_id: int):
+ """Deletes a piece of service feedback."""
+ service_feedback = (
+ db_session.query(ServiceFeedback)
+ .filter(ServiceFeedback.id == service_feedback_id)
+ .one_or_none()
+ )
+ db_session.delete(service_feedback)
+ db_session.commit()
diff --git a/src/dispatch/feedback/service/views.py b/src/dispatch/feedback/service/views.py
new file mode 100644
index 000000000000..2359f0c0228c
--- /dev/null
+++ b/src/dispatch/feedback/service/views.py
@@ -0,0 +1,58 @@
+from fastapi import APIRouter, HTTPException, status, Depends
+
+from dispatch.auth.permissions import (
+ FeedbackDeletePermission,
+ PermissionsDependency,
+ SensitiveProjectActionPermission,
+)
+from dispatch.database.core import DbSession
+from dispatch.database.service import search_filter_sort_paginate, CommonParameters
+from dispatch.models import PrimaryKey
+
+from .models import ServiceFeedbackRead, ServiceFeedbackPagination
+from .service import get, delete
+
+
+router = APIRouter()
+
+
+@router.get(
+ "",
+ response_model=ServiceFeedbackPagination,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def get_feedback_entries(commons: CommonParameters):
+ """Get all feedback entries, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="ServiceFeedback", **commons)
+
+
+@router.get(
+ "/{service_feedback_id}",
+ response_model=ServiceFeedbackRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def get_feedback(db_session: DbSession, service_feedback_id: PrimaryKey):
+ """Get a feedback entry by its id."""
+ feedback = get(db_session=db_session, service_feedback_id=service_feedback_id)
+ if not feedback:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A feedback entry with this id does not exist."}],
+ )
+ return feedback
+
+
+@router.delete(
+ "/{service_feedback_id}/{individual_contact_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([FeedbackDeletePermission]))],
+)
+def delete_feedback(db_session: DbSession, service_feedback_id: PrimaryKey):
+ """Delete a feedback entry, returning only an HTTP 200 OK if successful."""
+ feedback = get(db_session=db_session, service_feedback_id=service_feedback_id)
+ if not feedback:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A feedback entry with this id does not exist."}],
+ )
+ delete(db_session=db_session, service_feedback_id=service_feedback_id)
diff --git a/src/dispatch/forms/enums.py b/src/dispatch/forms/enums.py
new file mode 100644
index 000000000000..0a8e654a307f
--- /dev/null
+++ b/src/dispatch/forms/enums.py
@@ -0,0 +1,13 @@
+from dispatch.enums import DispatchEnum
+
+
+class FormStatus(DispatchEnum):
+ new = "New"
+ draft = "Draft"
+ complete = "Complete"
+
+
+class FormAttorneyStatus(DispatchEnum):
+ not_reviewed = "Not reviewed"
+ reviewed_no_action = "Reviewed: no action required"
+ reviewed_action_required = "Reviewed: follow up required"
diff --git a/src/dispatch/forms/models.py b/src/dispatch/forms/models.py
new file mode 100644
index 000000000000..7454b28a3205
--- /dev/null
+++ b/src/dispatch/forms/models.py
@@ -0,0 +1,69 @@
+from datetime import datetime
+from sqlalchemy import Column, Integer, ForeignKey, String
+from sqlalchemy.orm import relationship
+
+from dispatch.database.core import Base
+from dispatch.individual.models import IndividualContactReadMinimal
+from dispatch.models import DispatchBase, TimeStampMixin, PrimaryKey, Pagination, ProjectMixin
+from dispatch.project.models import ProjectRead
+from dispatch.incident.models import IncidentReadBasic
+from dispatch.forms.type.models import FormsTypeRead
+from .enums import FormStatus, FormAttorneyStatus
+
+
+class Forms(TimeStampMixin, ProjectMixin, Base):
+ # Columns
+ id = Column(Integer, primary_key=True)
+ form_data = Column(String, nullable=True)
+ status = Column(String, default=FormStatus.new, nullable=True)
+ attorney_status = Column(String, default=FormAttorneyStatus.not_reviewed, nullable=True)
+ attorney_questions = Column(String, nullable=True)
+ attorney_analysis = Column(String, nullable=True)
+ attorney_form_data = Column(String, nullable=True)
+
+ # Relationships
+ creator_id = Column(Integer, ForeignKey("individual_contact.id"))
+ creator = relationship("IndividualContact")
+
+ incident_id = Column(Integer, ForeignKey("incident.id"))
+ incident = relationship("Incident")
+
+ form_type_id = Column(Integer, ForeignKey("forms_type.id"))
+ form_type = relationship("FormsType")
+
+ score = Column(Integer, nullable=True)
+
+
+# Pydantic models
+class FormsBase(DispatchBase):
+ form_type: FormsTypeRead | None = None
+ creator: IndividualContactReadMinimal | None = None
+ form_data: str | None = None
+ attorney_form_data: str | None = None
+ status: str | None = None
+ attorney_status: str | None = None
+ project: ProjectRead | None = None
+ incident: IncidentReadBasic | None = None
+ attorney_questions: str | None = None
+ attorney_analysis: str | None = None
+ score: int | None = None
+
+
+class FormsCreate(FormsBase):
+ pass
+
+
+class FormsUpdate(FormsBase):
+ id: PrimaryKey = None
+
+
+class FormsRead(FormsBase):
+ id: PrimaryKey
+ project: ProjectRead | None = None
+ created_at: datetime | None = None
+ updated_at: datetime | None = None
+
+
+class FormsPagination(Pagination):
+ items: list[FormsRead]
+ total: int
diff --git a/src/dispatch/forms/scoring.py b/src/dispatch/forms/scoring.py
new file mode 100644
index 000000000000..4305024898bf
--- /dev/null
+++ b/src/dispatch/forms/scoring.py
@@ -0,0 +1,33 @@
+import json
+
+
+def calculate_score(form_data: str, scoring_schema: str) -> int:
+ """Calculates the score of a form.
+
+ Args:
+ form_data: A string containing the JSON of the form data with key-value pairs.
+ scoring_scheme: A string containing the JSON of the scoring schema. The schema should
+ be formatted as a list of dictionaries, where each dictionary contains the following keys:
+ var: The key of the form data to score.
+ includes: A list of values that the form data should include to be scored.
+ score: The score to add if the form data meets the criteria.
+
+ Returns:
+ int: The total score of the form.
+ """
+ score = 0
+
+ if not form_data or not scoring_schema:
+ return score
+
+ form_vals = json.loads(form_data)
+ scoring = json.loads(scoring_schema)
+
+ for s in scoring:
+ # get the value of the form data based on the key
+ if (var := form_vals.get(s.get("var"))) and (includes := s.get("includes")):
+ # if any element in the form data is in the includes list, add the score
+ if any(v in includes for v in var):
+ score += s.get("score", 0)
+
+ return score
diff --git a/src/dispatch/forms/service.py b/src/dispatch/forms/service.py
new file mode 100644
index 000000000000..a71f68cd6266
--- /dev/null
+++ b/src/dispatch/forms/service.py
@@ -0,0 +1,204 @@
+import logging
+import json
+from datetime import datetime
+
+from sqlalchemy.orm import Session
+from dispatch.database.core import resolve_attr
+
+from .models import Forms, FormsUpdate
+from .scoring import calculate_score
+from dispatch.document import service as document_service
+from dispatch.individual import service as individual_service
+from dispatch.forms.type import service as form_type_service
+from dispatch.plugin import service as plugin_service
+from dispatch.project import service as project_service
+
+log = logging.getLogger(__name__)
+
+
+def get(*, forms_id: int, db_session: Session) -> Forms | None:
+ """Gets a from by its id."""
+ return db_session.query(Forms).filter(Forms.id == forms_id).one_or_none()
+
+
+def get_all(*, db_session: Session):
+ """Gets all forms."""
+ return db_session.query(Forms)
+
+
+def create(*, forms_in: dict, db_session: Session, creator) -> Forms:
+ """Creates form data."""
+
+ individual = individual_service.get_by_email_and_project(
+ db_session=db_session, email=creator.email, project_id=forms_in["project_id"]
+ )
+
+ form = Forms(
+ **forms_in,
+ creator_id=individual.id,
+ )
+
+ if forms_in.get("form_type_id"):
+ form_type = form_type_service.get(
+ db_session=db_session, forms_type_id=forms_in["form_type_id"]
+ )
+ form.score = calculate_score(forms_in.get("form_data"), form_type.scoring_schema)
+
+ db_session.add(form)
+ db_session.commit()
+ return form
+
+
+def update(
+ *,
+ forms: Forms,
+ forms_in: FormsUpdate,
+ db_session: Session,
+) -> Forms:
+ """Updates a form."""
+ form_data = forms.dict()
+ update_data = forms_in.dict(exclude_unset=True)
+
+ for field in form_data:
+ if field in update_data:
+ setattr(forms, field, update_data[field])
+
+ forms.score = calculate_score(forms_in.form_data, forms.form_type.scoring_schema)
+
+ db_session.commit()
+ return forms
+
+
+def delete(*, db_session, forms_id: int):
+ """Deletes a form."""
+ form = db_session.query(Forms).filter(Forms.id == forms_id).one_or_none()
+ db_session.delete(form)
+ db_session.commit()
+
+
+def build_form_doc(form_schema: str, form_data: str) -> str:
+ # Used to build the read-only answers given the questions in form_schema and the answers in form_data
+ schema = json.loads(form_schema)
+ data = json.loads(form_data)
+ output_qa = []
+
+ for item in schema:
+ name = item["name"]
+ question = item["title"]
+ # find the key in form_data corresponding to this name
+ answer = data.get(name)
+ if answer and isinstance(answer, list) and len(answer) == 0:
+ answer = ""
+ # add the question and answer to the output_qa list
+ if answer:
+ output_qa.append(f"{question}: {answer}")
+
+ return "\n".join(output_qa)
+
+
+def export(*, db_session: Session, ids: list[int]) -> list[str]:
+ """Exports forms."""
+ folders = []
+ # get all the forms given the ids
+ forms = db_session.query(Forms).filter(Forms.id.in_(ids)).all()
+ # from the forms, get all unique project ids
+ project_ids = list({form.project_id for form in forms})
+
+ for project_id in project_ids:
+ # ensure there is a document plugin active
+ document_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type="document"
+ )
+ if not document_plugin:
+ log.warning(
+ f"Forms for project id ${project_id} not exported. No document plugin enabled."
+ )
+ continue
+
+ storage_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type="storage"
+ )
+ if not storage_plugin:
+ log.warning(
+ f"Forms for project id ${project_id} not exported. No storage plugin enabled."
+ )
+ continue
+
+ # create a storage folder for the forms in the root project folder
+ external_storage_root_id = storage_plugin.configuration.root_id
+
+ if not external_storage_root_id:
+ log.warning(
+ f"Forms for project id ${project_id} not exported. No external storage root id configured."
+ )
+ continue
+
+ project = project_service.get(db_session=db_session, project_id=project_id)
+ if not project:
+ log.warning(f"Forms for project id ${project_id} not exported. Project not found.")
+ continue
+
+ form_export_template = document_service.get_project_forms_export_template(
+ db_session=db_session, project_id=project_id
+ )
+ if not form_export_template:
+ log.warning(
+ f"Forms for project id ${project_id} not exported. No form export template document configured."
+ )
+ continue
+
+ # create a folder name that includes the date and time
+ folder_name = f"Exported forms {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}"
+ folder = storage_plugin.instance.create_file(
+ parent_id=external_storage_root_id, name=folder_name
+ )
+ folders.append(folder["weblink"])
+
+ # get the subset of forms that have this project id
+ project_forms = [form for form in forms if form.project_id == project_id]
+
+ # for each form, create a document from the template and update it with the form data
+ for form in project_forms:
+ export_document_name = f"{form.incident.name}-{form.form_type.name}-{form.id}"
+ export_document = storage_plugin.instance.copy_file(
+ folder_id=folder["id"],
+ file_id=form_export_template.resource_id,
+ name=export_document_name,
+ )
+ storage_plugin.instance.move_file(
+ new_folder_id=folder["id"], file_id=export_document["id"]
+ )
+ document_kwargs = {
+ "commander_fullname": form.incident.commander.individual.name,
+ "conference_challenge": resolve_attr(form.incident, "conference.challenge"),
+ "conference_weblink": resolve_attr(form.incident, "conference.weblink"),
+ "conversation_weblink": resolve_attr(form.incident, "conversation.weblink"),
+ "description": form.incident.description,
+ "document_weblink": resolve_attr(form.incident, "incident_document.weblink"),
+ "name": form.incident.name,
+ "priority": form.incident.incident_priority.name,
+ "reported_at": form.incident.reported_at.strftime("%m/%d/%Y %H:%M:%S"),
+ "closed_at": (
+ form.incident.closed_at.strftime("%m/%d/%Y %H:%M:%S")
+ if form.incident.closed_at
+ else ""
+ ),
+ "resolution": form.incident.resolution,
+ "severity": form.incident.incident_severity.name,
+ "status": form.incident.status,
+ "storage_weblink": resolve_attr(form.incident, "storage.weblink"),
+ "ticket_weblink": resolve_attr(form.incident, "ticket.weblink"),
+ "title": form.incident.title,
+ "type": form.incident.incident_type.name,
+ "summary": form.incident.summary,
+ "form_status": form.status,
+ "form_type": form.form_type.name,
+ "form_data": build_form_doc(form.form_type.form_schema, form.form_data),
+ "attorney_form_data": form.attorney_form_data,
+ "attorney_status": form.attorney_status,
+ "attorney_questions": form.attorney_questions,
+ "attorney_analysis": form.attorney_analysis,
+ }
+ document_plugin.instance.update(export_document["id"], **document_kwargs)
+
+ return folders
diff --git a/src/dispatch/forms/type/models.py b/src/dispatch/forms/type/models.py
new file mode 100644
index 000000000000..8dd9dba98088
--- /dev/null
+++ b/src/dispatch/forms/type/models.py
@@ -0,0 +1,66 @@
+from datetime import datetime
+from sqlalchemy import Boolean, Column, Integer, ForeignKey, String
+from sqlalchemy.sql.schema import UniqueConstraint
+from sqlalchemy.orm import relationship
+
+from dispatch.database.core import Base
+from dispatch.individual.models import IndividualContactReadMinimal
+from dispatch.models import (
+ DispatchBase,
+ NameStr,
+ Pagination,
+ PrimaryKey,
+ ProjectMixin,
+ TimeStampMixin,
+)
+from dispatch.project.models import ProjectRead
+from dispatch.service.models import ServiceRead
+
+
+class FormsType(ProjectMixin, TimeStampMixin, Base):
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String, nullable=True)
+ enabled = Column(Boolean, default=True)
+ form_schema = Column(String, nullable=True)
+ attorney_form_schema = Column(String, nullable=True)
+ scoring_schema = Column(String, nullable=True)
+
+ # Relationships
+ creator_id = Column(Integer, ForeignKey("individual_contact.id"))
+ creator = relationship("IndividualContact")
+
+ service_id = Column(Integer, ForeignKey("service.id"))
+ service = relationship("Service")
+
+
+# Pydantic models
+class FormsTypeBase(DispatchBase):
+ name: NameStr
+ description: str | None = None
+ enabled: bool | None = None
+ form_schema: str | None = None
+ attorney_form_schema: str | None = None
+ scoring_schema: str | None = None
+ creator: IndividualContactReadMinimal | None = None
+ project: ProjectRead | None = None
+ service: ServiceRead | None = None
+
+
+class FormsTypeCreate(FormsTypeBase):
+ pass
+
+
+class FormsTypeUpdate(FormsTypeBase):
+ id: PrimaryKey = None
+
+
+class FormsTypeRead(FormsTypeBase):
+ id: PrimaryKey
+ created_at: datetime | None = None
+ updated_at: datetime | None = None
+
+
+class FormsTypePagination(Pagination):
+ items: list[FormsTypeRead] = []
diff --git a/src/dispatch/forms/type/service.py b/src/dispatch/forms/type/service.py
new file mode 100644
index 000000000000..c84bb15ebd3e
--- /dev/null
+++ b/src/dispatch/forms/type/service.py
@@ -0,0 +1,101 @@
+import logging
+
+from sqlalchemy.orm import Session
+
+from .models import FormsType, FormsTypeCreate, FormsTypeUpdate
+from dispatch.individual import service as individual_service
+from dispatch.project import service as project_service
+from dispatch.service import service as service_service
+from dispatch.plugin import service as plugin_service
+from dispatch.forms.models import Forms
+from dispatch.service.models import Service
+from dispatch.incident.messaging import send_completed_form_email
+
+log = logging.getLogger(__name__)
+
+
+def get(*, forms_type_id: int, db_session: Session) -> FormsType | None:
+ """Gets a from type by its id."""
+ return db_session.query(FormsType).filter(FormsType.id == forms_type_id).one_or_none()
+
+
+def get_all(*, db_session: Session):
+ """Gets all form types."""
+ return db_session.query(FormsType)
+
+
+def create(*, db_session: Session, forms_type_in: FormsTypeCreate, creator) -> FormsType:
+ """Creates form type."""
+
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=forms_type_in.project
+ )
+
+ individual = individual_service.get_by_email_and_project(
+ db_session=db_session, email=creator.email, project_id=project.id
+ )
+
+ service_id = None
+ if forms_type_in.service:
+ service = service_service.get(db_session=db_session, service_id=forms_type_in.service.id)
+ if service:
+ service_id = service.id
+
+ form_type = FormsType(
+ **forms_type_in.dict(exclude={"creator", "project", "service"}),
+ creator_id=individual.id,
+ project_id=project.id,
+ service_id=service_id,
+ )
+ db_session.add(form_type)
+ db_session.commit()
+ return form_type
+
+
+def update(
+ *,
+ forms_type: FormsType,
+ forms_type_in: FormsTypeUpdate,
+ db_session: Session,
+) -> FormsType:
+ """Updates a form type."""
+ form_data = forms_type.dict()
+ update_data = forms_type_in.dict(exclude_unset=True)
+
+ for field in form_data:
+ if field in update_data:
+ setattr(forms_type, field, update_data[field])
+
+ service = forms_type_in.service
+ if service:
+ forms_type.service_id = service.id
+ else:
+ forms_type.service_id = None
+
+ db_session.commit()
+ return forms_type
+
+
+def delete(*, db_session, forms_type_id: int):
+ """Deletes a form type."""
+ form = db_session.query(FormsType).filter(FormsType.id == forms_type_id).one_or_none()
+ db_session.delete(form)
+ db_session.commit()
+
+
+def send_email_to_service(
+ *,
+ form: Forms,
+ service: Service,
+ db_session: Session,
+):
+ """Notifies oncall about completed form"""
+ oncall_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=form.project.id, plugin_type="oncall"
+ )
+ if not oncall_plugin:
+ log.debug("Unable to send email since oncall plugin is not active.")
+ else:
+ current_oncall = oncall_plugin.instance.get(service.external_id)
+ if current_oncall:
+ send_completed_form_email(current_oncall, form, db_session)
diff --git a/src/dispatch/forms/type/views.py b/src/dispatch/forms/type/views.py
new file mode 100644
index 000000000000..d2e87ad63136
--- /dev/null
+++ b/src/dispatch/forms/type/views.py
@@ -0,0 +1,107 @@
+import logging
+from fastapi import APIRouter, HTTPException, status, Depends
+from pydantic import ValidationError
+from sqlalchemy.exc import IntegrityError
+
+from dispatch.auth.permissions import (
+ FeedbackDeletePermission,
+ PermissionsDependency,
+)
+from dispatch.auth.service import CurrentUser
+from dispatch.database.core import DbSession
+from dispatch.database.service import search_filter_sort_paginate, CommonParameters
+from dispatch.models import PrimaryKey
+
+from .models import FormsTypeRead, FormsTypeCreate, FormsTypeUpdate, FormsTypePagination
+from .service import get, delete, create, update
+
+log = logging.getLogger(__name__)
+
+router = APIRouter()
+
+
+@router.get("", response_model=FormsTypePagination)
+def get_forms(commons: CommonParameters):
+ """Get all form types, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="FormsType", **commons)
+
+
+@router.get("/{forms_type_id}", response_model=FormsTypeRead)
+def get_form(db_session: DbSession, forms_type_id: PrimaryKey):
+ """Get a form type by its id."""
+ forms_type = get(db_session=db_session, forms_type_id=forms_type_id)
+ if not forms_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A form type with this id does not exist."}],
+ )
+ return forms_type
+
+
+@router.post("", response_model=FormsTypeRead)
+def create_forms_type(
+ db_session: DbSession,
+ forms_type_in: FormsTypeCreate,
+ current_user: CurrentUser,
+):
+ """Create a new form type."""
+ try:
+ return create(db_session=db_session, creator=current_user, forms_type_in=forms_type_in)
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "A form type with this name already exists.",
+ "loc": "name",
+ }
+ ],
+ ) from None
+
+
+@router.put(
+ "/{forms_type_id}/{individual_contact_id}",
+ response_model=FormsTypeRead,
+ dependencies=[Depends(PermissionsDependency([FeedbackDeletePermission]))],
+)
+def update_forms_type(
+ db_session: DbSession,
+ forms_type_id: PrimaryKey,
+ forms_type_in: FormsTypeUpdate,
+):
+ """Update a form type."""
+ forms_type = get(db_session=db_session, forms_type_id=forms_type_id)
+ if not forms_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A form type with this id does not exist."}],
+ )
+ try:
+ forms_type = update(
+ db_session=db_session, forms_type=forms_type, forms_type_in=forms_type_in
+ )
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "A form type with this name already exists.",
+ "loc": "name",
+ }
+ ],
+ ) from None
+ return forms_type
+
+
+@router.delete(
+ "/{forms_type_id}/{individual_contact_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([FeedbackDeletePermission]))],
+)
+def delete_form(db_session: DbSession, forms_type_id: PrimaryKey):
+ """Delete a form type, returning only an HTTP 200 OK if successful."""
+ forms_type = get(db_session=db_session, forms_type_id=forms_type_id)
+ if not forms_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A form type with this id does not exist."}],
+ )
+ delete(db_session=db_session, forms_type_id=forms_type_id)
diff --git a/src/dispatch/forms/views.py b/src/dispatch/forms/views.py
new file mode 100644
index 000000000000..f1dbd528b5a7
--- /dev/null
+++ b/src/dispatch/forms/views.py
@@ -0,0 +1,138 @@
+import logging
+from fastapi import APIRouter, HTTPException, status, Depends, Response
+from pydantic import ValidationError
+
+from sqlalchemy.exc import IntegrityError
+
+from dispatch.auth.permissions import (
+ FeedbackDeletePermission,
+ PermissionsDependency,
+ SensitiveProjectActionPermission,
+)
+from dispatch.database.core import DbSession
+from dispatch.auth.service import CurrentUser
+from dispatch.database.service import search_filter_sort_paginate, CommonParameters
+from dispatch.models import PrimaryKey
+from dispatch.forms.type.service import send_email_to_service
+
+from .models import FormsRead, FormsUpdate, FormsPagination
+from .service import get, create, update, delete, export
+
+log = logging.getLogger(__name__)
+router = APIRouter()
+
+
+@router.get("", response_model=FormsPagination)
+def get_forms(commons: CommonParameters):
+ """Get all forms, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="Forms", **commons)
+
+
+@router.get("/{form_id}", response_model=FormsRead)
+def get_form(db_session: DbSession, form_id: PrimaryKey):
+ """Get a form by its id."""
+ form = get(db_session=db_session, forms_id=form_id)
+ if not form:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A form with this id does not exist."}],
+ )
+ return form
+
+
+@router.post("/completed/{form_id}", response_model=FormsRead)
+def sendEmailToService(db_session: DbSession, form_id: PrimaryKey):
+ """Sends an email to service indicating form is complete"""
+ form = get(db_session=db_session, forms_id=form_id)
+ if not form:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A form with this id does not exist."}],
+ )
+ if not form.form_type or not form.form_type.service:
+ log.warning(f"Missing form type or form type service for form: {form}")
+ return Response(status_code=status.HTTP_204_NO_CONTENT)
+ send_email_to_service(db_session=db_session, service=form.form_type.service, form=form)
+ return Response(status_code=status.HTTP_204_NO_CONTENT)
+
+
+@router.post("", response_model=FormsRead)
+def create_forms(
+ db_session: DbSession,
+ forms_in: dict,
+ current_user: CurrentUser,
+):
+ """Create a new form."""
+ try:
+ return create(db_session=db_session, creator=current_user, forms_in=forms_in)
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "A search filter with this name already exists.",
+ "loc": "name",
+ }
+ ],
+ ) from None
+
+
+@router.post(
+ "/export",
+ summary="Exports forms",
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def export_forms(
+ db_session: DbSession,
+ ids: list[int],
+):
+ """Exports forms."""
+ return export(db_session=db_session, ids=ids)
+
+
+@router.put(
+ "/{forms_id}/{individual_contact_id}",
+ response_model=FormsRead,
+ dependencies=[Depends(PermissionsDependency([FeedbackDeletePermission]))],
+)
+def update_forms(
+ db_session: DbSession,
+ forms_id: PrimaryKey,
+ forms_in: FormsUpdate,
+):
+ """Update a search filter."""
+ forms = get(db_session=db_session, forms_id=forms_id)
+ if not forms:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A form with this id does not exist."}],
+ )
+ try:
+ forms = update(db_session=db_session, forms=forms, forms_in=forms_in)
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "A form with this name already exists.",
+ "loc": "name",
+ }
+ ],
+ ) from None
+ return forms
+
+
+# here the individual_contact_id is the creator of the form
+# used to validate if they have permission to delete
+@router.delete(
+ "/{forms_id}/{individual_contact_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([FeedbackDeletePermission]))],
+)
+def delete_form(db_session: DbSession, forms_id: PrimaryKey):
+ """Delete a form, returning only an HTTP 200 OK if successful."""
+ form = get(db_session=db_session, forms_id=forms_id)
+ if not form:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A form with this id does not exist."}],
+ )
+ delete(db_session=db_session, forms_id=forms_id)
diff --git a/src/dispatch/group/enums.py b/src/dispatch/group/enums.py
new file mode 100644
index 000000000000..109c43998b56
--- /dev/null
+++ b/src/dispatch/group/enums.py
@@ -0,0 +1,11 @@
+from dispatch.enums import DispatchEnum
+
+
+class GroupType(DispatchEnum):
+ tactical = "tactical"
+ notifications = "notifications"
+
+
+class GroupAction(DispatchEnum):
+ add_member = "add_member"
+ remove_member = "remove_member"
diff --git a/src/dispatch/group/flows.py b/src/dispatch/group/flows.py
new file mode 100644
index 000000000000..d959ceb7e129
--- /dev/null
+++ b/src/dispatch/group/flows.py
@@ -0,0 +1,179 @@
+import logging
+from typing import TypeVar
+from sqlalchemy.orm import Session
+
+from dispatch.case.models import Case
+from dispatch.database.core import get_table_name_by_class_instance
+from dispatch.event import service as event_service
+from dispatch.incident.models import Incident
+from dispatch.plugin import service as plugin_service
+
+from .enums import GroupType, GroupAction
+from .models import Group, GroupCreate
+from .service import create
+
+log = logging.getLogger(__name__)
+
+Subject = TypeVar("Subject", Case, Incident)
+
+
+def create_group(
+ subject: Subject, group_type: str, group_participants: list[str], db_session: Session
+):
+ """Creates a group."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=subject.project.id, plugin_type="participant-group"
+ )
+ if not plugin:
+ log.warning("Group not created. No group plugin enabled.")
+ return
+
+ group_name = subject.name
+ if group_type == GroupType.notifications:
+ group_name = f"{subject.name}-{GroupType.notifications}"
+
+ # we create the external group
+ try:
+ external_group = plugin.instance.create(name=group_name, participants=group_participants)
+ except Exception as e:
+ log.exception(e)
+ return
+
+ if not external_group:
+ log.error(f"Group not created. Plugin {plugin.plugin.slug} encountered an error.")
+ return
+
+ external_group.update(
+ {
+ "resource_type": f"{plugin.plugin.slug}-{group_type}-group",
+ "resource_id": external_group["id"],
+ }
+ )
+
+ # we create the internal group
+ group_in = GroupCreate(
+ name=external_group["name"],
+ email=external_group["email"],
+ resource_type=external_group["resource_type"],
+ resource_id=external_group["resource_id"],
+ weblink=external_group["weblink"],
+ )
+ group = create(db_session=db_session, group_in=group_in)
+ subject.groups.append(group)
+
+ if group_type == GroupType.tactical:
+ subject.tactical_group_id = group.id
+ if group_type == GroupType.notifications:
+ subject.notifications_group_id = group.id
+
+ db_session.add(subject)
+ db_session.commit()
+
+ subject_type = get_table_name_by_class_instance(subject)
+ if subject_type == "case":
+ event_service.log_case_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description=f"Case {group_type} group created",
+ case_id=subject.id,
+ )
+ if subject_type == "incident":
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description=f"Incident {group_type} group created",
+ incident_id=subject.id,
+ )
+
+ return group
+
+
+def update_group(
+ subject: Subject,
+ group: Group,
+ group_action: GroupAction,
+ group_member: str,
+ db_session: Session,
+):
+ """Updates an existing group."""
+ if group is None:
+ log.warning(
+ f"Group not updated. No group provided. Cannot {group_action} for {group_member}."
+ )
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=subject.project.id, plugin_type="participant-group"
+ )
+ if not plugin:
+ log.warning("Group not updated. No group plugin enabled.")
+ return
+
+ # we get the list of group members
+ try:
+ group_members = plugin.instance.list(email=group.email)
+ except Exception as e:
+ log.exception(e)
+ return
+
+ subject_type = get_table_name_by_class_instance(subject)
+
+ # we add the member to the group if it's not a member
+ if group_action == GroupAction.add_member and group_member not in group_members:
+ try:
+ plugin.instance.add(email=group.email, participants=[group_member])
+ except Exception as e:
+ log.exception(e)
+ return
+
+ if subject_type == "case":
+ event_service.log_case_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description=f"{group_member} added to case group ({group.email})",
+ case_id=subject.id,
+ )
+ if subject_type == "incident":
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description=f"{group_member} added to incident group ({group.email})",
+ incident_id=subject.id,
+ )
+
+ # we remove the member from the group if it's a member
+ if group_action == GroupAction.remove_member and group_member in group_members:
+ try:
+ plugin.instance.remove(email=group.email, participants=[group_member])
+ except Exception as e:
+ log.exception(e)
+ return
+
+ if subject_type == "case":
+ event_service.log_case_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description=f"{group_member} removed from case group ({group.email})",
+ case_id=subject.id,
+ )
+ if subject_type == "incident":
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description=f"{group_member} removed from incident group ({group.email})",
+ incident_id=subject.id,
+ )
+
+
+def delete_group(group: Group, project_id: int, db_session: Session):
+ """Deletes an existing group."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type="participant-group"
+ )
+ if plugin:
+ try:
+ plugin.instance.delete(email=group.email)
+ except Exception as e:
+ log.exception(e)
+ else:
+ log.warning("Group not deleted. No group plugin enabled.")
diff --git a/src/dispatch/group/models.py b/src/dispatch/group/models.py
index 7b8e6a9cd1fc..58e962ec45cf 100644
--- a/src/dispatch/group/models.py
+++ b/src/dispatch/group/models.py
@@ -1,35 +1,47 @@
-from sqlalchemy import Column, Integer, String
+"""Models for group resources in the Dispatch application."""
+from pydantic import field_validator, EmailStr
-from dispatch.database import Base
-from dispatch.models import DispatchBase, ResourceMixin
+from sqlalchemy import Column, Integer, String, ForeignKey
+
+from dispatch.database.core import Base
+from dispatch.messaging.strings import TACTICAL_GROUP_DESCRIPTION
+from dispatch.models import NameStr, PrimaryKey
+from dispatch.models import ResourceBase, ResourceMixin
class Group(Base, ResourceMixin):
+ """SQLAlchemy model for group resources."""
id = Column(Integer, primary_key=True)
name = Column(String)
email = Column(String)
+ incident_id = Column(Integer, ForeignKey("incident.id", ondelete="CASCADE"))
+ case_id = Column(Integer, ForeignKey("case.id", ondelete="CASCADE"))
# Pydantic models...
-class GroupBase(DispatchBase):
- name: str
- email: str
- resource_id: str
- resource_type: str
- weblink: str
+class GroupBase(ResourceBase):
+ """Base Pydantic model for group resources."""
+ name: NameStr
+ email: EmailStr
class GroupCreate(GroupBase):
+ """Pydantic model for creating a group resource."""
pass
class GroupUpdate(GroupBase):
- id: int
+ """Pydantic model for updating a group resource."""
+ id: PrimaryKey | None = None
class GroupRead(GroupBase):
- id: int
-
-
-class GroupNested(GroupBase):
- id: int
+ """Pydantic model for reading a group resource."""
+ id: PrimaryKey
+ description: str | None = None
+
+ @field_validator("description", mode="before")
+ @classmethod
+ def set_description(cls, v):
+ """Sets the description for the group resource."""
+ return TACTICAL_GROUP_DESCRIPTION
diff --git a/src/dispatch/group/service.py b/src/dispatch/group/service.py
index cbef0dcde2d0..be889044bc3f 100644
--- a/src/dispatch/group/service.py
+++ b/src/dispatch/group/service.py
@@ -1,18 +1,15 @@
-from typing import Optional
-
-from fastapi.encoders import jsonable_encoder
from .models import Group, GroupCreate, GroupUpdate
-def get(*, db_session, group_id: int) -> Optional[Group]:
+def get(*, db_session, group_id: int) -> Group | None:
"""Returns a group given a group id."""
return db_session.query(Group).filter(Group.id == group_id).one_or_none()
def get_by_incident_id_and_resource_type(
*, db_session, incident_id: str, resource_type: str
-) -> Optional[Group]:
+) -> Group | None:
"""Returns a group given an incident id and group resource type."""
return (
db_session.query(Group)
@@ -37,14 +34,13 @@ def create(*, db_session, group_in: GroupCreate) -> Group:
def update(*, db_session, group: Group, group_in: GroupUpdate) -> Group:
"""Updates a group."""
- group_data = jsonable_encoder(group)
- update_data = group_in.dict(skip_defaults=True)
+ group_data = group.dict()
+ update_data = group_in.dict(exclude_unset=True)
for field in group_data:
if field in update_data:
setattr(group, field, update_data[field])
- db_session.add(group)
db_session.commit()
return group
diff --git a/src/dispatch/incident/enums.py b/src/dispatch/incident/enums.py
index 47f5a50f82a3..5c1d02ae0d94 100644
--- a/src/dispatch/incident/enums.py
+++ b/src/dispatch/incident/enums.py
@@ -1,7 +1,7 @@
-from enum import Enum
+from dispatch.enums import DispatchEnum
-class IncidentStatus(str, Enum):
+class IncidentStatus(DispatchEnum):
active = "Active"
stable = "Stable"
closed = "Closed"
diff --git a/src/dispatch/incident/flows.py b/src/dispatch/incident/flows.py
index eee51515abe7..8c00ea13fc51 100644
--- a/src/dispatch/incident/flows.py
+++ b/src/dispatch/incident/flows.py
@@ -1,872 +1,887 @@
-"""
-.. module: dispatch.incident.flows
- :platform: Unix
- :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
- :license: Apache, see LICENSE for more details.
-
-.. moduleauthor:: Kevin Glisson
-.. moduleauthor:: Marc Vilanova
-"""
import logging
from datetime import datetime
-from typing import Any, List, Optional
-
-from dispatch.config import (
- INCIDENT_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT_ID,
- INCIDENT_DOCUMENT_INVESTIGATION_SHEET_ID,
- INCIDENT_FAQ_DOCUMENT_ID,
- INCIDENT_PLUGIN_CONTACT_SLUG,
- INCIDENT_PLUGIN_CONVERSATION_SLUG,
- INCIDENT_PLUGIN_CONFERENCE_SLUG,
- INCIDENT_PLUGIN_DOCUMENT_RESOLVER_SLUG,
- INCIDENT_PLUGIN_DOCUMENT_SLUG,
- INCIDENT_PLUGIN_GROUP_SLUG,
- INCIDENT_PLUGIN_PARTICIPANT_SLUG,
- INCIDENT_PLUGIN_STORAGE_SLUG,
- INCIDENT_PLUGIN_TICKET_SLUG,
- INCIDENT_RESOURCE_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT,
- INCIDENT_RESOURCE_FAQ_DOCUMENT,
- INCIDENT_RESOURCE_INCIDENT_REVIEW_DOCUMENT,
- INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT,
- INCIDENT_RESOURCE_INVESTIGATION_SHEET,
- INCIDENT_RESOURCE_NOTIFICATIONS_GROUP,
- INCIDENT_RESOURCE_TACTICAL_GROUP,
- INCIDENT_STORAGE_ARCHIVAL_FOLDER_ID,
- INCIDENT_STORAGE_INCIDENT_REVIEW_FILE_ID,
- INCIDENT_STORAGE_RESTRICTED,
-)
-from dispatch.conversation import service as conversation_service
-from dispatch.conversation.models import ConversationCreate
-from dispatch.database import SessionLocal
+from sqlalchemy.orm import Session
+
+from dispatch.ai import service as ai_service
+from dispatch.case import flows as case_flows
+from dispatch.case import service as case_service
+from dispatch.case.enums import CaseResolutionReason, CaseStatus
+from dispatch.case.models import Case
+from dispatch.conference import flows as conference_flows
+from dispatch.conversation import flows as conversation_flows
+from dispatch.database.core import resolve_attr
from dispatch.decorators import background_task
-from dispatch.document import service as document_service
-from dispatch.document.models import DocumentCreate
-from dispatch.document.service import get_by_incident_id_and_resource_type as get_document
-from dispatch.enums import Visibility
-from dispatch.group import service as group_service
-from dispatch.group.models import GroupCreate
-from dispatch.conference import service as conference_service
-from dispatch.conference.models import ConferenceCreate
+from dispatch.document import flows as document_flows
+from dispatch.document.models import Document
+from dispatch.enums import DocumentResourceTypes, EventType, Visibility
+from dispatch.event import service as event_service
+from dispatch.group import flows as group_flows
+from dispatch.group.enums import GroupAction, GroupType
from dispatch.incident import service as incident_service
from dispatch.incident.models import IncidentRead
-from dispatch.incident_priority.models import IncidentPriorityRead
-from dispatch.incident_type.models import IncidentTypeRead
+from dispatch.incident.type.service import get as get_incident_type
+from dispatch.incident_cost import service as incident_cost_service
+from dispatch.individual import service as individual_service
+from dispatch.individual.models import IndividualContact
+from dispatch.auth import service as auth_service
from dispatch.participant import flows as participant_flows
from dispatch.participant import service as participant_service
+from dispatch.participant.models import Participant
from dispatch.participant_role import flows as participant_role_flows
from dispatch.participant_role.models import ParticipantRoleType
-from dispatch.plugins.base import plugins
+from dispatch.plugin import service as plugin_service
+from dispatch.report.enums import ReportTypes
+from dispatch.report.messaging import send_incident_report_reminder
from dispatch.service import service as service_service
-from dispatch.storage import service as storage_service
-from dispatch.ticket import service as ticket_service
-from dispatch.ticket.models import TicketCreate
+from dispatch.storage import flows as storage_flows
+from dispatch.tag.flows import check_for_tag_change
+from dispatch.task.enums import TaskStatus
+from dispatch.team.models import TeamContact
+from dispatch.ticket import flows as ticket_flows
+from dispatch.canvas import flows as canvas_flows
from .messaging import (
- send_incident_update_notifications,
+ bulk_participant_announcement_message,
+ send_incident_closed_information_review_reminder,
send_incident_commander_readded_notification,
+ send_incident_created_notifications,
+ send_incident_management_help_tips_message,
send_incident_new_role_assigned_notification,
- send_incident_notifications,
- send_incident_participant_announcement_message,
- send_incident_participant_has_role_ephemeral_message,
- send_incident_participant_role_not_assigned_ephemeral_message,
- send_incident_resources_ephemeral_message_to_participant,
+ send_incident_open_tasks_ephemeral_message,
+ send_incident_rating_feedback_message,
send_incident_review_document_notification,
+ send_incident_update_notifications,
send_incident_welcome_participant_messages,
- send_incident_status_report_reminder,
+ send_participant_announcement_message,
)
from .models import Incident, IncidentStatus
log = logging.getLogger(__name__)
-def get_incident_participants(
- db_session, incident_type: IncidentTypeRead, priority: IncidentPriorityRead, description: str
-):
- """Get additional incident participants based on priority, type, and description."""
- p = plugins.get(INCIDENT_PLUGIN_PARTICIPANT_SLUG)
- individual_contacts, team_contacts = p.get(
- incident_type, priority, description, db_session=db_session
- )
- return individual_contacts, team_contacts
-
-
-def get_incident_documents(
- db_session, incident_type: IncidentTypeRead, priority: IncidentPriorityRead, description: str
-):
- """Get additional incident documents based on priority, type, and description."""
- p = plugins.get(INCIDENT_PLUGIN_DOCUMENT_RESOLVER_SLUG)
- documents = p.get(incident_type, priority, description, db_session=db_session)
- return documents
-
-
-def create_incident_ticket(
- title: str, incident_type: str, priority: str, commander: str, reporter: str, visibility: str
-):
- """Create an external ticket for tracking."""
- p = plugins.get(INCIDENT_PLUGIN_TICKET_SLUG)
-
- if visibility == Visibility.restricted:
- title = incident_type
-
- ticket = p.create(title, incident_type, priority, commander, reporter)
- ticket.update({"resource_type": INCIDENT_PLUGIN_TICKET_SLUG})
- return ticket
-
-
-def update_incident_ticket(
- ticket_id: str,
- title: str = None,
- description: str = None,
- incident_type: str = None,
- priority: str = None,
- status: str = None,
- commander_email: str = None,
- reporter_email: str = None,
- conversation_weblink: str = None,
- document_weblink: str = None,
- storage_weblink: str = None,
- conference_weblink: str = None,
- labels: List[str] = None,
- cost: str = None,
- visibility: str = None,
-):
- """Update external incident ticket."""
- p = plugins.get(INCIDENT_PLUGIN_TICKET_SLUG)
-
- if visibility == Visibility.restricted:
- title = description = incident_type
-
- p.update(
- ticket_id,
- title=title,
- description=description,
- incident_type=incident_type,
- priority=priority,
- status=status,
- commander_email=commander_email,
- reporter_email=reporter_email,
- conversation_weblink=conversation_weblink,
- document_weblink=document_weblink,
- storage_weblink=storage_weblink,
- conference_weblink=conference_weblink,
- labels=labels,
- cost=cost,
- )
+def filter_participants_for_bridge(
+ participant_emails: list[str], project_id: int, db_session: Session
+) -> list[str]:
+ """Filter participant emails to only include those who have opted into bridge participation."""
+ filtered_emails = []
+ for email in participant_emails:
+ # Get the dispatch user by email
+ dispatch_user = auth_service.get_by_email(db_session=db_session, email=email)
+ if dispatch_user:
+ # Get or create user settings
+ user_settings = auth_service.get_or_create_user_settings(
+ db_session=db_session, user_id=dispatch_user.id
+ )
+ # Check if user has opted into bridge participation
+ if user_settings.auto_add_to_incident_bridges:
+ filtered_emails.append(email)
+ else:
+ # If no dispatch user found, default to adding them (they can't opt out without a user account)
+ filtered_emails.append(email)
+ return filtered_emails
- log.debug("The external ticket has been updated.")
+def get_incident_participants(
+ incident: Incident, db_session: Session
+) -> tuple[list[IndividualContact | None], list[TeamContact | None]]:
+ """
+ Get additional participants (individuals and teams) based on
+ incident description, type, and priority.
+ """
+ individual_contacts = []
+ team_contacts = []
-def create_participant_groups(
- name: str, indirect_participants: List[Any], direct_participants: List[Any]
-):
- """Create external participant groups."""
- p = plugins.get(INCIDENT_PLUGIN_GROUP_SLUG)
+ if incident.visibility == Visibility.open:
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="participant"
+ )
+ if plugin:
+ individual_contacts, team_contacts = plugin.instance.get(
+ class_instance=incident,
+ project_id=incident.project.id,
+ db_session=db_session,
+ )
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=plugin.plugin.title,
+ description="Incident participants resolved",
+ incident_id=incident.id,
+ type=EventType.participant_updated,
+ )
+ else:
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Incident participants not resolved",
+ incident_id=incident.id,
+ type=EventType.participant_updated,
+ )
+ log.warning("Incident participants not resolved. No participant plugin enabled.")
- group_name = f"{name}"
- notification_group_name = f"{group_name}-notifications"
+ return individual_contacts, team_contacts
- direct_participant_emails = [x.email for x in direct_participants]
- tactical_group = p.create(
- group_name, direct_participant_emails
- ) # add participants to core group
- indirect_participant_emails = [x.email for x in indirect_participants]
- indirect_participant_emails.append(
- tactical_group["email"]
- ) # add all those already in the tactical group
- notification_group = p.create(notification_group_name, indirect_participant_emails)
+def reactivate_incident_participants(incident: Incident, db_session: Session):
+ """Reactivates all incident participants."""
+ for participant in incident.participants:
+ try:
+ incident_add_or_reactivate_participant_flow(
+ participant.individual.email, incident.id, db_session=db_session
+ )
+ except Exception as e:
+ # don't fail to reactivate all participants if one fails
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Unable to reactivate participant with email {participant.individual.email}",
+ incident_id=incident.id,
+ type=EventType.participant_updated,
+ )
+ log.exception(e)
- tactical_group.update(
- {"resource_type": INCIDENT_RESOURCE_TACTICAL_GROUP, "resource_id": tactical_group["id"]}
- )
- notification_group.update(
- {
- "resource_type": INCIDENT_RESOURCE_NOTIFICATIONS_GROUP,
- "resource_id": notification_group["id"],
- }
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Incident participants reactivated",
+ incident_id=incident.id,
+ type=EventType.participant_updated,
)
- return tactical_group, notification_group
+def inactivate_incident_participants(incident: Incident, db_session: Session):
+ """Inactivates all incident participants."""
+ for participant in incident.participants:
+ try:
+ participant_flows.inactivate_participant(
+ participant.individual.email, incident, db_session
+ )
+ except Exception as e:
+ # don't fail to inactivate all participants if one fails
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"Unable to inactivate participant with email {participant.individual.email}",
+ incident_id=incident.id,
+ type=EventType.participant_updated,
+ )
+ log.exception(e)
-def create_conference(incident: Incident, participants: List[str]):
- """Create external conference room."""
- conference_plugin = plugins.get(INCIDENT_PLUGIN_CONFERENCE_SLUG)
- conference = conference_plugin.create(incident.name, participants=participants)
-
- conference.update(
- {"resource_type": INCIDENT_PLUGIN_CONFERENCE_SLUG, "resource_id": conference["id"]}
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Incident participants inactivated",
+ incident_id=incident.id,
+ type=EventType.participant_updated,
)
- return conference
+def incident_create_resources(
+ *,
+ incident: Incident,
+ db_session: Session | None = None,
+ case: Case | None = None, # if it was escalated, we'll pass in the escalated case
+) -> Incident:
+ """Creates all resources required for incidents."""
+ # we create the incident ticket
+ if not incident.ticket or incident.ticket.resource_type == "jira-error-ticket":
+ ticket_flows.create_incident_ticket(incident=incident, db_session=db_session)
+
+ # we update the channel name immediately for dedicated channel cases escalated -> incident
+ if case and case.dedicated_channel and case.escalated_at is not None:
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=case.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning("Incident channel not renamed. No conversation plugin enabled.")
+ return
+
+ plugin.instance.rename(
+ conversation_id=incident.conversation.channel_id,
+ name=incident.name,
+ )
-def create_incident_storage(name: str, participant_group_emails: List[str]):
- """Create an external file store for incident storage."""
- p = plugins.get(INCIDENT_PLUGIN_STORAGE_SLUG)
- storage = p.create(name, participant_group_emails)
- storage.update({"resource_type": INCIDENT_PLUGIN_STORAGE_SLUG, "resource_id": storage["id"]})
+ # we resolve individual and team participants
+ individual_participants, team_participants = get_incident_participants(incident, db_session)
+ tactical_participant_emails = [i.email for i, _ in individual_participants]
- if INCIDENT_STORAGE_RESTRICTED:
- p.restrict(storage["resource_id"])
- log.debug("The incident storage has been restricted.")
+ # we create the tactical group
+ if not incident.tactical_group:
+ group_flows.create_group(
+ subject=incident,
+ group_type=GroupType.tactical,
+ group_participants=tactical_participant_emails,
+ db_session=db_session,
+ )
- return storage
+ # we create the notifications group
+ if not incident.notifications_group:
+ notification_participant_emails = [t.email for t in team_participants]
+ group_flows.create_group(
+ subject=incident,
+ group_type=GroupType.notifications,
+ group_participants=notification_participant_emails,
+ db_session=db_session,
+ )
+ # we create the storage folder
+ if not incident.storage:
+ storage_members = []
+ if incident.tactical_group and incident.notifications_group:
+ storage_members = [incident.tactical_group.email, incident.notifications_group.email]
+ else:
+ storage_members = tactical_participant_emails
-def create_collaboration_documents(
- name: str, incident_type: str, storage_id: str, template_id: int
-):
- """Create external collaboration document."""
- p = plugins.get(INCIDENT_PLUGIN_STORAGE_SLUG)
-
- document_name = f"{name} - Incident Document"
-
- # TODO can we make move and copy in one api call? (kglisson)
- document = p.copy_file(storage_id, template_id, document_name)
- p.move_file(storage_id, document["id"])
-
- # NOTE this should be optional
- if INCIDENT_DOCUMENT_INVESTIGATION_SHEET_ID:
- sheet_name = f"{name} - Incident Tracking Sheet"
- sheet = p.copy_file(storage_id, INCIDENT_DOCUMENT_INVESTIGATION_SHEET_ID, sheet_name)
- p.move_file(storage_id, sheet["id"])
-
- p.create_file(storage_id, "logs")
- p.create_file(storage_id, "screengrabs")
-
- # TODO this logic should probably be pushed down into the plugins i.e. making them return
- # the fields we expect instead of re-mapping. (kglisson)
- document.update(
- {
- "name": document_name,
- "resource_type": INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT,
- "resource_id": document["id"],
- }
- )
- sheet.update(
- {
- "name": sheet_name,
- "resource_type": INCIDENT_RESOURCE_INVESTIGATION_SHEET,
- "resource_id": sheet["id"],
- }
- )
+ storage_flows.create_storage(
+ subject=incident, storage_members=storage_members, db_session=db_session
+ )
- return document, sheet
+ # we create the incident document
+ if not incident.incident_document:
+ document_flows.create_document(
+ subject=incident,
+ document_type=DocumentResourceTypes.incident,
+ document_template=incident.incident_type.incident_template_document,
+ db_session=db_session,
+ )
+ # we create the conference room
+ if not incident.conference:
+ # we only include individuals that are directly participating in the
+ # resolution of the incident and have opted into bridge participation
+ conference_participants = tactical_participant_emails
+ if incident.tactical_group:
+ conference_participants = [incident.tactical_group.email]
+ else:
+ # filter participants based on their bridge participation preferences
+ conference_participants = filter_participants_for_bridge(
+ tactical_participant_emails, incident.project.id, db_session
+ )
+ conference_flows.create_conference(
+ incident=incident, participants=conference_participants, db_session=db_session
+ )
-def create_conversation(incident: Incident, participants: List[str]):
- """Create external communication conversation."""
# we create the conversation
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- conversation = convo_plugin.create(incident.name, participants)
-
- conversation.update(
- {"resource_type": INCIDENT_PLUGIN_CONVERSATION_SLUG, "resource_id": conversation["name"]}
- )
+ if not incident.conversation:
+ # dedicate channel cases escalated to incidents do not exercise this code
+ conversation_flows.create_incident_conversation(incident=incident, db_session=db_session)
- return conversation
+ # we update the incident ticket
+ ticket_flows.update_incident_ticket(incident_id=incident.id, db_session=db_session)
+ # we update the incident document
+ document_flows.update_document(
+ document=incident.incident_document, project_id=incident.project.id, db_session=db_session
+ )
-def set_conversation_topic(incident: Incident):
- """Sets the conversation topic."""
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- conversation_topic = f":helmet_with_white_cross: {incident.commander.name} - Type: {incident.incident_type.name} - Priority: {incident.incident_priority.name} - Status: {incident.status}"
- convo_plugin.set_topic(incident.conversation.channel_id, conversation_topic)
+ # we set the conversation topic
+ conversation_flows.set_conversation_topic(incident, db_session)
+
+ # and set the conversation description
+ if incident.incident_type.channel_description is not None:
+ conversation_flows.set_conversation_description(incident, db_session)
+
+ # we set the conversation bookmarks
+ bookmarks = [
+ # resource, title
+ (incident.incident_document, None), # generated by resource name
+ (incident.ticket, "Incident Ticket"),
+ (incident.conference, "Incident Bridge"),
+ (incident.storage, "Incident Storage"),
+ ]
+ for resource, title in bookmarks:
+ if not resource:
+ continue
+
+ conversation_flows.add_conversation_bookmark(
+ subject=incident,
+ resource=resource,
+ db_session=db_session,
+ title=title,
+ )
- log.debug(f"Conversation topic set to {conversation_topic}.")
+ # we defer this setup for all resolved incident roles until after resources have been created
+ roles = ["reporter", "commander", "liaison", "scribe"]
+ user_emails = [
+ resolve_attr(incident, f"{role}.individual.email")
+ for role in roles
+ if resolve_attr(incident, role)
+ ]
+ user_emails = list(dict.fromkeys(user_emails))
-def update_document(
- document_id: str,
- name: str,
- priority: str,
- status: str,
- title: str,
- description: str,
- commander_fullname: str,
- conversation_weblink: str,
- document_weblink: str,
- storage_weblink: str,
- ticket_weblink: str,
- conference_weblink: str = None,
- conference_challenge: str = None,
-):
- """Update external collaboration document."""
- p = plugins.get(INCIDENT_PLUGIN_DOCUMENT_SLUG)
- p.update(
- document_id,
- name=name,
- priority=priority,
- status=status,
- title=title,
- description=description,
- commander_fullname=commander_fullname,
- conversation_weblink=conversation_weblink,
- document_weblink=document_weblink,
- storage_weblink=storage_weblink,
- ticket_weblink=ticket_weblink,
- conference_weblink=conference_weblink,
- conference_challenge=conference_challenge,
+ # we add any observer added in create (like new oncall participant)
+ participant_with_observer_role = participant_service.get_by_incident_id_and_role(
+ db_session=db_session, incident_id=incident.id, role=ParticipantRoleType.observer
)
+ if participant_with_observer_role:
+ # add to list
+ user_emails.append(participant_with_observer_role.individual.email)
+
+ for user_email in user_emails:
+ # we add the participant to the tactical group
+ group_flows.update_group(
+ subject=incident,
+ group=incident.tactical_group,
+ group_action=GroupAction.add_member,
+ group_member=user_email,
+ db_session=db_session,
+ )
- log.debug("The external collaboration document has been updated.")
+ # we add the participant to the conversation
+ conversation_flows.add_incident_participants_to_conversation(
+ incident=incident,
+ participant_emails=[user_email],
+ db_session=db_session,
+ )
+ # we send the welcome messages to the participant
+ send_incident_welcome_participant_messages(user_email, incident, db_session)
-def add_participant_to_conversation(
- participant_email: str, incident_id: int, db_session: SessionLocal
-):
- """Adds a participant to the conversation."""
- # we load the incident instance
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+ bulk_participant_announcement_message(
+ participant_emails=user_emails,
+ subject=incident,
+ db_session=db_session,
+ )
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- convo_plugin.add(incident.conversation.channel_id, [participant_email])
+ # wait until all resources are created before adding suggested participants
+ for individual, service_id in individual_participants:
+ incident_add_or_reactivate_participant_flow(
+ individual.email,
+ incident.id,
+ participant_role=ParticipantRoleType.observer,
+ service_id=service_id,
+ db_session=db_session,
+ send_announcement_message=False,
+ )
+ # Create the participants canvas after all participants have been resolved
+ try:
+ canvas_flows.create_participants_canvas(incident=incident, db_session=db_session)
+ log.info(f"Created participants canvas for incident {incident.id}")
+ except Exception as e:
+ log.exception(f"Failed to create participants canvas for incident {incident.id}: {e}")
-@background_task
-def add_participant_to_tactical_group(user_email: str, incident_id: int, db_session=None):
- """Adds participant to the tactical group."""
- # we get the tactical group
- tactical_group = group_service.get_by_incident_id_and_resource_type(
+ event_service.log_incident_event(
db_session=db_session,
- incident_id=incident_id,
- resource_type=INCIDENT_RESOURCE_TACTICAL_GROUP,
+ source="Dispatch Core App",
+ description="Incident participants added to incident",
+ incident_id=incident.id,
+ type=EventType.participant_updated,
)
- p = plugins.get(INCIDENT_PLUGIN_GROUP_SLUG)
- p.add(tactical_group.email, [user_email])
-
- log.debug(f"{user_email} has been added to tactical group {tactical_group.email}")
+ return incident
-# TODO create some ability to checkpoint
-# We could use the model itself as the checkpoint, commiting resources as we go
-# Then checking for the existence of those resources before creating them for
-# this incident.
@background_task
-def incident_create_flow(*, incident_id: int, checkpoint: str = None, db_session=None):
- """Creates all resources required for new incidents."""
+def incident_create_resources_flow(
+ *, organization_slug: str, incident_id: int, db_session=None
+) -> Incident:
+ """Creates all resources required for an existing incident."""
+ # we get the incident
incident = incident_service.get(db_session=db_session, incident_id=incident_id)
- # get the incident participants based on incident type and priority
- individual_participants, team_participants = get_incident_participants(
- db_session, incident.incident_type, incident.incident_priority, incident.description
- )
+ # we create the incident resources
+ return incident_create_resources(incident=incident, db_session=db_session)
- # add individuals to incident
- for individual in individual_participants:
- participant_flows.add_participant(
- db_session=db_session, user_email=individual.email, incident_id=incident.id
- )
- log.debug(f"Added {len(individual_participants)} participants to incident id {incident.id}.")
+@background_task
+def incident_create_flow(
+ *,
+ organization_slug: str,
+ incident_id: int,
+ case_id: int | None = None,
+ db_session: Session | None = None,
+) -> Incident:
+ """Creates all resources required for new incidents and initiates incident response workflow."""
+ # we get the incident
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
- # create the incident ticket
- ticket = create_incident_ticket(
- incident.title,
- incident.incident_type.name,
- incident.incident_priority.name,
- incident.commander.email,
- incident.reporter.email,
- incident.visibility,
+ case = None
+ if case_id:
+ case = case_service.get(db_session=db_session, case_id=case_id)
+
+ # we create the incident resources
+ incident_create_resources(
+ incident=incident,
+ case=case if case else None,
+ db_session=db_session,
)
- incident.ticket = ticket_service.create(db_session=db_session, ticket_in=TicketCreate(**ticket))
+ send_incident_created_notifications(incident, db_session)
- log.debug("Added ticket to incident.")
+ # we page the incident commander based on incident priority
+ if incident.incident_priority.page_commander:
+ if incident.commander.service:
+ service_id = incident.commander.service.external_id
+ oncall_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="oncall"
+ )
+ if oncall_plugin:
+ oncall_plugin.instance.page(
+ service_id=service_id,
+ incident_name=incident.name,
+ incident_title=incident.title,
+ incident_description=incident.description,
+ )
+ else:
+ log.warning("Incident commander not paged. No plugin of type oncall enabled.")
+ else:
+ log.warning(
+ "Incident commander not paged. No relationship between commander and an oncall service."
+ )
- # we set the incident name
- name = ticket["resource_id"]
- incident.name = name
+ # we send a message to the incident commander with tips on how to manage the incident
+ send_incident_management_help_tips_message(incident, db_session)
- log.debug("Added name to incident.")
+ db_session.add(incident)
+ db_session.commit()
+
+ return incident
- # we create the participant groups (tactical and notification)
- tactical_group, notification_group = create_participant_groups(
- name, team_participants, [x.individual for x in incident.participants]
+
+@background_task
+def incident_create_stable_flow(
+ *, incident_id: int, organization_slug: str = None, db_session=None
+):
+ """Creates all resources necessary when an incident is created with a stable status."""
+ incident_create_flow(
+ incident_id=incident_id, organization_slug=organization_slug, db_session=db_session
)
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+ incident_stable_status_flow(incident=incident, db_session=db_session)
- for g in [tactical_group, notification_group]:
- group_in = GroupCreate(
- name=g["name"],
- email=g["email"],
- resource_type=g["resource_type"],
- resource_id=g["resource_id"],
- weblink=g["weblink"],
- )
- incident.groups.append(group_service.create(db_session=db_session, group_in=group_in))
- log.debug("Added groups to incident.")
+@background_task
+def incident_create_closed_flow(
+ *, incident_id: int, organization_slug: str = None, db_session=None
+):
+ """Creates all resources necessary when an incident is created with a closed status."""
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
- # we create storage resource
- storage = create_incident_storage(name, [tactical_group["email"], notification_group["email"]])
- incident.storage = storage_service.create(
- db_session=db_session,
- resource_id=storage["resource_id"],
- resource_type=storage["resource_type"],
- weblink=storage["weblink"],
- )
+ # we inactivate all participants
+ inactivate_incident_participants(incident, db_session)
- # we create the incident documents
- incident_document, incident_sheet = create_collaboration_documents(
- incident.name,
- incident.incident_type.name,
- incident.storage.resource_id,
- incident.incident_type.template_document.resource_id,
- )
+ # we set the stable and close times to the reported time
+ incident.stable_at = incident.closed_at = incident.reported_at
- # TODO: we need to delineate between the investigation document and suggested documents
- # # get any additional documentation based on priority or terms
- # incident_documents = get_incident_documents(
- # db_session, incident.incident_type, incident.incident_priority, incident.description
- # )
- #
- # incident.documents = incident_documents
-
- faq_document = {
- "name": "Incident FAQ",
- "resource_id": INCIDENT_FAQ_DOCUMENT_ID,
- "weblink": f"https://docs.google.com/document/d/{INCIDENT_FAQ_DOCUMENT_ID}",
- "resource_type": INCIDENT_RESOURCE_FAQ_DOCUMENT,
- }
-
- conversation_commands_reference_document = {
- "name": "Incident Conversation Commands Reference Document",
- "resource_id": INCIDENT_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT_ID,
- "weblink": f"https://docs.google.com/document/d/{INCIDENT_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT_ID}",
- "resource_type": INCIDENT_RESOURCE_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT,
- }
-
- conference = create_conference(incident, [tactical_group["email"]])
-
- log.debug("Conference created. Tactical group added.")
-
- conference_in = ConferenceCreate(
- resource_id=conference["resource_id"],
- resource_type=conference["resource_type"],
- weblink=conference["weblink"],
- conference_id=conference["id"],
- conference_challenge=conference["challenge"]
- )
- incident.conference = conference_service.create(
- db_session=db_session, conference_in=conference_in
- )
+ # we create the incident ticket
+ ticket_flows.create_incident_ticket(incident=incident, db_session=db_session)
- log.debug("Added conference to incident.")
+ # we update the incident ticket
+ ticket_flows.update_incident_ticket(incident_id=incident.id, db_session=db_session)
- for d in [
- incident_document,
- incident_sheet,
- faq_document,
- conversation_commands_reference_document,
- ]:
- document_in = DocumentCreate(
- name=d["name"],
- resource_id=d["resource_id"],
- resource_type=d["resource_type"],
- weblink=d["weblink"],
- )
- incident.documents.append(
- document_service.create(db_session=db_session, document_in=document_in)
- )
+ db_session.add(incident)
+ db_session.commit()
- log.debug("Added documents to incident.")
- # we create the conversation for real-time communications
- conversation = create_conversation(
- incident, [x.individual.email for x in incident.participants]
- )
+def incident_active_status_flow(incident: Incident, db_session=None):
+ """Runs the incident active flow."""
+ # we un-archive the conversation
+ conversation_flows.unarchive_conversation(subject=incident, db_session=db_session)
- log.debug("Conversation created. Participants and bots added.")
- conversation_in = ConversationCreate(
- resource_id=conversation["resource_id"],
- resource_type=conversation["resource_type"],
- weblink=conversation["weblink"],
- channel_id=conversation["id"],
+def create_incident_review_document(incident: Incident, db_session=None) -> Document | None:
+ # we create the post-incident review document
+ document_flows.create_document(
+ subject=incident,
+ document_type=DocumentResourceTypes.review,
+ document_template=incident.incident_type.review_template_document,
+ db_session=db_session,
)
- incident.conversation = conversation_service.create(
- db_session=db_session, conversation_in=conversation_in
+ # we update the post-incident review document
+ document_flows.update_document(
+ document=incident.incident_review_document,
+ project_id=incident.project.id,
+ db_session=db_session,
)
+ return incident.incident_review_document
- db_session.add(incident)
- db_session.commit()
- log.debug("Added conversation to incident.")
+def handle_incident_review_updates(incident: Incident, db_session=None):
+ """Manages the steps following the creation of an incident review document.
- # we set the conversation topic
- set_conversation_topic(incident)
-
- update_incident_ticket(
- incident.ticket.resource_id,
- title=incident.title,
- description=incident.description,
- incident_type=incident.incident_type.name,
- priority=incident.incident_priority.name,
- status=incident.status,
- commander_email=incident.commander.email,
- reporter_email=incident.reporter.email,
- conversation_weblink=incident.conversation.weblink,
- document_weblink=incident_document["weblink"],
- storage_weblink=incident.storage.weblink,
- conference_weblink=incident.conference.weblink,
- visibility=incident.visibility,
- )
-
- log.debug("Updated incident ticket.")
-
- update_document(
- incident_document["id"],
- incident.name,
- incident.incident_priority.name,
- incident.status,
- incident.title,
- incident.description,
- incident.commander.name,
- incident.conversation.weblink,
- incident_document["weblink"],
- incident.storage.weblink,
- incident.ticket.weblink,
- incident.conference.weblink,
- incident.conference.conference_challenge,
+ This includes updating the incident costs with the incident review costss, notifying the participants, and bookmarking the document in the conversation.
+ """
+ # Add the incident review costs to the incident costs.
+ incident_cost_service.update_incident_response_cost(
+ incident_id=incident.id,
+ db_session=db_session,
+ incident_review=bool(incident.incident_review_document),
)
- log.debug("Updated incident document.")
-
- for participant in incident.participants:
- # we announce the participant in the conversation
- send_incident_participant_announcement_message(
- participant.individual.email, incident.id, db_session
+ if incident.incident_review_document and incident.conversation:
+ # Send a notification about the incident review document to the conversation
+ send_incident_review_document_notification(
+ incident.conversation.channel_id,
+ incident.incident_review_document.weblink,
+ incident,
+ db_session,
)
- # we send the welcome messages to the participant
- send_incident_welcome_participant_messages(
- participant.individual.email, incident.id, db_session
+ # Bookmark the incident review document in the conversation
+ conversation_flows.add_conversation_bookmark(
+ subject=incident, resource=incident.incident_review_document, db_session=db_session
)
- log.debug("Sent incident welcome and announcement notifications.")
- if incident.visibility == Visibility.open:
- send_incident_notifications(incident, db_session)
- log.debug("Sent incident notifications.")
+def incident_stable_status_flow(incident: Incident, db_session=None):
+ """Runs the incident stable flow."""
+ # Set the stable time.
+ incident.stable_at = datetime.utcnow()
+ db_session.add(incident)
+ db_session.commit()
+ if incident.incident_document:
+ # Update the incident document.
+ document_flows.update_document(
+ document=incident.incident_document,
+ project_id=incident.project.id,
+ db_session=db_session,
+ )
-@background_task
-def incident_active_flow(incident_id: int, command: Optional[dict] = None, db_session=None):
- """Runs the incident active flow."""
- # we load the incident instance
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+ if incident.incident_review_document:
+ log.info("The post-incident review document has already been created. Skipping creation...")
+ return
- # we update the status of the external ticket
- update_incident_ticket(
- incident.ticket.resource_id,
- incident_type=incident.incident_type.name,
- status=IncidentStatus.active.lower(),
+ incident_type = get_incident_type(
+ db_session=db_session, incident_type_id=incident.incident_type_id
)
+ if incident_type.exclude_from_review:
+ log.info(
+ f"Incident of type {incident_type.name} is excluded from review. Skipping creation..."
+ )
+ return
- send_incident_status_report_reminder(incident)
+ # Create the post-incident review document.
+ create_incident_review_document(incident=incident, db_session=db_session)
- log.debug(f"We have updated the status of the external ticket to {IncidentStatus.active}.")
+ handle_incident_review_updates(incident=incident, db_session=db_session)
-@background_task
-def incident_stable_flow(incident_id: int, command: Optional[dict] = None, db_session=None):
- """Runs the incident stable flow."""
- # we load the incident instance
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+def incident_closed_status_flow(incident: Incident, db_session=None):
+ """Runs the incident closed flow."""
+ # we inactivate all participants
+ inactivate_incident_participants(incident, db_session)
- # we set the stable time
- incident.stable_at = datetime.utcnow()
- log.debug(f"We have set the stable time.")
+ # we set the closed time
+ incident.closed_at = datetime.utcnow()
+ db_session.add(incident)
+ db_session.commit()
- # we remind the incident commander to write a status report
- send_incident_status_report_reminder(incident)
+ # we archive the conversation
+ conversation_flows.archive_conversation(subject=incident, db_session=db_session)
- # we update the incident cost
- incident_cost = incident_service.calculate_cost(incident_id, db_session)
- log.debug(f"We have updated the cost of the incident.")
+ if incident.visibility == Visibility.open:
+ storage_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="storage"
+ )
+ if storage_plugin:
+ if storage_plugin.configuration.open_on_close:
+ for document in [incident.incident_document, incident.incident_review_document]:
+ document_flows.open_document_access(document=document, db_session=db_session)
+
+ if storage_plugin.configuration.read_only:
+ for document in [incident.incident_document, incident.incident_review_document]:
+ document_flows.mark_document_as_readonly(
+ document=document, db_session=db_session
+ )
+
+ for case in incident.cases:
+ try:
+ case.resolution = (
+ f"Closed as part of incident {incident.name}. See incident for more details."
+ )
+ case.resolution_reason = CaseResolutionReason.escalated
+ case.status = CaseStatus.closed
+ case_flows.case_closed_status_flow(case=case, db_session=db_session)
+ except Exception as e:
+ log.exception(
+ f"Failed to close case {case.name} while closing incident {incident.name}. Error: {str(e)}"
+ )
- # we update the external ticket
- update_incident_ticket(
- incident.ticket.resource_id, status=IncidentStatus.stable.lower(), cost=incident_cost
- )
+ # we send a direct message to the incident commander asking to review
+ # the incident's information and to tag the incident if appropriate
+ send_incident_closed_information_review_reminder(incident, db_session)
- log.debug(f"We have updated the status of the external ticket to {IncidentStatus.stable}.")
+ # we send a direct message to all participants asking them
+ # to rate and provide feedback about the incident
+ send_incident_rating_feedback_message(incident, db_session)
- incident_review_document = get_document(
- db_session=db_session,
- incident_id=incident.id,
- resource_type=INCIDENT_RESOURCE_INCIDENT_REVIEW_DOCUMENT,
- )
+ # if an AI plugin is enabled, we send the incident review doc for summary
+ ai_service.generate_incident_summary(incident=incident, db_session=db_session)
- if not incident_review_document:
- storage_plugin = plugins.get(INCIDENT_PLUGIN_STORAGE_SLUG)
- # we create a copy of the incident review document template and we move it to the incident storage
- incident_review_document_name = f"{incident.name} - Post Incident Review Document"
- incident_review_document = storage_plugin.copy_file(
- team_drive_id=incident.storage.resource_id,
- file_id=INCIDENT_STORAGE_INCIDENT_REVIEW_FILE_ID,
- name=incident_review_document_name,
- )
+def conversation_topic_dispatcher(
+ user_email: str,
+ incident: Incident,
+ previous_incident: dict,
+ db_session: Session,
+):
+ """Determines if the conversation topic needs to be updated."""
+ # we load the individual
+ individual = individual_service.get_by_email_and_project(
+ db_session=db_session, email=user_email, project_id=incident.project.id
+ )
- incident_review_document.update(
- {
- "name": incident_review_document_name,
- "resource_type": INCIDENT_RESOURCE_INCIDENT_REVIEW_DOCUMENT,
- }
+ conversation_topic_change = False
+ if previous_incident.title != incident.title:
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Incident Participant",
+ description=f'{individual.name} changed the incident title to "{incident.title}"',
+ incident_id=incident.id,
+ individual_id=individual.id,
+ type=EventType.field_updated,
+ owner=individual.name,
)
- storage_plugin.move_file(
- new_team_drive_id=incident.storage.resource_id, file_id=incident_review_document["id"]
+ if previous_incident.description != incident.description:
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Incident Participant",
+ description=f"{individual.name} changed the incident description",
+ details={"description": incident.description},
+ incident_id=incident.id,
+ individual_id=individual.id,
+ type=EventType.field_updated,
+ owner=individual.name,
)
- log.debug("We have added the incident review document in the incident storage.")
-
- document_in = DocumentCreate(
- name=incident_review_document["name"],
- resource_id=incident_review_document["id"],
- resource_type=incident_review_document["resource_type"],
- weblink=incident_review_document["weblink"],
- )
- incident.documents.append(
- document_service.create(db_session=db_session, document_in=document_in)
+ description, details = check_for_tag_change(
+ previous_incident_tags=previous_incident.tags, current_incident_tags=incident.tags
+ )
+ if description:
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Incident Participant",
+ description=f"{individual.name} {description}",
+ details=details,
+ incident_id=incident.id,
+ individual_id=individual.id,
+ type=EventType.field_updated,
+ owner=individual.name,
)
- log.debug("We have added the incident review document to the incident.")
+ if previous_incident.incident_type.name != incident.incident_type.name:
+ conversation_topic_change = True
- # we get the incident investigation and faq documents
- incident_document = get_document(
+ event_service.log_incident_event(
db_session=db_session,
- incident_id=incident_id,
- resource_type=INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT,
- )
-
- # we update the incident review document
- update_document(
- incident_review_document["id"],
- incident.name,
- incident.incident_priority.name,
- incident.status,
- incident.title,
- incident.description,
- incident.commander.name,
- incident.conversation.weblink,
- incident_document.weblink,
- incident.storage.weblink,
- incident.ticket.weblink,
+ source="Incident Participant",
+ description=f"{individual.name} changed the incident type to {incident.incident_type.name}",
+ incident_id=incident.id,
+ individual_id=individual.id,
+ type=EventType.field_updated,
+ owner=individual.name,
)
- log.debug("We have updated the incident review document.")
-
- # we send a notification about the incident review document to the conversation
- send_incident_review_document_notification(
- incident.conversation.channel_id, incident_review_document["weblink"]
- )
+ if previous_incident.incident_severity.name != incident.incident_severity.name:
+ conversation_topic_change = True
- log.debug(
- "We have sent a notification about the incident review document to the conversation."
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Incident Participant",
+ description=f"{individual.name} changed the incident severity to {incident.incident_severity.name}",
+ incident_id=incident.id,
+ individual_id=individual.id,
+ type=EventType.assessment_updated,
+ owner=individual.name,
)
- db_session.add(incident)
- db_session.commit()
-
- log.debug("We have sent the incident stable notifications.")
-
-
-@background_task
-def incident_closed_flow(incident_id: int, command: Optional[dict] = None, db_session=None):
- """Runs the incident closed flow."""
- # we load the incident instance
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
-
- # we set the closed time
- incident.closed_at = datetime.utcnow()
- log.debug(f"We have set the closed time.")
-
- # we update the incident cost
- incident_cost = incident_service.calculate_cost(incident_id, db_session)
- log.debug(f"We have updated the cost of the incident.")
-
- # we archive the conversation
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- convo_plugin.archive(incident.conversation.channel_id)
- log.debug("We have archived the incident conversation.")
-
- # we update the external ticket
- update_incident_ticket(
- incident.ticket.resource_id, status=IncidentStatus.closed.lower(), cost=incident_cost
- )
- log.debug(f"We have updated the status of the external ticket to {IncidentStatus.closed}.")
-
- if incident.visibility == Visibility.open:
- # we archive the artifacts in the storage
- storage_plugin = plugins.get(INCIDENT_PLUGIN_STORAGE_SLUG)
- storage_plugin.archive(
- source_team_drive_id=incident.storage.resource_id,
- dest_team_drive_id=INCIDENT_STORAGE_ARCHIVAL_FOLDER_ID,
- folder_name=incident.name,
- )
- log.debug(
- "We have archived the incident artifacts in the archival folder and re-applied permissions and deleted the source."
- )
+ if previous_incident.incident_priority.name != incident.incident_priority.name:
+ conversation_topic_change = True
- # we get the tactical group
- tactical_group = group_service.get_by_incident_id_and_resource_type(
+ event_service.log_incident_event(
db_session=db_session,
- incident_id=incident_id,
- resource_type=INCIDENT_RESOURCE_TACTICAL_GROUP,
+ source="Incident Participant",
+ description=f"{individual.name} changed the incident priority to {incident.incident_priority.name}",
+ incident_id=incident.id,
+ individual_id=individual.id,
+ type=EventType.assessment_updated,
+ owner=individual.name,
)
- # we get the notifications group
- notifications_group = group_service.get_by_incident_id_and_resource_type(
+ if previous_incident.status != incident.status:
+ conversation_topic_change = True
+
+ event_service.log_incident_event(
db_session=db_session,
- incident_id=incident_id,
- resource_type=INCIDENT_RESOURCE_NOTIFICATIONS_GROUP,
+ source="Incident Participant",
+ description=f"{individual.name} marked the incident as {incident.status.lower()}",
+ incident_id=incident.id,
+ individual_id=individual.id,
+ type=EventType.assessment_updated,
+ owner=individual.name,
)
- group_plugin = plugins.get(INCIDENT_PLUGIN_GROUP_SLUG)
- group_plugin.delete(email=tactical_group.email)
- group_plugin.delete(email=notifications_group.email)
- log.debug("We have deleted the notification and tactical groups.")
+ if conversation_topic_change:
+ if incident.status != IncidentStatus.closed:
+ conversation_flows.set_conversation_topic(incident, db_session)
- # Delete the conference
- conference = conference_service.get_by_incident_id(
- db_session=db_session,
- incident_id=incident_id
- )
- conference_plugin = plugins.get(INCIDENT_PLUGIN_CONFERENCE_SLUG)
- conference_plugin.delete(conference.conference_id)
- db_session.add(incident)
- db_session.commit()
+def status_flow_dispatcher(
+ incident: Incident,
+ current_status: IncidentStatus,
+ previous_status: IncidentStatus,
+ db_session: Session,
+):
+ """Runs the correct flows depending on the incident's current and previous status."""
+ # we have a currently active incident
+ if current_status == IncidentStatus.active:
+ if previous_status == IncidentStatus.closed:
+ # re-activate incident
+ incident_active_status_flow(incident=incident, db_session=db_session)
+ reactivate_incident_participants(incident=incident, db_session=db_session)
+ send_incident_report_reminder(incident, ReportTypes.tactical_report, db_session)
+ elif previous_status == IncidentStatus.stable:
+ send_incident_report_reminder(incident, ReportTypes.tactical_report, db_session)
+
+ # we currently have a stable incident
+ elif current_status == IncidentStatus.stable:
+ if previous_status == IncidentStatus.active:
+ incident_stable_status_flow(incident=incident, db_session=db_session)
+ send_incident_report_reminder(incident, ReportTypes.tactical_report, db_session)
+ elif previous_status == IncidentStatus.closed:
+ incident_active_status_flow(incident=incident, db_session=db_session)
+ incident_stable_status_flow(incident=incident, db_session=db_session)
+ reactivate_incident_participants(incident=incident, db_session=db_session)
+ send_incident_report_reminder(incident, ReportTypes.tactical_report, db_session)
+
+ # we currently have a closed incident
+ elif current_status == IncidentStatus.closed:
+ if previous_status == IncidentStatus.active:
+ incident_stable_status_flow(incident=incident, db_session=db_session)
+ incident_closed_status_flow(incident=incident, db_session=db_session)
+ elif previous_status == IncidentStatus.stable:
+ incident_closed_status_flow(incident=incident, db_session=db_session)
@background_task
def incident_update_flow(
- user_email: str, incident_id: int, previous_incident: IncidentRead, notify=True, db_session=None
+ user_email: str,
+ commander_email: str,
+ reporter_email: str,
+ incident_id: int,
+ previous_incident: IncidentRead,
+ organization_slug: str = None,
+ db_session=None,
):
"""Runs the incident update flow."""
- conversation_topic_change = False
-
- # we load the incident instance
+ # we load the incident
incident = incident_service.get(db_session=db_session, incident_id=incident_id)
- if previous_incident.incident_type.name != incident.incident_type.name:
- conversation_topic_change = True
-
- if previous_incident.incident_priority.name != incident.incident_priority.name:
- conversation_topic_change = True
-
- if previous_incident.status.value != incident.status:
- conversation_topic_change = True
+ if incident.commander.individual.email != commander_email:
+ # we assign the commander role to another participant
+ incident_assign_role_flow(
+ incident_id=incident_id,
+ assigner_email=user_email,
+ assignee_email=commander_email,
+ assignee_role=ParticipantRoleType.incident_commander,
+ db_session=db_session,
+ )
- if conversation_topic_change:
- # we update the conversation topic
- set_conversation_topic(incident)
+ if incident.reporter.individual.email != reporter_email:
+ # we assign the reporter role to another participant
+ incident_assign_role_flow(
+ incident_id=incident_id,
+ assigner_email=user_email,
+ assignee_email=reporter_email,
+ assignee_role=ParticipantRoleType.reporter,
+ db_session=db_session,
+ )
- if notify:
- send_incident_update_notifications(incident, previous_incident)
+ # we run the active, stable or closed flows based on incident status change
+ status_flow_dispatcher(incident, incident.status, previous_incident.status, db_session)
- # we get the incident document
- incident_document = get_document(
- db_session=db_session,
- incident_id=incident_id,
- resource_type=INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT,
- )
+ # we update the conversation topic
+ conversation_topic_dispatcher(user_email, incident, previous_incident, db_session)
# we update the external ticket
- update_incident_ticket(
- incident.ticket.resource_id,
- title=incident.title,
- description=incident.description,
- incident_type=incident.incident_type.name,
- priority=incident.incident_priority.name,
- commander_email=incident.commander.email,
- conversation_weblink=incident.conversation.weblink,
- conference_weblink=incident.conference.weblink,
- document_weblink=incident_document.weblink,
- storage_weblink=incident.storage.weblink,
- visibility=incident.visibility,
- )
-
- log.debug(f"Updated the external ticket {incident.ticket.resource_id}.")
-
- # get the incident participants based on incident type and priority
- individual_participants, team_participants = get_incident_participants(
- db_session, incident.incident_type, incident.incident_priority, incident.description
- )
+ ticket_flows.update_incident_ticket(incident_id=incident.id, db_session=db_session)
+
+ if incident.status == IncidentStatus.active:
+ # we re-resolve and add individuals to the incident
+ individual_participants, team_participants = get_incident_participants(incident, db_session)
+
+ for individual, service_id in individual_participants:
+ incident_add_or_reactivate_participant_flow(
+ individual.email,
+ incident.id,
+ participant_role=ParticipantRoleType.observer,
+ service_id=service_id,
+ db_session=db_session,
+ )
- # we add the individuals as incident participants
- for individual in individual_participants:
- incident_add_or_reactivate_participant_flow(
- individual.email, incident.id, db_session=db_session
+ # we add the team distributions lists to the notifications group
+ # we only have to do this for teams as new members
+ # will be added to the tactical group on incident join
+ group_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="participant-group"
)
+ if group_plugin and incident.notifications_group:
+ team_participant_emails = [x.email for x in team_participants]
+ group_plugin.instance.add(incident.notifications_group.email, team_participant_emails)
- # we get the tactical group
- notification_group = group_service.get_by_incident_id_and_resource_type(
- db_session=db_session,
- incident_id=incident.id,
- resource_type=INCIDENT_RESOURCE_NOTIFICATIONS_GROUP,
- )
- team_participant_emails = [x.email for x in team_participants]
+ # we send the incident update notifications
+ send_incident_update_notifications(incident, previous_incident, db_session)
- # we add the team distributions lists to the notifications group
- group_plugin = plugins.get(INCIDENT_PLUGIN_GROUP_SLUG)
- group_plugin.add(notification_group.email, team_participant_emails)
- log.debug(f"Resolved and added new participants to the incident.")
+def incident_delete_flow(incident: Incident, db_session: Session):
+ """Deletes all external incident resources."""
+ # we delete the external ticket
+ if incident.ticket:
+ ticket_flows.delete_ticket(
+ ticket=incident.ticket, project_id=incident.project.id, db_session=db_session
+ )
- if previous_incident.status.value != incident.status:
- if incident.status == IncidentStatus.active:
- incident_active_flow(incident_id=incident.id, db_session=db_session)
- elif incident.status == IncidentStatus.stable:
- incident_stable_flow(incident_id=incident.id, db_session=db_session)
- elif incident.status == IncidentStatus.closed:
- if previous_incident.status.value == IncidentStatus.active:
- incident_stable_flow(incident_id=incident.id, db_session=db_session)
- incident_closed_flow(incident_id=incident.id, db_session=db_session)
+ # we delete the external groups
+ if incident.groups:
+ for group in incident.groups:
+ group_flows.delete_group(
+ group=group, project_id=incident.project.id, db_session=db_session
+ )
+
+ # we delete the external storage
+ if incident.storage:
+ storage_flows.delete_storage(
+ storage=incident.storage, project_id=incident.project.id, db_session=db_session
+ )
- log.debug(f"Finished running status flow. Status: {incident.status}")
+ # we delete the conversation
+ if incident.conversation:
+ conversation_flows.delete_conversation(
+ conversation=incident.conversation,
+ project_id=incident.project.id,
+ db_session=db_session,
+ )
-@background_task
def incident_assign_role_flow(
- assigner_email: str, incident_id: int, assignee_email: str, assignee_role: str, db_session=None
+ incident_id: int,
+ assigner_email: str,
+ assignee_email: str,
+ assignee_role: str,
+ db_session: Session,
):
"""Runs the incident participant role assignment flow."""
- # we resolve the assigner and assignee's contact information
- contact_plugin = plugins.get(INCIDENT_PLUGIN_CONTACT_SLUG)
- assigner_contact_info = contact_plugin.get(assigner_email)
- assignee_contact_info = contact_plugin.get(assignee_email)
-
# we load the incident instance
incident = incident_service.get(db_session=db_session, incident_id=incident_id)
- # we get the participant object for the assignee
- assignee_participant = participant_service.get_by_incident_id_and_email(
- db_session=db_session, incident_id=incident.id, email=assignee_contact_info["email"]
- )
-
- if not assignee_participant:
- # The assignee is not a participant. We add them to the incident
- incident_add_or_reactivate_participant_flow(
- assignee_email, incident.id, db_session=db_session
- )
+ # we add the assignee to the incident if they're not a participant
+ incident_add_or_reactivate_participant_flow(assignee_email, incident.id, db_session=db_session)
# we run the participant assign role flow
result = participant_role_flows.assign_role_flow(
- incident.id, assignee_contact_info, assignee_role, db_session
+ incident, assignee_email, assignee_role, db_session
)
if result == "assignee_has_role":
# NOTE: This is disabled until we can determine the source of the caller
# we let the assigner know that the assignee already has this role
# send_incident_participant_has_role_ephemeral_message(
- # assigner_email, assignee_contact_info, assignee_role, incident
+ # assigner_email, assignee_contact_info, assignee_role, incident
# )
return
@@ -874,140 +889,362 @@ def incident_assign_role_flow(
# NOTE: This is disabled until we can determine the source of the caller
# we let the assigner know that we were not able to assign the role
# send_incident_participant_role_not_assigned_ephemeral_message(
- # assigner_email, assignee_contact_info, assignee_role, incident
+ # assigner_email, assignee_contact_info, assignee_role, incident
# )
return
- if assignee_role != ParticipantRoleType.participant:
- # we send a notification to the incident conversation
- send_incident_new_role_assigned_notification(
- assigner_contact_info, assignee_contact_info, assignee_role, incident
- )
+ if incident.status != IncidentStatus.closed:
+ if assignee_role != ParticipantRoleType.participant:
+ # we resolve the assigner and assignee contact information
+ contact_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="contact"
+ )
- if assignee_role == ParticipantRoleType.incident_commander:
- # we update the conversation topic
- set_conversation_topic(incident)
+ if contact_plugin:
+ assigner_contact_info = contact_plugin.instance.get(
+ assigner_email, db_session=db_session
+ )
+ assignee_contact_info = contact_plugin.instance.get(
+ assignee_email, db_session=db_session
+ )
+ else:
+ assigner_contact_info = {
+ "email": assigner_email,
+ "fullname": "Unknown",
+ "weblink": "",
+ }
+ assignee_contact_info = {
+ "email": assignee_email,
+ "fullname": "Unknown",
+ "weblink": "",
+ }
+
+ # we send a notification to the incident conversation
+ send_incident_new_role_assigned_notification(
+ assigner_contact_info, assignee_contact_info, assignee_role, incident, db_session
+ )
- # we get the incident document
- incident_document = get_document(
- db_session=db_session,
- incident_id=incident_id,
- resource_type=INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT,
- )
+ if assignee_role == ParticipantRoleType.incident_commander:
+ # we update the conversation topic
+ conversation_flows.set_conversation_topic(incident, db_session)
- # we update the external ticket
- update_incident_ticket(
- incident.ticket.resource_id,
- description=incident.description,
- incident_type=incident.incident_type.name,
- commander_email=incident.commander.email,
- conversation_weblink=incident.conversation.weblink,
- document_weblink=incident_document.weblink,
- storage_weblink=incident.storage.weblink,
- visibility=incident.visibility,
- conference_weblink=incident.conference.weblink,
- )
+ # we send a message to the incident commander with tips on how to manage the incident
+ send_incident_management_help_tips_message(incident, db_session)
+
+ # Update the participants canvas since a role was assigned
+ try:
+ canvas_flows.update_participants_canvas(incident=incident, db_session=db_session)
+ log.info(
+ f"Updated participants canvas for incident {incident.id} after assigning {assignee_role} to {assignee_email}"
+ )
+ except Exception as e:
+ log.exception(f"Failed to update participants canvas for incident {incident.id}: {e}")
@background_task
-def incident_engage_oncall_flow(user_id: str, user_email: str, incident_id: int, action: dict, db_session=None):
+def incident_engage_oncall_flow(
+ user_email: str,
+ incident_id: int,
+ oncall_service_external_id: str,
+ page=None,
+ organization_slug: str = None,
+ db_session=None,
+):
"""Runs the incident engage oncall flow."""
- oncall_service_id = action["submission"]["oncall_service_id"]
- page = action["submission"]["page"]
-
# we load the incident instance
incident = incident_service.get(db_session=db_session, incident_id=incident_id)
# we resolve the oncall service
- oncall_service = service_service.get_by_external_id(
- db_session=db_session, external_id=oncall_service_id
+ oncall_service = service_service.get_by_external_id_and_project_id(
+ db_session=db_session,
+ external_id=oncall_service_external_id,
+ project_id=incident.project.id,
+ )
+
+ # we get the active oncall plugin
+ oncall_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="oncall"
+ )
+
+ if oncall_plugin:
+ if oncall_plugin.plugin.slug != oncall_service.type:
+ log.warning(
+ f"Unable to engage the oncall. Oncall plugin enabled not of type {oncall_plugin.plugin.slug}." # noqa
+ )
+ return None, None
+ else:
+ log.warning("Unable to engage the oncall. No oncall plugins enabled.")
+ return None, None
+
+ oncall_email = oncall_plugin.instance.get(service_id=oncall_service_external_id)
+
+ # we attempt to add the oncall to the incident
+ oncall_participant_added = incident_add_or_reactivate_participant_flow(
+ oncall_email, incident.id, service_id=oncall_service.id, db_session=db_session
+ )
+
+ if not oncall_participant_added:
+ # we already have the oncall for the service in the incident
+ return None, oncall_service
+
+ individual = individual_service.get_by_email_and_project(
+ db_session=db_session, email=user_email, project_id=incident.project.id
)
- oncall_plugin = plugins.get(oncall_service.type)
- oncall_email = oncall_plugin.get(service_id=oncall_service_id)
- # we add the oncall to the incident
- incident_add_or_reactivate_participant_flow(oncall_email, incident.id, db_session=db_session)
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=oncall_plugin.plugin.title,
+ description=f"{individual.name} engages oncall service {oncall_service.name}",
+ incident_id=incident.id,
+ )
if page == "Yes":
# we page the oncall
- oncall_plugin.page(oncall_service_id, incident.name, incident.title, incident.description)
+ oncall_plugin.instance.page(
+ service_id=oncall_service_external_id,
+ incident_name=incident.name,
+ incident_title=incident.title,
+ incident_description=incident.description,
+ )
- log.debug(f"{user_email} has engaged oncall service {oncall_service.name}")
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=oncall_plugin.plugin.title,
+ description=f"{oncall_service.name} on-call paged",
+ incident_id=incident.id,
+ )
+
+ return oncall_participant_added.individual, oncall_service
@background_task
-def incident_add_or_reactivate_participant_flow(
- user_email: str, incident_id: int, role: ParticipantRoleType = None, db_session=None
+def incident_subscribe_participant_flow(
+ user_email: str,
+ incident_id: int,
+ organization_slug: str,
+ db_session=None,
):
- """Runs the add or reactivate incident participant flow."""
+ """Subscribes a participant to the incident."""
+ # we get the incident
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+
+ # we add the participant to the tactical group
+ group_flows.update_group(
+ subject=incident,
+ group=incident.tactical_group,
+ group_action=GroupAction.add_member,
+ group_member=user_email,
+ db_session=db_session,
+ )
+
+
+@background_task
+def incident_add_or_reactivate_participant_flow(
+ user_email: str,
+ incident_id: int,
+ participant_role: ParticipantRoleType = ParticipantRoleType.observer,
+ service_id: int = 0,
+ event: dict = None,
+ organization_slug: str = None,
+ db_session=None,
+ send_announcement_message: bool = True,
+) -> Participant:
+ """Runs the incident add or reactivate participant flow."""
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+
+ if service_id:
+ # we need to ensure that we don't add another member of a service if one
+ # already exists (e.g. overlapping oncalls, we assume they will hand-off if necessary)
+ participant = participant_service.get_by_incident_id_and_service_id(
+ incident_id=incident_id, service_id=service_id, db_session=db_session
+ )
+
+ if participant:
+ log.info("Skipping resolved participant. Oncall service member already engaged.")
+ return
participant = participant_service.get_by_incident_id_and_email(
- db_session=db_session, incident_id=incident_id, email=user_email
+ db_session=db_session, incident_id=incident.id, email=user_email
)
if participant:
- if participant.is_active:
- log.debug(f"{user_email} is already an active participant.")
- else:
+ if participant.active_roles:
+ return participant
+
+ if incident.status != IncidentStatus.closed:
# we reactivate the participant
- reactivated = participant_flows.reactivate_participant(
- user_email, incident_id, db_session
+ participant_flows.reactivate_participant(
+ user_email, incident, db_session, service_id=service_id
)
-
- if reactivated:
- # we add the participant to the conversation
- add_participant_to_conversation(user_email, incident_id, db_session)
-
- # we announce the participant in the conversation
- send_incident_participant_announcement_message(user_email, incident_id, db_session)
-
- # we send the welcome messages to the participant
- send_incident_welcome_participant_messages(user_email, incident_id, db_session)
else:
# we add the participant to the incident
participant = participant_flows.add_participant(
- user_email, incident_id, db_session, role=role
+ user_email, incident, db_session, service_id=service_id, roles=[participant_role]
)
- if participant:
- # we add the participant to the tactical group
- add_participant_to_tactical_group(user_email, incident_id)
+ # we add the participant to the tactical group
+ group_flows.update_group(
+ subject=incident,
+ group=incident.tactical_group,
+ group_action=GroupAction.add_member,
+ group_member=user_email,
+ db_session=db_session,
+ )
- # we add the participant to the conversation
- add_participant_to_conversation(user_email, incident_id, db_session)
+ if incident.status != IncidentStatus.closed:
+ # we add the participant to the conversation
+ conversation_flows.add_incident_participants_to_conversation(
+ incident=incident, participant_emails=[user_email], db_session=db_session
+ )
- # we announce the participant in the conversation
- send_incident_participant_announcement_message(user_email, incident_id, db_session)
+ # log event for adding the participant
+ try:
+ slack_conversation_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
- # we send the welcome messages to the participant
- send_incident_welcome_participant_messages(user_email, incident_id, db_session)
+ if not slack_conversation_plugin:
+ log.warning(f"{user_email} not updated. No conversation plugin enabled.")
+ return
+
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=slack_conversation_plugin.plugin.title,
+ description=f"{user_email} added to conversation (channel ID: {incident.conversation.channel_id})",
+ incident_id=incident.id,
+ type=EventType.participant_updated,
+ )
+
+ log.info(
+ f"Added {user_email} to conversation in (channel ID: {incident.conversation.channel_id})"
+ )
+
+ except Exception as e:
+ log.exception(f"Failed to add user to Slack conversation: {e}")
+
+ # we announce the participant in the conversation
+ if send_announcement_message:
+ send_participant_announcement_message(
+ participant_email=user_email,
+ subject=incident,
+ db_session=db_session,
+ )
+
+ # we send the welcome messages to the participant
+ send_incident_welcome_participant_messages(user_email, incident, db_session)
+
+ # Update the participants canvas since a new participant was added
+ try:
+ canvas_flows.update_participants_canvas(incident=incident, db_session=db_session)
+ log.info(
+ f"Updated participants canvas for incident {incident.id} after adding {user_email}"
+ )
+ except Exception as e:
+ log.exception(f"Failed to update participants canvas for incident {incident.id}: {e}")
+
+ return participant
@background_task
-def incident_remove_participant_flow(user_email: str, incident_id: int, db_session=None):
+def incident_remove_participant_flow(
+ user_email: str,
+ incident_id: int,
+ event: dict = None,
+ organization_slug: str = None,
+ db_session=None,
+):
"""Runs the remove participant flow."""
- # we load the incident instance
incident = incident_service.get(db_session=db_session, incident_id=incident_id)
- if user_email == incident.commander.email:
- # we add the incident commander to the conversation again
- add_participant_to_conversation(user_email, incident_id, db_session)
+ if not incident:
+ log.warn(
+ f"Unable to remove participant from incident with id {incident_id}. An incident with this id does not exist."
+ )
+ return
- # we send a notification to the channel
- send_incident_commander_readded_notification(incident_id, db_session)
+ participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=incident.id, email=user_email
+ )
- log.debug(
- f"Incident Commander {incident.commander.name} has been re-added to the incident conversation."
+ for task in incident.tasks:
+ if task.status == TaskStatus.open:
+ for assignee in task.assignees:
+ if assignee == participant:
+ # we add the participant to the conversation
+ conversation_flows.add_incident_participants_to_conversation(
+ incident=incident, participant_emails=[user_email], db_session=db_session
+ )
+
+ # we ask the participant to resolve or re-assign
+ # their tasks before leaving the incident conversation
+ send_incident_open_tasks_ephemeral_message(user_email, incident, db_session)
+
+ return
+
+ if (
+ incident.status != IncidentStatus.closed
+ and user_email == incident.commander.individual.email
+ ):
+ # we add the participant to the conversation
+ conversation_flows.add_incident_participants_to_conversation(
+ incident=incident, participant_emails=[user_email], db_session=db_session
)
- else:
- # we remove the participant from the incident
- participant_flows.remove_participant(user_email, incident_id, db_session)
+ # we send a notification to the channel
+ send_incident_commander_readded_notification(incident, db_session)
-@background_task
-def incident_list_resources_flow(incident_id: int, command: Optional[dict] = None, db_session=None):
- """Runs the list incident resources flow."""
- # we send the list of resources to the participant
- send_incident_resources_ephemeral_message_to_participant(
- command["user_id"], incident_id, db_session
+ return
+
+ # we remove the participant from the incident
+ participant_flows.remove_participant(user_email, incident, db_session)
+
+ # we remove the participant from the tactical group
+ group_flows.update_group(
+ subject=incident,
+ group=incident.tactical_group,
+ group_action=GroupAction.remove_member,
+ group_member=user_email,
+ db_session=db_session,
)
+
+ # Update the participants canvas since a participant was removed
+ try:
+ canvas_flows.update_participants_canvas(incident=incident, db_session=db_session)
+ log.info(
+ f"Updated participants canvas for incident {incident.id} after removing {user_email}"
+ )
+ except Exception as e:
+ log.exception(f"Failed to update participants canvas for incident {incident.id}: {e}")
+
+ # we also try to remove the user from the Slack conversation
+ slack_conversation_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+
+ if not slack_conversation_plugin:
+ log.warning(f"{user_email} not updated. No conversation plugin enabled.")
+ return
+
+ if not incident.conversation:
+ log.warning("No conversation enabled for this incident.")
+ return
+
+ try:
+ slack_conversation_plugin.instance.remove_user(
+ conversation_id=incident.conversation.channel_id, user_email=user_email
+ )
+
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=slack_conversation_plugin.plugin.title,
+ description=f"{user_email} removed from conversation (channel ID: {incident.conversation.channel_id})",
+ incident_id=incident.id,
+ type=EventType.participant_updated,
+ )
+
+ log.info(
+ f"Removed {user_email} from conversation in channel {incident.conversation.channel_id}"
+ )
+
+ except Exception as e:
+ log.exception(f"Failed to remove user from Slack conversation: {e}")
diff --git a/src/dispatch/incident/messaging.py b/src/dispatch/incident/messaging.py
index ef2358c6a094..1608d2668548 100644
--- a/src/dispatch/incident/messaging.py
+++ b/src/dispatch/incident/messaging.py
@@ -4,374 +4,578 @@
:copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
:license: Apache, see LICENSE for more details.
"""
+
import logging
-from dispatch.config import (
- INCIDENT_NOTIFICATION_CONVERSATIONS,
- INCIDENT_NOTIFICATION_DISTRIBUTION_LISTS,
- INCIDENT_PLUGIN_CONTACT_SLUG,
- INCIDENT_PLUGIN_CONVERSATION_SLUG,
- INCIDENT_PLUGIN_EMAIL_SLUG,
- INCIDENT_RESOURCE_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT,
- INCIDENT_RESOURCE_FAQ_DOCUMENT,
- INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT,
-)
-from dispatch.database import SessionLocal
-from dispatch.enums import Visibility
-from dispatch.messaging import (
+from slack_sdk.errors import SlackApiError
+from sqlalchemy.orm import Session
+
+from dispatch.config import DISPATCH_UI_URL
+from dispatch.conversation.enums import ConversationCommands
+from dispatch.database.core import resolve_attr
+from dispatch.decorators import timer
+from dispatch.document import service as document_service
+from dispatch.email_templates import service as email_template_service
+from dispatch.email_templates.enums import EmailTemplateTypes
+from dispatch.email_templates.models import EmailTemplates
+from dispatch.enums import SubjectNames
+from dispatch.event import service as event_service
+from dispatch.forms.models import Forms
+from dispatch.incident.enums import IncidentStatus
+from dispatch.incident.models import Incident, IncidentRead
+from dispatch.messaging.strings import (
+ INCIDENT_CLOSE_REMINDER,
+ INCIDENT_CLOSED_INFORMATION_REVIEW_REMINDER_NOTIFICATION,
+ INCIDENT_CLOSED_RATING_FEEDBACK_NOTIFICATION,
+ INCIDENT_COMMANDER,
INCIDENT_COMMANDER_READDED_NOTIFICATION,
+ INCIDENT_COMPLETED_FORM_MESSAGE,
+ INCIDENT_MANAGEMENT_HELP_TIPS_MESSAGE,
+ INCIDENT_NAME,
+ INCIDENT_NAME_WITH_ENGAGEMENT,
+ INCIDENT_NAME_WITH_ENGAGEMENT_NO_SELF_JOIN,
INCIDENT_NEW_ROLE_NOTIFICATION,
INCIDENT_NOTIFICATION,
INCIDENT_NOTIFICATION_COMMON,
+ INCIDENT_OPEN_TASKS,
INCIDENT_PRIORITY_CHANGE,
+ INCIDENT_REVIEW_DOCUMENT,
+ INCIDENT_SEVERITY_CHANGE,
INCIDENT_STATUS_CHANGE,
+ INCIDENT_TASK_ADD_TO_INCIDENT,
INCIDENT_TYPE_CHANGE,
- INCIDENT_PARTICIPANT_WELCOME_MESSAGE,
- INCIDENT_RESOURCES_MESSAGE,
- INCIDENT_REVIEW_DOCUMENT_NOTIFICATION,
- INCIDENT_STATUS_REPORT_REMINDER,
- INCIDENT_COMMANDER,
MessageType,
+ generate_welcome_message,
)
-
-from dispatch.conversation.enums import ConversationCommands
-from dispatch.document.service import get_by_incident_id_and_resource_type as get_document
-from dispatch.incident import service as incident_service
-from dispatch.incident.models import Incident, IncidentRead
+from dispatch.notification import service as notification_service
from dispatch.participant import service as participant_service
from dispatch.participant_role import service as participant_role_service
-from dispatch.plugins.base import plugins
-
+from dispatch.plugin import service as plugin_service
+from dispatch.plugins.dispatch_slack.enums import SlackAPIErrorCode
+from dispatch.task.models import TaskCreate
+from dispatch.types import Subject
log = logging.getLogger(__name__)
-def send_incident_status_report_reminder(incident: Incident):
- """Sends the incident commander a direct message indicating that they should complete a status report."""
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- status_report_command = convo_plugin.get_command_name(ConversationCommands.status_report)
+def get_suggested_documents(db_session, incident: Incident) -> list:
+ """Get additional incident documents based on priority, type, and description."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="document-resolver"
+ )
- items = [
- {
- "name": incident.name,
- "ticket_weblink": incident.ticket.weblink,
- "title": incident.title,
- "command": status_report_command,
- }
- ]
+ documents = []
+ if plugin:
+ matches = plugin.instance.get(incident=incident, db_session=db_session)
- convo_plugin.send_direct(
- incident.commander.email,
- "Incident Status Report Reminder",
- INCIDENT_STATUS_REPORT_REMINDER,
- MessageType.incident_status_report,
- items=items,
- )
+ for m in matches:
+ document = document_service.get(
+ db_session=db_session, document_id=m.resource_state["id"]
+ )
+ documents.append(document)
+
+ return documents
def send_welcome_ephemeral_message_to_participant(
- participant_email: str, incident_id: int, db_session: SessionLocal
+ *,
+ participant_email: str,
+ incident: Incident,
+ db_session: Session,
+ welcome_template: EmailTemplates | None = None,
):
- """Sends an ephemeral message to the participant."""
- # we load the incident instance
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+ """Sends an ephemeral welcome message to the participant."""
+ if not incident.conversation:
+ log.warning(
+ "Incident participant welcome message not sent. No conversation available for this incident."
+ )
+ return
- # we get the incident documents
- incident_document = get_document(
- db_session=db_session,
- incident_id=incident_id,
- resource_type=INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT,
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
)
+ if not plugin:
+ log.warning(
+ "Incident participant welcome message not sent. No conversation plugin enabled."
+ )
+ return
- incident_faq = get_document(
- db_session=db_session, incident_id=incident_id, resource_type=INCIDENT_RESOURCE_FAQ_DOCUMENT
+ incident_description = (
+ incident.description
+ if len(incident.description) <= 500
+ else f"{incident.description[:500]}..."
)
- incident_conversation_commands_reference_document = get_document(
- db_session=db_session,
- incident_id=incident_id,
- resource_type=INCIDENT_RESOURCE_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT,
+ # we send the ephemeral message
+ message_kwargs = {
+ "name": incident.name,
+ "title": incident.title,
+ "description": incident_description,
+ "visibility": incident.visibility,
+ "status": incident.status,
+ "type": incident.incident_type.name,
+ "type_description": incident.incident_type.description,
+ "severity": incident.incident_severity.name,
+ "severity_description": incident.incident_severity.description,
+ "priority": incident.incident_priority.name,
+ "priority_description": incident.incident_priority.description,
+ "commander_fullname": incident.commander.individual.name,
+ "commander_team": incident.commander.team,
+ "commander_weblink": incident.commander.individual.weblink,
+ "reporter_fullname": incident.reporter.individual.name,
+ "reporter_team": incident.reporter.team,
+ "reporter_weblink": incident.reporter.individual.weblink,
+ "document_weblink": resolve_attr(incident, "incident_document.weblink"),
+ "storage_weblink": resolve_attr(incident, "storage.weblink"),
+ "ticket_weblink": resolve_attr(incident, "ticket.weblink"),
+ "conference_weblink": resolve_attr(incident, "conference.weblink"),
+ "conference_challenge": resolve_attr(incident, "conference.conference_challenge"),
+ }
+
+ faq_doc = document_service.get_incident_faq_document(
+ db_session=db_session, project_id=incident.project_id
)
+ if faq_doc:
+ message_kwargs.update({"faq_weblink": faq_doc.weblink})
- # we send the ephemeral message
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- convo_plugin.send_ephemeral(
+ conversation_reference = document_service.get_conversation_reference_document(
+ db_session=db_session, project_id=incident.project_id
+ )
+ if conversation_reference:
+ message_kwargs.update(
+ {"conversation_commands_reference_document_weblink": conversation_reference.weblink}
+ )
+
+ plugin.instance.send_ephemeral(
incident.conversation.channel_id,
participant_email,
"Incident Welcome Message",
- INCIDENT_PARTICIPANT_WELCOME_MESSAGE,
+ generate_welcome_message(welcome_template),
MessageType.incident_participant_welcome,
- name=incident.name,
- title=incident.title,
- status=incident.status,
- priority=incident.incident_priority.name,
- commander_fullname=incident.commander.name,
- commander_weblink=incident.commander.weblink,
- document_weblink=incident_document.weblink,
- storage_weblink=incident.storage.weblink,
- ticket_weblink=incident.ticket.weblink,
- faq_weblink=incident_faq.weblink,
- conference_weblink=incident.conference.weblink,
- conference_challenge=incident.conference.conference_challenge,
- conversation_commands_reference_document_weblink=incident_conversation_commands_reference_document.weblink,
+ **message_kwargs,
)
log.debug(f"Welcome ephemeral message sent to {participant_email}.")
def send_welcome_email_to_participant(
- participant_email: str, incident_id: int, db_session: SessionLocal
+ *,
+ participant_email: str,
+ incident: Incident,
+ db_session: Session,
+ welcome_template: EmailTemplates | None = None,
):
"""Sends a welcome email to the participant."""
# we load the incident instance
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
-
- # we get the incident documents
- incident_document = get_document(
- db_session=db_session,
- incident_id=incident_id,
- resource_type=INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT,
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="email"
)
+ if not plugin:
+ log.warning("Participant welcome email not sent, not email plugin configured.")
+ return
- incident_faq = get_document(
- db_session=db_session, incident_id=incident_id, resource_type=INCIDENT_RESOURCE_FAQ_DOCUMENT
+ incident_description = (
+ incident.description
+ if len(incident.description) <= 500
+ else f"{incident.description[:500]}..."
)
- incident_conversation_commands_reference_document = get_document(
- db_session=db_session,
- incident_id=incident_id,
- resource_type=INCIDENT_RESOURCE_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT,
+ message_kwargs = {
+ "name": incident.name,
+ "title": incident.title,
+ "description": incident_description,
+ "visibility": incident.visibility,
+ "status": incident.status,
+ "type": incident.incident_type.name,
+ "type_description": incident.incident_type.description,
+ "severity": incident.incident_severity.name,
+ "severity_description": incident.incident_severity.description,
+ "priority": incident.incident_priority.name,
+ "priority_description": incident.incident_priority.description,
+ "commander_fullname": incident.commander.individual.name,
+ "commander_team": incident.commander.team,
+ "commander_weblink": incident.commander.individual.weblink,
+ "reporter_fullname": incident.reporter.individual.name,
+ "reporter_team": incident.reporter.team,
+ "reporter_weblink": incident.reporter.individual.weblink,
+ "document_weblink": resolve_attr(incident, "incident_document.weblink"),
+ "storage_weblink": resolve_attr(incident, "storage.weblink"),
+ "ticket_weblink": resolve_attr(incident, "ticket.weblink"),
+ "conference_weblink": resolve_attr(incident, "conference.weblink"),
+ "conference_challenge": resolve_attr(incident, "conference.conference_challenge"),
+ "contact_fullname": incident.commander.individual.name,
+ "contact_weblink": incident.commander.individual.weblink,
+ }
+
+ faq_doc = document_service.get_incident_faq_document(
+ db_session=db_session, project_id=incident.project_id
)
+ if faq_doc:
+ message_kwargs.update({"faq_weblink": faq_doc.weblink})
- email_plugin = plugins.get(INCIDENT_PLUGIN_EMAIL_SLUG)
- email_plugin.send(
- participant_email,
- INCIDENT_PARTICIPANT_WELCOME_MESSAGE,
- MessageType.incident_participant_welcome,
- name=incident.name,
- title=incident.title,
- status=incident.status,
- priority=incident.incident_priority.name,
- commander_fullname=incident.commander.name,
- commander_weblink=incident.commander.weblink,
- document_weblink=incident_document.weblink,
- storage_weblink=incident.storage.weblink,
- ticket_weblink=incident.ticket.weblink,
- faq_weblink=incident_faq.weblink,
- conference_weblink=incident.conference.weblink,
- conference_challenge=incident.conference.conference_challenge,
- conversation_commands_reference_document_weblink=incident_conversation_commands_reference_document.weblink,
+ conversation_reference = document_service.get_conversation_reference_document(
+ db_session=db_session, project_id=incident.project_id
)
+ if conversation_reference:
+ message_kwargs.update(
+ {"conversation_commands_reference_document_weblink": conversation_reference.weblink}
+ )
+
+ notification_text = "Incident Notification"
+
+ # Can raise exception "tenacity.RetryError: RetryError". (Email may still go through).
+ try:
+ plugin.instance.send(
+ participant_email,
+ notification_text,
+ generate_welcome_message(welcome_template),
+ MessageType.incident_participant_welcome,
+ **message_kwargs,
+ )
+ except Exception as e:
+ log.error(f"Error in sending welcome email to {participant_email}: {e}")
log.debug(f"Welcome email sent to {participant_email}.")
+def send_completed_form_email(participant_email: str, form: Forms, db_session: Session):
+ """Sends an email to notify about a completed incident form."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=form.project.id, plugin_type="email"
+ )
+ if not plugin:
+ log.warning("Completed form notification email not sent. No email plugin configured.")
+ return
+
+ incident_description = (
+ form.incident.description
+ if len(form.incident.description) <= 500
+ else f"{form.incident.description[:500]}..."
+ )
+
+ message_kwargs = {
+ "name": form.incident.name,
+ "title": form.incident.title,
+ "description": incident_description,
+ "status": form.incident.status,
+ "commander_fullname": form.incident.commander.individual.name,
+ "commander_team": form.incident.commander.team,
+ "commander_weblink": form.incident.commander.individual.weblink,
+ "contact_fullname": form.incident.commander.individual.name,
+ "contact_weblink": form.incident.commander.individual.weblink,
+ "form_type": form.form_type.name,
+ "form_type_description": form.form_type.description,
+ "form_weblink": f"{DISPATCH_UI_URL}/{form.project.organization.name}/forms/{form.id}",
+ }
+
+ notification_text = "Incident Form Completed Notification"
+
+ # Can raise exception "tenacity.RetryError: RetryError". (Email may still go through).
+ try:
+ plugin.instance.send(
+ participant_email,
+ notification_text,
+ INCIDENT_COMPLETED_FORM_MESSAGE,
+ MessageType.incident_completed_form_notification,
+ **message_kwargs,
+ )
+ except Exception as e:
+ log.error(f"Error in sending completed form notification email to {participant_email}: {e}")
+
+ log.debug(f"Completed form notification email sent to {participant_email}.")
+
+
+@timer
def send_incident_welcome_participant_messages(
- participant_email: str, incident_id: int, db_session: SessionLocal
+ participant_email: str,
+ incident: Incident,
+ db_session: Session,
):
"""Sends welcome messages to the participant."""
+ # check to see if there is an override welcome message template
+ welcome_template = email_template_service.get_by_type(
+ db_session=db_session,
+ project_id=incident.project_id,
+ email_template_type=EmailTemplateTypes.incident_welcome,
+ )
+
# we send the welcome ephemeral message
- send_welcome_ephemeral_message_to_participant(participant_email, incident_id, db_session)
+ send_welcome_ephemeral_message_to_participant(
+ participant_email=participant_email,
+ incident=incident,
+ db_session=db_session,
+ welcome_template=welcome_template,
+ )
# we send the welcome email
- send_welcome_email_to_participant(participant_email, incident_id, db_session)
+ send_welcome_email_to_participant(
+ participant_email=participant_email,
+ incident=incident,
+ db_session=db_session,
+ welcome_template=welcome_template,
+ )
log.debug(f"Welcome participant messages sent {participant_email}.")
-def send_incident_status_notifications(incident: Incident, db_session: SessionLocal):
- """Sends incident status notifications to conversations and distribution lists."""
- notification_text = "Incident Notification"
- notification_type = MessageType.incident_notification
- message_template = INCIDENT_NOTIFICATION
+def send_incident_created_notifications(incident: Incident, db_session: Session):
+ """Sends incident created notifications."""
+ notification_template = INCIDENT_NOTIFICATION.copy()
- # we get the incident documents
- incident_document = get_document(
- db_session=db_session,
- incident_id=incident.id,
- resource_type=INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT,
- )
+ if incident.status != IncidentStatus.closed:
+ if incident.project.allow_self_join:
+ notification_template.insert(0, INCIDENT_NAME_WITH_ENGAGEMENT)
+ else:
+ notification_template.insert(0, INCIDENT_NAME_WITH_ENGAGEMENT_NO_SELF_JOIN)
+ else:
+ notification_template.insert(0, INCIDENT_NAME)
- incident_faq = get_document(
- db_session=db_session, incident_id=incident.id, resource_type=INCIDENT_RESOURCE_FAQ_DOCUMENT
+ incident_description = (
+ incident.description
+ if len(incident.description) <= 500
+ else f"{incident.description[:500]}..."
)
- # we send status notifications to conversations
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- for conversation in INCIDENT_NOTIFICATION_CONVERSATIONS:
- convo_plugin.send(
- conversation,
- notification_text,
- message_template,
- notification_type,
- name=incident.name,
- title=incident.title,
- status=incident.status,
- priority=incident.incident_priority.name,
- commander_fullname=incident.commander.name,
- commander_weblink=incident.commander.weblink,
- document_weblink=incident_document.weblink,
- storage_weblink=incident.storage.weblink,
- ticket_weblink=incident.ticket.weblink,
- faq_weblink=incident_faq.weblink,
- incident_id=incident.id,
- )
-
- # we send status notifications to distribution lists
- email_plugin = plugins.get(INCIDENT_PLUGIN_EMAIL_SLUG)
- for distro in INCIDENT_NOTIFICATION_DISTRIBUTION_LISTS:
- email_plugin.send(
- distro,
- message_template,
- notification_type,
- name=incident.name,
- title=incident.title,
- status=incident.status,
- priority=incident.incident_priority.name,
- commander_fullname=incident.commander.name,
- commander_weblink=incident.commander.weblink,
- document_weblink=incident_document.weblink,
- storage_weblink=incident.storage.weblink,
- ticket_weblink=incident.ticket.weblink,
- faq_weblink=incident_faq.weblink,
- incident_id=incident.id,
- )
+ notification_kwargs = {
+ "name": incident.name,
+ "title": incident.title,
+ "description": incident_description,
+ "visibility": incident.visibility,
+ "status": incident.status,
+ "type": incident.incident_type.name,
+ "type_description": incident.incident_type.description,
+ "severity": incident.incident_severity.name,
+ "severity_description": incident.incident_severity.description,
+ "priority": incident.incident_priority.name,
+ "priority_description": incident.incident_priority.description,
+ "reporter_fullname": incident.reporter.individual.name,
+ "reporter_team": incident.reporter.team,
+ "reporter_weblink": incident.reporter.individual.weblink,
+ "commander_fullname": incident.commander.individual.name,
+ "commander_team": incident.commander.team,
+ "commander_weblink": incident.commander.individual.weblink,
+ "document_weblink": resolve_attr(incident, "incident_document.weblink"),
+ "storage_weblink": resolve_attr(incident, "storage.weblink"),
+ "ticket_weblink": resolve_attr(incident, "ticket.weblink"),
+ "conference_weblink": resolve_attr(incident, "conference.weblink"),
+ "conference_challenge": resolve_attr(incident, "conference.conference_challenge"),
+ "contact_fullname": incident.commander.individual.name,
+ "contact_weblink": incident.commander.individual.weblink,
+ "incident_id": incident.id,
+ "organization_slug": incident.project.organization.slug,
+ }
+
+ faq_doc = document_service.get_incident_faq_document(
+ db_session=db_session, project_id=incident.project_id
+ )
+ if faq_doc:
+ notification_kwargs.update({"faq_weblink": faq_doc.weblink})
- log.debug(f"Incident status notifications sent.")
+ notification_params = {
+ "text": "Incident Notification",
+ "type": MessageType.incident_notification,
+ "template": notification_template,
+ "kwargs": notification_kwargs,
+ }
+ notification_service.filter_and_send(
+ db_session=db_session,
+ project_id=incident.project.id,
+ class_instance=incident,
+ notification_params=notification_params,
+ )
-def send_incident_notifications(incident: Incident, db_session: SessionLocal):
- """Sends all incident notifications."""
- # we send the incident status notifications
- send_incident_status_notifications(incident, db_session)
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Incident notifications sent",
+ incident_id=incident.id,
+ )
- log.debug(f"Incident notifications sent.")
+ log.debug("Incident created notifications sent.")
-def send_incident_update_notifications(incident: Incident, previous_incident: IncidentRead):
+def send_incident_update_notifications(
+ incident: Incident, previous_incident: IncidentRead, db_session: Session
+):
"""Sends notifications about incident changes."""
notification_text = "Incident Notification"
notification_type = MessageType.incident_notification
notification_template = INCIDENT_NOTIFICATION_COMMON.copy()
change = False
- if previous_incident.incident_priority.name != incident.incident_priority.name:
- change = True
- notification_template.append(INCIDENT_PRIORITY_CHANGE)
-
if previous_incident.status != incident.status:
change = True
notification_template.append(INCIDENT_STATUS_CHANGE)
+ if previous_incident.incident_type.name != incident.incident_type.name:
+ notification_template.append(INCIDENT_TYPE_CHANGE)
- if previous_incident.incident_type.name != incident.incident_type.name:
- change = True
- notification_template.append(INCIDENT_TYPE_CHANGE)
+ if previous_incident.incident_severity.name != incident.incident_severity.name:
+ notification_template.append(INCIDENT_SEVERITY_CHANGE)
+
+ if previous_incident.incident_priority.name != incident.incident_priority.name:
+ notification_template.append(INCIDENT_PRIORITY_CHANGE)
+ else:
+ if incident.status != IncidentStatus.closed:
+ if previous_incident.incident_type.name != incident.incident_type.name:
+ change = True
+ notification_template.append(INCIDENT_TYPE_CHANGE)
+
+ if previous_incident.incident_severity.name != incident.incident_severity.name:
+ change = True
+ notification_template.append(INCIDENT_SEVERITY_CHANGE)
+
+ if previous_incident.incident_priority.name != incident.incident_priority.name:
+ change = True
+ notification_template.append(INCIDENT_PRIORITY_CHANGE)
if not change:
- # we don't need to notify
- log.debug(f"Incident change notifications not sent.")
+ # we don't need to send notifications
+ log.debug("Incident updated notifications not sent.")
return
notification_template.append(INCIDENT_COMMANDER)
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
+ # we send an update to the incident conversation if the incident is active or stable
+ if incident.status != IncidentStatus.closed:
+ incident_conversation_notification_template = notification_template.copy()
+ incident_conversation_notification_template.insert(0, INCIDENT_NAME)
- # we send an update to the incident conversation
- convo_plugin.send(
- incident.conversation.channel_id,
- notification_text,
- notification_template,
- notification_type,
- name=incident.name,
- ticket_weblink=incident.ticket.weblink,
- title=incident.title,
- incident_type_old=previous_incident.incident_type.name,
- incident_type_new=incident.incident_type.name,
- incident_priority_old=previous_incident.incident_priority.name,
- incident_priority_new=incident.incident_priority.name,
- incident_status_old=previous_incident.status.value,
- incident_status_new=incident.status,
- commander_fullname=incident.commander.name,
- commander_weblink=incident.commander.weblink,
- )
-
- if incident.visibility == Visibility.open:
- # we send an update to the incident notification conversations
- for conversation in INCIDENT_NOTIFICATION_CONVERSATIONS:
- convo_plugin.send(
- conversation,
+ convo_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+ if convo_plugin:
+ convo_plugin.instance.send(
+ incident.conversation.channel_id,
notification_text,
- notification_template,
+ incident_conversation_notification_template,
notification_type,
+ commander_fullname=incident.commander.individual.name,
+ commander_team=incident.commander.team,
+ commander_weblink=incident.commander.individual.weblink,
+ incident_priority_new=incident.incident_priority,
+ incident_priority_old=previous_incident.incident_priority,
+ incident_severity_new=incident.incident_severity,
+ incident_severity_old=previous_incident.incident_severity,
+ incident_status_new=incident.status,
+ incident_status_old=previous_incident.status,
+ incident_type_new=incident.incident_type.name,
+ incident_type_old=previous_incident.incident_type.name,
name=incident.name,
ticket_weblink=incident.ticket.weblink,
title=incident.title,
- incident_id=incident.id,
- incident_type_old=previous_incident.incident_type.name,
- incident_type_new=incident.incident_type.name,
- incident_priority_old=previous_incident.incident_priority.name,
- incident_priority_new=incident.incident_priority.name,
- incident_status_old=previous_incident.status.value,
- incident_status_new=incident.status,
- commander_fullname=incident.commander.name,
- commander_weblink=incident.commander.weblink,
)
-
- # we send an update to the incident notification distribution lists
- email_plugin = plugins.get(INCIDENT_PLUGIN_EMAIL_SLUG)
- for distro in INCIDENT_NOTIFICATION_DISTRIBUTION_LISTS:
- email_plugin.send(
- distro,
- notification_template,
- notification_type,
- name=incident.name,
- title=incident.title,
- status=incident.status,
- priority=incident.incident_priority.name,
- commander_fullname=incident.commander.name,
- commander_weblink=incident.commander.weblink,
- document_weblink=incident.incident_document.weblink,
- storage_weblink=incident.storage.weblink,
- ticket_weblink=incident.ticket.weblink,
- faq_weblink=incident.incident_faq.weblink,
- incident_id=incident.id,
- incident_priority_old=previous_incident.incident_priority.name,
- incident_priority_new=incident.incident_priority.name,
- incident_type_old=previous_incident.incident_type.name,
- incident_type_new=incident.incident_type.name,
- incident_status_old=previous_incident.status.value,
- incident_status_new=incident.status,
+ else:
+ log.debug(
+ "Incident updated notification not sent to incident conversation. No conversation plugin enabled." # noqa
)
- log.debug(f"Incident update notifications sent.")
+ # we send a notification to the notification conversations and emails
+ fyi_notification_template = notification_template.copy()
+ if incident.status != IncidentStatus.closed:
+ if incident.project.allow_self_join:
+ fyi_notification_template.insert(0, INCIDENT_NAME_WITH_ENGAGEMENT)
+ else:
+ fyi_notification_template.insert(0, INCIDENT_NAME_WITH_ENGAGEMENT_NO_SELF_JOIN)
+ else:
+ fyi_notification_template.insert(0, INCIDENT_NAME)
+
+ notification_kwargs = {
+ "commander_fullname": incident.commander.individual.name,
+ "commander_team": incident.commander.team,
+ "commander_weblink": incident.commander.individual.weblink,
+ "contact_fullname": incident.commander.individual.name,
+ "contact_weblink": incident.commander.individual.weblink,
+ "incident_id": incident.id,
+ "incident_priority_new": incident.incident_priority,
+ "incident_priority_old": previous_incident.incident_priority,
+ "incident_severity_new": incident.incident_severity,
+ "incident_severity_old": previous_incident.incident_severity,
+ "incident_status_new": incident.status,
+ "incident_status_old": previous_incident.status,
+ "incident_type_new": incident.incident_type.name,
+ "incident_type_old": previous_incident.incident_type.name,
+ "name": incident.name,
+ "organization_slug": incident.project.organization.slug,
+ "ticket_weblink": resolve_attr(incident, "ticket.weblink"),
+ "title": incident.title,
+ }
+
+ notification_params = {
+ "text": notification_text,
+ "type": notification_type,
+ "template": fyi_notification_template,
+ "kwargs": notification_kwargs,
+ }
+
+ notification_service.filter_and_send(
+ db_session=db_session,
+ project_id=incident.project.id,
+ class_instance=incident,
+ notification_params=notification_params,
+ )
+
+ log.debug("Incident updated notifications sent.")
-def send_incident_participant_announcement_message(
- participant_email: str, incident_id: int, db_session=SessionLocal
+@timer
+def send_participant_announcement_message(
+ participant_email: str,
+ subject: Subject,
+ db_session: Session,
):
"""Announces a participant in the conversation."""
- notification_text = "New Incident Participant"
- notification_type = MessageType.incident_notification
- notification_template = []
+ subject_type = type(subject).__name__
- # we load the incident instance
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+ if not subject.conversation:
+ log.warning(
+ f"{subject_type} participant announcement message not sent. No conversation available for this {subject_type.lower()}."
+ )
+ return
- participant = participant_service.get_by_incident_id_and_email(
- db_session=db_session, incident_id=incident_id, email=participant_email
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=subject.project.id, plugin_type="conversation"
)
+ if not plugin:
+ log.warning(
+ "Incident participant announcement message not sent. No conversation plugin enabled."
+ )
+ return
- contact_plugin = plugins.get(INCIDENT_PLUGIN_CONTACT_SLUG)
- participant_info = contact_plugin.get(participant_email)
+ notification_text = f"New {subject_type} Participant"
+ notification_type = MessageType.incident_notification
+ notification_template = []
+
+ match subject_type:
+ case SubjectNames.CASE:
+ participant = participant_service.get_by_case_id_and_email(
+ db_session=db_session,
+ case_id=subject.id,
+ email=participant_email,
+ )
+ case SubjectNames.INCIDENT:
+ participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session,
+ incident_id=subject.id,
+ email=participant_email,
+ )
+ case _:
+ raise Exception(
+ "Unknown subject was passed to send_participant_announcement_message",
+ )
- participant_name = participant_info["fullname"]
- participant_team = participant_info["team"]
- participant_department = participant_info["department"]
- participant_location = participant_info["location"]
- participant_weblink = participant_info["weblink"]
+ participant_info = {}
+ contact_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=subject.project.id, plugin_type="contact"
+ )
+ if contact_plugin:
+ participant_info = contact_plugin.instance.get(participant_email, db_session=db_session)
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- participant_avatar_url = convo_plugin.get_participant_avatar_url(participant_email)
+ participant_name = participant_info.get("fullname", "Unknown")
+ participant_team = participant_info.get("team", "Unknown")
+ participant_department = participant_info.get("department", "Unknown")
+ participant_location = participant_info.get("location", "Unknown")
+ participant_weblink = participant_info.get("weblink", DISPATCH_UI_URL)
participant_active_roles = participant_role_service.get_all_active_roles(
db_session=db_session, participant_id=participant.id
@@ -380,66 +584,249 @@ def send_incident_participant_announcement_message(
for role in participant_active_roles:
participant_roles.append(role.role)
+ participant_avatar_url = None
+ try:
+ participant_avatar_url = plugin.instance.get_participant_avatar_url(participant_email)
+ except SlackApiError as e:
+ if e.response["error"] == SlackAPIErrorCode.USERS_NOT_FOUND:
+ log.warning(f"Unable to fetch participant avatar for {participant_email}: {e}")
+
+ participant_name_mrkdwn = participant_name
+ if participant_weblink:
+ participant_name_mrkdwn = f"<{participant_weblink}|{participant_name}>"
+
blocks = [
- {"type": "section", "text": {"type": "mrkdwn", "text": f"*{notification_text}*"}},
{
+ "type": "section",
+ "text": {"type": "mrkdwn", "text": f"*{notification_text}*"},
+ },
+ {
+ "type": "section",
+ "text": {
+ "type": "mrkdwn",
+ "text": (
+ f"*Name:* {participant_name_mrkdwn}\n"
+ f"*Team*: {participant_team}, {participant_department}\n"
+ f"*Location*: {participant_location}\n"
+ f"*{subject_type} Role(s)*: {(', ').join(participant_roles)}\n"
+ ),
+ },
+ },
+ ]
+
+ if participant_avatar_url:
+ blocks[1]["accessory"] = {
+ "type": "image",
+ "image_url": participant_avatar_url,
+ "alt_text": participant_name,
+ }
+
+ try:
+ if subject_type == SubjectNames.CASE and subject.has_thread:
+ plugin.instance.send(
+ subject.conversation.channel_id,
+ notification_text,
+ notification_template,
+ notification_type,
+ blocks=blocks,
+ ts=subject.conversation.thread_id,
+ )
+ else:
+ plugin.instance.send(
+ subject.conversation.channel_id,
+ notification_text,
+ notification_template,
+ notification_type,
+ blocks=blocks,
+ )
+ except SlackApiError as e:
+ if e.response["error"] == SlackAPIErrorCode.USERS_NOT_FOUND:
+ log.warning(f"Failed to send announcement message to {participant_email}: {e}")
+ else:
+ log.debug(f"{subject_type} participant announcement message sent.")
+
+
+@timer
+def bulk_participant_announcement_message(
+ participant_emails: list[str],
+ subject: Subject,
+ db_session: Session,
+):
+ """Announces a list of participants in the conversation."""
+ subject_type = type(subject).__name__
+
+ if not subject.conversation:
+ log.warning(
+ f"{subject_type} participant announcement message not sent. No conversation available for this {subject_type.lower()}."
+ )
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=subject.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning(
+ "Incident participant announcement message not sent. No conversation plugin enabled."
+ )
+ return
+
+ notification_text = f"{len(participant_emails)} New {subject_type} Participants"
+ notification_type = MessageType.incident_notification
+ notification_template = []
+
+ participant_blocks = []
+ for participant_email in participant_emails:
+ match subject_type:
+ case SubjectNames.CASE:
+ participant = participant_service.get_by_case_id_and_email(
+ db_session=db_session,
+ case_id=subject.id,
+ email=participant_email,
+ )
+ case SubjectNames.INCIDENT:
+ participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session,
+ incident_id=subject.id,
+ email=participant_email,
+ )
+ case _:
+ raise Exception(
+ "Unknown subject was passed to send_participant_announcement_message"
+ )
+
+ participant_info = {}
+ contact_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=subject.project.id, plugin_type="contact"
+ )
+
+ if contact_plugin:
+ participant_info = contact_plugin.instance.get(participant_email, db_session=db_session)
+
+ participant_name = participant_info.get("fullname", "Unknown")
+ participant_team = participant_info.get("team", "Unknown")
+ participant_department = participant_info.get("department", "Unknown")
+ participant_location = participant_info.get("location", "Unknown")
+ participant_weblink = participant_info.get("weblink", DISPATCH_UI_URL)
+
+ participant_active_roles = participant_role_service.get_all_active_roles(
+ db_session=db_session, participant_id=participant.id
+ )
+ participant_roles = [role.role for role in participant_active_roles]
+
+ participant_avatar_url = None
+ try:
+ participant_avatar_url = plugin.instance.get_participant_avatar_url(participant_email)
+ except SlackApiError as e:
+ if e.response["error"] == SlackAPIErrorCode.USERS_NOT_FOUND:
+ log.warning(f"Unable to fetch participant avatar for {participant_email}: {e}")
+
+ participant_name_mrkdwn = participant_name
+ if participant_weblink:
+ participant_name_mrkdwn = f"<{participant_weblink}|{participant_name}>"
+
+ participant_block = {
"type": "section",
"text": {
"type": "mrkdwn",
"text": (
- f"*Name:* <{participant_weblink}|{participant_name}>\n"
+ f"*Name:* {participant_name_mrkdwn}\n"
f"*Team*: {participant_team}, {participant_department}\n"
f"*Location*: {participant_location}\n"
f"*Incident Role(s)*: {(', ').join(participant_roles)}\n"
),
},
- "accessory": {
+ }
+
+ if participant_avatar_url:
+ participant_block["accessory"] = {
"type": "image",
"image_url": participant_avatar_url,
"alt_text": participant_name,
- },
+ }
+
+ participant_blocks.append(participant_block)
+
+ blocks = [
+ {
+ "type": "section",
+ "text": {"type": "mrkdwn", "text": f"*{notification_text}*"},
},
+ *participant_blocks,
]
- convo_plugin.send(
- incident.conversation.channel_id,
- notification_text,
- notification_template,
- notification_type,
- blocks=blocks,
- )
-
- log.debug(f"Incident participant announcement message sent.")
+ try:
+ if subject_type == SubjectNames.CASE and subject.has_thread:
+ plugin.instance.send(
+ subject.conversation.channel_id,
+ notification_text,
+ notification_template,
+ notification_type,
+ blocks=blocks,
+ ts=subject.conversation.thread_id,
+ )
+ else:
+ plugin.instance.send(
+ subject.conversation.channel_id,
+ notification_text,
+ notification_template,
+ notification_type,
+ blocks=blocks,
+ )
+ except SlackApiError as e:
+ if e.response["error"] == SlackAPIErrorCode.USERS_NOT_FOUND:
+ log.warning(f"Failed to send announcement messages: {e}")
+ else:
+ log.debug(f"{subject_type} participant announcement messages sent.")
-def send_incident_commander_readded_notification(incident_id: int, db_session: SessionLocal):
+def send_incident_commander_readded_notification(incident: Incident, db_session: Session):
"""Sends a notification about re-adding the incident commander to the conversation."""
notification_text = "Incident Notification"
notification_type = MessageType.incident_notification
- # we load the incident instance
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning(
+ "Unable to send commander re-added notification, no conversation plugin enabled."
+ )
+ return
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- convo_plugin.send(
+ plugin.instance.send(
incident.conversation.channel_id,
notification_text,
INCIDENT_COMMANDER_READDED_NOTIFICATION,
notification_type,
- commander_fullname=incident.commander.name,
+ commander_fullname=incident.commander.individual.name,
)
- log.debug(f"Incident commander readded notification sent.")
+ log.debug("Incident commander readded notification sent.")
def send_incident_participant_has_role_ephemeral_message(
- assigner_email: str, assignee_contact_info: dict, assignee_role: str, incident: Incident
+ assigner_email: str,
+ assignee_contact_info: dict,
+ assignee_role: str,
+ incident: Incident,
+ db_session: Session,
):
- """Sends an ephemeral message to the assigner to let them know that the assignee already has the role."""
+ """
+ Sends an ephemeral message to the assigner to let them know
+ that the assignee already has the role.
+ """
notification_text = "Incident Assign Role Notification"
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- convo_plugin.send_ephemeral(
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning(
+ "Unabled to send incident participant has role message, no conversation plugin enabled."
+ )
+ return
+
+ plugin.instance.send_ephemeral(
incident.conversation.channel_id,
assigner_email,
notification_text,
@@ -448,23 +835,39 @@ def send_incident_participant_has_role_ephemeral_message(
"type": "section",
"text": {
"type": "plain_text",
- "text": f"{assignee_contact_info['fullname']} already has the {assignee_role} role.",
+ "text": f"{assignee_contact_info['fullname']} already has the {assignee_role} role.", # noqa
},
}
],
)
- log.debug(f"Incident participant has role message sent.")
+ log.debug("Incident participant has role message sent.")
def send_incident_participant_role_not_assigned_ephemeral_message(
- assigner_email: str, assignee_contact_info: dict, assignee_role: str, incident: Incident
+ assigner_email: str,
+ assignee_contact_info: dict,
+ assignee_role: str,
+ incident: Incident,
+ db_session: Session,
):
- """Sends an ephemeral message to the assigner to let them know that we were not able to assign the role."""
+ """
+ Sends an ephemeral message to the assigner to let them know
+ that we were not able to assign the role.
+ """
notification_text = "Incident Assign Role Notification"
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- convo_plugin.send_ephemeral(
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning(
+ "Unabled to send incident participant role not assigned message, no conversation plugin enabled." # noqa
+ )
+ return
+
+ # TODO we should use raw blocks here (kglisson)
+ plugin.instance.send_ephemeral(
incident.conversation.channel_id,
assigner_email,
notification_text,
@@ -473,96 +876,337 @@ def send_incident_participant_role_not_assigned_ephemeral_message(
"type": "section",
"text": {
"type": "plain_text",
- "text": f"We were not able to assign the {assignee_role} role to {assignee_contact_info['fullname']}.",
+ "text": f"We were not able to assign the {assignee_role} role to {assignee_contact_info['fullname']}.", # noqa
},
}
],
)
- log.debug(f"Incident participant role not assigned message sent.")
+ log.debug("Incident participant role not assigned message sent.")
def send_incident_new_role_assigned_notification(
- assigner_contact_info: dict, assignee_contact_info: dict, assignee_role: str, incident: Incident
+ assigner_contact_info: dict,
+ assignee_contact_info: dict,
+ assignee_role: str,
+ incident: Incident,
+ db_session: Session,
):
"""Notified the conversation about the new participant role."""
notification_text = "Incident Notification"
notification_type = MessageType.incident_notification
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- convo_plugin.send(
+ if not incident.conversation:
+ log.warning("Incident new role message not sent because incident has no conversation.")
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning(
+ "Incident new role assignment message not sent. No conversation plugin is enabled."
+ )
+ return
+
+ plugin.instance.send(
incident.conversation.channel_id,
notification_text,
INCIDENT_NEW_ROLE_NOTIFICATION,
notification_type,
assigner_fullname=assigner_contact_info["fullname"],
+ assigner_email=assigner_contact_info["email"],
assignee_fullname=assignee_contact_info["fullname"],
- assignee_firstname=assignee_contact_info["fullname"].split(" ")[0],
+ assignee_email=assignee_contact_info["email"],
assignee_weblink=assignee_contact_info["weblink"],
assignee_role=assignee_role,
)
-
- log.debug(f"Incident new role assigned message sent.")
+ log.debug("Incident new role assigned message sent.")
def send_incident_review_document_notification(
- conversation_id: str, incident_review_document_weblink: str
+ conversation_id: str, review_document_weblink: str, incident: Incident, db_session: Session
):
"""Sends the review document notification."""
notification_text = "Incident Notification"
notification_type = MessageType.incident_notification
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- convo_plugin.send(
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning("Incident review document not sent, no conversation enabled.")
+ return
+
+ plugin.instance.send(
conversation_id,
notification_text,
- INCIDENT_REVIEW_DOCUMENT_NOTIFICATION,
+ [INCIDENT_REVIEW_DOCUMENT],
notification_type,
- incident_review_document_weblink=incident_review_document_weblink,
+ review_document_weblink=review_document_weblink,
)
- log.debug(f"Incident review document notification sent.")
+ log.debug("Incident review document notification sent.")
-def send_incident_resources_ephemeral_message_to_participant(
- user_id: str, incident_id: int, db_session: SessionLocal
-):
- """Sends the list of incident resources to the participant via an ephemeral message."""
- # we load the incident instance
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+def send_incident_close_reminder(incident: Incident, db_session: Session):
+ """
+ Sends a direct message to the incident commander reminding
+ them to close the incident if possible.
+ """
+ message_text = "Incident Close Reminder"
+ message_template = INCIDENT_CLOSE_REMINDER
- # we get the incident documents
- incident_document = get_document(
- db_session=db_session,
- incident_id=incident_id,
- resource_type=INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT,
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
)
+ if not plugin:
+ log.warning("Incident close reminder message not sent. No conversation plugin enabled.")
+ return
+
+ update_command = plugin.instance.get_command_name(ConversationCommands.update_incident)
- incident_faq = get_document(
- db_session=db_session, incident_id=incident_id, resource_type=INCIDENT_RESOURCE_FAQ_DOCUMENT
+ items = [
+ {
+ "command": update_command,
+ "name": incident.name,
+ "dispatch_ui_incident_url": f"{DISPATCH_UI_URL}/{incident.project.organization.name}/incidents/{incident.name}", # noqa
+ "conversation_weblink": resolve_attr(incident, "conversation.weblink"),
+ "title": incident.title,
+ "status": incident.status,
+ }
+ ]
+
+ plugin.instance.send_direct(
+ incident.commander.individual.email,
+ message_text,
+ message_template,
+ MessageType.incident_status_reminder,
+ items=items,
)
- incident_conversation_commands_reference_document = get_document(
- db_session=db_session,
- incident_id=incident_id,
- resource_type=INCIDENT_RESOURCE_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT,
+ log.debug(f"Incident close reminder sent to {incident.commander.individual.email}.")
+
+
+def send_incident_closed_information_review_reminder(incident: Incident, db_session: Session):
+ """
+ Sends a direct message to the incident commander
+ asking them to review the incident's information
+ and to tag the incident if appropriate.
+ """
+ message_text = "Incident Closed Information Review Reminder"
+ message_template = INCIDENT_CLOSED_INFORMATION_REVIEW_REMINDER_NOTIFICATION
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning(
+ "Incident closed information review reminder message not sent, no conversation plugin enabled." # noqa
+ )
+ return
+
+ incident_description = (
+ incident.description
+ if len(incident.description) <= 100
+ else f"{incident.description[:100]}..."
)
- # we send the ephemeral message
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- convo_plugin.send_ephemeral(
+ incident_resolution = (
+ incident.resolution
+ if len(incident.resolution) <= 100
+ else f"{incident.resolution[:100]}..."
+ )
+
+ items = [
+ {
+ "name": incident.name,
+ "title": incident.title,
+ "description": incident_description,
+ "resolution": incident_resolution,
+ "type": incident.incident_type.name,
+ "severity": incident.incident_severity.name,
+ "priority": incident.incident_priority.name,
+ "dispatch_ui_incident_url": f"{DISPATCH_UI_URL}/{incident.project.organization.name}/incidents/{incident.name}", # noqa
+ }
+ ]
+
+ plugin.instance.send_direct(
+ incident.commander.individual.email,
+ message_text,
+ message_template,
+ MessageType.incident_closed_information_review_reminder,
+ items=items,
+ )
+
+ log.debug(
+ f"Incident closed information review reminder sent to {incident.commander.individual.email}." # noqa
+ )
+
+
+def send_incident_rating_feedback_message(incident: Incident, db_session: Session):
+ """
+ Sends a direct message to all incident participants asking
+ them to rate and provide feedback about the incident.
+ """
+ notification_text = "Incident Rating and Feedback"
+ notification_template = INCIDENT_CLOSED_RATING_FEEDBACK_NOTIFICATION
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning(
+ "Incident rating and feedback message not sent, no conversation plugin enabled."
+ )
+ return
+
+ items = [
+ {
+ "incident_id": incident.id,
+ "organization_slug": incident.project.organization.slug,
+ "name": incident.name,
+ "title": incident.title,
+ "ticket_weblink": incident.ticket.weblink,
+ }
+ ]
+
+ for participant in incident.participants:
+ try:
+ plugin.instance.send_direct(
+ participant.individual.email,
+ notification_text,
+ notification_template,
+ MessageType.incident_rating_feedback,
+ items=items,
+ )
+ except Exception as e:
+ # if one fails we don't want all to fail
+ log.exception(e)
+
+ log.debug("Incident rating and feedback message sent to all participants.")
+
+
+def send_incident_management_help_tips_message(incident: Incident, db_session: Session):
+ """
+ Sends a direct message to the incident commander
+ with help tips on how to manage the incident.
+ """
+ notification_text = "Incident Management Help Tips"
+ message_template = INCIDENT_MANAGEMENT_HELP_TIPS_MESSAGE
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning(
+ "Incident management help tips message not sent, no conversation plugin enabled."
+ )
+ return
+
+ engage_oncall_command = plugin.instance.get_command_name(ConversationCommands.engage_oncall)
+ executive_report_command = plugin.instance.get_command_name(
+ ConversationCommands.executive_report
+ )
+ tactical_report_command = plugin.instance.get_command_name(ConversationCommands.tactical_report)
+ update_command = plugin.instance.get_command_name(ConversationCommands.update_incident)
+
+ items = [
+ {
+ "name": incident.name,
+ "title": incident.title,
+ "engage_oncall_command": engage_oncall_command,
+ "executive_report_command": executive_report_command,
+ "tactical_report_command": tactical_report_command,
+ "update_command": update_command,
+ "conversation_weblink": resolve_attr(incident, "conversation.weblink"),
+ }
+ ]
+
+ plugin.instance.send_direct(
+ incident.commander.individual.email,
+ notification_text,
+ message_template,
+ MessageType.incident_management_help_tips,
+ items=items,
+ )
+
+ log.debug(
+ f"Incident management help tips message sent to incident commander with email {incident.commander.individual.email}." # noqa
+ )
+
+
+def send_incident_open_tasks_ephemeral_message(
+ participant_email: str, incident: Incident, db_session: Session
+):
+ """
+ Sends an ephemeral message to the participant asking them to resolve or re-assign
+ their open tasks before leaving the incident conversation.
+ """
+ convo_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+ if not convo_plugin:
+ log.warning("Incident open tasks message not sent. No conversation plugin enabled.")
+ return
+
+ notification_text = "Open Incident Tasks"
+ message_type = MessageType.incident_open_tasks
+ message_template = INCIDENT_OPEN_TASKS
+ message_kwargs = {
+ "title": notification_text,
+ "dispatch_ui_url": f"{DISPATCH_UI_URL}/{incident.project.organization.name}/tasks",
+ }
+
+ convo_plugin.instance.send_ephemeral(
incident.conversation.channel_id,
- user_id,
- "Incident Resources Message",
- INCIDENT_RESOURCES_MESSAGE,
- MessageType.incident_resources_message,
- commander_fullname=incident.commander.name,
- commander_weblink=incident.commander.weblink,
- document_weblink=incident_document.weblink,
- storage_weblink=incident.storage.weblink,
- faq_weblink=incident_faq.weblink,
- conversation_commands_reference_document_weblink=incident_conversation_commands_reference_document.weblink,
- conference_weblink=incident.conference.weblink,
- )
-
- log.debug(f"List of incident resources sent to {user_id} via ephemeral message.")
+ participant_email,
+ notification_text,
+ message_template,
+ message_type,
+ **message_kwargs,
+ )
+
+ log.debug(f"Open incident tasks message sent to {participant_email}.")
+
+
+def send_task_add_ephemeral_message(
+ *,
+ assignee_email: str,
+ incident: Incident,
+ db_session: Session,
+ task: TaskCreate,
+):
+ """
+ Sends an ephemeral message to the assignee letting them know why they have been added to the incident.
+ """
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning("Task add message not sent, no conversation plugin enabled.")
+ return
+
+ notification_text = "Task Added Notification"
+ message_type = MessageType.task_add_to_incident
+ message_template = INCIDENT_TASK_ADD_TO_INCIDENT
+ message_kwargs = {
+ "title": notification_text,
+ "dispatch_ui_url": f"{DISPATCH_UI_URL}/{incident.project.organization.name}/tasks?incident={incident.name}",
+ "task_description": task.description,
+ "task_weblink": task.weblink,
+ }
+ try:
+ plugin.instance.send_ephemeral(
+ incident.conversation.channel_id,
+ assignee_email,
+ notification_text,
+ message_template,
+ message_type,
+ **message_kwargs,
+ )
+
+ log.debug(f"Task add message sent to {assignee_email}.")
+ except Exception as e:
+ log.error(f"Error in sending task add message to {assignee_email}: {e}")
diff --git a/src/dispatch/incident/metrics.py b/src/dispatch/incident/metrics.py
index ffa0018f8d46..afdec49f2380 100644
--- a/src/dispatch/incident/metrics.py
+++ b/src/dispatch/incident/metrics.py
@@ -1,26 +1,21 @@
import json
-import math
import logging
+import math
+from calendar import monthrange
+from datetime import date
from itertools import groupby
-from datetime import date
-from dateutil.relativedelta import relativedelta
+import pandas as pd
+from sqlalchemy import and_
+from statsmodels.tsa.api import ExponentialSmoothing
-from calendar import monthrange
+from dispatch.database.service import apply_filter_specific_joins, apply_filters
+from dispatch.incident.type.models import IncidentType
-from dispatch.config import INCIDENT_METRIC_FORECAST_REGRESSIONS
-from dispatch.incident_type.models import IncidentType
from .models import Incident
-
log = logging.getLogger(__name__)
-try:
- from fbprophet import Prophet
- import pandas as pd
-except ImportError:
- log.warning("Unable to import fbprophet, some metrics will not be usable.")
-
def month_grouper(item):
"""Determines the last day of a given month."""
@@ -32,72 +27,76 @@ def month_grouper(item):
return key
-def make_forecast(
- db_session, incident_type: str = None, periods: int = 24, grouping: str = "month"
+def create_incident_metric_query(
+ db_session,
+ end_date: date,
+ start_date: date = None,
+ filter_spec: list[dict] | str | None = None,
):
- """Makes an incident forecast."""
- query = db_session.query(Incident).join(IncidentType)
+ """Fetches eligible incidents."""
+ query = db_session.query(Incident)
- # exclude simulations
- query = query.filter(IncidentType.name != "Simulation")
+ if filter_spec:
+ if isinstance(filter_spec, str):
+ filter_spec = json.loads(filter_spec)
+ query = apply_filter_specific_joins(Incident, filter_spec, query)
+ query = apply_filters(query, filter_spec)
- # exclude current month
- query = query.filter(Incident.reported_at < date.today().replace(day=1))
+ if start_date:
+ query = query.filter(
+ and_(Incident.reported_at <= end_date, Incident.reported_at >= start_date)
+ )
+ else:
+ query = query.filter(Incident.reported_at <= end_date)
- if incident_type != "all":
- if incident_type:
- query = query.filter(IncidentType.name == incident_type)
+ # exclude incident types
+ query = query.filter(IncidentType.exclude_from_metrics.isnot(True))
+ return query.all()
- if grouping == "month":
- grouper = month_grouper
- query.filter(Incident.reported_at > date.today() + relativedelta(months=-periods))
- incidents = query.all()
- incidents_sorted = sorted(incidents, key=grouper)
+def make_forecast(incidents: list[Incident]):
+ """Makes an incident forecast."""
+ incidents_sorted = sorted(incidents, key=month_grouper)
- # TODO ensure there are no missing periods (e.g. periods with no incidents)
dataframe_dict = {"ds": [], "y": []}
- regression_keys = []
- if INCIDENT_METRIC_FORECAST_REGRESSIONS:
- # assumes json file with key as column and list of values
- regression_data = json.loads(INCIDENT_METRIC_FORECAST_REGRESSIONS)
- regression_keys = regression_data.keys()
- dataframe_dict.update(regression_data)
-
- for (last_day, items) in groupby(incidents_sorted, grouper):
+ for last_day, items in groupby(incidents_sorted, month_grouper):
dataframe_dict["ds"].append(str(last_day))
dataframe_dict["y"].append(len(list(items)))
dataframe = pd.DataFrame.from_dict(dataframe_dict)
- forecaster = Prophet()
-
- for r in regression_keys:
- forecaster.add_regressor(r)
-
- forecaster.fit(dataframe, algorithm="LBFGS")
-
- # https://facebook.github.io/prophet/docs/quick_start.html#python-api
- future = forecaster.make_future_dataframe(periods=periods, freq="M")
- forecast = forecaster.predict(future)
-
- forecast_data = forecast.to_dict("series")
-
- return {
- "categories": list(forecast_data["ds"]),
- "series": [
- {
- "name": "Upper",
- "data": [max(math.ceil(x), 0) for x in list(forecast_data["yhat_upper"])],
- },
- {
- "name": "Predicted",
- "data": [max(math.ceil(x), 0) for x in list(forecast_data["yhat"])],
- },
- {
- "name": "Lower",
- "data": [max(math.ceil(x), 0) for x in list(forecast_data["yhat_lower"])],
- },
- ],
- }
+ if dataframe.empty:
+ return [], []
+
+ # reset index to by month and drop month column
+ dataframe.index = dataframe.ds
+ dataframe.index.freq = "M"
+ dataframe.drop("ds", inplace=True, axis=1)
+
+ # fill periods without incidents with 0
+ idx = pd.date_range(dataframe.index[0], dataframe.index[-1], freq="ME")
+ dataframe.index = pd.DatetimeIndex(dataframe.index)
+ dataframe = dataframe.reindex(idx, fill_value=0)
+
+ row_count, _ = dataframe.shape
+
+ if row_count > 3:
+ try:
+ forecaster = ExponentialSmoothing(
+ dataframe, seasonal_periods=12, trend="add", seasonal="add"
+ ).fit()
+ except Exception as e:
+ log.warning(f"Issue forecasting incidents: {e}")
+ return [], []
+ forecast = forecaster.forecast(12)
+ forecast_df = pd.DataFrame({"ds": forecast.index.astype("str"), "that": forecast.values})
+
+ forecast_data = forecast_df.to_dict("series")
+
+ # drop day data
+ categories = [d[:-3] for d in forecast_data["ds"]]
+ predicted_counts = [max(math.ceil(x), 0) for x in list(forecast_data["that"])]
+ return categories, predicted_counts
+ else:
+ return [], []
diff --git a/src/dispatch/incident/models.py b/src/dispatch/incident/models.py
index 7b0cb4bd1dd0..6301e50bcd84 100644
--- a/src/dispatch/incident/models.py
+++ b/src/dispatch/incident/models.py
@@ -1,69 +1,101 @@
+"""Models for incident resources in the Dispatch application."""
+
+from collections import Counter, defaultdict
from datetime import datetime
-from typing import List, Optional, Any
-
-from pydantic import validator
-from sqlalchemy import (
- Column,
- DateTime,
- Float,
- ForeignKey,
- Integer,
- PrimaryKeyConstraint,
- String,
- Table,
-)
+
+from pydantic import field_validator, AnyHttpUrl
+
+from sqlalchemy import Column, DateTime, ForeignKey, Integer, PrimaryKeyConstraint, String, Table
from sqlalchemy.ext.hybrid import hybrid_property
from sqlalchemy.orm import relationship
-from sqlalchemy_utils import TSVectorType
-
-from dispatch.config import (
- INCIDENT_RESOURCE_FAQ_DOCUMENT,
- INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT,
-)
+from sqlalchemy_utils import TSVectorType, observes
+from dispatch.conference.models import ConferenceRead
from dispatch.conversation.models import ConversationRead
-from dispatch.database import Base
-from dispatch.document.models import DocumentRead
+from dispatch.database.core import Base
+from dispatch.document.models import Document, DocumentRead
from dispatch.enums import Visibility
-from dispatch.incident_priority.models import (
+from dispatch.event.models import EventRead
+from dispatch.group.models import Group
+from dispatch.incident.priority.models import (
+ IncidentPriorityBase,
IncidentPriorityCreate,
IncidentPriorityRead,
- IncidentPriorityBase,
+ IncidentPriorityReadMinimal,
+)
+from dispatch.incident.severity.models import (
+ IncidentSeverityBase,
+ IncidentSeverityCreate,
+ IncidentSeverityRead,
+ IncidentSeverityReadMinimal,
+)
+from dispatch.incident.type.models import (
+ IncidentTypeBase,
+ IncidentTypeCreate,
+ IncidentTypeRead,
+ IncidentTypeReadMinimal,
)
-from dispatch.incident_type.models import IncidentTypeCreate, IncidentTypeRead, IncidentTypeBase
-from dispatch.models import DispatchBase, IndividualReadNested, TimeStampMixin
-from dispatch.participant_role.models import ParticipantRoleType
+from dispatch.incident_cost.models import (
+ IncidentCostRead,
+ IncidentCostUpdate,
+)
+from dispatch.messaging.strings import INCIDENT_RESOLUTION_DEFAULT
+from dispatch.models import (
+ DispatchBase,
+ NameStr,
+ PrimaryKey,
+ ProjectMixin,
+ TimeStampMixin,
+ Pagination,
+)
+from dispatch.participant.models import (
+ Participant,
+ ParticipantRead,
+ ParticipantReadMinimal,
+ ParticipantUpdate,
+)
+from dispatch.report.enums import ReportTypes
+from dispatch.report.models import ReportRead
from dispatch.storage.models import StorageRead
+from dispatch.tag.models import TagRead
+from dispatch.task.enums import TaskStatus
+from dispatch.term.models import TermRead
from dispatch.ticket.models import TicketRead
-from dispatch.conference.models import ConferenceRead
+from dispatch.workflow.models import WorkflowInstanceRead
from .enums import IncidentStatus
assoc_incident_terms = Table(
"assoc_incident_terms",
Base.metadata,
- Column("incident_id", Integer, ForeignKey("incident.id")),
- Column("term_id", Integer, ForeignKey("term.id")),
+ Column("incident_id", Integer, ForeignKey("incident.id", ondelete="CASCADE")),
+ Column("term_id", Integer, ForeignKey("term.id", ondelete="CASCADE")),
PrimaryKeyConstraint("incident_id", "term_id"),
)
assoc_incident_tags = Table(
"assoc_incident_tags",
Base.metadata,
- Column("incident_id", Integer, ForeignKey("incident.id")),
- Column("tag_id", Integer, ForeignKey("tag.id")),
+ Column("incident_id", Integer, ForeignKey("incident.id", ondelete="CASCADE")),
+ Column("tag_id", Integer, ForeignKey("tag.id", ondelete="CASCADE")),
PrimaryKeyConstraint("incident_id", "tag_id"),
)
-class Incident(Base, TimeStampMixin):
+class Incident(Base, TimeStampMixin, ProjectMixin):
id = Column(Integer, primary_key=True)
name = Column(String)
title = Column(String, nullable=False)
description = Column(String, nullable=False)
- status = Column(String, default=IncidentStatus.active)
- cost = Column(Float, default=0)
- visibility = Column(String, default=Visibility.open)
+ resolution = Column(String, default=INCIDENT_RESOLUTION_DEFAULT, nullable=False)
+ status = Column(String, default=IncidentStatus.active, nullable=False)
+ visibility = Column(String, default=Visibility.open, nullable=False)
+ participants_team = Column(String)
+ participants_location = Column(String)
+ commanders_location = Column(String)
+ reporters_location = Column(String)
+ delay_executive_report_reminder = Column(DateTime, nullable=True)
+ delay_tactical_report_reminder = Column(DateTime, nullable=True)
# auto generated
reported_at = Column(DateTime, default=datetime.utcnow)
@@ -76,126 +108,342 @@ class Incident(Base, TimeStampMixin):
)
)
- # NOTE these only work in python, if want them to be executed via sql we need to
- # write the coresponding expressions. See:
- # https://docs.sqlalchemy.org/en/13/orm/extensions/hybrid.html
- @hybrid_property
- def commander(self):
- if self.participants:
- for p in self.participants:
- for pr in p.participant_role:
- if (
- pr.role == ParticipantRoleType.incident_commander
- and pr.renounce_at
- is None # Column renounce_at will be null for the current incident commander
- ):
- return p.individual
-
@hybrid_property
- def reporter(self):
- if self.participants:
- for p in self.participants:
- for role in p.participant_role:
- if role.role == ParticipantRoleType.reporter:
- return p.individual
+ def tactical_reports(self):
+ if self.reports:
+ tactical_reports = [
+ report for report in self.reports if report.type == ReportTypes.tactical_report
+ ]
+ return tactical_reports
@hybrid_property
- def incident_document(self):
- if self.documents:
- for d in self.documents:
- if d.resource_type == INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT:
- return d
+ def last_tactical_report(self):
+ if self.tactical_reports:
+ return sorted(self.tactical_reports, key=lambda r: r.created_at)[-1]
@hybrid_property
- def incident_faq(self):
- if self.documents:
- for d in self.documents:
- if d.resource_type == INCIDENT_RESOURCE_FAQ_DOCUMENT:
- return d
+ def executive_reports(self):
+ if self.reports:
+ executive_reports = [
+ report for report in self.reports if report.type == ReportTypes.executive_report
+ ]
+ return executive_reports
@hybrid_property
- def last_status_report(self):
- if self.status_reports:
- return sorted(self.status_reports, key=lambda r: r.created_at)[-1]
+ def last_executive_report(self):
+ if self.executive_reports:
+ return sorted(self.executive_reports, key=lambda r: r.created_at)[-1]
# resources
- groups = relationship("Group", lazy="subquery", backref="incident")
- conversation = relationship("Conversation", uselist=False, backref="incident")
- conference = relationship("Conference", uselist=False, backref="incident")
- storage = relationship("Storage", uselist=False, backref="incident")
- ticket = relationship("Ticket", uselist=False, backref="incident")
- documents = relationship("Document", lazy="subquery", backref="incident")
- participants = relationship("Participant", backref="incident")
- incident_type_id = Column(Integer, ForeignKey("incident_type.id"))
- incident_type = relationship("IncidentType", backref="incident")
- incident_priority_id = Column(Integer, ForeignKey("incident_priority.id"))
+ incident_costs = relationship(
+ "IncidentCost",
+ backref="incident",
+ cascade="all, delete-orphan",
+ lazy="subquery",
+ order_by="IncidentCost.created_at",
+ )
+
incident_priority = relationship("IncidentPriority", backref="incident")
- status_reports = relationship("StatusReport", backref="incident")
- tasks = relationship("Task", backref="incident")
- tags = relationship("Tag", secondary=assoc_incident_tags, backref="incidents")
+ incident_priority_id = Column(Integer, ForeignKey("incident_priority.id"))
+
+ incident_severity = relationship("IncidentSeverity", backref="incident")
+ incident_severity_id = Column(Integer, ForeignKey("incident_severity.id"))
+
+ incident_type = relationship("IncidentType", backref="incident")
+ incident_type_id = Column(Integer, ForeignKey("incident_type.id"))
+
+ conference = relationship(
+ "Conference", uselist=False, backref="incident", cascade="all, delete-orphan"
+ )
+ conversation = relationship(
+ "Conversation", uselist=False, backref="incident", cascade="all, delete-orphan"
+ )
+ documents = relationship(
+ "Document",
+ backref="incident",
+ cascade="all, delete-orphan",
+ foreign_keys=[Document.incident_id],
+ )
+ events = relationship("Event", backref="incident", cascade="all, delete-orphan")
+ feedback = relationship("Feedback", backref="incident", cascade="all, delete-orphan")
+ groups = relationship(
+ "Group", backref="incident", cascade="all, delete-orphan", foreign_keys=[Group.incident_id]
+ )
+ participants = relationship(
+ "Participant",
+ backref="incident",
+ cascade="all, delete-orphan",
+ foreign_keys=[Participant.incident_id],
+ )
+ reports = relationship("Report", backref="incident", cascade="all, delete-orphan")
+ storage = relationship(
+ "Storage", uselist=False, backref="incident", cascade="all, delete-orphan"
+ )
+ tags = relationship(
+ "Tag",
+ secondary=assoc_incident_tags,
+ lazy="subquery",
+ backref="incidents",
+ )
+ tasks = relationship("Task", backref="incident", cascade="all, delete-orphan")
terms = relationship("Term", secondary=assoc_incident_terms, backref="incidents")
+ ticket = relationship("Ticket", uselist=False, backref="incident", cascade="all, delete-orphan")
+ workflow_instances = relationship(
+ "WorkflowInstance", backref="incident", cascade="all, delete-orphan"
+ )
+ canvases = relationship("Canvas", back_populates="incident", cascade="all, delete-orphan")
+
+ duplicate_id = Column(Integer, ForeignKey("incident.id"))
+ duplicates = relationship(
+ "Incident", remote_side=[id], lazy="joined", join_depth=2, uselist=True
+ )
+
+ commander_id = Column(Integer, ForeignKey("participant.id"))
+ commander = relationship("Participant", foreign_keys=[commander_id], post_update=True)
+
+ reporter_id = Column(Integer, ForeignKey("participant.id"))
+ reporter = relationship("Participant", foreign_keys=[reporter_id], post_update=True)
+
+ liaison_id = Column(Integer, ForeignKey("participant.id"))
+ liaison = relationship("Participant", foreign_keys=[liaison_id], post_update=True)
+
+ scribe_id = Column(Integer, ForeignKey("participant.id"))
+ scribe = relationship("Participant", foreign_keys=[scribe_id], post_update=True)
+
+ incident_document_id = Column(Integer, ForeignKey("document.id"))
+ incident_document = relationship("Document", foreign_keys=[incident_document_id])
+
+ incident_review_document_id = Column(Integer, ForeignKey("document.id"))
+ incident_review_document = relationship("Document", foreign_keys=[incident_review_document_id])
+
+ tactical_group_id = Column(Integer, ForeignKey("group.id"))
+ tactical_group = relationship("Group", foreign_keys=[tactical_group_id])
+
+ notifications_group_id = Column(Integer, ForeignKey("group.id"))
+ notifications_group = relationship("Group", foreign_keys=[notifications_group_id])
+
+ summary = Column(String, nullable=True)
+
+ @hybrid_property
+ def total_cost(self):
+ total_cost = 0
+ if self.incident_costs:
+ for cost in self.incident_costs:
+ total_cost += cost.amount
+ return total_cost
+
+ @observes("participants")
+ def participant_observer(self, participants):
+ self.participants_team = Counter(p.team for p in participants).most_common(1)[0][0]
+ self.participants_location = Counter(p.location for p in participants).most_common(1)[0][0]
+
+
+class ProjectRead(DispatchBase):
+ """Pydantic model for reading a project resource."""
+
+ id: PrimaryKey | None = None
+ name: NameStr
+ color: str | None = None
+ stable_priority: IncidentPriorityRead | None = None
+ allow_self_join: bool | None = True
+ display_name: str | None = None
+
+
+class CaseReadBasic(DispatchBase):
+ """Pydantic model for reading a case resource."""
+
+ id: PrimaryKey
+ name: NameStr | None = None
+
+
+class TaskRead(DispatchBase):
+ """Pydantic model for reading a task resource."""
+
+ id: PrimaryKey
+ assignees: list[ParticipantRead | None] = []
+ created_at: datetime | None = None
+ description: str | None = None
+ status: TaskStatus = TaskStatus.open
+ owner: ParticipantRead | None = None
+ weblink: AnyHttpUrl | None = None
+ resolve_by: datetime | None = None
+ resolved_at: datetime | None = None
+ ticket: TicketRead | None = None
+
+
+class TaskReadMinimal(DispatchBase):
+ """Pydantic model for reading a minimal task resource."""
+
+ id: PrimaryKey
+ description: str | None = None
+ status: TaskStatus = TaskStatus.open
# Pydantic models...
class IncidentBase(DispatchBase):
+ """Base Pydantic model for incident resources."""
+
title: str
description: str
- status: Optional[IncidentStatus] = IncidentStatus.active
- visibility: Optional[Visibility]
+ resolution: str | None = None
+ status: IncidentStatus | None = None
+ visibility: Visibility | None = None
- @validator("title")
- def title_required(cls, v):
+ @field_validator("title")
+ @classmethod
+ def title_required(cls, v: str) -> str:
+ """Ensures the title is not an empty string."""
if not v:
raise ValueError("must not be empty string")
return v
- @validator("description")
- def description_required(cls, v):
+ @field_validator("description")
+ @classmethod
+ def description_required(cls, v: str) -> str:
+ """Ensures the description is not an empty string."""
if not v:
raise ValueError("must not be empty string")
return v
class IncidentCreate(IncidentBase):
- incident_priority: IncidentPriorityCreate
- incident_type: IncidentTypeCreate
+ """Pydantic model for creating an incident resource."""
+
+ commander: ParticipantUpdate | None = None
+ commander_email: str | None = None
+ incident_priority: IncidentPriorityCreate | None = None
+ incident_severity: IncidentSeverityCreate | None = None
+ incident_type: IncidentTypeCreate | None = None
+ project: ProjectRead | None = None
+ reporter: ParticipantUpdate | None = None
+ tags: list[TagRead] | None = []
+
+
+class IncidentReadBasic(DispatchBase):
+ """Pydantic model for reading a basic incident resource."""
+
+ id: PrimaryKey
+ name: NameStr | None = None
+
+
+class IncidentReadMinimal(IncidentBase):
+ """Pydantic model for reading a minimal incident resource."""
+
+ id: PrimaryKey
+ closed_at: datetime | None = None
+ commander: ParticipantReadMinimal | None = None
+ commanders_location: str | None = None
+ created_at: datetime | None = None
+ duplicates: list[IncidentReadBasic] | None = []
+ incident_costs: list[IncidentCostRead] | None = []
+ incident_document: DocumentRead | None = None
+ incident_priority: IncidentPriorityReadMinimal
+ incident_review_document: DocumentRead | None = None
+ incident_severity: IncidentSeverityReadMinimal
+ incident_type: IncidentTypeReadMinimal
+ name: NameStr | None = None
+ participants_location: str | None = None
+ participants_team: str | None = None
+ project: ProjectRead
+ reported_at: datetime | None = None
+ reporter: ParticipantReadMinimal | None = None
+ reporters_location: str | None = None
+ stable_at: datetime | None = None
+ storage: StorageRead | None = None
+ summary: str | None = None
+ tags: list[TagRead] | None = []
+ tasks: list[TaskReadMinimal] | None = []
+ total_cost: float | None = None
+ cases: list[CaseReadBasic] | None = []
class IncidentUpdate(IncidentBase):
+ """Pydantic model for updating an incident resource."""
+
+ cases: list[CaseReadBasic] | None = []
+ commander: ParticipantUpdate | None = None
+ delay_executive_report_reminder: datetime | None = None
+ delay_tactical_report_reminder: datetime | None = None
+ duplicates: list[IncidentReadBasic] | None = []
+ incident_costs: list[IncidentCostUpdate] | None = []
incident_priority: IncidentPriorityBase
+ incident_severity: IncidentSeverityBase
incident_type: IncidentTypeBase
- reported_at: Optional[datetime] = None
- stable_at: Optional[datetime] = None
- commander: Optional[IndividualReadNested]
- reporter: Optional[IndividualReadNested]
- tags: Optional[List[Any]] = [] # any until we figure out circular imports
- terms: Optional[List[Any]] = [] # any until we figure out circular imports
+ reported_at: datetime | None = None
+ reporter: ParticipantUpdate | None = None
+ stable_at: datetime | None = None
+ summary: str | None = None
+ tags: list[TagRead] | None = []
+ terms: list[TermRead] | None = []
+
+ @field_validator("tags")
+ @classmethod
+ def find_exclusive(cls, v):
+ """Ensures only one exclusive tag per tag type is applied."""
+ if v:
+ exclusive_tags = defaultdict(list)
+ for tag in v:
+ if tag.tag_type.exclusive:
+ exclusive_tags[tag.tag_type.id].append(tag)
+
+ for tag_types in exclusive_tags.values():
+ if len(tag_types) > 1:
+ raise ValueError(
+ f"Found multiple exclusive tags. Please ensure that only one tag of a given type is applied. Tags: {','.join([t.name for t in v])}" # noqa: E501
+ )
+ return v
class IncidentRead(IncidentBase):
- id: int
- cost: float = None
- name: str = None
- reporter: Optional[IndividualReadNested]
- commander: Optional[IndividualReadNested]
- last_status_report: Optional[Any]
+ """Pydantic model for reading an incident resource."""
+
+ id: PrimaryKey
+ cases: list[CaseReadBasic] | None = []
+ closed_at: datetime | None = None
+ commander: ParticipantRead | None = None
+ commanders_location: str | None = None
+ conference: ConferenceRead | None = None
+ conversation: ConversationRead | None = None
+ created_at: datetime | None = None
+ delay_executive_report_reminder: datetime | None = None
+ delay_tactical_report_reminder: datetime | None = None
+ documents: list[DocumentRead] | None = []
+ duplicates: list[IncidentReadBasic] | None = []
+ events: list[EventRead] | None = []
+ incident_costs: list[IncidentCostRead] | None = []
incident_priority: IncidentPriorityRead
+ incident_severity: IncidentSeverityRead
incident_type: IncidentTypeRead
- # participants: Any # List[IndividualReadNested]
- storage: Optional[StorageRead] = None
- ticket: Optional[TicketRead] = None
- documents: Optional[List[DocumentRead]] = []
- tags: Optional[List[Any]] = [] # any until we figure out circular imports
- terms: Optional[List[Any]] = [] # any until we figure out circular imports
- conference: Optional[ConferenceRead] = None
- conversation: Optional[ConversationRead] = None
-
- created_at: Optional[datetime] = None
- reported_at: Optional[datetime] = None
- stable_at: Optional[datetime] = None
- closed_at: Optional[datetime] = None
-
-
-class IncidentPagination(DispatchBase):
- total: int
- items: List[IncidentRead] = []
+ last_executive_report: ReportRead | None = None
+ last_tactical_report: ReportRead | None = None
+ name: NameStr | None = None
+ participants: list[ParticipantRead] | None = []
+ participants_location: str | None = None
+ participants_team: str | None = None
+ project: ProjectRead
+ reported_at: datetime | None = None
+ reporter: ParticipantRead | None = None
+ reporters_location: str | None = None
+ stable_at: datetime | None = None
+ storage: StorageRead | None = None
+ summary: str | None = None
+ tags: list[TagRead] | None = []
+ tasks: list[TaskRead] | None = []
+ terms: list[TermRead] | None = []
+ ticket: TicketRead | None = None
+ total_cost: float | None = None
+ workflow_instances: list[WorkflowInstanceRead] | None = []
+
+
+class IncidentExpandedPagination(Pagination):
+ """Pydantic model for paginated expanded incident results."""
+
+ itemsPerPage: int
+ page: int
+ items: list[IncidentRead] = []
+
+
+class IncidentPagination(Pagination):
+ """Pydantic model for paginated incident results."""
+
+ items: list[IncidentReadMinimal] = []
diff --git a/src/dispatch/incident/priority/__init__.py b/src/dispatch/incident/priority/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/incident/priority/config.py b/src/dispatch/incident/priority/config.py
new file mode 100644
index 000000000000..ba47e7d5365a
--- /dev/null
+++ b/src/dispatch/incident/priority/config.py
@@ -0,0 +1,35 @@
+default_incident_priorities = [
+ {
+ "name": "Low",
+ "description": "This incident may require your team's attention during working hours until the incident is stable.",
+ "view_order": 1,
+ "tactical_report_reminder": 12,
+ "executive_report_reminder": 9999,
+ "color": "#8bc34a",
+ "page_commander": False,
+ "default": True,
+ "enabled": True,
+ },
+ {
+ "name": "Medium",
+ "description": "This incident may require your team's full attention during waking hours, including weekends, until the incident is stable.",
+ "view_order": 2,
+ "tactical_report_reminder": 6,
+ "executive_report_reminder": 12,
+ "color": "#ff9800",
+ "page_commander": False,
+ "default": False,
+ "enabled": True,
+ },
+ {
+ "name": "High",
+ "description": "This incident may require your team's full attention 24x7, and should be prioritized over all other work, until the incident is stable.",
+ "view_order": 3,
+ "tactical_report_reminder": 2,
+ "executive_report_reminder": 6,
+ "color": "#e53935",
+ "page_commander": False,
+ "default": False,
+ "enabled": True,
+ },
+]
diff --git a/src/dispatch/incident/priority/models.py b/src/dispatch/incident/priority/models.py
new file mode 100644
index 000000000000..25c715910218
--- /dev/null
+++ b/src/dispatch/incident/priority/models.py
@@ -0,0 +1,103 @@
+"""Models for incident priority resources in the Dispatch application."""
+
+from pydantic import StrictBool
+
+from sqlalchemy import Column, Integer, String, Boolean
+from sqlalchemy.sql.schema import UniqueConstraint
+from sqlalchemy.event import listen
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base, ensure_unique_default_per_project
+from dispatch.models import DispatchBase, NameStr, ProjectMixin, PrimaryKey, Pagination
+
+
+class IncidentPriority(Base, ProjectMixin):
+ """SQLAlchemy model for incident priority resources."""
+
+ __allow_unmapped__ = True
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ page_commander = Column(Boolean, default=False)
+ color = Column(String)
+ enabled = Column(Boolean, default=True)
+ default = Column(Boolean, default=False)
+ disable_delayed_message_warning = Column(Boolean, default=False)
+
+ # number of hours after which reports should be sent.
+ tactical_report_reminder = Column(Integer, default=24, server_default="24")
+ executive_report_reminder = Column(Integer, default=24, server_default="24")
+
+ # This column is used to control how priorities should be displayed.
+ # Lower orders will be shown first.
+ view_order = Column(Integer, default=9999)
+
+ search_vector = Column(TSVectorType("name", "description"))
+
+
+listen(IncidentPriority.default, "set", ensure_unique_default_per_project)
+
+
+class ProjectRead(DispatchBase):
+ """Pydantic model for reading a project resource."""
+
+ id: PrimaryKey | None = None
+ name: NameStr
+ display_name: str | None = None
+
+
+# Pydantic models...
+class IncidentPriorityBase(DispatchBase):
+ """Base Pydantic model for incident priority resources."""
+
+ name: NameStr
+ description: str | None = None
+ page_commander: StrictBool | None = None
+ tactical_report_reminder: int | None = None
+ executive_report_reminder: int | None = None
+ project: ProjectRead | None = None
+ default: bool | None = None
+ enabled: bool | None = None
+ view_order: int | None = None
+ color: str | None = None
+ disable_delayed_message_warning: bool | None = None
+
+
+class IncidentPriorityCreate(IncidentPriorityBase):
+ """Pydantic model for creating an incident priority resource."""
+
+ pass
+
+
+class IncidentPriorityUpdate(IncidentPriorityBase):
+ """Pydantic model for updating an incident priority resource."""
+
+ pass
+
+
+class IncidentPriorityRead(IncidentPriorityBase):
+ """Pydantic model for reading an incident priority resource."""
+
+ id: PrimaryKey
+
+
+class IncidentPriorityReadMinimal(DispatchBase):
+ """Pydantic model for reading a minimal incident priority resource."""
+
+ id: PrimaryKey
+ name: NameStr
+ description: str | None = None
+ page_commander: StrictBool | None = None
+ tactical_report_reminder: int | None = None
+ executive_report_reminder: int | None = None
+ default: bool | None = None
+ enabled: bool | None = None
+ view_order: int | None = None
+ color: str | None = None
+
+
+class IncidentPriorityPagination(Pagination):
+ """Pydantic model for paginated incident priority results."""
+
+ items: list[IncidentPriorityRead] = []
diff --git a/src/dispatch/incident/priority/service.py b/src/dispatch/incident/priority/service.py
new file mode 100644
index 000000000000..66c435b72497
--- /dev/null
+++ b/src/dispatch/incident/priority/service.py
@@ -0,0 +1,156 @@
+from pydantic import ValidationError
+
+from sqlalchemy.sql.expression import true
+
+from dispatch.project import service as project_service
+
+from .models import (
+ IncidentPriority,
+ IncidentPriorityCreate,
+ IncidentPriorityRead,
+ IncidentPriorityUpdate,
+)
+
+
+def get(*, db_session, incident_priority_id: int) -> IncidentPriority | None:
+ """Returns an incident priority based on the given priority id."""
+ return (
+ db_session.query(IncidentPriority)
+ .filter(IncidentPriority.id == incident_priority_id)
+ .one_or_none()
+ )
+
+
+def get_default(*, db_session, project_id: int):
+ """Returns the default incident priority."""
+ return (
+ db_session.query(IncidentPriority)
+ .filter(IncidentPriority.default == true())
+ .filter(IncidentPriority.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_default_or_raise(*, db_session, project_id: int) -> IncidentPriority:
+ """Returns the default incident priority or raise a ValidationError if one doesn't exist."""
+ incident_priority = get_default(db_session=db_session, project_id=project_id)
+
+ if not incident_priority:
+ raise ValidationError(
+ [
+ {
+ "msg": "No default incident priority defined.",
+ "loc": "incident_priority",
+ }
+ ]
+ )
+ return incident_priority
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> IncidentPriority | None:
+ """Returns an incident priority based on the given priority name."""
+ return (
+ db_session.query(IncidentPriority)
+ .filter(IncidentPriority.name == name)
+ .filter(IncidentPriority.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(
+ *, db_session, project_id: int, incident_priority_in=IncidentPriorityRead
+) -> IncidentPriority:
+ """Returns the incident priority specified or raises ValidationError."""
+ incident_priority = get_by_name(
+ db_session=db_session, project_id=project_id, name=incident_priority_in.name
+ )
+
+ if not incident_priority:
+ raise ValidationError(
+ [
+ {
+ "msg": "Incident priority not found.",
+ "loc": "incident_priority",
+ "incident_priority": incident_priority_in.name,
+ }
+ ]
+ )
+
+ return incident_priority
+
+
+def get_by_name_or_default(
+ *, db_session, project_id: int, incident_priority_in=IncidentPriorityRead
+) -> IncidentPriority:
+ """Returns a incident priority based on a name or the default if not specified."""
+ if incident_priority_in and incident_priority_in.name:
+ incident_priority = get_by_name(
+ db_session=db_session, project_id=project_id, name=incident_priority_in.name
+ )
+ if incident_priority:
+ return incident_priority
+ return get_default_or_raise(db_session=db_session, project_id=project_id)
+
+
+def get_all(*, db_session, project_id: int = None) -> list[IncidentPriority | None]:
+ """Returns all incident priorities."""
+ if project_id:
+ return db_session.query(IncidentPriority).filter(IncidentPriority.project_id == project_id)
+ return db_session.query(IncidentPriority)
+
+
+def get_all_enabled(*, db_session, project_id: int = None) -> list[IncidentPriority | None]:
+ """Returns all enabled incident priorities."""
+ if project_id:
+ return (
+ db_session.query(IncidentPriority)
+ .filter(IncidentPriority.project_id == project_id)
+ .filter(IncidentPriority.enabled == true())
+ .order_by(IncidentPriority.view_order)
+ )
+ return (
+ db_session.query(IncidentPriority)
+ .filter(IncidentPriority.enabled == true())
+ .order_by(IncidentPriority.view_order)
+ )
+
+
+def create(*, db_session, incident_priority_in: IncidentPriorityCreate) -> IncidentPriority:
+ """Creates an incident priority."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=incident_priority_in.project
+ )
+ incident_priority = IncidentPriority(
+ **incident_priority_in.dict(exclude={"project", "color"}), project=project
+ )
+ if incident_priority_in.color:
+ incident_priority.color = incident_priority_in.color
+
+ db_session.add(incident_priority)
+ db_session.commit()
+ return incident_priority
+
+
+def update(
+ *, db_session, incident_priority: IncidentPriority, incident_priority_in: IncidentPriorityUpdate
+) -> IncidentPriority:
+ """Updates an incident priority."""
+ incident_priority_data = incident_priority.dict()
+
+ update_data = incident_priority_in.dict(exclude_unset=True, exclude={"project", "color"})
+
+ for field in incident_priority_data:
+ if field in update_data:
+ setattr(incident_priority, field, update_data[field])
+
+ if incident_priority_in.color:
+ incident_priority.color = incident_priority_in.color
+
+ db_session.commit()
+ return incident_priority
+
+
+def delete(*, db_session, incident_priority_id: int):
+ """Deletes an incident priority."""
+ db_session.query(IncidentPriority).filter(IncidentPriority.id == incident_priority_id).delete()
+ db_session.commit()
diff --git a/src/dispatch/incident/priority/views.py b/src/dispatch/incident/priority/views.py
new file mode 100644
index 000000000000..d83db6a6a9e9
--- /dev/null
+++ b/src/dispatch/incident/priority/views.py
@@ -0,0 +1,75 @@
+from fastapi import APIRouter, Depends, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
+from dispatch.models import PrimaryKey
+
+from .models import (
+ IncidentPriorityCreate,
+ IncidentPriorityPagination,
+ IncidentPriorityRead,
+ IncidentPriorityUpdate,
+)
+from .service import create, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=IncidentPriorityPagination, tags=["incident_priorities"])
+def get_incident_priorities(common: CommonParameters):
+ """Returns all incident priorities."""
+ return search_filter_sort_paginate(model="IncidentPriority", **common)
+
+
+@router.post(
+ "",
+ response_model=IncidentPriorityRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_incident_priority(
+ db_session: DbSession,
+ incident_priority_in: IncidentPriorityCreate,
+):
+ """Create a new incident priority."""
+ incident_priority = create(db_session=db_session, incident_priority_in=incident_priority_in)
+ return incident_priority
+
+
+@router.put(
+ "/{incident_priority_id}",
+ response_model=IncidentPriorityRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_incident_priority(
+ db_session: DbSession,
+ incident_priority_id: PrimaryKey,
+ incident_priority_in: IncidentPriorityUpdate,
+):
+ """Update an existing incident priority."""
+ incident_priority = get(db_session=db_session, incident_priority_id=incident_priority_id)
+ if not incident_priority:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An incident priority with this id does not exist."}],
+ )
+
+ incident_priority = update(
+ db_session=db_session,
+ incident_priority=incident_priority,
+ incident_priority_in=incident_priority_in,
+ )
+ return incident_priority
+
+
+@router.get("/{incident_priority_id}", response_model=IncidentPriorityRead)
+def get_incident_priority(db_session: DbSession, incident_priority_id: PrimaryKey):
+ """Get an incident priority."""
+ incident_priority = get(db_session=db_session, incident_priority_id=incident_priority_id)
+ if not incident_priority:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An incident priority with this id does not exist."}],
+ )
+ return incident_priority
diff --git a/src/dispatch/incident/scheduled.py b/src/dispatch/incident/scheduled.py
index 68c20b347581..12fcc12a06ea 100644
--- a/src/dispatch/incident/scheduled.py
+++ b/src/dispatch/incident/scheduled.py
@@ -1,242 +1,382 @@
import logging
-from datetime import datetime
+from collections import defaultdict
+from datetime import date, datetime
+
from schedule import every
+from sqlalchemy import func
+from sqlalchemy.orm import Session
-from dispatch.config import (
- INCIDENT_PLUGIN_CONVERSATION_SLUG,
- INCIDENT_DAILY_SUMMARY_ONCALL_SERVICE_ID,
- INCIDENT_NOTIFICATION_CONVERSATIONS,
- INCIDENT_PLUGIN_TICKET_SLUG,
-)
-from dispatch.decorators import background_task
+from dispatch.ai import service as ai_service
+from dispatch.conversation.enums import ConversationButtonActions
+from dispatch.database.core import resolve_attr
+from dispatch.decorators import scheduled_project_task, timer
from dispatch.enums import Visibility
-from dispatch.extensions import sentry_sdk
-from dispatch.incident_priority.models import IncidentPriorityType
-from dispatch.individual import service as individual_service
-from dispatch.messaging import (
- INCIDENT_DAILY_SUMMARY_ACTIVE_INCIDENTS_DESCRIPTION,
- INCIDENT_DAILY_SUMMARY_DESCRIPTION,
- INCIDENT_DAILY_SUMMARY_NO_ACTIVE_INCIDENTS_DESCRIPTION,
- INCIDENT_DAILY_SUMMARY_NO_STABLE_CLOSED_INCIDENTS_DESCRIPTION,
- INCIDENT_DAILY_SUMMARY_STABLE_CLOSED_INCIDENTS_DESCRIPTION,
+from dispatch.incident import service as incident_service
+from dispatch.messaging.strings import (
+ INCIDENT,
+ INCIDENT_DAILY_REPORT,
+ INCIDENT_DAILY_REPORT_TITLE,
+ INCIDENT_SUMMARY_TEMPLATE,
+ INCIDENT_WEEKLY_REPORT,
+ INCIDENT_WEEKLY_REPORT_NO_INCIDENTS,
+ INCIDENT_WEEKLY_REPORT_TITLE,
+ MessageType,
)
-from dispatch.plugins.base import plugins
+from dispatch.nlp import build_phrase_matcher, build_term_vocab, extract_terms_from_text
+from dispatch.notification import service as notification_service
+from dispatch.participant import flows as participant_flows
+from dispatch.participant_role.models import ParticipantRoleType
+from dispatch.plugin import service as plugin_service
+from dispatch.project.models import Project
from dispatch.scheduler import scheduler
-from dispatch.service import service as service_service
+from dispatch.search_filter import service as search_filter_service
+from dispatch.tag import service as tag_service
+from dispatch.tag.models import Tag
from .enums import IncidentStatus
-from .service import calculate_cost, get_all, get_all_by_status, get_all_last_x_hours_by_status
-from .messaging import send_incident_status_report_reminder
-
-# TODO figure out a way to do mapping in the config file
-# reminder (in hours)
-STATUS_REPORT_REMINDER_MAPPING = {
- IncidentPriorityType.high.name: 2,
- IncidentPriorityType.medium.name: 6,
- IncidentPriorityType.low.name: 12,
- IncidentPriorityType.info.name: 24,
-}
+from .messaging import send_incident_close_reminder
+from .service import (
+ get_all,
+ get_all_by_status,
+ get_all_last_x_hours_by_status,
+)
log = logging.getLogger(__name__)
-@scheduler.add(every(1).hours, name="incident-status-report-reminder")
-@background_task
-def status_report_reminder(db_session=None):
- """Sends status report reminders to active incident commanders."""
- incidents = get_all_by_status(db_session=db_session, status=IncidentStatus.active)
+@scheduler.add(every(1).hours, name="incident-auto-tagger")
+@timer
+@scheduled_project_task
+def incident_auto_tagger(db_session: Session, project: Project):
+ """Attempts to take existing tags and associate them with incidents."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project.id, plugin_type="storage"
+ )
+
+ if not plugin:
+ log.warning(
+ f"Incident tags not updated. No storage plugin enabled. Project: {project.name}. Organization: {project.organization.name}"
+ )
+ return
+
+ tags = tag_service.get_all(db_session=db_session, project_id=project.id).all()
+ tag_strings = [t.name.lower() for t in tags if t.discoverable]
+ phrases = build_term_vocab(tag_strings)
+ matcher = build_phrase_matcher("dispatch-tag", phrases)
+
+ incidents = get_all(db_session=db_session, project_id=project.id).all()
for incident in incidents:
- try:
- notification_hour = STATUS_REPORT_REMINDER_MAPPING[
- incident.incident_priority.name.lower()
- ]
+ log.debug(f"Processing incident {incident.name}...")
- if incident.last_status_report:
- remind_after = incident.last_status_report.created_at
- else:
- remind_after = incident.created_at
+ if incident.incident_document:
+ try:
+ mime_type = "text/plain"
+ text = plugin.instance.get(incident.incident_document.resource_id, mime_type)
+ except Exception as e:
+ log.warn(e)
+ continue
- now = datetime.utcnow() - remind_after
+ extracted_tags = list(set(extract_terms_from_text(text, matcher)))
- # we calculate the number of hours and seconds since last CAN was sent
- hours, seconds = divmod((now.days * 86400) + now.seconds, 3600)
+ matched_tags = (
+ db_session.query(Tag)
+ .filter(func.upper(Tag.name).in_([func.upper(t) for t in extracted_tags]))
+ .all()
+ )
- q, r = divmod(hours, notification_hour)
- if q >= 1 and r == 0: # it's time to send the reminder
- send_incident_status_report_reminder(incident)
+ incident.tags.extend(matched_tags)
+ db_session.commit()
- except Exception as e:
- # we shouldn't fail to update all incidents when one fails
- sentry_sdk.capture_exception(e)
+ log.debug(f"Associating tags with incident {incident.name}. Tags: {extracted_tags}")
-@scheduler.add(every(1).day.at("18:00"), name="incident-daily-summary")
-@background_task
-def daily_summary(db_session=None):
- """Fetches all open incidents and provides a daily summary."""
+@scheduler.add(every(1).day.at("18:00"), name="incident-report-daily")
+@timer
+@scheduled_project_task
+def incident_report_daily(db_session: Session, project: Project):
+ """Creates and sends incident daily reports based on notifications."""
- blocks = []
- blocks.append(
- {
- "type": "section",
- "text": {"type": "mrkdwn", "text": f"*{INCIDENT_DAILY_SUMMARY_DESCRIPTION}*"},
- }
+ # don't send if set to false
+ if project.send_daily_reports is False:
+ return
+
+ # we fetch all active, stable and closed incidents
+ active_incidents = get_all_by_status(
+ db_session=db_session, project_id=project.id, status=IncidentStatus.active
+ )
+ stable_incidents = get_all_last_x_hours_by_status(
+ db_session=db_session,
+ project_id=project.id,
+ status=IncidentStatus.stable,
+ hours=24,
)
+ closed_incidents = get_all_last_x_hours_by_status(
+ db_session=db_session,
+ project_id=project.id,
+ status=IncidentStatus.closed,
+ hours=24,
+ )
+ incidents = active_incidents + stable_incidents + closed_incidents
- active_incidents = get_all_by_status(db_session=db_session, status=IncidentStatus.active)
- if active_incidents:
- blocks.append(
- {
- "type": "section",
- "text": {
- "type": "mrkdwn",
- "text": f"*{INCIDENT_DAILY_SUMMARY_ACTIVE_INCIDENTS_DESCRIPTION}*",
- },
- }
- )
- for incident in active_incidents:
- if incident.visibility == Visibility.open:
+ # we map incidents to notification filters
+ incidents_notification_filters_mapping = defaultdict(lambda: defaultdict(lambda: []))
+ notifications = notification_service.get_all_enabled(
+ db_session=db_session, project_id=project.id
+ )
+ for incident in incidents:
+ for notification in notifications:
+ for search_filter in notification.filters:
+ match = search_filter_service.match(
+ db_session=db_session,
+ subject=search_filter.subject,
+ filter_spec=search_filter.expression,
+ class_instance=incident,
+ )
+ if match:
+ incidents_notification_filters_mapping[notification.id][
+ search_filter.id
+ ].append(incident)
+
+ if not notification.filters:
+ incidents_notification_filters_mapping[notification.id][0].append(incident)
+
+ # we create and send an incidents daily report for each notification filter
+ for notification_id, search_filter_dict in incidents_notification_filters_mapping.items():
+ for _search_filter_id, incidents in search_filter_dict.items():
+ items_grouped = []
+ items_grouped_template = INCIDENT
+
+ for idx, incident in enumerate(incidents):
try:
- blocks.append(
- {
- "type": "section",
- "text": {
- "type": "mrkdwn",
- "text": (
- f"*<{incident.ticket.weblink}|{incident.name}>*\n"
- f"*Title*: {incident.title}\n"
- f"*Priority*: {incident.incident_priority.name}\n"
- f"*Incident Commander*: <{incident.commander.weblink}|{incident.commander.name}>"
- ),
- },
- }
- )
+ item = {
+ "buttons": [],
+ "commander_fullname": incident.commander.individual.name,
+ "commander_team": incident.commander.team,
+ "commander_weblink": incident.commander.individual.weblink,
+ "incident_id": incident.id,
+ "name": incident.name,
+ "organization_slug": incident.project.organization.slug,
+ "priority": incident.incident_priority.name,
+ "priority_description": incident.incident_priority.description,
+ "severity": incident.incident_severity.name,
+ "severity_description": incident.incident_severity.description,
+ "status": incident.status,
+ "ticket_weblink": resolve_attr(incident, "ticket.weblink"),
+ "title": incident.title,
+ "type": incident.incident_type.name,
+ "type_description": incident.incident_type.description,
+ }
+
+ if incident.status != IncidentStatus.closed:
+ item["buttons"].append(
+ {
+ "button_text": "Subscribe",
+ "button_value": f"{incident.project.organization.slug}-{incident.id}",
+ "button_action": f"{ConversationButtonActions.subscribe_user}-{incident.status}-{idx}",
+ }
+ )
+ if incident.project.allow_self_join:
+ item["buttons"].append(
+ {
+ "button_text": "Join",
+ "button_value": f"{incident.project.organization.slug}-{incident.id}",
+ "button_action": f"{ConversationButtonActions.invite_user}-{incident.status}-{idx}",
+ }
+ )
+
+ items_grouped.append(item)
except Exception as e:
- sentry_sdk.capture_exception(e)
- else:
- blocks.append(
- {
- "type": "section",
- "text": {
- "type": "mrkdwn",
- "text": INCIDENT_DAILY_SUMMARY_NO_ACTIVE_INCIDENTS_DESCRIPTION,
- },
+ log.exception(e)
+
+ notification_kwargs = {
+ "items_grouped": items_grouped,
+ "items_grouped_template": items_grouped_template,
}
- )
- blocks.append({"type": "divider"})
- blocks.append(
- {
- "type": "section",
- "text": {
- "type": "mrkdwn",
- "text": f"*{INCIDENT_DAILY_SUMMARY_STABLE_CLOSED_INCIDENTS_DESCRIPTION}*",
- },
- }
- )
+ notification_title_text = f"{project.name} {INCIDENT_DAILY_REPORT_TITLE}"
+ notification_params = {
+ "text": notification_title_text,
+ "type": MessageType.incident_daily_report,
+ "template": INCIDENT_DAILY_REPORT,
+ "kwargs": notification_kwargs,
+ }
+
+ notification = notification_service.get(
+ db_session=db_session, notification_id=notification_id
+ )
+
+ notification_service.send(
+ db_session=db_session,
+ project_id=notification.project.id,
+ notification=notification,
+ notification_params=notification_params,
+ )
- hours = 24
- stable_closed_incidents = get_all_last_x_hours_by_status(
- db_session=db_session, status=IncidentStatus.stable, hours=hours
- ) + get_all_last_x_hours_by_status(
- db_session=db_session, status=IncidentStatus.closed, hours=hours
+
+@scheduler.add(every(1).day.at("18:00"), name="incident-close-reminder")
+@timer
+@scheduled_project_task
+def incident_close_reminder(db_session: Session, project: Project):
+ """Sends a reminder to the incident commander to close out their incident."""
+ incidents = get_all_by_status(
+ db_session=db_session, project_id=project.id, status=IncidentStatus.stable
)
- if stable_closed_incidents:
- for incident in stable_closed_incidents:
- if incident.visibility == Visibility.open:
- try:
- blocks.append(
- {
- "type": "section",
- "text": {
- "type": "mrkdwn",
- "text": (
- f"*<{incident.ticket.weblink}|{incident.name}>*\n"
- f"*Title*: {incident.title}\n"
- f"*Status*: {incident.status}\n"
- f"*Priority*: {incident.incident_priority.name}\n"
- f"*Incident Commander*: <{incident.commander.weblink}|{incident.commander.name}>"
- ),
- },
- }
- )
- except Exception as e:
- sentry_sdk.capture_exception(e)
- else:
- blocks.append(
- {
- "type": "section",
- "text": {
- "type": "mrkdwn",
- "text": INCIDENT_DAILY_SUMMARY_NO_STABLE_CLOSED_INCIDENTS_DESCRIPTION,
- },
- }
- )
- # NOTE INCIDENT_DAILY_SUMMARY_ONCALL_SERVICE_ID is optional
- if INCIDENT_DAILY_SUMMARY_ONCALL_SERVICE_ID:
- oncall_service = service_service.get_by_external_id(
- db_session=db_session, external_id=INCIDENT_DAILY_SUMMARY_ONCALL_SERVICE_ID
- )
+ for incident in incidents:
+ span = datetime.utcnow() - incident.stable_at
+ q, r = divmod(span.days, 7)
+ if q >= 1 and date.today().isoweekday() == 1:
+ # we only send the reminder for incidents that have been stable
+ # longer than a week and only on Mondays
+ send_incident_close_reminder(incident, db_session)
- oncall_plugin = plugins.get(oncall_service.type)
- oncall_email = oncall_plugin.get(service_id=INCIDENT_DAILY_SUMMARY_ONCALL_SERVICE_ID)
- oncall_individual = individual_service.resolve_user_by_email(oncall_email)
+@scheduler.add(every().monday.at("18:00"), name="incident-report-weekly")
+@timer
+@scheduled_project_task
+def incident_report_weekly(db_session: Session, project: Project):
+ """Creates and sends incident weekly reports based on notifications."""
- blocks.append(
- {
- "type": "context",
- "elements": [
- {
- "type": "mrkdwn",
- "text": f"For questions about this notification, reach out to <{oncall_individual['weblink']}|{oncall_individual['fullname']}> (current on-call)",
- }
- ],
- }
- )
+ # don't send if set to false or no notification id is set
+ if project.send_weekly_reports is False or not project.weekly_report_notification_id:
+ return
- convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- for c in INCIDENT_NOTIFICATION_CONVERSATIONS:
- convo_plugin.send(c, "Incident Daily Summary", {}, "", blocks=blocks)
+ # don't send if no enabled ai plugin
+ ai_plugin = plugin_service.get_active_instance(
+ db_session=db_session, plugin_type="artificial-intelligence", project_id=project.id
+ )
+ if not ai_plugin:
+ log.warning("Incident weekly reports not sent. No AI plugin enabled.")
+ return
+ # we fetch all closed incidents in the last week
+ incidents = get_all_last_x_hours_by_status(
+ db_session=db_session,
+ project_id=project.id,
+ status=IncidentStatus.closed,
+ hours=24 * 7,
+ )
-@scheduler.add(every(5).minutes, name="calculate-incidents-cost")
-@background_task
-def calculate_incidents_cost(db_session=None):
- """Calculates the cost of all incidents."""
+ storage_plugin = plugin_service.get_active_instance(
+ db_session=db_session, plugin_type="storage", project_id=project.id
+ )
- # we want to update all incidents, all the time
- incidents = get_all(db_session=db_session)
+ if not storage_plugin:
+ log.warning(
+ f"Incident weekly reports not sent. No storage plugin enabled. Project: {project.name}."
+ )
+ return
+
+ items_grouped = []
+ items_grouped_template = INCIDENT_SUMMARY_TEMPLATE
+ # we create and send an incidents weekly report
for incident in incidents:
- # we calculate the cost
- try:
- incident_cost = calculate_cost(incident.id, db_session)
+ # Skip if no incident review document
+ if (
+ not incident.incident_review_document
+ or not incident.incident_review_document.resource_id
+ ):
+ continue
- # if the cost hasn't changed don't continue
- if incident.cost == incident_cost:
- continue
+ # Skip restricted incidents
+ if incident.visibility == Visibility.restricted:
+ continue
- # we update the incident
- incident.cost = incident_cost
- db_session.add(incident)
- db_session.commit()
+ # Skip if incident is a duplicate
+ if incident.duplicates:
+ continue
- if incident.ticket.resource_id:
- # we update the external ticket
- ticket_plugin = plugins.get(INCIDENT_PLUGIN_TICKET_SLUG)
- ticket_plugin.update(
- incident.ticket.resource_id,
- cost=incident_cost,
- incident_type=incident.incident_type.name,
- )
- log.debug(f"Incident cost for {incident.name} updated in the ticket.")
+ try:
+ # if already summary generated, use that instead
+ if incident.summary:
+ summary = incident.summary
else:
- log.debug(f"Incident cost for {incident.name} not updated. Ticket not found.")
+ summary = ai_service.generate_incident_summary(
+ db_session=db_session, incident=incident
+ )
- log.debug(f"Incident cost for {incident.name} updated in the database.")
+ item = {
+ "commander_fullname": incident.commander.individual.name,
+ "commander_team": incident.commander.team,
+ "commander_weblink": incident.commander.individual.weblink,
+ "name": incident.name,
+ "ticket_weblink": resolve_attr(incident, "ticket.weblink"),
+ "title": incident.title,
+ "summary": summary,
+ }
+
+ items_grouped.append(item)
except Exception as e:
- # we shouldn't fail to update all incidents when one fails
- sentry_sdk.capture_exception(e)
+ log.exception(e)
+
+ template = INCIDENT_WEEKLY_REPORT
+ # if no closed incidents or all closed incidents are restricted, send a different template
+ if not items_grouped:
+ template = INCIDENT_WEEKLY_REPORT_NO_INCIDENTS
+
+ notification_kwargs = {
+ "items_grouped": items_grouped,
+ "items_grouped_template": items_grouped_template,
+ }
+
+ notification_title_text = f"{project.name} {INCIDENT_WEEKLY_REPORT_TITLE}"
+ notification_params = {
+ "text": notification_title_text,
+ "type": MessageType.incident_weekly_report,
+ "template": template,
+ "kwargs": notification_kwargs,
+ }
+
+ notification = notification_service.get(
+ db_session=db_session, notification_id=project.weekly_report_notification_id
+ )
+
+ notification_service.send(
+ db_session=db_session,
+ project_id=notification.project.id,
+ notification=notification,
+ notification_params=notification_params,
+ )
+
+
+@scheduler.add(every(1).hour, name="incident-sync-members")
+@timer
+@scheduled_project_task
+def incident_sync_members(db_session: Session, project: Project):
+ """Checks the members of all conversations associated with active
+ and stable incidents and ensures they are in the incident."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session,
+ project_id=project.id,
+ plugin_type="conversation",
+ )
+ if not plugin:
+ log.warning("No conversation plugin is active.")
+ return
+
+ active_incidents = incident_service.get_all_by_status(
+ db_session=db_session, project_id=project.id, status=IncidentStatus.active
+ )
+ stable_incidents = incident_service.get_all_by_status(
+ db_session=db_session, project_id=project.id, status=IncidentStatus.stable
+ )
+ incidents = active_incidents + stable_incidents
+
+ for incident in incidents:
+ if incident.conversation:
+ conversation_members = plugin.instance.get_all_member_emails(
+ incident.conversation.channel_id
+ )
+ incident_members = [m.individual.email for m in incident.participants]
+
+ for member in conversation_members:
+ if member not in incident_members:
+ participant_flows.add_participant(
+ member,
+ incident,
+ db_session,
+ roles=[ParticipantRoleType.observer],
+ )
+ log.debug(f"Added missing {member} to incident {incident.name}")
diff --git a/src/dispatch/incident/service.py b/src/dispatch/incident/service.py
index acbc24ea60c0..34c3c20189c3 100644
--- a/src/dispatch/incident/service.py
+++ b/src/dispatch/incident/service.py
@@ -1,89 +1,137 @@
-import math
+"""
+.. module: dispatch.incident.service
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
+
+import logging
from datetime import datetime, timedelta
-from typing import List, Optional
-
-from dispatch.config import ANNUAL_COST_EMPLOYEE, BUSINESS_HOURS_YEAR
-from dispatch.database import SessionLocal
-from dispatch.term import service as term_service
-from dispatch.term.models import TermUpdate
-from dispatch.tag.models import TagUpdate
-from dispatch.tag import service as tag_service
-from dispatch.incident_priority import service as incident_priority_service
-from dispatch.incident_priority.models import IncidentPriorityType
-from dispatch.incident_type import service as incident_type_service
+from pydantic import ValidationError
+from sqlalchemy.orm import Session
+
+from dispatch.case import service as case_service
+from dispatch.decorators import timer
+from dispatch.event import service as event_service
+from dispatch.incident.priority import service as incident_priority_service
+from dispatch.incident.severity import service as incident_severity_service
+from dispatch.incident.type import service as incident_type_service
+from dispatch.incident_cost import service as incident_cost_service
+from dispatch.incident_role.service import resolve_role
from dispatch.participant import flows as participant_flows
-from dispatch.participant_role import service as participant_role_service
from dispatch.participant_role.models import ParticipantRoleType
-from dispatch.plugins.base import plugins
+from dispatch.plugin import service as plugin_service
+from dispatch.project import service as project_service
+from dispatch.tag import service as tag_service
+from dispatch.term import service as term_service
+from dispatch.ticket import flows as ticket_flows
from .enums import IncidentStatus
-from .models import Incident, IncidentUpdate
-
-
-HOURS_IN_DAY = 24
-SECONDS_IN_HOUR = 3600
-
-
-def resolve_incident_commander_email(
- db_session: SessionLocal,
- reporter_email: str,
- incident_type: str,
- incident_priority: str,
- incident_name: str,
- incident_title: str,
- incident_description: str,
-):
- """Resolves the correct incident commander email based on given parameters."""
- if incident_priority == IncidentPriorityType.info:
- return reporter_email
-
- commander_service = incident_type_service.get_by_name(
- db_session=db_session, name=incident_type
- ).commander_service
-
- p = plugins.get(commander_service.type)
-
- # page for high priority incidents
- # we could do this at the end but it seems pretty important...
- if incident_priority == IncidentPriorityType.high:
- p.page(
- service_id=commander_service.external_id,
- incident_name=incident_name,
- incident_title=incident_title,
- incident_description=incident_description,
+from .models import Incident, IncidentCreate, IncidentRead, IncidentUpdate
+
+log = logging.getLogger(__name__)
+
+
+def resolve_and_associate_role(db_session: Session, incident: Incident, role: ParticipantRoleType):
+ """For a given role type resolve which individual email should be assigned that role."""
+ email_address = None
+ service_id = None
+
+ incident_role = resolve_role(db_session=db_session, role=role, incident=incident)
+ if not incident_role:
+ log.info(
+ f"We were not able to resolve the email address for {role} via incident role policies."
+ )
+ return email_address, service_id
+
+ if incident_role.service:
+ service_id = incident_role.service.id
+ service_external_id = incident_role.service.external_id
+ oncall_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="oncall"
)
+ if not oncall_plugin:
+ log.warning("Resolved incident role associated with a plugin that is not active.")
+ return email_address, service_id
+
+ email_address = oncall_plugin.instance.get(service_id=service_external_id)
- return p.get(service_id=commander_service.external_id)
+ return email_address, service_id
-def get(*, db_session, incident_id: str) -> Optional[Incident]:
+@timer
+def get(*, db_session: Session, incident_id: int) -> Incident | None:
"""Returns an incident based on the given id."""
return db_session.query(Incident).filter(Incident.id == incident_id).first()
-def get_by_name(*, db_session, incident_name: str) -> Optional[Incident]:
+def get_by_name(*, db_session: Session, project_id: int, name: str) -> Incident | None:
"""Returns an incident based on the given name."""
- return db_session.query(Incident).filter(Incident.name == incident_name).first()
+ return (
+ db_session.query(Incident)
+ .filter(Incident.name == name)
+ .filter(Incident.project_id == project_id)
+ .first()
+ )
-def get_all(*, db_session) -> List[Optional[Incident]]:
+def get_all_open_by_incident_type(
+ *, db_session: Session, incident_type_id: int
+) -> list[Incident | None]:
+ """Returns all non-closed incidents based on the given incident type."""
+ return (
+ db_session.query(Incident)
+ .filter(Incident.status != IncidentStatus.closed)
+ .filter(Incident.incident_type_id == incident_type_id)
+ .all()
+ )
+
+
+def get_by_name_or_raise(
+ *, db_session: Session, project_id: int, incident_in: IncidentRead
+) -> Incident:
+ """Returns an incident based on a given name or raises ValidationError"""
+ incident = get_by_name(db_session=db_session, project_id=project_id, name=incident_in.name)
+
+ if not incident:
+ raise ValidationError([
+ {
+ "msg": "Incident not found.",
+ "loc": "name",
+ }
+ ])
+ return incident
+
+
+def get_all(*, db_session: Session, project_id: int) -> list[Incident | None]:
"""Returns all incidents."""
- return db_session.query(Incident)
+ return db_session.query(Incident).filter(Incident.project_id == project_id)
def get_all_by_status(
- *, db_session, status: IncidentStatus, skip=0, limit=100
-) -> List[Optional[Incident]]:
+ *, db_session: Session, status: str, project_id: int
+) -> list[Incident | None]:
"""Returns all incidents based on the given status."""
return (
- db_session.query(Incident).filter(Incident.status == status).offset(skip).limit(limit).all()
+ db_session.query(Incident)
+ .filter(Incident.status == status)
+ .filter(Incident.project_id == project_id)
+ .all()
+ )
+
+
+def get_all_last_x_hours(*, db_session: Session, hours: int) -> list[Incident | None]:
+ """Returns all incidents in the last x hours."""
+ now = datetime.utcnow()
+ return (
+ db_session.query(Incident).filter(Incident.created_at >= now - timedelta(hours=hours)).all()
)
def get_all_last_x_hours_by_status(
- *, db_session, status: IncidentStatus, hours: int, skip=0, limit=100
-) -> List[Optional[Incident]]:
+ *, db_session: Session, status: str, hours: int, project_id: int
+) -> list[Incident | None]:
"""Returns all incidents of a given status in the last x hours."""
now = datetime.utcnow()
@@ -92,8 +140,7 @@ def get_all_last_x_hours_by_status(
db_session.query(Incident)
.filter(Incident.status == IncidentStatus.active)
.filter(Incident.created_at >= now - timedelta(hours=hours))
- .offset(skip)
- .limit(limit)
+ .filter(Incident.project_id == project_id)
.all()
)
@@ -102,8 +149,7 @@ def get_all_last_x_hours_by_status(
db_session.query(Incident)
.filter(Incident.status == IncidentStatus.stable)
.filter(Incident.stable_at >= now - timedelta(hours=hours))
- .offset(skip)
- .limit(limit)
+ .filter(Incident.project_id == project_id)
.all()
)
@@ -112,196 +158,272 @@ def get_all_last_x_hours_by_status(
db_session.query(Incident)
.filter(Incident.status == IncidentStatus.closed)
.filter(Incident.closed_at >= now - timedelta(hours=hours))
- .offset(skip)
- .limit(limit)
+ .filter(Incident.project_id == project_id)
.all()
)
-def get_all_by_incident_type(
- *, db_session, incident_type: str, skip=0, limit=100
-) -> List[Optional[Incident]]:
- """Returns all incidents with the given incident type."""
- return (
- db_session.query(Incident)
- .filter(Incident.incident_type.name == incident_type)
- .offset(skip)
- .limit(limit)
- .all()
+def create(*, db_session: Session, incident_in: IncidentCreate) -> Incident:
+ """Creates a new incident."""
+ project = project_service.get_by_name_or_default(
+ db_session=db_session, project_in=incident_in.project
)
+ incident_type = incident_type_service.get_by_name_or_default(
+ db_session=db_session, project_id=project.id, incident_type_in=incident_in.incident_type
+ )
-def create(
- *,
- db_session,
- incident_priority: str,
- incident_type: str,
- reporter_email: str,
- title: str,
- status: str,
- description: str,
- visibility: str = None,
-) -> Incident:
- """Creates a new incident."""
- # We get the incident type by name
- incident_type = incident_type_service.get_by_name(
- db_session=db_session, name=incident_type["name"]
+ incident_priority = incident_priority_service.get_by_name_or_default(
+ db_session=db_session,
+ project_id=project.id,
+ incident_priority_in=incident_in.incident_priority,
)
- # We get the incident priority by name
- incident_priority = incident_priority_service.get_by_name(
- db_session=db_session, name=incident_priority["name"]
+ incident_severity = incident_severity_service.get_by_name_or_default(
+ db_session=db_session,
+ project_id=project.id,
+ incident_severity_in=incident_in.incident_severity,
)
- if not visibility:
- visibility = incident_type.visibility
+ visibility = incident_type.visibility
+ if incident_in.visibility:
+ visibility = incident_in.visibility
+
+ tag_objs = []
+ for t in incident_in.tags:
+ tag_objs.append(tag_service.get_or_create(db_session=db_session, tag_in=t))
# We create the incident
incident = Incident(
- title=title,
- description=description,
- status=status,
- incident_type=incident_type,
+ description=incident_in.description,
incident_priority=incident_priority,
+ incident_severity=incident_severity,
+ incident_type=incident_type,
+ project=project,
+ status=incident_in.status,
+ tags=tag_objs,
+ title=incident_in.title,
visibility=visibility,
)
+
db_session.add(incident)
db_session.commit()
- # We add the reporter to the incident
- reporter_participant = participant_flows.add_participant(
- reporter_email, incident.id, db_session, ParticipantRoleType.reporter
+ reporter_name = incident_in.reporter.individual.name if incident_in.reporter else ""
+
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Incident created",
+ details={
+ "title": incident.title,
+ "description": incident.description,
+ "type": incident.incident_type.name,
+ "severity": incident.incident_severity.name,
+ "priority": incident.incident_priority.name,
+ "status": incident.status,
+ "visibility": incident.visibility,
+ },
+ individual_id=incident_in.reporter.individual.id,
+ incident_id=incident.id,
+ owner=reporter_name,
+ pinned=True,
)
- # We resolve the incident commander email
- incident_commander_email = resolve_incident_commander_email(
- db_session,
+ # add reporter
+ reporter_email = incident_in.reporter.individual.email
+ participant_flows.add_participant(
reporter_email,
- incident_type.name,
- incident_priority.name,
- "",
- title,
- description,
+ incident,
+ db_session,
+ roles=[ParticipantRoleType.reporter],
)
- if reporter_email == incident_commander_email:
- # We add the role of incident commander the reporter
- participant_role_service.add_role(
- participant_id=reporter_participant.id,
- participant_role=ParticipantRoleType.incident_commander,
- db_session=db_session,
- )
+ # add commander
+ commander_email = commander_service_id = None
+ if incident_in.commander_email:
+ commander_email = incident_in.commander_email
+ elif incident_in.commander:
+ commander_email = incident_in.commander.individual.email
else:
- # We create a new participant for the incident commander and we add it to the incident
+ commander_email, commander_service_id = resolve_and_associate_role(
+ db_session=db_session, incident=incident, role=ParticipantRoleType.incident_commander
+ )
+
+ if not commander_email:
+ # we make the reporter the commander if an email for the commander
+ # was not provided or resolved via incident role policies
+ commander_email = reporter_email
+
+ participant_flows.add_participant(
+ commander_email,
+ incident,
+ db_session,
+ service_id=commander_service_id,
+ roles=[ParticipantRoleType.incident_commander],
+ )
+
+ # add liaison
+ liaison_email, liaison_service_id = resolve_and_associate_role(
+ db_session=db_session, incident=incident, role=ParticipantRoleType.liaison
+ )
+
+ if liaison_email:
+ # we only add the liaison if we are able to resolve its email
+ # via incident role policies
participant_flows.add_participant(
- incident_commander_email,
- incident.id,
+ liaison_email,
+ incident,
db_session,
- ParticipantRoleType.incident_commander,
+ service_id=liaison_service_id,
+ roles=[ParticipantRoleType.liaison],
+ )
+
+ # add scribe
+ scribe_email, scribe_service_id = resolve_and_associate_role(
+ db_session=db_session, incident=incident, role=ParticipantRoleType.scribe
+ )
+
+ if scribe_email:
+ # we only add the scribe if we are able to resolve its email
+ # via incident role policies
+ participant_flows.add_participant(
+ scribe_email,
+ incident,
+ db_session,
+ service_id=scribe_service_id,
+ roles=[ParticipantRoleType.scribe],
+ )
+
+ # add observer (if engage_next_oncall is enabled)
+ incident_role = resolve_role(
+ db_session=db_session, role=ParticipantRoleType.incident_commander, incident=incident
+ )
+ if incident_role and incident_role.engage_next_oncall:
+ oncall_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="oncall"
)
+ if not oncall_plugin:
+ log.debug("Resolved observer role not available since oncall plugin is not active.")
+ else:
+ oncall_email = oncall_plugin.instance.get_next_oncall(
+ service_id=incident_role.service.external_id
+ )
+ # no need to add as observer if already added as commander
+ if oncall_email and commander_email != oncall_email:
+ participant_flows.add_participant(
+ oncall_email,
+ incident,
+ db_session,
+ service_id=incident_role.service.id,
+ roles=[ParticipantRoleType.observer],
+ )
return incident
-def update(*, db_session, incident: Incident, incident_in: IncidentUpdate) -> Incident:
- incident_priority = incident_priority_service.get_by_name(
- db_session=db_session, name=incident_in.incident_priority.name
+def update(*, db_session: Session, incident: Incident, incident_in: IncidentUpdate) -> Incident:
+ """Updates an existing incident."""
+ incident_type = incident_type_service.get_by_name_or_default(
+ db_session=db_session,
+ project_id=incident.project.id,
+ incident_type_in=incident_in.incident_type,
)
- incident_type = incident_type_service.get_by_name(
- db_session=db_session, name=incident_in.incident_type.name
+ incident_severity = incident_severity_service.get_by_name_or_default(
+ db_session=db_session,
+ project_id=incident.project.id,
+ incident_severity_in=incident_in.incident_severity,
)
+ if incident_in.status == IncidentStatus.stable and incident.project.stable_priority:
+ incident_priority = incident.project.stable_priority
+ else:
+ incident_priority = incident_priority_service.get_by_name_or_default(
+ db_session=db_session,
+ project_id=incident.project.id,
+ incident_priority_in=incident_in.incident_priority,
+ )
+
+ cases = []
+ for c in incident_in.cases:
+ cases.append(case_service.get(db_session=db_session, case_id=c.id))
+
tags = []
for t in incident_in.tags:
- tags.append(tag_service.get_or_create(db_session=db_session, tag_in=TagUpdate(**t)))
+ tags.append(tag_service.get_or_create(db_session=db_session, tag_in=t))
terms = []
for t in incident_in.terms:
- terms.append(term_service.get_or_create(db_session=db_session, term_in=TermUpdate(**t)))
+ terms.append(term_service.get_or_create(db_session=db_session, term_in=t))
+
+ duplicates = []
+ for d in incident_in.duplicates:
+ duplicates.append(get(db_session=db_session, incident_id=d.id))
+
+ incident_costs = []
+ for incident_cost in incident_in.incident_costs:
+ incident_costs.append(
+ incident_cost_service.get_or_create(
+ db_session=db_session, incident_cost_in=incident_cost
+ )
+ )
+
+ # Update total incident response cost if incident type has changed.
+ if incident_type.id != incident.incident_type.id:
+ incident_cost_service.update_incident_response_cost(
+ incident_id=incident.id, db_session=db_session
+ )
+ # if the new incident type has plugin metadata and the
+ # project key of the ticket is the same, also update the ticket with the new metadata
+ if incident_type.plugin_metadata:
+ ticket_flows.update_incident_ticket_metadata(
+ db_session=db_session,
+ ticket_id=incident.ticket.resource_id,
+ project_id=incident.project.id,
+ incident_id=incident.id,
+ incident_type=incident_type,
+ )
update_data = incident_in.dict(
- skip_defaults=True,
+ exclude_unset=True,
exclude={
- "incident_type",
- "incident_priority",
+ "cases",
"commander",
+ "duplicates",
+ "incident_costs",
+ "incident_priority",
+ "incident_severity",
+ "incident_type",
+ "project",
"reporter",
"status",
- "visibility",
"tags",
"terms",
+ "visibility",
},
)
for field in update_data.keys():
setattr(incident, field, update_data[field])
- incident.terms = terms
- incident.tags = tags
-
- incident.status = incident_in.status
- incident.visibility = incident_in.visibility
-
+ incident.cases = cases
+ incident.duplicates = duplicates
+ incident.incident_costs = incident_costs
incident.incident_priority = incident_priority
+ incident.incident_severity = incident_severity
incident.incident_type = incident_type
+ incident.status = incident_in.status
+ incident.tags = tags
+ incident.terms = terms
+ incident.visibility = incident_in.visibility
- db_session.add(incident)
db_session.commit()
return incident
-def delete(*, db_session, incident_id: int):
- # TODO: When deleting, respect referential integrity here in the code. Or add cascading deletes
- # in models.py.
+def delete(*, db_session: Session, incident_id: int):
+ """Deletes an existing incident."""
db_session.query(Incident).filter(Incident.id == incident_id).delete()
db_session.commit()
-
-
-def calculate_cost(incident_id: int, db_session: SessionLocal, incident_review=False):
- """Calculates the incident cost."""
- # we ge the incident
- incident = get(db_session=db_session, incident_id=incident_id)
-
- participants_active_hours = 0
- for participant in incident.participants:
- participant_active_at = participant.active_at
- participant_inactive_at = (
- participant.inactive_at if participant.inactive_at else datetime.utcnow()
- )
-
- participant_active_time = participant_inactive_at - participant_active_at
- participant_active_hours = participant_active_time.total_seconds() / SECONDS_IN_HOUR
-
- # we assume that participants only spend ~10 hours/day working on the incident if the incident goes past 24hrs
- if participant_active_hours > HOURS_IN_DAY:
- days, hours = divmod(participant_active_hours, HOURS_IN_DAY)
- participant_active_hours = math.ceil((days * HOURS_IN_DAY * 0.4) + hours)
-
- participants_active_hours += participant_active_hours
-
- num_participants = len(incident.participants)
-
- # we calculate the number of hours spent responding per person using the 25/50/25 rule,
- # where 25% of participants get a full share, 50% get a half share, and 25% get a quarter share
- response_hours_full_share = num_participants * 0.25 * participants_active_hours
- response_hours_half_share = num_participants * 0.5 * participants_active_hours * 0.5
- response_hours_quarter_share = num_participants * 0.25 * participants_active_hours * 0.25
- response_hours = (
- response_hours_full_share + response_hours_half_share + response_hours_quarter_share
- )
-
- # we calculate the number of hours spent in incident review related activities
- incident_review_hours = 0
- if incident_review:
- incident_review_prep = 1
- incident_review_meeting = num_participants * 0.5 * 1
- incident_review_hours = incident_review_prep + incident_review_meeting
-
- # we calculate and round up the hourly rate
- hourly_rate = math.ceil(ANNUAL_COST_EMPLOYEE / BUSINESS_HOURS_YEAR)
-
- # we calculate, round up, and format the incident cost
- incident_cost = f"{math.ceil((response_hours + incident_review_hours) * hourly_rate):.2f}"
- return incident_cost
diff --git a/src/dispatch/incident/severity/__init__.py b/src/dispatch/incident/severity/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/incident/severity/config.py b/src/dispatch/incident/severity/config.py
new file mode 100644
index 000000000000..7d0dd7c0d5f1
--- /dev/null
+++ b/src/dispatch/incident/severity/config.py
@@ -0,0 +1,42 @@
+default_incident_severities = [
+ {
+ "name": "Undetermined",
+ "description": "The severity of the incident has not yet been determined.",
+ "view_order": 1,
+ "color": "#9e9e9e",
+ "default": True,
+ "enabled": True,
+ },
+ {
+ "name": "Low",
+ "description": "Low severity.",
+ "view_order": 2,
+ "color": "#8bc34a",
+ "default": False,
+ "enabled": True,
+ },
+ {
+ "name": "Medium",
+ "description": "Medium severity.",
+ "view_order": 3,
+ "color": "#ffeb3b",
+ "default": False,
+ "enabled": True,
+ },
+ {
+ "name": "High",
+ "description": "High severity.",
+ "view_order": 4,
+ "color": "#ff9800",
+ "default": False,
+ "enabled": True,
+ },
+ {
+ "name": "Critical",
+ "description": "Critical severity.",
+ "view_order": 5,
+ "color": "#e53935",
+ "default": False,
+ "enabled": True,
+ },
+]
diff --git a/src/dispatch/incident/severity/models.py b/src/dispatch/incident/severity/models.py
new file mode 100644
index 000000000000..dc3ba1efe9d1
--- /dev/null
+++ b/src/dispatch/incident/severity/models.py
@@ -0,0 +1,80 @@
+"""Models for incident severity resources in the Dispatch application."""
+
+from sqlalchemy import Column, Integer, String, Boolean
+from sqlalchemy.sql.schema import UniqueConstraint
+from sqlalchemy.event import listen
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base, ensure_unique_default_per_project
+from dispatch.models import DispatchBase, NameStr, ProjectMixin, PrimaryKey, Pagination
+from dispatch.project.models import ProjectRead
+
+class IncidentSeverity(Base, ProjectMixin):
+ """SQLAlchemy model for incident severity resources."""
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ color = Column(String)
+ enabled = Column(Boolean, default=True)
+ default = Column(Boolean, default=False)
+ allowed_for_stable_incidents = Column(Boolean, default=True, server_default="t")
+
+ # This column is used to control how severities should be displayed
+ # Lower numbers will be shown first.
+ view_order = Column(Integer, default=9999)
+
+ search_vector = Column(
+ TSVectorType(
+ "name",
+ "description",
+ regconfig="pg_catalog.simple",
+ )
+ )
+
+
+listen(IncidentSeverity.default, "set", ensure_unique_default_per_project)
+
+
+# Pydantic models
+class IncidentSeverityBase(DispatchBase):
+ """Base Pydantic model for incident severity resources."""
+ color: str | None = None
+ default: bool | None = None
+ description: str | None = None
+ enabled: bool | None = None
+ name: NameStr
+ project: ProjectRead | None = None
+ view_order: int | None = None
+ allowed_for_stable_incidents: bool | None = None
+
+
+class IncidentSeverityCreate(IncidentSeverityBase):
+ """Pydantic model for creating an incident severity resource."""
+ pass
+
+
+class IncidentSeverityUpdate(IncidentSeverityBase):
+ """Pydantic model for updating an incident severity resource."""
+ pass
+
+
+class IncidentSeverityRead(IncidentSeverityBase):
+ """Pydantic model for reading an incident severity resource."""
+ id: PrimaryKey
+
+
+class IncidentSeverityReadMinimal(DispatchBase):
+ """Pydantic model for reading a minimal incident severity resource."""
+ id: PrimaryKey
+ color: str | None = None
+ default: bool | None = None
+ description: str | None = None
+ enabled: bool | None = None
+ name: NameStr
+ allowed_for_stable_incidents: bool | None = None
+
+
+class IncidentSeverityPagination(Pagination):
+ """Pydantic model for paginated incident severity results."""
+ items: list[IncidentSeverityRead] = []
diff --git a/src/dispatch/incident/severity/service.py b/src/dispatch/incident/severity/service.py
new file mode 100644
index 000000000000..7c09c2200e7d
--- /dev/null
+++ b/src/dispatch/incident/severity/service.py
@@ -0,0 +1,166 @@
+from pydantic import ValidationError
+
+from sqlalchemy.sql.expression import true
+
+from dispatch.project import service as project_service
+
+from .models import (
+ IncidentSeverity,
+ IncidentSeverityCreate,
+ IncidentSeverityRead,
+ IncidentSeverityUpdate,
+)
+
+
+def get(*, db_session, incident_severity_id: int) -> IncidentSeverity | None:
+ """Returns an incident severity based on the given severity id."""
+ return (
+ db_session.query(IncidentSeverity)
+ .filter(IncidentSeverity.id == incident_severity_id)
+ .one_or_none()
+ )
+
+
+def get_default(*, db_session, project_id: int):
+ """Returns the default incident severity."""
+ return (
+ db_session.query(IncidentSeverity)
+ .filter(IncidentSeverity.default == true())
+ .filter(IncidentSeverity.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_default_or_raise(*, db_session, project_id: int) -> IncidentSeverity:
+ """Returns the default incident severity or raises a ValidationError if one doesn't exist."""
+ incident_severity = get_default(db_session=db_session, project_id=project_id)
+
+ if not incident_severity:
+ raise ValidationError.from_exception_data(
+ "IncidentSeverityRead",
+ [
+ {
+ "type": "value_error",
+ "loc": ("incident_severity",),
+ "input": None,
+ "msg": "No default incident severity defined.",
+ "ctx": {"error": ValueError("No default incident severity defined.")},
+ }
+ ],
+ )
+
+ return incident_severity
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> IncidentSeverity | None:
+ """Returns an incident severity based on the given severity name."""
+ return (
+ db_session.query(IncidentSeverity)
+ .filter(IncidentSeverity.name == name)
+ .filter(IncidentSeverity.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(
+ *, db_session, project_id: int, incident_severity_in=IncidentSeverityRead
+) -> IncidentSeverity:
+ """Returns the incident severity specified or raises ValidationError."""
+ incident_severity = get_by_name(
+ db_session=db_session, project_id=project_id, name=incident_severity_in.name
+ )
+
+ if not incident_severity:
+ raise ValidationError(
+ [
+ {
+ "msg": "Incident severity not found.",
+ "loc": ("incident_severity",),
+ "type": "value_error.not_found",
+ "incident_severity": incident_severity_in.name,
+ }
+ ]
+ )
+
+ return incident_severity
+
+
+def get_by_name_or_default(
+ *, db_session, project_id: int, incident_severity_in=IncidentSeverityRead
+) -> IncidentSeverity:
+ """Returns an incident severity based on a name or the default if not specified."""
+ if incident_severity_in and incident_severity_in.name:
+ incident_severity = get_by_name(
+ db_session=db_session, project_id=project_id, name=incident_severity_in.name
+ )
+ if incident_severity:
+ return incident_severity
+ return get_default_or_raise(db_session=db_session, project_id=project_id)
+
+
+def get_all(*, db_session, project_id: int = None) -> list[IncidentSeverity | None]:
+ """Returns all incident severities."""
+ if project_id:
+ return db_session.query(IncidentSeverity).filter(IncidentSeverity.project_id == project_id)
+
+ return db_session.query(IncidentSeverity).all()
+
+
+def get_all_enabled(*, db_session, project_id: int = None) -> list[IncidentSeverity | None]:
+ """Returns all enabled incident severities."""
+ if project_id:
+ return (
+ db_session.query(IncidentSeverity)
+ .filter(IncidentSeverity.project_id == project_id)
+ .filter(IncidentSeverity.enabled == true())
+ .order_by(IncidentSeverity.view_order)
+ )
+
+ return (
+ db_session.query(IncidentSeverity)
+ .filter(IncidentSeverity.enabled == true())
+ .order_by(IncidentSeverity.view_order)
+ )
+
+
+def create(*, db_session, incident_severity_in: IncidentSeverityCreate) -> IncidentSeverity:
+ """Creates an incident severity."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=incident_severity_in.project
+ )
+ incident_severity = IncidentSeverity(
+ **incident_severity_in.dict(exclude={"project", "color"}), project=project
+ )
+ if incident_severity_in.color:
+ incident_severity.color = incident_severity_in.color
+
+ db_session.add(incident_severity)
+ db_session.commit()
+
+ return incident_severity
+
+
+def update(
+ *, db_session, incident_severity: IncidentSeverity, incident_severity_in: IncidentSeverityUpdate
+) -> IncidentSeverity:
+ """Updates an incident severity."""
+ incident_severity_data = incident_severity.dict()
+
+ update_data = incident_severity_in.dict(exclude_unset=True, exclude={"project", "color"})
+
+ for field in incident_severity_data:
+ if field in update_data:
+ setattr(incident_severity, field, update_data[field])
+
+ if incident_severity_in.color:
+ incident_severity.color = incident_severity_in.color
+
+ db_session.commit()
+
+ return incident_severity
+
+
+def delete(*, db_session, incident_severity_id: int):
+ """Deletes an incident severity."""
+ db_session.query(IncidentSeverity).filter(IncidentSeverity.id == incident_severity_id).delete()
+ db_session.commit()
diff --git a/src/dispatch/incident/severity/views.py b/src/dispatch/incident/severity/views.py
new file mode 100644
index 000000000000..b65651bc61a6
--- /dev/null
+++ b/src/dispatch/incident/severity/views.py
@@ -0,0 +1,75 @@
+from fastapi import APIRouter, Depends, HTTPException, status
+
+from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
+
+from .models import (
+ IncidentSeverityCreate,
+ IncidentSeverityPagination,
+ IncidentSeverityRead,
+ IncidentSeverityUpdate,
+)
+from .service import create, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=IncidentSeverityPagination, tags=["incident_severities"])
+def get_incident_severities(common: CommonParameters):
+ """Returns all incident severities."""
+ return search_filter_sort_paginate(model="IncidentSeverity", **common)
+
+
+@router.post(
+ "",
+ response_model=IncidentSeverityRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_incident_severity(
+ db_session: DbSession,
+ incident_severity_in: IncidentSeverityCreate,
+):
+ """Creates a new incident severity."""
+ incident_severity = create(db_session=db_session, incident_severity_in=incident_severity_in)
+ return incident_severity
+
+
+@router.put(
+ "/{incident_severity_id}",
+ response_model=IncidentSeverityRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_incident_severity(
+ db_session: DbSession,
+ incident_severity_id: PrimaryKey,
+ incident_severity_in: IncidentSeverityUpdate,
+):
+ """Updates an existing incident severity."""
+ incident_severity = get(db_session=db_session, incident_severity_id=incident_severity_id)
+ if not incident_severity:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An incident severity with this id does not exist."}],
+ )
+
+ incident_severity = update(
+ db_session=db_session,
+ incident_severity=incident_severity,
+ incident_severity_in=incident_severity_in,
+ )
+ return incident_severity
+
+
+@router.get("/{incident_severity_id}", response_model=IncidentSeverityRead)
+def get_incident_severity(db_session: DbSession, incident_severity_id: PrimaryKey):
+ """Gets an incident severity."""
+ incident_severity = get(db_session=db_session, incident_severity_id=incident_severity_id)
+ if not incident_severity:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An incident severity with this id does not exist."}],
+ )
+ return incident_severity
diff --git a/src/dispatch/incident/type/__init__.py b/src/dispatch/incident/type/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/incident/type/config.py b/src/dispatch/incident/type/config.py
new file mode 100644
index 000000000000..9bf818f9fd5d
--- /dev/null
+++ b/src/dispatch/incident/type/config.py
@@ -0,0 +1,10 @@
+from dispatch.enums import Visibility
+
+default_incident_type = {
+ "name": "Default",
+ "description": "This is the default incident type.",
+ "visibility": Visibility.open,
+ "exclude_from_metrics": False,
+ "default": True,
+ "enabled": True,
+}
diff --git a/src/dispatch/incident/type/models.py b/src/dispatch/incident/type/models.py
new file mode 100644
index 000000000000..a46dc79eda91
--- /dev/null
+++ b/src/dispatch/incident/type/models.py
@@ -0,0 +1,179 @@
+"""Models for incident type resources in the Dispatch application."""
+
+from pydantic import field_validator, AnyHttpUrl
+
+from sqlalchemy import Column, Boolean, ForeignKey, Integer, String, JSON
+from sqlalchemy.event import listen
+from sqlalchemy.ext.hybrid import hybrid_method
+from sqlalchemy.orm import relationship
+from sqlalchemy.sql import false
+from sqlalchemy.sql.schema import UniqueConstraint
+from sqlalchemy_utils import TSVectorType
+
+
+from dispatch.cost_model.models import CostModelRead
+from dispatch.database.core import Base, ensure_unique_default_per_project
+from dispatch.enums import Visibility
+from dispatch.models import DispatchBase, ProjectMixin, Pagination
+from dispatch.models import NameStr, PrimaryKey
+from dispatch.plugin.models import PluginMetadata
+from dispatch.project.models import ProjectRead
+from dispatch.service.models import ServiceRead
+
+
+class IncidentType(ProjectMixin, Base):
+ """SQLAlchemy model for incident type resources."""
+
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ slug = Column(String)
+ description = Column(String)
+ exclude_from_metrics = Column(Boolean, default=False)
+
+ enabled = Column(Boolean, default=True)
+ default = Column(Boolean, default=False)
+ visibility = Column(String, default=Visibility.open)
+ exclude_from_reminders = Column(Boolean, default=False)
+ exclude_from_review = Column(Boolean, default=False)
+ plugin_metadata = Column(JSON, default=[])
+ task_plugin_metadata = Column(JSON, default=[])
+ generate_read_in_summary = Column(Boolean, default=False, server_default=false())
+
+ incident_template_document_id = Column(Integer, ForeignKey("document.id"))
+ incident_template_document = relationship(
+ "Document", foreign_keys=[incident_template_document_id]
+ )
+
+ executive_template_document_id = Column(Integer, ForeignKey("document.id"))
+ executive_template_document = relationship(
+ "Document", foreign_keys=[executive_template_document_id]
+ )
+
+ review_template_document_id = Column(Integer, ForeignKey("document.id"))
+ review_template_document = relationship("Document", foreign_keys=[review_template_document_id])
+
+ tracking_template_document_id = Column(Integer, ForeignKey("document.id"))
+ tracking_template_document = relationship(
+ "Document", foreign_keys=[tracking_template_document_id]
+ )
+
+ commander_service_id = Column(Integer, ForeignKey("service.id"))
+ commander_service = relationship("Service", foreign_keys=[commander_service_id])
+
+ liaison_service_id = Column(Integer, ForeignKey("service.id"))
+ liaison_service = relationship("Service", foreign_keys=[liaison_service_id])
+
+ # the catalog here is simple to help matching "named entities"
+ search_vector = Column(TSVectorType("name", regconfig="pg_catalog.simple"))
+
+ cost_model_id = Column(Integer, ForeignKey("cost_model.id"), nullable=True, default=None)
+ cost_model = relationship(
+ "CostModel",
+ foreign_keys=[cost_model_id],
+ )
+
+ channel_description = Column(String, nullable=True)
+ description_service_id = Column(Integer, ForeignKey("service.id"))
+ description_service = relationship("Service", foreign_keys=[description_service_id])
+
+ @hybrid_method
+ def get_meta(self, slug):
+ """Retrieve plugin metadata by slug."""
+ if not self.plugin_metadata:
+ return
+
+ for m in self.plugin_metadata:
+ if m["slug"] == slug:
+ return m
+
+ @hybrid_method
+ def get_task_meta(self, slug):
+ """Retrieve task plugin metadata by slug."""
+ if not self.task_plugin_metadata:
+ return
+
+ for m in self.task_plugin_metadata:
+ if m["slug"] == slug:
+ return m
+
+
+listen(IncidentType.default, "set", ensure_unique_default_per_project)
+
+
+class Document(DispatchBase):
+ """Pydantic model for a document related to an incident type."""
+
+ id: PrimaryKey
+ name: NameStr
+ resource_type: str | None = None
+ resource_id: str | None = None
+ description: str | None = None
+ weblink: AnyHttpUrl | None = None
+
+
+# Pydantic models...
+class IncidentTypeBase(DispatchBase):
+ """Base Pydantic model for incident type resources."""
+
+ name: NameStr
+ visibility: str | None = None
+ description: str | None = None
+ enabled: bool | None = None
+ incident_template_document: Document | None = None
+ executive_template_document: Document | None = None
+ review_template_document: Document | None = None
+ tracking_template_document: Document | None = None
+ exclude_from_metrics: bool | None = False
+ exclude_from_reminders: bool | None = False
+ exclude_from_review: bool | None = False
+ default: bool | None = False
+ project: ProjectRead | None = None
+ plugin_metadata: list[PluginMetadata] = []
+ cost_model: CostModelRead | None = None
+ channel_description: str | None = None
+ description_service: ServiceRead | None = None
+ task_plugin_metadata: list[PluginMetadata] = []
+ generate_read_in_summary: bool | None = False
+
+ @field_validator("plugin_metadata", mode="before")
+ @classmethod
+ def replace_none_with_empty_list(cls, value):
+ """Ensure plugin_metadata is always a list, replacing None with an empty list."""
+ return [] if value is None else value
+
+
+class IncidentTypeCreate(IncidentTypeBase):
+ """Pydantic model for creating an incident type resource."""
+
+ pass
+
+
+class IncidentTypeUpdate(IncidentTypeBase):
+ """Pydantic model for updating an incident type resource."""
+
+ id: PrimaryKey | None = None
+
+
+class IncidentTypeRead(IncidentTypeBase):
+ """Pydantic model for reading an incident type resource."""
+
+ id: PrimaryKey
+
+
+class IncidentTypeReadMinimal(DispatchBase):
+ """Pydantic model for reading a minimal incident type resource."""
+
+ id: PrimaryKey
+ name: NameStr
+ visibility: str | None = None
+ description: str | None = None
+ enabled: bool | None = None
+ exclude_from_metrics: bool | None = False
+ default: bool | None = False
+
+
+class IncidentTypePagination(Pagination):
+ """Pydantic model for paginated incident type results."""
+
+ items: list[IncidentTypeRead] = []
diff --git a/src/dispatch/incident/type/service.py b/src/dispatch/incident/type/service.py
new file mode 100644
index 000000000000..83f7631146c1
--- /dev/null
+++ b/src/dispatch/incident/type/service.py
@@ -0,0 +1,278 @@
+from pydantic import ValidationError
+
+from sqlalchemy.sql.expression import true
+
+from dispatch.incident_cost import service as incident_cost_service
+from dispatch.incident import service as incident_service
+from dispatch.cost_model import service as cost_model_service
+from dispatch.document import service as document_service
+from dispatch.project import service as project_service
+from dispatch.service import service as service_service
+
+from .models import IncidentType, IncidentTypeCreate, IncidentTypeRead, IncidentTypeUpdate
+
+
+def get(*, db_session, incident_type_id: int) -> IncidentType | None:
+ """Returns an incident type based on the given type id."""
+ return db_session.query(IncidentType).filter(IncidentType.id == incident_type_id).one_or_none()
+
+
+def get_default(*, db_session, project_id: int):
+ """Returns the default incident type."""
+ return (
+ db_session.query(IncidentType)
+ .filter(IncidentType.default == true())
+ .filter(IncidentType.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_default_or_raise(*, db_session, project_id: int) -> IncidentType:
+ """Returns the default incident_type or raise a ValidationError if one doesn't exist."""
+ incident_type = get_default(db_session=db_session, project_id=project_id)
+
+ if not incident_type:
+ raise ValidationError.from_exception_data(
+ "IncidentTypeRead",
+ [
+ {
+ "type": "value_error",
+ "loc": ("incident_type",),
+ "input": None,
+ "ctx": {"error": ValueError("No default incident type defined.")},
+ }
+ ],
+ )
+ return incident_type
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> IncidentType | None:
+ """Returns an incident type based on the given type name."""
+ return (
+ db_session.query(IncidentType)
+ .filter(IncidentType.name == name)
+ .filter(IncidentType.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(
+ *, db_session, project_id: int, incident_type_in=IncidentTypeRead
+) -> IncidentType:
+ """Returns the incident_type specified or raises ValidationError."""
+ incident_type = get_by_name(
+ db_session=db_session, project_id=project_id, name=incident_type_in.name
+ )
+
+ if not incident_type:
+ raise ValidationError.from_exception_data(
+ "IncidentTypeRead",
+ [
+ {
+ "type": "value_error",
+ "loc": ("incident_type",),
+ "input": incident_type_in.name,
+ "ctx": {"error": ValueError("Incident type not found.")},
+ }
+ ],
+ )
+
+ return incident_type
+
+
+def get_by_name_or_default(
+ *, db_session, project_id: int, incident_type_in=IncidentTypeRead
+) -> IncidentType:
+ """Returns a incident_type based on a name or the default if not specified."""
+ if incident_type_in and incident_type_in.name:
+ incident_type = get_by_name(
+ db_session=db_session, project_id=project_id, name=incident_type_in.name
+ )
+ if incident_type:
+ return incident_type
+ return get_default_or_raise(db_session=db_session, project_id=project_id)
+
+
+def get_by_slug(*, db_session, project_id: int, slug: str) -> IncidentType | None:
+ """Returns an incident type based on the given type slug."""
+ return (
+ db_session.query(IncidentType)
+ .filter(IncidentType.slug == slug)
+ .filter(IncidentType.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_all(*, db_session, project_id: int = None) -> list[IncidentType | None]:
+ """Returns all incident types."""
+ if project_id:
+ return db_session.query(IncidentType).filter(IncidentType.project_id == project_id)
+ return db_session.query(IncidentType)
+
+
+def get_all_enabled(*, db_session, project_id: int = None) -> list[IncidentType | None]:
+ """Returns all enabled incident types."""
+ if project_id:
+ return (
+ db_session.query(IncidentType)
+ .filter(IncidentType.project_id == project_id)
+ .filter(IncidentType.enabled == true())
+ .order_by(IncidentType.name)
+ )
+ return (
+ db_session.query(IncidentType)
+ .filter(IncidentType.enabled == true())
+ .order_by(IncidentType.name)
+ )
+
+
+def create(*, db_session, incident_type_in: IncidentTypeCreate) -> IncidentType:
+ """Creates an incident type."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=incident_type_in.project
+ )
+ incident_type = IncidentType(
+ **incident_type_in.dict(
+ exclude={
+ "incident_template_document",
+ "executive_template_document",
+ "tracking_template_document",
+ "review_template_document",
+ "cost_model",
+ "project",
+ "description_service",
+ }
+ ),
+ project=project,
+ )
+
+ if incident_type_in.cost_model:
+ cost_model = cost_model_service.get_cost_model_by_id(
+ db_session=db_session, cost_model_id=incident_type_in.cost_model.id
+ )
+ incident_type.cost_model = cost_model
+
+ if incident_type_in.incident_template_document:
+ incident_template_document = document_service.get(
+ db_session=db_session, document_id=incident_type_in.incident_template_document.id
+ )
+ incident_type.incident_template_document = incident_template_document
+
+ if incident_type_in.executive_template_document:
+ executive_template_document = document_service.get(
+ db_session=db_session, document_id=incident_type_in.executive_template_document.id
+ )
+ incident_type.executive_template_document = executive_template_document
+
+ if incident_type_in.review_template_document:
+ review_template_document = document_service.get(
+ db_session=db_session, document_id=incident_type_in.review_template_document.id
+ )
+ incident_type.review_template_document = review_template_document
+
+ if incident_type_in.tracking_template_document:
+ tracking_template_document = document_service.get(
+ db_session=db_session, document_id=incident_type_in.tracking_template_document.id
+ )
+ incident_type.tracking_template_document = tracking_template_document
+
+ if incident_type_in.description_service:
+ service = service_service.get(
+ db_session=db_session, service_id=incident_type_in.description_service.id
+ )
+ if service:
+ incident_type.description_service_id = service.id
+
+ db_session.add(incident_type)
+ db_session.commit()
+ return incident_type
+
+
+def update(
+ *, db_session, incident_type: IncidentType, incident_type_in: IncidentTypeUpdate
+) -> IncidentType:
+ """Updates an incident type.
+
+ If the cost model is updated, we need to update the costs of all incidents associated with this incident type.
+ """
+ cost_model = None
+ if incident_type_in.cost_model:
+ cost_model = cost_model_service.get_cost_model_by_id(
+ db_session=db_session, cost_model_id=incident_type_in.cost_model.id
+ )
+ should_update_incident_cost = incident_type.cost_model != cost_model
+ incident_type.cost_model = cost_model
+
+ # Calculate the cost of all non-closed incidents associated with this incident type
+ incidents = incident_service.get_all_open_by_incident_type(
+ db_session=db_session, incident_type_id=incident_type.id
+ )
+ for incident in incidents:
+ if incident is not None:
+ incident_cost_service.calculate_incident_response_cost(
+ incident_id=incident.id, db_session=db_session, incident_review=False
+ )
+
+ if incident_type_in.incident_template_document:
+ incident_template_document = document_service.get(
+ db_session=db_session, document_id=incident_type_in.incident_template_document.id
+ )
+ incident_type.incident_template_document = incident_template_document
+
+ if incident_type_in.executive_template_document:
+ executive_template_document = document_service.get(
+ db_session=db_session, document_id=incident_type_in.executive_template_document.id
+ )
+ incident_type.executive_template_document = executive_template_document
+
+ if incident_type_in.review_template_document:
+ review_template_document = document_service.get(
+ db_session=db_session, document_id=incident_type_in.review_template_document.id
+ )
+ incident_type.review_template_document = review_template_document
+
+ if incident_type_in.tracking_template_document:
+ tracking_template_document = document_service.get(
+ db_session=db_session, document_id=incident_type_in.tracking_template_document.id
+ )
+ incident_type.tracking_template_document = tracking_template_document
+
+ if incident_type_in.description_service:
+ service = service_service.get(
+ db_session=db_session, service_id=incident_type_in.description_service.id
+ )
+ if service:
+ incident_type.description_service_id = service.id
+
+ incident_type_data = incident_type.dict()
+
+ update_data = incident_type_in.dict(
+ exclude_unset=True,
+ exclude={
+ "incident_template_document",
+ "executive_template_document",
+ "tracking_template_document",
+ "review_template_document",
+ "cost_model",
+ "description_service",
+ },
+ )
+
+ for field in incident_type_data:
+ if field in update_data:
+ setattr(incident_type, field, update_data[field])
+
+ db_session.commit()
+
+ if should_update_incident_cost:
+ incident_cost_service.update_incident_response_cost_for_incident_type(
+ db_session=db_session, incident_type=incident_type
+ )
+
+ return incident_type
+
+
+def delete(*, db_session, incident_type_id: int):
+ """Deletes an incident type."""
+ db_session.query(IncidentType).filter(IncidentType.id == incident_type_id).delete()
+ db_session.commit()
diff --git a/src/dispatch/incident/type/views.py b/src/dispatch/incident/type/views.py
new file mode 100644
index 000000000000..1d8eb346dd28
--- /dev/null
+++ b/src/dispatch/incident/type/views.py
@@ -0,0 +1,68 @@
+from fastapi import APIRouter, Depends, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
+from dispatch.models import PrimaryKey
+
+from .models import IncidentTypeCreate, IncidentTypePagination, IncidentTypeRead, IncidentTypeUpdate
+from .service import create, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=IncidentTypePagination, tags=["incident_types"])
+def get_incident_types(common: CommonParameters):
+ """Returns all incident types."""
+ return search_filter_sort_paginate(model="IncidentType", **common)
+
+
+@router.post(
+ "",
+ response_model=IncidentTypeRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_incident_type(
+ db_session: DbSession,
+ incident_type_in: IncidentTypeCreate,
+):
+ """Create a new incident type."""
+ incident_type = create(db_session=db_session, incident_type_in=incident_type_in)
+ return incident_type
+
+
+@router.put(
+ "/{incident_type_id}",
+ response_model=IncidentTypeRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_incident_type(
+ db_session: DbSession,
+ incident_type_id: PrimaryKey,
+ incident_type_in: IncidentTypeUpdate,
+):
+ """Update an existing incident type."""
+ incident_type = get(db_session=db_session, incident_type_id=incident_type_id)
+ if not incident_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "The incident type with this id does not exist."}],
+ )
+
+ incident_type = update(
+ db_session=db_session, incident_type=incident_type, incident_type_in=incident_type_in
+ )
+ return incident_type
+
+
+@router.get("/{incident_type_id}", response_model=IncidentTypeRead)
+def get_incident_type(db_session: DbSession, incident_type_id: PrimaryKey):
+ """Get an incident type."""
+ incident_type = get(db_session=db_session, incident_type_id=incident_type_id)
+ if not incident_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "The incident type with this id does not exist."}],
+ )
+ return incident_type
diff --git a/src/dispatch/incident/views.py b/src/dispatch/incident/views.py
index 797e69325ba9..b901f1f35288 100644
--- a/src/dispatch/incident/views.py
+++ b/src/dispatch/incident/views.py
@@ -1,144 +1,608 @@
-from typing import List
+import calendar
+import json
+import logging
+from datetime import date, datetime
+from typing import Annotated
+from dateutil.relativedelta import relativedelta
+from dispatch.ai.models import TacticalReportResponse
+from fastapi import APIRouter, BackgroundTasks, Depends, HTTPException, Query, status
+from sqlalchemy.exc import IntegrityError
+from starlette.requests import Request
-from fastapi import APIRouter, BackgroundTasks, Depends, HTTPException, Query
-from sqlalchemy.orm import Session
+from dispatch.ai import service as ai_service
+from dispatch.auth.permissions import (
+ IncidentEditPermission,
+ IncidentEventPermission,
+ IncidentJoinOrSubscribePermission,
+ IncidentViewPermission,
+ PermissionsDependency,
+)
+from dispatch.auth.service import CurrentUser
+from dispatch.common.utils.views import create_pydantic_include
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.event import flows as event_flows
+from dispatch.event.models import EventCreateMinimal, EventUpdate
+from dispatch.incident.enums import IncidentStatus
+from dispatch.individual.models import IndividualContactRead
+from dispatch.individual.service import get_or_create
+from dispatch.models import OrganizationSlug, PrimaryKey
+from dispatch.participant.models import ParticipantUpdate
+from dispatch.project import service as project_service
+from dispatch.report import flows as report_flows
+from dispatch.report.models import ExecutiveReportCreate, TacticalReportCreate
-from dispatch.auth.service import get_current_user
-from dispatch.database import get_db, search_filter_sort_paginate
-
-from dispatch.participant_role.models import ParticipantRoleType
-
-from .flows import incident_create_flow, incident_update_flow, incident_assign_role_flow
-from .models import IncidentCreate, IncidentPagination, IncidentRead, IncidentUpdate
+from .flows import (
+ incident_add_or_reactivate_participant_flow,
+ incident_create_closed_flow,
+ incident_create_flow,
+ incident_create_resources_flow,
+ incident_create_stable_flow,
+ incident_delete_flow,
+ incident_remove_participant_flow,
+ incident_subscribe_participant_flow,
+ incident_update_flow,
+)
+from .metrics import create_incident_metric_query, make_forecast
+from .models import (
+ Incident,
+ IncidentCreate,
+ IncidentExpandedPagination,
+ IncidentPagination,
+ IncidentRead,
+ IncidentUpdate,
+)
from .service import create, delete, get, update
-from .metrics import make_forecast
+
+log = logging.getLogger(__name__)
router = APIRouter()
-# TODO add additional routes to get incident by e.g. deeplink
-@router.get("/", response_model=IncidentPagination, summary="Retrieve a list of all incidents.")
+
+def get_current_incident(db_session: DbSession, request: Request) -> Incident:
+ """Fetches incident or returns a 404."""
+ incident = get(db_session=db_session, incident_id=request.path_params["incident_id"])
+ if not incident:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An incident with this id does not exist."}],
+ )
+ return incident
+
+
+CurrentIncident = Annotated[Incident, Depends(get_current_incident)]
+
+
+@router.get("", summary="Retrieve a list of incidents.")
def get_incidents(
- db_session: Session = Depends(get_db),
- page: int = 1,
- items_per_page: int = Query(5, alias="itemsPerPage"),
- query_str: str = Query(None, alias="q"),
- sort_by: List[str] = Query(None, alias="sortBy[]"),
- descending: List[bool] = Query(None, alias="descending[]"),
- fields: List[str] = Query(None, alias="fields[]"),
- ops: List[str] = Query(None, alias="ops[]"),
- values: List[str] = Query(None, alias="values[]"),
+ common: CommonParameters,
+ include: list[str] = Query([], alias="include[]"),
+ expand: bool = Query(default=False),
):
- """
- Retrieve a list of all incidents.
- """
- return search_filter_sort_paginate(
- db_session=db_session,
- model="Incident",
- query_str=query_str,
- page=page,
- items_per_page=items_per_page,
- sort_by=sort_by,
- descending=descending,
- fields=fields,
- values=values,
- ops=ops,
- )
+ """Retrieves a list of incidents."""
+ pagination = search_filter_sort_paginate(model="Incident", **common)
+ if expand:
+ return json.loads(IncidentExpandedPagination(**pagination).json())
-@router.get("/{incident_id}", response_model=IncidentRead, summary="Retrieve a single incident.")
-def get_incident(*, db_session: Session = Depends(get_db), incident_id: str):
- """
- Retrieve details about a specific incident.
- """
- incident = get(db_session=db_session, incident_id=incident_id)
- if not incident:
- raise HTTPException(status_code=404, detail="The requested incident does not exist.")
- return incident
+ if include:
+ # only allow two levels for now
+ include_sets = create_pydantic_include(include)
+
+ include_fields = {
+ "items": {"__all__": include_sets},
+ "itemsPerPage": ...,
+ "page": ...,
+ "total": ...,
+ }
+ return json.loads(IncidentExpandedPagination(**pagination).json(include=include_fields))
+ return json.loads(IncidentPagination(**pagination).json())
+
+
+@router.get(
+ "/{incident_id}",
+ response_model=IncidentRead,
+ summary="Retrieves a single incident.",
+ dependencies=[Depends(PermissionsDependency([IncidentViewPermission]))],
+)
+def get_incident(
+ incident_id: PrimaryKey,
+ db_session: DbSession,
+ current_incident: CurrentIncident,
+):
+ """Retrieves the details of a single incident."""
+ return current_incident
-@router.post("/", response_model=IncidentRead, summary="Create a new incident.")
+@router.post("", response_model=IncidentRead, summary="Creates a new incident.")
def create_incident(
- *,
- db_session: Session = Depends(get_db),
+ db_session: DbSession,
+ organization: OrganizationSlug,
incident_in: IncidentCreate,
- current_user_email: str = Depends(get_current_user),
+ current_user: CurrentUser,
background_tasks: BackgroundTasks,
):
- """
- Create a new incident.
- """
- incident = create(
- db_session=db_session, reporter_email=current_user_email, **incident_in.dict()
- )
+ """Creates a new incident."""
+ if not incident_in.reporter:
+ # Ensure the individual exists, create if not
+ if incident_in.project is None:
+ raise HTTPException(
+ status.HTTP_422_UNPROCESSABLE_ENTITY,
+ detail=[{"msg": "Project must be set to create reporter individual."}],
+ )
+ # Fetch the full DB project instance
+ project = project_service.get_by_name_or_default(
+ db_session=db_session, project_in=incident_in.project
+ )
+ individual = get_or_create(
+ db_session=db_session,
+ email=current_user.email,
+ project=project,
+ )
+ incident_in.reporter = ParticipantUpdate(
+ individual=IndividualContactRead(id=individual.id, email=individual.email)
+ )
+ incident = create(db_session=db_session, incident_in=incident_in)
- background_tasks.add_task(incident_create_flow, incident_id=incident.id)
+ if incident.status == IncidentStatus.stable:
+ background_tasks.add_task(
+ incident_create_stable_flow, incident_id=incident.id, organization_slug=organization
+ )
+ elif incident.status == IncidentStatus.closed:
+ background_tasks.add_task(
+ incident_create_closed_flow, incident_id=incident.id, organization_slug=organization
+ )
+ else:
+ background_tasks.add_task(
+ incident_create_flow, incident_id=incident.id, organization_slug=organization
+ )
return incident
-@router.put("/{incident_id}", response_model=IncidentRead, summary="Update an existing incident.")
+@router.post(
+ "/{incident_id}/resources",
+ response_model=IncidentRead,
+ summary="Creates resources for an existing incident.",
+ dependencies=[Depends(PermissionsDependency([IncidentViewPermission]))],
+)
+def create_incident_resources(
+ organization: OrganizationSlug,
+ incident_id: PrimaryKey,
+ current_incident: CurrentIncident,
+ background_tasks: BackgroundTasks,
+):
+ """Creates resources for an existing incident."""
+ background_tasks.add_task(
+ incident_create_resources_flow, organization_slug=organization, incident_id=incident_id
+ )
+
+ return current_incident
+
+
+@router.put(
+ "/{incident_id}",
+ response_model=IncidentRead,
+ summary="Updates an existing incident.",
+ dependencies=[Depends(PermissionsDependency([IncidentEditPermission]))],
+)
def update_incident(
- *,
- db_session: Session = Depends(get_db),
- incident_id: str,
+ db_session: DbSession,
+ current_incident: CurrentIncident,
+ organization: OrganizationSlug,
+ incident_id: PrimaryKey,
incident_in: IncidentUpdate,
- current_user_email: str = Depends(get_current_user),
+ current_user: CurrentUser,
background_tasks: BackgroundTasks,
):
- """
- Update an individual incident.
- """
- incident = get(db_session=db_session, incident_id=incident_id)
- if not incident:
- raise HTTPException(status_code=404, detail="The requested incident does not exist.")
+ """Updates an existing incident."""
+ # we store the previous state of the incident in order to be able to detect changes
+ previous_incident = IncidentRead.from_orm(current_incident)
- previous_incident = IncidentRead.from_orm(incident)
-
- # NOTE: Order matters we have to get the previous state for change detection
- incident = update(db_session=db_session, incident=incident, incident_in=incident_in)
+ # we update the incident
+ incident = update(db_session=db_session, incident=current_incident, incident_in=incident_in)
+ # we run the incident update flow
background_tasks.add_task(
incident_update_flow,
- user_email=current_user_email,
- incident_id=incident.id,
+ user_email=current_user.email,
+ commander_email=incident_in.commander.individual.email,
+ reporter_email=incident_in.reporter.individual.email,
+ incident_id=incident_id,
previous_incident=previous_incident,
+ organization_slug=organization,
)
- # assign commander
+ return incident
+
+
+@router.delete(
+ "/{incident_id}",
+ response_model=None,
+ summary="Deletes an incident and its external resources.",
+ dependencies=[Depends(PermissionsDependency([IncidentEditPermission]))],
+)
+def delete_incident(
+ incident_id: PrimaryKey,
+ db_session: DbSession,
+ current_incident: CurrentIncident,
+):
+ """Deletes an incident and its external resources."""
+ # we run the incident delete flow
+ incident_delete_flow(incident=current_incident, db_session=db_session)
+
+ # we delete the internal incident
+ try:
+ delete(incident_id=current_incident.id, db_session=db_session)
+ except IntegrityError as e:
+ log.exception(e)
+ raise HTTPException(
+ status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
+ detail=[
+ {
+ "msg": (
+ f"Incident {current_incident.name} could not be deleted. Make sure the incident has no "
+ "relationships to other incidents or cases before deleting it.",
+ )
+ }
+ ],
+ ) from None
+
+
+@router.post(
+ "/{incident_id}/join",
+ summary="Adds an individual to an incident.",
+ dependencies=[Depends(PermissionsDependency([IncidentJoinOrSubscribePermission]))],
+)
+def join_incident(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ incident_id: PrimaryKey,
+ current_incident: CurrentIncident,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ """Adds an individual to an incident."""
background_tasks.add_task(
- incident_assign_role_flow,
- current_user_email,
- incident_id=incident.id,
- assignee_email=incident_in.commander.email,
- assignee_role=ParticipantRoleType.incident_commander,
+ incident_add_or_reactivate_participant_flow,
+ current_user.email,
+ incident_id=current_incident.id,
+ organization_slug=organization,
)
- # assign reporter
+
+@router.post(
+ "/{incident_id}/subscribe",
+ summary="Subscribes an individual to an incident.",
+ dependencies=[Depends(PermissionsDependency([IncidentJoinOrSubscribePermission]))],
+)
+def subscribe_to_incident(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ incident_id: PrimaryKey,
+ current_incident: CurrentIncident,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ """Subscribes an individual to an incident."""
background_tasks.add_task(
- incident_assign_role_flow,
- current_user_email,
- incident_id=incident.id,
- assignee_email=incident_in.reporter.email,
- assignee_role=ParticipantRoleType.reporter,
+ incident_subscribe_participant_flow,
+ current_user.email,
+ incident_id=current_incident.id,
+ organization_slug=organization,
)
- return incident
+@router.delete(
+ "/{incident_id}/remove/{email}",
+ summary="Removes an individual from an incident.",
+ dependencies=[Depends(PermissionsDependency([IncidentEditPermission]))],
+)
+def remove_participant_from_incident(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ incident_id: PrimaryKey,
+ email: str,
+ current_incident: CurrentIncident,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ """Removes an individual from an incident."""
+ background_tasks.add_task(
+ incident_remove_participant_flow,
+ email,
+ incident_id=current_incident.id,
+ organization_slug=organization,
+ )
-@router.delete("/{incident_id}", response_model=IncidentRead, summary="Delete an incident.")
-def delete_incident(*, db_session: Session = Depends(get_db), incident_id: str):
- """
- Delete an individual incident.
- """
- incident = get(db_session=db_session, incident_id=incident_id)
- if not incident:
- raise HTTPException(status_code=404, detail="The requested incident does not exist.")
- delete(db_session=db_session, incident_id=incident.id)
+@router.post(
+ "/{incident_id}/add/{email}",
+ summary="Adds an individual to an incident.",
+ dependencies=[Depends(PermissionsDependency([IncidentEditPermission]))],
+)
+def add_participant_to_incident(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ incident_id: PrimaryKey,
+ email: str,
+ current_incident: CurrentIncident,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ """Adds an individual to an incident."""
+ background_tasks.add_task(
+ incident_add_or_reactivate_participant_flow,
+ email,
+ incident_id=current_incident.id,
+ organization_slug=organization,
+ )
+
+
+@router.post(
+ "/{incident_id}/report/tactical",
+ summary="Creates a tactical report.",
+ dependencies=[Depends(PermissionsDependency([IncidentEditPermission]))],
+)
+def create_tactical_report(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ incident_id: PrimaryKey,
+ tactical_report_in: TacticalReportCreate,
+ current_user: CurrentUser,
+ current_incident: CurrentIncident,
+ background_tasks: BackgroundTasks,
+):
+ """Creates a tactical report."""
+ background_tasks.add_task(
+ report_flows.create_tactical_report,
+ user_email=current_user.email,
+ incident_id=current_incident.id,
+ tactical_report_in=tactical_report_in,
+ organization_slug=organization,
+ )
-@router.get("/metric/forecast/{incident_type}", summary="Get incident forecast data.")
-def get_incident_forecast(*, db_session: Session = Depends(get_db), incident_type: str):
+@router.get(
+ '/{incident_id}/report/tactical/generate',
+ summary="Auto-generate a tactical report based on Slack conversation contents.",
+ dependencies=[Depends(PermissionsDependency([IncidentEditPermission]))],
+)
+def generate_tactical_report(
+ db_session: DbSession,
+ current_incident: CurrentIncident,
+) -> TacticalReportResponse:
"""
- Get incident forecast data.
+ Auto-generate a tactical report. Requires an enabled Artificial Intelligence Plugin
"""
- return make_forecast(db_session=db_session, incident_type=incident_type)
+ if not current_incident.conversation or not current_incident.conversation.channel_id:
+ return TacticalReportResponse(error_message = f"No channel id found for incident {current_incident.id}")
+ response = ai_service.generate_tactical_report(
+ db_session=db_session,
+ incident=current_incident,
+ project=current_incident.project
+ )
+ if not response.tactical_report:
+ raise HTTPException(
+ status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
+ detail=[{"msg": (response.error_message if response.error_message else "Unknown error generating tactical report.")}],
+ )
+ return response
+
+
+
+@router.post(
+ "/{incident_id}/report/executive",
+ summary="Creates an executive report.",
+ dependencies=[Depends(PermissionsDependency([IncidentEditPermission]))],
+)
+def create_executive_report(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ incident_id: PrimaryKey,
+ current_incident: CurrentIncident,
+ executive_report_in: ExecutiveReportCreate,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ """Creates an executive report."""
+ background_tasks.add_task(
+ report_flows.create_executive_report,
+ user_email=current_user.email,
+ incident_id=current_incident.id,
+ executive_report_in=executive_report_in,
+ organization_slug=organization,
+ )
+
+
+@router.post(
+ "/{incident_id}/event",
+ summary="Creates a custom event.",
+ dependencies=[Depends(PermissionsDependency([IncidentEventPermission]))],
+)
+def create_custom_event(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ incident_id: PrimaryKey,
+ current_incident: CurrentIncident,
+ event_in: EventCreateMinimal,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ event_in.details.update({"created_by": current_user.email, "added_on": str(datetime.utcnow())})
+ """Creates a custom event."""
+ background_tasks.add_task(
+ event_flows.log_incident_event,
+ user_email=current_user.email,
+ incident_id=current_incident.id,
+ event_in=event_in,
+ organization_slug=organization,
+ )
+
+
+@router.patch(
+ "/{incident_id}/event",
+ summary="Updates a custom event.",
+ dependencies=[Depends(PermissionsDependency([IncidentEventPermission]))],
+)
+def update_custom_event(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ incident_id: PrimaryKey,
+ current_incident: CurrentIncident,
+ event_in: EventUpdate,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ if event_in.details:
+ event_in.details.update(
+ {
+ **event_in.details,
+ "updated_by": current_user.email,
+ "updated_on": str(datetime.utcnow()),
+ }
+ )
+ else:
+ event_in.details = {"updated_by": current_user.email, "updated_on": str(datetime.utcnow())}
+ """Updates a custom event."""
+ background_tasks.add_task(
+ event_flows.update_incident_event,
+ event_in=event_in,
+ organization_slug=organization,
+ )
+
+
+@router.post(
+ "/{incident_id}/exportTimeline",
+ summary="Exports timeline events.",
+ dependencies=[Depends(PermissionsDependency([IncidentEventPermission]))],
+)
+def export_timeline_event(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ incident_id: PrimaryKey,
+ current_incident: CurrentIncident,
+ timeline_filters: dict,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ try:
+ event_flows.export_timeline(
+ timeline_filters=timeline_filters,
+ incident_id=incident_id,
+ organization_slug=organization,
+ db_session=db_session,
+ )
+ except Exception as e:
+ log.exception(e)
+ raise HTTPException(
+ status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
+ detail=[{"msg": (f"{str(e)}.",)}],
+ ) from e
+
+
+@router.delete(
+ "/{incident_id}/event/{event_uuid}",
+ summary="Deletes a custom event.",
+ dependencies=[Depends(PermissionsDependency([IncidentEventPermission]))],
+)
+def delete_custom_event(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ incident_id: PrimaryKey,
+ current_incident: CurrentIncident,
+ event_uuid: str,
+ current_user: CurrentUser,
+ background_tasks: BackgroundTasks,
+):
+ """Deletes a custom event."""
+ background_tasks.add_task(
+ event_flows.delete_incident_event,
+ event_uuid=event_uuid,
+ organization_slug=organization,
+ )
+
+
+def get_month_range(relative):
+ today = date.today()
+ relative_month = today - relativedelta(months=relative)
+ _, month_end_day = calendar.monthrange(relative_month.year, relative_month.month)
+ month_start = relative_month.replace(day=1)
+ month_end = relative_month.replace(day=month_end_day)
+ return month_start, month_end
+
+
+@router.get("/metric/forecast", summary="Gets incident forecast data.")
+def get_incident_forecast(
+ db_session: DbSession,
+ common: CommonParameters,
+):
+ """Gets incident forecast data."""
+ categories = []
+ predicted = []
+ actual = []
+
+ for i in reversed(range(1, 5)):
+ start_date, end_date = get_month_range(i)
+
+ if i == 1:
+ incidents = create_incident_metric_query(
+ db_session=db_session,
+ filter_spec=common["filter_spec"],
+ end_date=end_date,
+ )
+ predicted_months, predicted_counts = make_forecast(incidents=incidents)
+ categories = categories + predicted_months
+ predicted = predicted + predicted_counts
+
+ else:
+ incidents = create_incident_metric_query(
+ db_session=db_session, filter_spec=common["filter_spec"], end_date=end_date
+ )
+
+ # get only first predicted month for completed months
+ predicted_months, predicted_counts = make_forecast(incidents=incidents)
+ if predicted_months and predicted_counts:
+ categories.append(predicted_months[0])
+ predicted.append(predicted_counts[0])
+
+ # get actual month counts
+ incidents = create_incident_metric_query(
+ db_session=db_session,
+ filter_spec=common["filter_spec"],
+ end_date=end_date,
+ start_date=start_date,
+ )
+
+ actual.append(len(incidents))
+
+ if not (len(predicted)):
+ return {
+ "categories": categories,
+ "series": [
+ {"name": "Predicted", "data": []},
+ {"name": "Actual", "data": []},
+ ],
+ }
+
+ return {
+ "categories": categories,
+ "series": [
+ {"name": "Predicted", "data": predicted},
+ {"name": "Actual", "data": actual[1:]},
+ ],
+ }
+
+
+@router.get(
+ "/{incident_id}/regenerate",
+ summary="Regenerates incident sumamary",
+ dependencies=[Depends(PermissionsDependency([IncidentEventPermission]))],
+)
+def generate_summary(
+ db_session: DbSession,
+ current_incident: CurrentIncident,
+):
+ return ai_service.generate_incident_summary(
+ db_session=db_session,
+ incident=current_incident,
+ )
diff --git a/src/dispatch/incident_cost/__init__.py b/src/dispatch/incident_cost/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/incident_cost/models.py b/src/dispatch/incident_cost/models.py
new file mode 100644
index 000000000000..6c89a077f92a
--- /dev/null
+++ b/src/dispatch/incident_cost/models.py
@@ -0,0 +1,48 @@
+from datetime import datetime
+
+from sqlalchemy import Column, ForeignKey, Integer, Numeric
+from sqlalchemy.ext.associationproxy import association_proxy
+from sqlalchemy.orm import relationship
+
+from dispatch.database.core import Base
+from dispatch.incident_cost_type.models import IncidentCostTypeRead
+from dispatch.models import DispatchBase, Pagination, PrimaryKey, ProjectMixin, TimeStampMixin
+from dispatch.project.models import ProjectRead
+
+
+# SQLAlchemy Model
+class IncidentCost(Base, TimeStampMixin, ProjectMixin):
+ # columns
+ id = Column(Integer, primary_key=True)
+ amount = Column(Numeric(precision=10, scale=2), nullable=True)
+
+ # relationships
+ incident_cost_type = relationship("IncidentCostType", backref="incident_cost")
+ incident_cost_type_id = Column(Integer, ForeignKey("incident_cost_type.id"))
+ incident_id = Column(Integer, ForeignKey("incident.id", ondelete="CASCADE"))
+ search_vector = association_proxy("incident_cost_type", "search_vector")
+
+
+# Pydantic Models
+class IncidentCostBase(DispatchBase):
+ amount: float = 0
+
+
+class IncidentCostCreate(IncidentCostBase):
+ incident_cost_type: IncidentCostTypeRead
+ project: ProjectRead
+
+
+class IncidentCostUpdate(IncidentCostBase):
+ id: PrimaryKey | None = None
+ incident_cost_type: IncidentCostTypeRead
+
+
+class IncidentCostRead(IncidentCostBase):
+ id: PrimaryKey
+ incident_cost_type: IncidentCostTypeRead
+ updated_at: datetime | None = None
+
+
+class IncidentCostPagination(Pagination):
+ items: list[IncidentCostRead] = []
diff --git a/src/dispatch/incident_cost/scheduled.py b/src/dispatch/incident_cost/scheduled.py
new file mode 100644
index 000000000000..0ae3ec4a5f53
--- /dev/null
+++ b/src/dispatch/incident_cost/scheduled.py
@@ -0,0 +1,66 @@
+import logging
+
+from schedule import every
+
+from dispatch.database.core import SessionLocal
+from dispatch.decorators import scheduled_project_task, timer
+from dispatch.incident import service as incident_service
+from dispatch.incident.enums import IncidentStatus
+from dispatch.incident_cost_type import service as incident_cost_type_service
+from dispatch.project.models import Project
+from dispatch.scheduler import scheduler
+
+from .service import (
+ calculate_incident_response_cost,
+ get_or_create_default_incident_response_cost,
+)
+
+
+log = logging.getLogger(__name__)
+
+
+@scheduler.add(every(1).hour, name="calculate-incidents-response-cost")
+@timer
+@scheduled_project_task
+def calculate_incidents_response_cost(db_session: SessionLocal, project: Project):
+ """Calculates and saves the response cost for all incidents."""
+ response_cost_type = incident_cost_type_service.get_default(
+ db_session=db_session, project_id=project.id
+ )
+ if response_cost_type is None:
+ log.warning(
+ f"A default cost type for response cost doesn't exist in the {project.name} project and organization {project.organization.name}. Response costs for incidents won't be calculated."
+ )
+ return
+
+ incidents = incident_service.get_all(db_session=db_session, project_id=project.id)
+
+ for incident in incidents:
+ try:
+ # we get the response cost for the given incident
+ incident_response_cost = get_or_create_default_incident_response_cost(
+ incident, db_session
+ )
+
+ # we don't need to update the cost of closed incidents
+ # if they already have a response cost and this was updated
+ # after the incident was marked as stable
+ if incident.status == IncidentStatus.closed:
+ if incident_response_cost:
+ if incident_response_cost.updated_at > incident.stable_at:
+ continue
+
+ # we calculate the response cost amount
+ amount = calculate_incident_response_cost(incident.id, db_session)
+ # we don't need to update the cost amount if it hasn't changed
+ if incident_response_cost.amount == amount:
+ continue
+
+ # we save the new incident cost amount
+ incident_response_cost.amount = amount
+ incident.incident_costs.append(incident_response_cost)
+ db_session.add(incident)
+ db_session.commit()
+ except Exception as e:
+ # we shouldn't fail to update all incidents when one fails
+ log.exception(e)
diff --git a/src/dispatch/incident_cost/service.py b/src/dispatch/incident_cost/service.py
new file mode 100644
index 000000000000..e02f80486e38
--- /dev/null
+++ b/src/dispatch/incident_cost/service.py
@@ -0,0 +1,523 @@
+from datetime import datetime, timedelta, timezone
+import logging
+import math
+
+from sqlalchemy.orm import Session
+
+from dispatch.cost_model.models import CostModelActivity
+from dispatch.incident import service as incident_service
+from dispatch.incident.enums import IncidentStatus
+from dispatch.incident.models import Incident
+from dispatch.incident.type.models import IncidentType
+from dispatch.incident_cost_type import service as incident_cost_type_service
+from dispatch.incident_cost_type.models import IncidentCostTypeRead
+from dispatch.participant import service as participant_service
+from dispatch.participant.models import ParticipantRead
+from dispatch.participant_activity import service as participant_activity_service
+from dispatch.participant_activity.models import ParticipantActivityCreate
+from dispatch.participant_role.models import ParticipantRoleType, ParticipantRole
+from dispatch.plugin import service as plugin_service
+
+from .models import IncidentCost, IncidentCostCreate, IncidentCostUpdate
+
+
+HOURS_IN_DAY = 24
+SECONDS_IN_HOUR = 3600
+log = logging.getLogger(__name__)
+
+
+def get(*, db_session: Session, incident_cost_id: int) -> IncidentCost | None:
+ """Gets an incident cost by its id."""
+ return db_session.query(IncidentCost).filter(IncidentCost.id == incident_cost_id).one_or_none()
+
+
+def get_by_incident_id(*, db_session: Session, incident_id: int) -> list[IncidentCost | None]:
+ """Gets incident costs by their incident id."""
+ return db_session.query(IncidentCost).filter(IncidentCost.incident_id == incident_id).all()
+
+
+def get_by_incident_id_and_incident_cost_type_id(
+ *, db_session: Session, incident_id: int, incident_cost_type_id: int
+) -> IncidentCost | None:
+ """Gets incident costs by their incident id and incident cost type id."""
+ return (
+ db_session.query(IncidentCost)
+ .filter(IncidentCost.incident_id == incident_id)
+ .filter(IncidentCost.incident_cost_type_id == incident_cost_type_id)
+ .one_or_none()
+ )
+
+
+def get_all(*, db_session) -> list[IncidentCost | None]:
+ """Gets all incident costs."""
+ return db_session.query(IncidentCost)
+
+
+def get_or_create(
+ *, db_session: Session, incident_cost_in: IncidentCostCreate | IncidentCostUpdate
+) -> IncidentCost:
+ """Gets or creates an incident cost object."""
+ if type(incident_cost_in) is IncidentCostUpdate and incident_cost_in.id:
+ incident_cost = get(db_session=db_session, incident_cost_id=incident_cost_in.id)
+ else:
+ incident_cost = create(db_session=db_session, incident_cost_in=incident_cost_in)
+
+ return incident_cost
+
+
+def create(*, db_session: Session, incident_cost_in: IncidentCostCreate) -> IncidentCost:
+ """Creates a new incident cost."""
+ incident_cost_type = incident_cost_type_service.get(
+ db_session=db_session, incident_cost_type_id=incident_cost_in.incident_cost_type.id
+ )
+ incident_cost = IncidentCost(
+ **incident_cost_in.dict(exclude={"incident_cost_type", "project"}),
+ incident_cost_type=incident_cost_type,
+ project=incident_cost_type.project,
+ )
+ db_session.add(incident_cost)
+ db_session.commit()
+
+ return incident_cost
+
+
+def update(
+ *, db_session: Session, incident_cost: IncidentCost, incident_cost_in: IncidentCostUpdate
+) -> IncidentCost:
+ """Updates an incident cost."""
+ incident_cost_data = incident_cost.dict()
+ update_data = incident_cost_in.dict(exclude_unset=True)
+
+ for field in incident_cost_data:
+ if field in update_data:
+ setattr(incident_cost, field, update_data[field])
+
+ db_session.commit()
+ return incident_cost
+
+
+def delete(*, db_session: Session, incident_cost_id: int):
+ """Deletes an existing incident cost."""
+ db_session.query(IncidentCost).filter(IncidentCost.id == incident_cost_id).delete()
+ db_session.commit()
+
+
+def get_engagement_multiplier(participant_role: str):
+ """Returns an engagement multiplier for a given incident role."""
+ engagement_mappings = {
+ ParticipantRoleType.incident_commander: 1,
+ ParticipantRoleType.scribe: 0.75,
+ ParticipantRoleType.liaison: 0.75,
+ ParticipantRoleType.participant: 0.5,
+ ParticipantRoleType.reporter: 0.5,
+ # ParticipantRoleType.observer: 0, # NOTE: set to 0. It's not used, as we don't calculate cost for participants with observer role
+ }
+
+ return engagement_mappings.get(participant_role)
+
+
+def get_incident_review_hours(incident: Incident) -> int:
+ """Calculate the time spent in incident review related activities."""
+ num_participants = len(incident.participants)
+ incident_review_prep = (
+ 1 # we make the assumption that it takes an hour to prepare the incident review
+ )
+ incident_review_meeting = (
+ num_participants * 0.5 * 1
+ ) # we make the assumption that only half of the incident participants will attend the 1-hour, incident review session
+ return incident_review_prep + incident_review_meeting
+
+
+def get_hourly_rate(project) -> int:
+ """Calculates and rounds up the employee hourly rate within a project."""
+ return math.ceil(project.annual_employee_cost / project.business_year_hours)
+
+
+def update_incident_response_cost_for_incident_type(
+ db_session, incident_type: IncidentType
+) -> None:
+ """Calculate the response cost of all non-closed incidents associated with this incident type."""
+ incidents = incident_service.get_all_open_by_incident_type(
+ db_session=db_session, incident_type_id=incident_type.id
+ )
+ for incident in incidents:
+ update_incident_response_cost(incident_id=incident.id, db_session=db_session)
+
+
+def calculate_response_cost(
+ hourly_rate, total_response_time_seconds, incident_review_hours=0
+) -> float:
+ """Calculates the incident response cost."""
+ return ((total_response_time_seconds / SECONDS_IN_HOUR) + incident_review_hours) * hourly_rate
+
+
+def get_default_incident_response_cost(
+ incident: Incident, db_session: Session
+) -> IncidentCost | None:
+ response_cost_type = incident_cost_type_service.get_default(
+ db_session=db_session, project_id=incident.project.id
+ )
+
+ if not response_cost_type:
+ log.warning(
+ f"A default cost type for response cost doesn't exist in the {incident.project.name} project and organization {incident.project.organization.name}. Response costs for incident {incident.name} won't be calculated."
+ )
+ return None
+
+ return get_by_incident_id_and_incident_cost_type_id(
+ db_session=db_session,
+ incident_id=incident.id,
+ incident_cost_type_id=response_cost_type.id,
+ )
+
+
+def get_or_create_default_incident_response_cost(
+ incident: Incident, db_session: Session
+) -> IncidentCost | None:
+ """Gets or creates the default incident cost for an incident.
+
+ The default incident cost is the cost associated with the participant effort in an incident's response.
+ """
+ response_cost_type = incident_cost_type_service.get_default(
+ db_session=db_session, project_id=incident.project.id
+ )
+
+ if not response_cost_type:
+ log.warning(
+ f"A default cost type for response cost doesn't exist in the {incident.project.name} project and organization {incident.project.organization.name}. Response costs for incident {incident.name} won't be calculated."
+ )
+ return None
+
+ incident_response_cost = get_by_incident_id_and_incident_cost_type_id(
+ db_session=db_session,
+ incident_id=incident.id,
+ incident_cost_type_id=response_cost_type.id,
+ )
+
+ if not incident_response_cost:
+ # we create the response cost if it doesn't exist
+ incident_cost_type = IncidentCostTypeRead.from_orm(response_cost_type)
+ incident_cost_in = IncidentCostCreate(
+ incident_cost_type=incident_cost_type, project=incident.project
+ )
+ incident_response_cost = create(db_session=db_session, incident_cost_in=incident_cost_in)
+ incident.incident_costs.append(incident_response_cost)
+ db_session.add(incident)
+ db_session.commit()
+
+ return incident_response_cost
+
+
+def fetch_incident_events(
+ incident: Incident, activity: CostModelActivity, oldest: str, db_session: Session
+) -> list[tuple[datetime.timestamp, str | None]]:
+ plugin_instance = plugin_service.get_active_instance_by_slug(
+ db_session=db_session,
+ slug=activity.plugin_event.plugin.slug,
+ project_id=incident.project.id,
+ )
+ if not plugin_instance:
+ log.warning(
+ f"Cannot fetch cost model activity. Its associated plugin {activity.plugin_event.plugin.title} is not enabled."
+ )
+ return []
+
+ # Array of sorted (timestamp, user_id) tuples.
+ return plugin_instance.instance.fetch_events(
+ db_session=db_session,
+ subject=incident,
+ plugin_event_id=activity.plugin_event.id,
+ oldest=oldest,
+ )
+
+
+def calculate_incident_response_cost_with_cost_model(
+ incident: Incident, db_session: Session
+) -> float:
+ """Calculates the cost of an incident using the incident's cost model.
+
+ This function aggregates all new incident costs based on plugin activity since the last incident cost update.
+ If this is the first time performing cost calculation for this incident, it computes the total costs from the incident's creation.
+
+ Args:
+ incident: The incident to calculate the incident response cost for.
+ db_session: The database session.
+
+ Returns:
+ float: The incident response cost in dollars.
+ """
+
+ participants_total_response_time_seconds = 0
+ oldest = incident.created_at.replace(tzinfo=timezone.utc).timestamp()
+
+ # Used for determining whether we've previously calculated the incident cost.
+ current_time = datetime.now(tz=timezone.utc).replace(tzinfo=None)
+
+ incident_response_cost = get_or_create_default_incident_response_cost(
+ incident=incident, db_session=db_session
+ )
+ if not incident_response_cost:
+ log.warning(f"Cannot calculate incident response cost for incident {incident.name}.")
+ return 0
+
+ # Ignore events that happened before the last incident cost update.
+ if incident_response_cost.updated_at < current_time:
+ oldest = incident_response_cost.updated_at.replace(tzinfo=timezone.utc).timestamp()
+
+ # Get the cost model. Iterate through all the listed activities we want to record.
+ for activity in incident.incident_type.cost_model.activities:
+ # Array of sorted (timestamp, user_id) tuples.
+ incident_events = fetch_incident_events(
+ incident=incident, activity=activity, oldest=oldest, db_session=db_session
+ )
+
+ for ts, user_id in incident_events:
+ participant = participant_service.get_by_incident_id_and_conversation_id(
+ db_session=db_session,
+ incident_id=incident.id,
+ user_conversation_id=user_id,
+ )
+ if not participant:
+ log.warning("Cannot resolve participant.")
+ continue
+
+ activity_in = ParticipantActivityCreate(
+ plugin_event=activity.plugin_event,
+ started_at=ts,
+ ended_at=ts + timedelta(seconds=activity.response_time_seconds),
+ participant=ParticipantRead(id=participant.id),
+ incident=incident,
+ )
+
+ if participant_response_time := participant_activity_service.create_or_update(
+ db_session=db_session, activity_in=activity_in
+ ):
+ participants_total_response_time_seconds += (
+ participant_response_time.total_seconds()
+ )
+
+ hourly_rate = get_hourly_rate(incident.project)
+ amount = calculate_response_cost(
+ hourly_rate=hourly_rate,
+ total_response_time_seconds=participants_total_response_time_seconds,
+ )
+
+ return float(incident.total_cost) + amount
+
+
+def get_participant_role_time_seconds(
+ incident: Incident, participant_role: ParticipantRole, start_at: datetime
+) -> float:
+ """Calculates the time spent by a participant in an incident role starting from a given time.
+
+ The participant's time spent in the incident role is adjusted based on the role's engagement multiplier.
+
+ Args:
+ incident: The incident the participant is part of.
+ participant_role: The role of the participant and the time they assumed and renounced the role.
+ start_at: Only time spent after this will be considered.
+
+ Returns:
+ float: The time spent by the participant in the incident role in seconds.
+ """
+ if participant_role.renounced_at and participant_role.renounced_at < start_at:
+ # skip calculating already-recorded activity
+ return 0
+
+ if participant_role.role == ParticipantRoleType.observer:
+ # skip calculating cost for participants with the observer role
+ return 0
+
+ if participant_role.activity == 0:
+ # skip calculating cost for roles that have no activity
+ return 0
+
+ participant_role_assumed_at = participant_role.assumed_at
+
+ # we set the renounced_at default time to the current time
+ participant_role_renounced_at = datetime.now(tz=timezone.utc).replace(tzinfo=None)
+
+ if incident.status == IncidentStatus.active:
+ if participant_role.renounced_at:
+ # the participant left the conversation or got assigned another role
+ # we use the role's renounced_at time
+ participant_role_renounced_at = participant_role.renounced_at
+ else:
+ # we set the renounced_at default time to the stable_at time if the stable_at time exists
+ if incident.stable_at:
+ participant_role_renounced_at = incident.stable_at
+
+ if participant_role.renounced_at:
+ # the participant left the conversation or got assigned another role
+ if participant_role.renounced_at < participant_role_renounced_at:
+ # we use the role's renounced_at time if it happened before the
+ # incident was marked as stable or closed
+ participant_role_renounced_at = participant_role.renounced_at
+
+ # the time the participant has spent in the incident role since the last incident cost update
+ participant_role_time = participant_role_renounced_at - max(
+ participant_role_assumed_at, start_at
+ )
+ if participant_role_time.total_seconds() < 0:
+ # the participant was added after the incident was marked as stable
+ return 0
+
+ # we calculate the number of hours the participant has spent in the incident role
+ participant_role_time_hours = participant_role_time.total_seconds() / SECONDS_IN_HOUR
+
+ # we make the assumption that participants only spend 8 hours a day working on the incident,
+ # if the incident goes past 24hrs
+ # TODO(mvilanova): adjust based on incident priority
+ if participant_role_time_hours > HOURS_IN_DAY:
+ days, hours = divmod(participant_role_time_hours, HOURS_IN_DAY)
+ participant_role_time_hours = ((days * HOURS_IN_DAY) / 3) + hours
+
+ # we make the assumption that participants spend more or less time based on their role
+ # and we adjust the time spent based on that
+ return (
+ participant_role_time_hours
+ * SECONDS_IN_HOUR
+ * get_engagement_multiplier(participant_role.role)
+ )
+
+
+def get_total_participant_roles_time_seconds(incident: Incident, start_at: datetime) -> int:
+ """Calculates the time spent by all participants in this incident starting from a given time.
+
+ The participant hours are adjusted based on their role(s)'s engagement multiplier.
+
+ Args:
+ incident: The incident the participant is part of.
+ participant_role: The role of the participant and the time they assumed and renounced the role.
+ start_at: Only time spent after this will be considered.
+
+ Returns:
+ int: The total time spent by all participants in the incident roles in seconds.
+
+ """
+ total_participants_roles_time_seconds = 0
+ for participant in incident.participants:
+ for participant_role in participant.participant_roles:
+ total_participants_roles_time_seconds += get_participant_role_time_seconds(
+ incident=incident,
+ participant_role=participant_role,
+ start_at=start_at,
+ )
+ return total_participants_roles_time_seconds
+
+
+def calculate_incident_response_cost_with_classic_model(
+ incident: Incident, db_session: Session, incident_review: bool = False
+) -> float:
+ """Calculates the cost of an incident using the classic incident cost model.
+
+ This function aggregates all new incident costs since the last incident cost update. If this is the first time performing cost calculation for this incident, it computes the total costs from the incident's creation.
+
+ Args:
+ incident: The incident to calculate the incident response cost for.
+ db_session: The database session.
+ incident_review: Whether to add the incident review costs in this calculation.
+
+ Returns:
+ float: The incident response cost in dollars.
+ """
+ last_update = incident.created_at
+ incident_review_hours = 0
+
+ # Used for determining whether we've previously calculated the incident cost.
+ current_time = datetime.now(tz=timezone.utc).replace(tzinfo=None)
+
+ incident_response_cost = get_or_create_default_incident_response_cost(
+ incident=incident, db_session=db_session
+ )
+ if not incident_response_cost:
+ return 0
+
+ # Ignore activities that happened before the last incident cost update.
+ if incident_response_cost.updated_at < current_time:
+ last_update = incident_response_cost.updated_at
+
+ # TODO: Implement a more robust way to ensure we are calculating the incident review hours only once.
+ if incident_review:
+ incident_review_hours = get_incident_review_hours(incident)
+
+ # Aggregates the incident response costs accumulated since the last incident cost update
+ total_participants_roles_time_seconds = get_total_participant_roles_time_seconds(
+ incident, start_at=last_update
+ )
+
+ # Calculates and rounds up the incident cost
+ hourly_rate = get_hourly_rate(incident.project)
+ amount = calculate_response_cost(
+ hourly_rate=hourly_rate,
+ total_response_time_seconds=total_participants_roles_time_seconds,
+ incident_review_hours=incident_review_hours,
+ )
+
+ return float(incident_response_cost.amount) + amount
+
+
+def calculate_incident_response_cost(
+ incident_id: int, db_session: Session, incident_review: bool = False
+) -> int:
+ """Calculates the response cost of a given incident."""
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+ if not incident:
+ log.warning(f"Incident with id {incident_id} not found.")
+ return 0
+
+ incident_type = incident.incident_type
+ if not incident_type:
+ log.warning(f"Incident type for incident {incident.name} not found.")
+ return 0
+
+ if incident_type.cost_model and incident_type.cost_model.enabled:
+ log.debug(
+ f"Calculating {incident.name} incident cost with model {incident_type.cost_model}."
+ )
+ return calculate_incident_response_cost_with_cost_model(
+ incident=incident, db_session=db_session
+ )
+ else:
+ log.debug("No incident cost model found. Defaulting to classic incident cost model.")
+
+ return calculate_incident_response_cost_with_classic_model(
+ incident=incident, db_session=db_session, incident_review=incident_review
+ )
+
+
+def update_incident_response_cost(
+ incident_id: int, db_session: Session, incident_review: bool = False
+) -> int:
+ """Updates the response cost of a given incident.
+
+ Args:
+ incident_id: The incident id.
+ db_session: The database session.
+ incident_review: Whether to add the incident review costs in this calculation.
+
+ Returns:
+ int: The incident response cost in dollars.
+ """
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+
+ amount = calculate_incident_response_cost(
+ incident_id=incident.id, db_session=db_session, incident_review=incident_review
+ )
+
+ incident_response_cost = get_default_incident_response_cost(
+ incident=incident, db_session=db_session
+ )
+
+ if not incident_response_cost:
+ log.warning(f"Cannot calculate incident response cost for incident {incident.name}.")
+ return 0
+
+ # we update the cost amount only if the incident cost has changed
+ if incident_response_cost.amount != amount:
+ incident_response_cost.amount = amount
+ incident.incident_costs.append(incident_response_cost)
+ db_session.add(incident)
+ db_session.commit()
+
+ return incident_response_cost.amount
diff --git a/src/dispatch/incident_cost/views.py b/src/dispatch/incident_cost/views.py
new file mode 100644
index 000000000000..397ea7060116
--- /dev/null
+++ b/src/dispatch/incident_cost/views.py
@@ -0,0 +1,87 @@
+from fastapi import APIRouter, Depends, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
+from dispatch.models import PrimaryKey
+
+from .models import (
+ IncidentCostCreate,
+ IncidentCostPagination,
+ IncidentCostRead,
+ IncidentCostUpdate,
+)
+from .service import create, delete, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=IncidentCostPagination)
+def get_incident_costs(common: CommonParameters):
+ """Get all incident costs, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="IncidentCost", **common)
+
+
+@router.get("/{incident_cost_id}", response_model=IncidentCostRead)
+def get_incident_cost(db_session: DbSession, incident_cost_id: PrimaryKey):
+ """Get an incident cost by its id."""
+ incident_cost = get(db_session=db_session, incident_cost_id=incident_cost_id)
+ if not incident_cost:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An incident cost with this id does not exist."}],
+ )
+ return incident_cost
+
+
+@router.post(
+ "",
+ response_model=IncidentCostRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_incident_cost(db_session: DbSession, incident_cost_in: IncidentCostCreate):
+ """Create an incident cost."""
+ incident_cost = create(db_session=db_session, incident_cost_in=incident_cost_in)
+ return incident_cost
+
+
+@router.put(
+ "/{incident_cost_id}",
+ response_model=IncidentCostRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_incident_cost(
+ db_session: DbSession,
+ incident_cost_id: PrimaryKey,
+ incident_cost_in: IncidentCostUpdate,
+):
+ """Update an incident cost by its id."""
+ incident_cost = get(db_session=db_session, incident_cost_id=incident_cost_id)
+ if not incident_cost:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An incident cost with this id does not exist."}],
+ )
+ incident_cost = update(
+ db_session=db_session,
+ incident_cost=incident_cost,
+ incident_cost_in=incident_cost_in,
+ )
+ return incident_cost
+
+
+@router.delete(
+ "/{incident_cost_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def delete_incident_cost(db_session: DbSession, incident_cost_id: PrimaryKey):
+ """Delete an incident cost, returning only an HTTP 200 OK if successful."""
+ incident_cost = get(db_session=db_session, incident_cost_id=incident_cost_id)
+ if not incident_cost:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An incident cost with this id does not exist."}],
+ )
+ delete(db_session=db_session, incident_cost_id=incident_cost_id)
diff --git a/src/dispatch/incident_cost_type/__init__.py b/src/dispatch/incident_cost_type/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/incident_cost_type/config.py b/src/dispatch/incident_cost_type/config.py
new file mode 100644
index 000000000000..bb5252ace222
--- /dev/null
+++ b/src/dispatch/incident_cost_type/config.py
@@ -0,0 +1,8 @@
+default_incident_cost_type = {
+ "name": "Response Cost",
+ "description": "Cost associated with handling an incident.",
+ "category": "Primary",
+ "details": {},
+ "default": True,
+ "editable": False,
+}
diff --git a/src/dispatch/incident_cost_type/models.py b/src/dispatch/incident_cost_type/models.py
new file mode 100644
index 000000000000..29228e47ecd2
--- /dev/null
+++ b/src/dispatch/incident_cost_type/models.py
@@ -0,0 +1,70 @@
+from datetime import datetime
+
+from sqlalchemy import Column, Integer, String, Boolean
+from sqlalchemy.event import listen
+
+from sqlalchemy_utils import TSVectorType, JSONType
+
+from dispatch.database.core import Base, ensure_unique_default_per_project
+from dispatch.models import (
+ DispatchBase,
+ NameStr,
+ ProjectMixin,
+ TimeStampMixin,
+ Pagination,
+ PrimaryKey,
+)
+from dispatch.project.models import ProjectRead
+
+
+# SQLAlchemy Model
+class IncidentCostType(Base, TimeStampMixin, ProjectMixin):
+ """SQLAlchemy model for incident cost type resources."""
+ # columns
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ category = Column(String)
+ details = Column(JSONType, nullable=True)
+ default = Column(Boolean, default=False)
+ editable = Column(Boolean, default=True)
+
+ # full text search capabilities
+ search_vector = Column(
+ TSVectorType("name", "description", weights={"name": "A", "description": "B"})
+ )
+
+
+listen(IncidentCostType.default, "set", ensure_unique_default_per_project)
+
+
+# Pydantic Models
+class IncidentCostTypeBase(DispatchBase):
+ """Base Pydantic model for incident cost type resources."""
+ name: NameStr
+ description: str | None = None
+ category: str | None = None
+ details: dict[str, object] | None = None
+ default: bool | None = None
+ editable: bool | None = None
+
+
+class IncidentCostTypeCreate(IncidentCostTypeBase):
+ """Pydantic model for creating an incident cost type."""
+ project: ProjectRead
+
+
+class IncidentCostTypeUpdate(IncidentCostTypeBase):
+ """Pydantic model for updating an incident cost type."""
+ id: PrimaryKey | None = None
+
+
+class IncidentCostTypeRead(IncidentCostTypeBase):
+ """Pydantic model for reading an incident cost type."""
+ id: PrimaryKey
+ created_at: datetime
+
+
+class IncidentCostTypePagination(Pagination):
+ """Pydantic model for paginated incident cost type results."""
+ items: list[IncidentCostTypeRead] = []
diff --git a/src/dispatch/incident_cost_type/service.py b/src/dispatch/incident_cost_type/service.py
new file mode 100644
index 000000000000..bc6bdeb0bb12
--- /dev/null
+++ b/src/dispatch/incident_cost_type/service.py
@@ -0,0 +1,82 @@
+from sqlalchemy.sql.expression import true
+
+from dispatch.project import service as project_service
+
+from .models import (
+ IncidentCostType,
+ IncidentCostTypeCreate,
+ IncidentCostTypeUpdate,
+)
+
+
+def get(*, db_session, incident_cost_type_id: int) -> IncidentCostType | None:
+ """Gets an incident cost type by its id."""
+ return (
+ db_session.query(IncidentCostType)
+ .filter(IncidentCostType.id == incident_cost_type_id)
+ .one_or_none()
+ )
+
+
+def get_default(*, db_session, project_id: int) -> IncidentCostType | None:
+ """Returns the default incident cost type."""
+ return (
+ db_session.query(IncidentCostType)
+ .filter(IncidentCostType.default == true())
+ .filter(IncidentCostType.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name(
+ *, db_session, project_id: int, incident_cost_type_name: str
+) -> IncidentCostType | None:
+ """Gets an incident cost type by its name."""
+ return (
+ db_session.query(IncidentCostType)
+ .filter(IncidentCostType.name == incident_cost_type_name)
+ .filter(IncidentCostType.project_id == project_id)
+ .first()
+ )
+
+
+def get_all(*, db_session) -> list[IncidentCostType | None]:
+ """Gets all incident cost types."""
+ return db_session.query(IncidentCostType).all()
+
+
+def create(*, db_session, incident_cost_type_in: IncidentCostTypeCreate) -> IncidentCostType:
+ """Creates a new incident cost type."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=incident_cost_type_in.project
+ )
+ incident_cost_type = IncidentCostType(
+ **incident_cost_type_in.dict(exclude={"project"})
+ )
+ incident_cost_type.project = project # type: ignore[attr-defined]
+ db_session.add(incident_cost_type)
+ db_session.commit()
+ return incident_cost_type
+
+
+def update(
+ *,
+ db_session,
+ incident_cost_type: IncidentCostType,
+ incident_cost_type_in: IncidentCostTypeUpdate,
+) -> IncidentCostType:
+ """Updates an incident cost type."""
+ update_data = incident_cost_type_in.dict(exclude_unset=True)
+
+ for field, value in update_data.items():
+ if hasattr(incident_cost_type, field):
+ setattr(incident_cost_type, field, value)
+
+ db_session.commit()
+ return incident_cost_type
+
+
+def delete(*, db_session, incident_cost_type_id: int):
+ """Deletes an existing incident cost type."""
+ db_session.query(IncidentCostType).filter(IncidentCostType.id == incident_cost_type_id).delete()
+ db_session.commit()
diff --git a/src/dispatch/incident_cost_type/views.py b/src/dispatch/incident_cost_type/views.py
new file mode 100644
index 000000000000..d8a9da628223
--- /dev/null
+++ b/src/dispatch/incident_cost_type/views.py
@@ -0,0 +1,105 @@
+from fastapi import APIRouter, Depends, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
+from dispatch.models import PrimaryKey
+
+from .models import (
+ IncidentCostTypeCreate,
+ IncidentCostTypePagination,
+ IncidentCostTypeRead,
+ IncidentCostTypeUpdate,
+)
+from .service import create, delete, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=IncidentCostTypePagination)
+def get_incident_cost_types(common: CommonParameters):
+ """Get all incident cost types, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="IncidentCostType", **common)
+
+
+@router.get("/{incident_cost_type_id}", response_model=IncidentCostTypeRead)
+def get_incident_cost_type(db_session: DbSession, incident_cost_type_id: PrimaryKey):
+ """Get an incident cost type by its id."""
+ incident_cost_type = get(db_session=db_session, incident_cost_type_id=incident_cost_type_id)
+ if not incident_cost_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An incident cost type with this id does not exist."}],
+ )
+ return incident_cost_type
+
+
+@router.post(
+ "",
+ response_model=IncidentCostTypeRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_incident_cost_type(db_session: DbSession, incident_cost_type_in: IncidentCostTypeCreate):
+ """Create an incident cost type."""
+ incident_cost_type = create(db_session=db_session, incident_cost_type_in=incident_cost_type_in)
+ return incident_cost_type
+
+
+@router.put(
+ "/{incident_cost_type_id}",
+ response_model=IncidentCostTypeRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_incident_cost_type(
+ db_session: DbSession,
+ incident_cost_type_id: PrimaryKey,
+ incident_cost_type_in: IncidentCostTypeUpdate,
+):
+ """Update an incident cost type by its id."""
+ incident_cost_type = get(db_session=db_session, incident_cost_type_id=incident_cost_type_id)
+ if not incident_cost_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An incident cost type with this id does not exist."}],
+ )
+
+ if not incident_cost_type.editable:
+ raise HTTPException(
+ status_code=301,
+ detail=[{"msg": "You are not allowed to update this incident cost type."}],
+ )
+
+ incident_cost_type = update(
+ db_session=db_session,
+ incident_cost_type=incident_cost_type,
+ incident_cost_type_in=incident_cost_type_in,
+ )
+ return incident_cost_type
+
+
+@router.delete(
+ "/{incident_cost_type_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def delete_incident_cost_type(
+ db_session: DbSession,
+ incident_cost_type_id: PrimaryKey,
+):
+ """Delete an incident cost type, returning only an HTTP 200 OK if successful."""
+ incident_cost_type = get(db_session=db_session, incident_cost_type_id=incident_cost_type_id)
+
+ if not incident_cost_type:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An incident cost type with this id does not exist."}],
+ )
+
+ if not incident_cost_type.editable:
+ raise HTTPException(
+ status_code=301,
+ detail=[{"msg": "You are not allowed to delete this incident cost type."}],
+ )
+
+ delete(db_session=db_session, incident_cost_type_id=incident_cost_type_id)
diff --git a/src/dispatch/incident_priority/models.py b/src/dispatch/incident_priority/models.py
deleted file mode 100644
index c9cdb937c868..000000000000
--- a/src/dispatch/incident_priority/models.py
+++ /dev/null
@@ -1,46 +0,0 @@
-from enum import Enum
-from typing import List, Optional
-
-from sqlalchemy import Column, Integer, String
-
-from dispatch.database import Base
-from dispatch.models import DispatchBase
-
-
-class IncidentPriorityType(str, Enum):
- # Note: The reporting form uses these types for the Priority drop-down list. Add them in the order you want
- # them to be displayed.
-
- high = "High"
- medium = "Medium"
- low = "Low"
- info = "Info"
-
-
-class IncidentPriority(Base):
- id = Column(Integer, primary_key=True)
- name = Column(String, default=IncidentPriorityType.info, unique=True)
- description = Column(String)
-
-
-# Pydantic models...
-class IncidentPriorityBase(DispatchBase):
- name: str
- description: Optional[str]
-
-
-class IncidentPriorityCreate(IncidentPriorityBase):
- pass
-
-
-class IncidentPriorityUpdate(IncidentPriorityBase):
- pass
-
-
-class IncidentPriorityRead(IncidentPriorityBase):
- id: int
-
-
-class IncidentPriorityPagination(DispatchBase):
- total: int
- items: List[IncidentPriorityRead] = []
diff --git a/src/dispatch/incident_priority/service.py b/src/dispatch/incident_priority/service.py
deleted file mode 100644
index f4b1d8366c93..000000000000
--- a/src/dispatch/incident_priority/service.py
+++ /dev/null
@@ -1,54 +0,0 @@
-from typing import List, Optional
-
-from fastapi.encoders import jsonable_encoder
-
-from .models import IncidentPriority, IncidentPriorityCreate, IncidentPriorityUpdate
-
-
-def get(*, db_session, incident_priority_id: int) -> Optional[IncidentPriority]:
- """Returns an incident priority based on the given priority id."""
- return (
- db_session.query(IncidentPriority)
- .filter(IncidentPriority.id == incident_priority_id)
- .one_or_none()
- )
-
-
-def get_by_name(*, db_session, name: str) -> Optional[IncidentPriority]:
- """Returns an incident priority based on the given priority name."""
- return db_session.query(IncidentPriority).filter(IncidentPriority.name == name).one_or_none()
-
-
-def get_all(*, db_session) -> List[Optional[IncidentPriority]]:
- """Returns all incident priorities."""
- return db_session.query(IncidentPriority)
-
-
-def create(*, db_session, incident_priority_in: IncidentPriorityCreate) -> IncidentPriority:
- """Creates an incident priority."""
- incident_priority = IncidentPriority(**incident_priority_in.dict())
- db_session.add(incident_priority)
- db_session.commit()
- return incident_priority
-
-
-def update(
- *, db_session, incident_priority: IncidentPriority, incident_priority_in: IncidentPriorityUpdate
-) -> IncidentPriority:
- """Updates an incident priority."""
- incident_priority_data = jsonable_encoder(incident_priority)
- update_data = incident_priority_in.dict(skip_defaults=True)
-
- for field in incident_priority_data:
- if field in update_data:
- setattr(incident_priority, field, update_data[field])
-
- db_session.add(incident_priority)
- db_session.commit()
- return incident_priority
-
-
-def delete(*, db_session, incident_priority_id: int):
- """Deletes an incident priority."""
- db_session.query(IncidentPriority).filter(IncidentPriority.id == incident_priority_id).delete()
- db_session.commit()
diff --git a/src/dispatch/incident_priority/views.py b/src/dispatch/incident_priority/views.py
deleted file mode 100644
index e0a3eb7b5478..000000000000
--- a/src/dispatch/incident_priority/views.py
+++ /dev/null
@@ -1,39 +0,0 @@
-from typing import List
-
-from fastapi import APIRouter, Depends, Query
-from sqlalchemy.orm import Session
-
-from dispatch.database import get_db, search_filter_sort_paginate
-
-from .models import IncidentPriorityPagination
-
-router = APIRouter()
-
-
-@router.get("/", response_model=IncidentPriorityPagination, tags=["incident_priorities"])
-def get_incident_priorities(
- db_session: Session = Depends(get_db),
- page: int = 1,
- items_per_page: int = Query(5, alias="itemsPerPage"),
- query_str: str = Query(None, alias="q"),
- sort_by: List[str] = Query(None, alias="sortBy[]"),
- descending: List[bool] = Query(None, alias="descending[]"),
- fields: List[str] = Query(None, alias="field[]"),
- ops: List[str] = Query(None, alias="op[]"),
- values: List[str] = Query(None, alias="value[]"),
-):
- """
- Returns all incident priorities.
- """
- return search_filter_sort_paginate(
- db_session=db_session,
- model="IncidentPriority",
- query_str=query_str,
- page=page,
- items_per_page=items_per_page,
- sort_by=sort_by,
- descending=descending,
- fields=fields,
- values=values,
- ops=ops,
- )
diff --git a/src/dispatch/incident_role/__init__.py b/src/dispatch/incident_role/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/incident_role/models.py b/src/dispatch/incident_role/models.py
new file mode 100644
index 000000000000..4e7e10c61c2a
--- /dev/null
+++ b/src/dispatch/incident_role/models.py
@@ -0,0 +1,94 @@
+from datetime import datetime
+from pydantic import PositiveInt
+
+from sqlalchemy import Boolean, Column, Integer, String, PrimaryKeyConstraint, Table, ForeignKey
+from sqlalchemy.orm import relationship
+
+from dispatch.database.core import Base
+from dispatch.incident.priority.models import IncidentPriorityRead
+from dispatch.incident.type.models import IncidentTypeRead
+from dispatch.individual.models import IndividualContactRead
+from dispatch.models import DispatchBase, TimeStampMixin, ProjectMixin
+from dispatch.models import PrimaryKey
+from dispatch.participant_role.models import ParticipantRoleType
+from dispatch.project.models import ProjectRead
+from dispatch.service.models import ServiceRead
+from dispatch.tag.models import TagRead
+
+assoc_incident_roles_tags = Table(
+ "incident_role_tag",
+ Base.metadata,
+ Column("incident_role_id", Integer, ForeignKey("incident_role.id")),
+ Column("tag_id", Integer, ForeignKey("tag.id")),
+ PrimaryKeyConstraint("incident_role_id", "tag_id"),
+)
+
+assoc_incident_roles_incident_types = Table(
+ "incident_role_incident_type",
+ Base.metadata,
+ Column("incident_role_id", Integer, ForeignKey("incident_role.id")),
+ Column("incident_type_id", Integer, ForeignKey("incident_type.id")),
+ PrimaryKeyConstraint("incident_role_id", "incident_type_id"),
+)
+
+assoc_incident_roles_incident_priorities = Table(
+ "incident_role_incident_priority",
+ Base.metadata,
+ Column("incident_role_id", Integer, ForeignKey("incident_role.id")),
+ Column("incident_priority_id", Integer, ForeignKey("incident_priority.id")),
+ PrimaryKeyConstraint("incident_role_id", "incident_priority_id"),
+)
+
+
+class IncidentRole(Base, TimeStampMixin, ProjectMixin):
+ # Columns
+ id = Column(Integer, primary_key=True)
+ role = Column(String)
+ enabled = Column(Boolean, default=True)
+ order = Column(Integer)
+
+ # Relationships
+ tags = relationship("Tag", secondary=assoc_incident_roles_tags)
+ incident_types = relationship("IncidentType", secondary=assoc_incident_roles_incident_types)
+ incident_priorities = relationship(
+ "IncidentPriority", secondary=assoc_incident_roles_incident_priorities
+ )
+
+ service_id = Column(Integer, ForeignKey("service.id"))
+ service = relationship("Service")
+ individual_id = Column(Integer, ForeignKey("individual_contact.id"))
+ individual = relationship("IndividualContact")
+
+ engage_next_oncall = Column(Boolean, default=False)
+
+
+# Pydantic models
+class IncidentRoleBase(DispatchBase):
+ enabled: bool | None = None
+ tags: list[TagRead] | None = None
+ order: PositiveInt | None = None
+ incident_types: list[IncidentTypeRead] | None = None
+ incident_priorities: list[IncidentPriorityRead] | None = None
+ service: ServiceRead | None = None
+ individual: IndividualContactRead | None = None
+ engage_next_oncall: bool | None = None
+
+
+class IncidentRoleCreateUpdate(IncidentRoleBase):
+ id: PrimaryKey | None = None
+ project: ProjectRead | None = None
+
+
+class IncidentRolesCreateUpdate(DispatchBase):
+ policies: list[IncidentRoleCreateUpdate]
+
+
+class IncidentRoleRead(IncidentRoleBase):
+ id: PrimaryKey
+ role: ParticipantRoleType
+ created_at: datetime | None = None
+ updated_at: datetime | None = None
+
+
+class IncidentRoles(DispatchBase):
+ policies: list[IncidentRoleRead] = []
diff --git a/src/dispatch/incident_role/service.py b/src/dispatch/incident_role/service.py
new file mode 100644
index 000000000000..53e21619cf5f
--- /dev/null
+++ b/src/dispatch/incident_role/service.py
@@ -0,0 +1,204 @@
+import logging
+
+from operator import attrgetter
+from pydantic import ValidationError
+
+from dispatch.incident.models import Incident, ProjectRead
+from dispatch.incident.priority import service as incident_priority_service
+from dispatch.incident.type import service as incident_type_service
+from dispatch.individual import service as individual_contact_service
+from dispatch.participant_role.models import ParticipantRoleType
+from dispatch.project import service as project_service
+from dispatch.service import service as service_service
+from dispatch.tag import service as tag_service
+
+from .models import (
+ IncidentRole,
+ IncidentRoleCreateUpdate,
+)
+
+
+log = logging.getLogger(__name__)
+
+
+def get(*, db_session, incident_role_id: int) -> IncidentRole | None:
+ """Returns an incident role based on the given id."""
+ return db_session.query(IncidentRole).filter(IncidentRole.id == incident_role_id).one_or_none()
+
+
+def get_all(*, db_session, project_id: int = None) -> list[IncidentRole | None]:
+ """Returns all incident roles."""
+ if project_id is not None:
+ return db_session.query(IncidentRole).filter(IncidentRole.project_id == project_id)
+ return db_session.query(IncidentRole)
+
+
+def get_all_by_role(
+ *, db_session, role: ParticipantRoleType, project_id: int
+) -> list[IncidentRole] | None:
+ """Gets all policies for a given role."""
+ return (
+ db_session.query(IncidentRole)
+ .filter(IncidentRole.role == role)
+ .filter(IncidentRole.project_id == project_id)
+ .all()
+ )
+
+
+def get_all_enabled_by_role(
+ *, db_session, role: ParticipantRoleType, project_id: int
+) -> list[IncidentRole] | None:
+ """Gets all enabled incident roles."""
+ return (
+ db_session.query(IncidentRole)
+ .filter(IncidentRole.enabled == True) # noqa Flake8 E712
+ .filter(IncidentRole.role == role)
+ .filter(IncidentRole.project_id == project_id)
+ ).all()
+
+
+def create_or_update(
+ *,
+ db_session,
+ project_in: ProjectRead,
+ role: ParticipantRoleType,
+ incident_roles_in: list[IncidentRoleCreateUpdate],
+) -> list[IncidentRole]:
+ """Updates a list of incident role policies."""
+ role_policies = []
+
+ project = project_service.get_by_name_or_raise(db_session=db_session, project_in=project_in)
+
+ # update/create everybody else
+ for role_policy_in in incident_roles_in:
+ if role_policy_in.id:
+ role_policy = get(db_session=db_session, incident_role_id=role_policy_in.id)
+
+ if not role_policy:
+ raise ValidationError.from_exception_data(
+ "IncidentRoleRead",
+ [
+ {
+ "type": "value_error",
+ "loc": ("incident_role",),
+ "msg": "Incident role not found.",
+ "input": role_policy_in.name,
+ }
+ ]
+ )
+
+ else:
+ role_policy = IncidentRole(role=role, project=project)
+ db_session.add(role_policy)
+
+ role_policy_data = role_policy.dict()
+ update_data = role_policy_in.dict(
+ exclude_unset=True,
+ exclude={
+ "role", # we don't allow role to be updated
+ "tags",
+ "incident_types",
+ "incident_priorities",
+ "service",
+ "individual",
+ "project",
+ },
+ )
+
+ for field in role_policy_data:
+ if field in update_data:
+ setattr(role_policy, field, update_data[field])
+
+ if role_policy_in.tags:
+ tags = [
+ tag_service.get_by_name_or_raise(
+ db_session=db_session, project_id=project.id, tag_in=t
+ )
+ for t in role_policy_in.tags
+ ]
+ role_policy.tags = tags
+
+ if role_policy_in.incident_types:
+ incident_types = [
+ incident_type_service.get_by_name_or_raise(
+ db_session=db_session, project_id=project.id, incident_type_in=i
+ )
+ for i in role_policy_in.incident_types
+ ]
+ role_policy.incident_types = incident_types
+
+ if role_policy_in.incident_priorities:
+ incident_priorities = [
+ incident_priority_service.get_by_name_or_raise(
+ db_session=db_session,
+ project_id=project.id,
+ incident_priority_in=i,
+ )
+ for i in role_policy_in.incident_priorities
+ ]
+ role_policy.incident_priorities = incident_priorities
+
+ if role_policy_in.service:
+ service = service_service.get_by_external_id_and_project_id_or_raise(
+ db_session=db_session,
+ project_id=project.id,
+ service_in=role_policy_in.service,
+ )
+ role_policy.service = service
+
+ if role_policy_in.individual:
+ individual = individual_contact_service.get_by_email_and_project_id_or_raise(
+ db_session=db_session,
+ project_id=project.id,
+ individual_contact_in=role_policy_in.individual,
+ )
+ role_policy.individual = individual
+
+ role_policies.append(role_policy)
+
+ # TODO Add projects
+ # get all current policies in order to detect deletions
+ existing_incident_roles = get_all_by_role(
+ db_session=db_session, role=role, project_id=project.id
+ )
+ for existing_role_policy in existing_incident_roles:
+ for current_role_policy in role_policies:
+ if existing_role_policy.id == current_role_policy.id:
+ break
+ else:
+ db_session.delete(existing_role_policy)
+
+ db_session.commit()
+ return role_policies
+
+
+def resolve_role(
+ *,
+ db_session,
+ role: ParticipantRoleType,
+ incident: Incident,
+) -> IncidentRole | None:
+ """Based on parameters currently associated to an incident determine who should be assigned which incident role."""
+ incident_roles = get_all_enabled_by_role(
+ db_session=db_session, role=role, project_id=incident.project.id
+ )
+
+ # the order of evaluation of policies is as follows:
+ # 1) Match any policy that includes the current priority
+ # 2) Match any policy that includes the current incident type
+ # 3) Match any policy that includes the current tag set (individually first and then as a group)
+ # 4) If there are still multiple matches use order to determine who to resolve (lowest gets priority)
+
+ incident_roles = [
+ p for p in incident_roles if incident.incident_priority in p.incident_priorities
+ ]
+ incident_roles = [p for p in incident_roles if incident.incident_type in p.incident_types]
+
+ for t in incident.tags:
+ incident_roles += [p for p in incident_roles if t in p.tags]
+
+ if len(incident_roles) == 1:
+ return incident_roles[0]
+
+ if len(incident_roles) > 1:
+ return sorted(incident_roles, key=attrgetter("order"))[0]
diff --git a/src/dispatch/incident_role/views.py b/src/dispatch/incident_role/views.py
new file mode 100644
index 000000000000..7b9d92d0e981
--- /dev/null
+++ b/src/dispatch/incident_role/views.py
@@ -0,0 +1,53 @@
+from fastapi import APIRouter, Depends, Query
+
+
+from dispatch.database.core import DbSession
+from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
+from dispatch.participant_role.models import ParticipantRoleType
+from dispatch.project.models import ProjectRead
+from dispatch.project import service as project_service
+
+from .models import (
+ IncidentRoles,
+ IncidentRolesCreateUpdate,
+)
+from .service import create_or_update, get_all_by_role
+
+
+router = APIRouter()
+
+
+@router.get("/{role}", response_model=IncidentRoles)
+def get_incident_roles(
+ db_session: DbSession,
+ role: ParticipantRoleType,
+ project_name: str = Query(..., alias="projectName"),
+):
+ """Get all incident role mappings."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=ProjectRead(name=project_name)
+ )
+ policies = get_all_by_role(db_session=db_session, role=role, project_id=project.id)
+ return {"policies": policies}
+
+
+@router.put(
+ "/{role}",
+ response_model=IncidentRoles,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_incident_role(
+ db_session: DbSession,
+ role: ParticipantRoleType,
+ incident_roles_in: IncidentRolesCreateUpdate,
+ project_name: str = Query(..., alias="projectName"),
+):
+ """Update a incident role mapping by its id."""
+ return {
+ "policies": create_or_update(
+ db_session=db_session,
+ project_in=ProjectRead(name=project_name),
+ role=role,
+ incident_roles_in=incident_roles_in.policies,
+ )
+ }
diff --git a/src/dispatch/incident_type/models.py b/src/dispatch/incident_type/models.py
deleted file mode 100644
index ddeb4a1ed496..000000000000
--- a/src/dispatch/incident_type/models.py
+++ /dev/null
@@ -1,76 +0,0 @@
-from enum import Enum
-from typing import List, Optional
-
-from sqlalchemy import Column, ForeignKey, Integer, String
-from sqlalchemy.orm import relationship
-from sqlalchemy_utils import TSVectorType
-
-from dispatch.enums import Visibility
-from dispatch.models import Base, DispatchBase
-
-
-class IncidentType(Base):
- id = Column(Integer, primary_key=True)
- name = Column(String, unique=True)
- slug = Column(String)
- description = Column(String)
- visibility = Column(String, default=Visibility.open)
-
- template_document_id = Column(Integer, ForeignKey("document.id"))
- template_document = relationship("Document")
-
- commander_service_id = Column(Integer, ForeignKey("service.id"))
- commander_service = relationship("Service")
-
- search_vector = Column(TSVectorType("name", "description"))
-
-
-class Document(DispatchBase):
- id: int
- resource_type: Optional[str]
- resource_id: Optional[str]
- description: Optional[str]
- weblink: str
- name: str
-
-
-class Service(DispatchBase):
- id: int
- name: Optional[str] = None
- external_id: Optional[str] = None
- is_active: Optional[bool] = None
- type: Optional[str] = None
-
-
-# Pydantic models...
-class IncidentTypeBase(DispatchBase):
- name: str
- description: Optional[str]
-
-
-class IncidentTypeCreate(IncidentTypeBase):
- template_document: Optional[Document]
- commander_service: Optional[Service]
-
-
-class IncidentTypeUpdate(IncidentTypeBase):
- id: int
- visibility: Optional[Visibility]
- template_document: Optional[Document]
- commander_service: Optional[Service]
-
-
-class IncidentTypeRead(IncidentTypeBase):
- id: int
- visibility: Optional[Visibility]
- template_document: Optional[Document]
- commander_service: Optional[Service]
-
-
-class IncidentTypeNested(IncidentTypeBase):
- id: int
-
-
-class IncidentTypePagination(DispatchBase):
- total: int
- items: List[IncidentTypeRead] = []
diff --git a/src/dispatch/incident_type/service.py b/src/dispatch/incident_type/service.py
deleted file mode 100644
index c9e8227a2bdb..000000000000
--- a/src/dispatch/incident_type/service.py
+++ /dev/null
@@ -1,91 +0,0 @@
-from typing import List, Optional
-
-from fastapi.encoders import jsonable_encoder
-
-from dispatch.document import service as document_service
-from dispatch.document.models import Document
-from dispatch.service import service as service_service
-from dispatch.service.models import Service
-
-from .models import IncidentType, IncidentTypeCreate, IncidentTypeUpdate
-
-
-def get(*, db_session, incident_type_id: int) -> Optional[IncidentType]:
- """Returns an incident type based on the given type id."""
- return db_session.query(IncidentType).filter(IncidentType.id == incident_type_id).one_or_none()
-
-
-def get_by_name(*, db_session, name: str) -> Optional[IncidentType]:
- """Returns an incident type based on the given type name."""
- return db_session.query(IncidentType).filter(IncidentType.name == name).one_or_none()
-
-
-def get_by_slug(*, db_session, slug: str) -> Optional[IncidentType]:
- """Returns an incident type based on the given type slug."""
- return db_session.query(IncidentType).filter(IncidentType.slug == slug).one_or_none()
-
-
-def get_all(*, db_session) -> List[Optional[IncidentType]]:
- """Returns all incident types."""
- return db_session.query(IncidentType)
-
-
-def create(*, db_session, incident_type_in: IncidentTypeCreate) -> IncidentType:
- """Creates an incident type."""
- template_document = Document()
- if incident_type_in.template_document:
- template_document = document_service.get(
- db_session=db_session, document_id=incident_type_in.template_document.id
- )
-
- commander_service = Service()
- if incident_type_in.commander_service:
- commander_service = service_service.get(
- db_session=db_session, service_id=incident_type_in.commander_service.id
- )
-
- incident_type = IncidentType(
- **incident_type_in.dict(exclude={"commander_service", "template_document"}),
- commander_service=commander_service,
- template_document=template_document,
- )
- db_session.add(incident_type)
- db_session.commit()
- return incident_type
-
-
-def update(
- *, db_session, incident_type: IncidentType, incident_type_in: IncidentTypeUpdate
-) -> IncidentType:
- """Updates an incident type."""
- if incident_type_in.template_document:
- template_document = document_service.get(
- db_session=db_session, document_id=incident_type_in.template_document.id
- )
- incident_type.template_document = template_document
-
- if incident_type_in.commander_service:
- commander_service = service_service.get(
- db_session=db_session, service_id=incident_type_in.commander_service.id
- )
- incident_type.commander_service = commander_service
-
- incident_type_data = jsonable_encoder(incident_type)
-
- update_data = incident_type_in.dict(
- skip_defaults=True, exclude={"commander_service", "template_document"}
- )
-
- for field in incident_type_data:
- if field in update_data:
- setattr(incident_type, field, update_data[field])
-
- db_session.add(incident_type)
- db_session.commit()
- return incident_type
-
-
-def delete(*, db_session, incident_type_id: int):
- """Deletes an incident type."""
- db_session.query(IncidentType).filter(IncidentType.id == incident_type_id).delete()
- db_session.commit()
diff --git a/src/dispatch/incident_type/views.py b/src/dispatch/incident_type/views.py
deleted file mode 100644
index 7c62a23cffac..000000000000
--- a/src/dispatch/incident_type/views.py
+++ /dev/null
@@ -1,85 +0,0 @@
-from typing import List
-
-from fastapi import APIRouter, Depends, HTTPException, Query
-from sqlalchemy.orm import Session
-
-from dispatch.database import get_db, search_filter_sort_paginate
-
-from .models import IncidentTypeCreate, IncidentTypePagination, IncidentTypeRead, IncidentTypeUpdate
-from .service import create, get, update
-
-router = APIRouter()
-
-
-@router.get("/", response_model=IncidentTypePagination, tags=["incident_types"])
-def get_incident_types(
- db_session: Session = Depends(get_db),
- page: int = 1,
- items_per_page: int = Query(5, alias="itemsPerPage"),
- query_str: str = Query(None, alias="q"),
- sort_by: List[str] = Query(None, alias="sortBy[]"),
- descending: List[bool] = Query(None, alias="descending[]"),
- fields: List[str] = Query(None, alias="field[]"),
- ops: List[str] = Query(None, alias="op[]"),
- values: List[str] = Query(None, alias="value[]"),
-):
- """
- Returns all incident types.
- """
- return search_filter_sort_paginate(
- db_session=db_session,
- model="IncidentType",
- query_str=query_str,
- page=page,
- items_per_page=items_per_page,
- sort_by=sort_by,
- descending=descending,
- fields=fields,
- values=values,
- ops=ops,
- )
-
-
-@router.post("/", response_model=IncidentTypeRead)
-def create_incident_type(
- *, db_session: Session = Depends(get_db), incident_type_in: IncidentTypeCreate
-):
- """
- Create a new incident_type.
- """
- incident_type = create(db_session=db_session, incident_type_in=incident_type_in)
- return incident_type
-
-
-@router.put("/{incident_type_id}", response_model=IncidentTypeRead)
-def update_incident_type(
- *,
- db_session: Session = Depends(get_db),
- incident_type_id: int,
- incident_type_in: IncidentTypeUpdate,
-):
- """
- Update an existing incident_type.
- """
- incident_type = get(db_session=db_session, incident_type_id=incident_type_id)
- if not incident_type:
- raise HTTPException(
- status_code=404, detail="The incident_type with this id does not exist."
- )
- incident_type = update(
- db_session=db_session, incident_type=incident_type, incident_type_in=incident_type_in
- )
- return incident_type
-
-
-@router.get("/{incident_type_id}", response_model=IncidentTypeRead)
-def get_incident_type(*, db_session: Session = Depends(get_db), incident_type_id: int):
- """
- Get a single incident_type.
- """
- incident_type = get(db_session=db_session, incident_type_id=incident_type_id)
- if not incident_type:
- raise HTTPException(
- status_code=404, detail="The incident_type with this id does not exist."
- )
- return incident_type
diff --git a/src/dispatch/individual/models.py b/src/dispatch/individual/models.py
index b2c7811e423b..0597c02c0586 100644
--- a/src/dispatch/individual/models.py
+++ b/src/dispatch/individual/models.py
@@ -1,105 +1,158 @@
-from datetime import datetime
-from typing import List, Optional
+"""Models for individual contact resources in the Dispatch application."""
-from sqlalchemy import Column, ForeignKey, Integer, PrimaryKeyConstraint, String, Table
+from datetime import datetime
+from pydantic import field_validator, Field, ConfigDict
+from urllib.parse import urlparse
+
+from sqlalchemy import (
+ Column,
+ ForeignKey,
+ Integer,
+ PrimaryKeyConstraint,
+ String,
+ Table,
+ UniqueConstraint,
+)
from sqlalchemy.orm import relationship
from sqlalchemy_utils import TSVectorType
-from dispatch.database import Base
-from dispatch.incident_priority.models import IncidentPriorityCreate, IncidentPriorityRead
-from dispatch.incident_type.models import IncidentTypeCreate, IncidentTypeRead
-from dispatch.term.models import TermCreate
-from dispatch.models import ContactBase, ContactMixin, DispatchBase, TermReadNested
+from dispatch.database.core import Base
+from dispatch.project.models import ProjectRead
+from dispatch.search_filter.models import SearchFilterRead
+from dispatch.models import (
+ ContactBase,
+ ContactMixin,
+ ProjectMixin,
+ PrimaryKey,
+ Pagination,
+ TimeStampMixin,
+ DispatchBase,
+)
# Association tables for many to many relationships
-assoc_individual_contact_incident_types = Table(
- "assoc_individual_contact_incident_type",
+assoc_individual_contact_filters = Table(
+ "assoc_individual_contact_filters",
Base.metadata,
- Column("incident_type_id", Integer, ForeignKey("incident_type.id")),
- Column("individual_contact_id", Integer, ForeignKey("individual_contact.id")),
- PrimaryKeyConstraint("incident_type_id", "individual_contact_id"),
+ Column(
+ "individual_contact_id", Integer, ForeignKey("individual_contact.id", ondelete="CASCADE")
+ ),
+ Column("search_filter_id", Integer, ForeignKey("search_filter.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("individual_contact_id", "search_filter_id"),
)
-assoc_individual_contact_incident_priorities = Table(
- "assoc_individual_contact_incident_priority",
- Base.metadata,
- Column("incident_priority_id", Integer, ForeignKey("incident_priority.id")),
- Column("individual_contact_id", Integer, ForeignKey("individual_contact.id")),
- PrimaryKeyConstraint("incident_priority_id", "individual_contact_id"),
-)
-assoc_individual_contact_terms = Table(
- "assoc_individual_contact_terms",
- Base.metadata,
- Column("term_id", Integer, ForeignKey("term.id")),
- Column("individual_contact_id", ForeignKey("individual_contact.id")),
- PrimaryKeyConstraint("term_id", "individual_contact_id"),
-)
+class IndividualContact(Base, ContactMixin, ProjectMixin, TimeStampMixin):
+ """SQLAlchemy model for individual contact resources."""
+ __table_args__ = (UniqueConstraint("email", "project_id"),)
-class IndividualContact(ContactMixin, Base):
id = Column(Integer, primary_key=True)
name = Column(String)
mobile_phone = Column(String)
office_phone = Column(String)
title = Column(String)
weblink = Column(String)
+ external_id = Column(String)
+ events = relationship("Event", backref="individual")
+ service_feedback = relationship("ServiceFeedback", backref="individual")
+
+ filters = relationship(
+ "SearchFilter", secondary=assoc_individual_contact_filters, backref="individuals"
+ )
team_contact_id = Column(Integer, ForeignKey("team_contact.id"))
team_contact = relationship("TeamContact", backref="individuals")
- # this is a self referential relationship lets punt on this for now.
- # relationship_owner_id = Column(Integer, ForeignKey("individual_contact.id"))
- # relationship_owner = relationship("IndividualContact", backref="individual_contacts")
- participant = relationship("Participant", lazy="subquery", backref="individual")
- incident_types = relationship(
- "IncidentType", secondary=assoc_individual_contact_incident_types, backref="individuals"
- )
- incident_priorities = relationship(
- "IncidentPriority",
- secondary=assoc_individual_contact_incident_priorities,
- backref="individuals",
- )
- terms = relationship("Term", secondary=assoc_individual_contact_terms, backref="individuals")
search_vector = Column(
TSVectorType(
"name",
"title",
+ "email",
"company",
"notes",
- weights={"name": "A", "title": "B", "company": "C", "notes": "D"},
+ weights={"name": "A", "email": "B", "title": "C", "company": "D"},
)
)
class IndividualContactBase(ContactBase):
- weblink: Optional[str]
- mobile_phone: Optional[str]
- office_phone: Optional[str]
- title: Optional[str]
+ """Base Pydantic model for individual contact resources."""
+
+ mobile_phone: str | None = Field(default=None)
+ office_phone: str | None = Field(default=None)
+ title: str | None = Field(default=None)
+ weblink: str | None = Field(default=None)
+ external_id: str | None = Field(default=None)
+
+ @field_validator("weblink")
+ @classmethod
+ def weblink_validator(cls, v: str | None) -> str | None:
+ """Validates the weblink field to be None, empty string, or a valid URL (internal or external)."""
+ if v is None or v == "":
+ return v
+ result = urlparse(v)
+ if all([result.scheme, result.netloc]):
+ return v
+ raise ValueError("weblink must be empty or a valid URL")
class IndividualContactCreate(IndividualContactBase):
- terms: Optional[List[TermCreate]] = []
- incident_priorities: Optional[List[IncidentPriorityCreate]] = []
- incident_types: Optional[List[IncidentTypeCreate]] = []
+ """Pydantic model for creating an individual contact resource."""
+
+ filters: list[SearchFilterRead] | None = None
+ project: ProjectRead
class IndividualContactUpdate(IndividualContactBase):
- terms: Optional[List[TermCreate]] = []
- incident_priorities: Optional[List[IncidentPriorityCreate]] = []
- incident_types: Optional[List[IncidentTypeCreate]] = []
+ """Pydantic model for updating an individual contact resource."""
+
+ filters: list[SearchFilterRead] | None = None
+ project: ProjectRead | None = None
class IndividualContactRead(IndividualContactBase):
- id: int
- terms: Optional[List[TermReadNested]] = []
- incident_priorities: Optional[List[IncidentPriorityRead]] = []
- incident_types: Optional[List[IncidentTypeRead]] = []
- created_at: Optional[datetime] = None
- updated_at: Optional[datetime] = None
+ """Pydantic model for reading an individual contact resource."""
+
+ id: PrimaryKey | None = None
+ filters: list[SearchFilterRead] = []
+ created_at: datetime | None = None
+ updated_at: datetime | None = None
+
+
+# Creating a more minimal version that doesn't inherit from ContactBase to avoid email validation issues in tests
+class IndividualContactReadMinimal(DispatchBase):
+ """Pydantic model for reading a minimal individual contact resource."""
+
+ id: PrimaryKey
+ created_at: datetime | None = None
+ updated_at: datetime | None = None
+ # Adding only required fields from ContactBase and IndividualContactBase
+ email: str | None = None # Not using EmailStr for tests
+ name: str | None = None
+ is_active: bool | None = True
+ is_external: bool | None = False
+ company: str | None = None
+ contact_type: str | None = None
+ notes: str | None = None
+ owner: str | None = None
+ mobile_phone: str | None = None
+ office_phone: str | None = None
+ title: str | None = None
+ weblink: str | None = None
+ external_id: str | None = None
+ auto_add_to_incident_bridges: bool | None = True
+
+ # Ensure validation is turned off for tests
+ model_config = ConfigDict(
+ extra="ignore",
+ validate_default=False,
+ validate_assignment=False,
+ arbitrary_types_allowed=True,
+ )
+
+class IndividualContactPagination(Pagination):
+ """Pydantic model for paginated individual contact results."""
-class IndividualContactPagination(DispatchBase):
total: int
- items: List[IndividualContactRead] = []
+ items: list[IndividualContactRead] = []
diff --git a/src/dispatch/individual/service.py b/src/dispatch/individual/service.py
index 70a13d880bc4..5edc691c151b 100644
--- a/src/dispatch/individual/service.py
+++ b/src/dispatch/individual/service.py
@@ -1,23 +1,29 @@
-from typing import List, Optional
+from functools import lru_cache
-from fastapi.encoders import jsonable_encoder
+from pydantic import ValidationError
+from sqlalchemy.orm import Session
-from dispatch.config import INCIDENT_PLUGIN_CONTACT_SLUG
-from dispatch.incident_priority import service as incident_priority_service
-from dispatch.incident_type import service as incident_type_service
-from dispatch.plugins.base import plugins
-from dispatch.term import service as term_service
+from dispatch.plugin.models import PluginInstance
+from dispatch.project.models import Project, ProjectRead
+from dispatch.plugin import service as plugin_service
+from dispatch.project import service as project_service
+from dispatch.search_filter import service as search_filter_service
-from .models import IndividualContact, IndividualContactCreate, IndividualContactUpdate
+from .models import (
+ IndividualContact,
+ IndividualContactCreate,
+ IndividualContactRead,
+ IndividualContactUpdate,
+)
-def resolve_user_by_email(email):
+def resolve_user_by_email(email: str, db_session: Session):
"""Resolves a user's details given their email."""
- p = plugins.get(INCIDENT_PLUGIN_CONTACT_SLUG)
- return p.get(email)
+ plugin = plugin_service.get_active_instance(db_session=db_session, plugin_type="contact")
+ return plugin.instance.get(email)
-def get(*, db_session, individual_contact_id: int) -> Optional[IndividualContact]:
+def get(*, db_session: Session, individual_contact_id: int) -> IndividualContact | None:
"""Returns an individual given an individual id."""
return (
db_session.query(IndividualContact)
@@ -26,54 +32,118 @@ def get(*, db_session, individual_contact_id: int) -> Optional[IndividualContact
)
-def get_by_email(*, db_session, email: str) -> Optional[IndividualContact]:
- """Returns an individual given an individual email address."""
+def get_by_email_and_project(
+ *, db_session: Session, email: str, project_id: int
+) -> IndividualContact | None:
+ """Returns an individual given an email address and project id."""
return (
- db_session.query(IndividualContact).filter(IndividualContact.email == email).one_or_none()
+ db_session.query(IndividualContact)
+ .filter(IndividualContact.email == email)
+ .filter(IndividualContact.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_email_and_project_id_or_raise(
+ *, db_session: Session, project_id: int, individual_contact_in: IndividualContactRead
+) -> IndividualContactRead:
+ """Returns the individual specified or raises ValidationError."""
+ individual_contact = get_by_email_and_project(
+ db_session=db_session, project_id=project_id, email=individual_contact_in.email
)
+ if not individual_contact:
+ raise ValidationError(
+ [
+ {
+ "loc": ("individual",),
+ "msg": "Individual not found.",
+ "type": "value_error",
+ "input": individual_contact_in.email,
+ }
+ ]
+ )
+
+ return individual_contact
+
-def get_all(*, db_session) -> List[Optional[IndividualContact]]:
+def get_all(*, db_session) -> list[IndividualContact | None]:
"""Returns all individuals."""
return db_session.query(IndividualContact)
-def get_or_create(*, db_session, email: str, **kwargs) -> IndividualContact:
+@lru_cache(maxsize=1000)
+def fetch_individual_info(contact_plugin: PluginInstance, email: str, db_session: Session):
+ return contact_plugin.instance.get(email, db_session=db_session)
+
+
+def get_or_create(
+ *, db_session: Session, email: str, project: Project, **kwargs
+) -> IndividualContact:
"""Gets or creates an individual."""
- contact = get_by_email(db_session=db_session, email=email)
+ # we fetch the individual contact from the database
+ individual_contact = get_by_email_and_project(
+ db_session=db_session, email=email, project_id=project.id
+ )
- if not contact:
- contact_plugin = plugins.get(INCIDENT_PLUGIN_CONTACT_SLUG)
- individual_info = contact_plugin.get(email)
- kwargs["email"] = individual_info["email"]
- kwargs["name"] = individual_info["fullname"]
- kwargs["weblink"] = individual_info["weblink"]
- individual_contact_in = IndividualContactCreate(**kwargs)
- contact = create(db_session=db_session, individual_contact_in=individual_contact_in)
+ # we try to fetch the individual's contact information using the contact plugin
+ contact_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project.id, plugin_type="contact"
+ )
- return contact
+ individual_info = {}
+ if contact_plugin:
+ individual_info = fetch_individual_info(contact_plugin, email, db_session)
+
+ kwargs["email"] = individual_info.get("email", email)
+ kwargs["name"] = individual_info.get("fullname", email.split("@")[0].capitalize())
+ kwargs["weblink"] = individual_info.get("weblink", "")
+
+ # Use Pydantic's model_validate to convert SQLAlchemy Project to ProjectRead
+ project_read = ProjectRead.model_validate(project)
+ if project_read.annual_employee_cost is None:
+ project_read.annual_employee_cost = 50000
+ if project_read.business_year_hours is None:
+ project_read.business_year_hours = 2080
+
+ if not individual_contact:
+ # we create a new contact
+ individual_contact_in = IndividualContactCreate(**kwargs, project=project_read)
+ individual_contact = create(
+ db_session=db_session, individual_contact_in=individual_contact_in
+ )
+ else:
+ # we update the existing contact
+ individual_contact_in = IndividualContactUpdate(**kwargs, project=project_read)
+ individual_contact = update(
+ db_session=db_session,
+ individual_contact=individual_contact,
+ individual_contact_in=individual_contact_in,
+ )
+
+ return individual_contact
-def create(*, db_session, individual_contact_in: IndividualContactCreate) -> IndividualContact:
+def create(
+ *, db_session: Session, individual_contact_in: IndividualContactCreate
+) -> IndividualContact:
"""Creates an individual."""
- terms = [
- term_service.get_or_create(db_session=db_session, term_in=t)
- for t in individual_contact_in.terms
- ]
- incident_priorities = [
- incident_priority_service.get_by_name(db_session=db_session, name=n.name)
- for n in individual_contact_in.incident_priorities
- ]
- incident_types = [
- incident_type_service.get_by_name(db_session=db_session, name=n.name)
- for n in individual_contact_in.incident_types
- ]
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=individual_contact_in.project
+ )
+
contact = IndividualContact(
- **individual_contact_in.dict(exclude={"terms", "incident_priorities", "incident_types"}),
- terms=terms,
- incident_types=incident_types,
- incident_priorities=incident_priorities,
+ **individual_contact_in.dict(exclude={"project", "filters"}),
+ project=project,
)
+
+ if individual_contact_in.filters is not None:
+ filters = [
+ search_filter_service.get(db_session=db_session, search_filter_id=f.id)
+ for f in individual_contact_in.filters
+ ]
+ contact.filters = filters
+
db_session.add(contact)
db_session.commit()
return contact
@@ -81,41 +151,31 @@ def create(*, db_session, individual_contact_in: IndividualContactCreate) -> Ind
def update(
*,
- db_session,
+ db_session: Session,
individual_contact: IndividualContact,
individual_contact_in: IndividualContactUpdate,
) -> IndividualContact:
- individual_contact_data = jsonable_encoder(individual_contact_in)
-
- terms = [
- term_service.get_or_create(db_session=db_session, term_in=t)
- for t in individual_contact_in.terms
- ]
- incident_priorities = [
- incident_priority_service.get_by_name(db_session=db_session, name=n.name)
- for n in individual_contact_in.incident_priorities
- ]
- incident_types = [
- incident_type_service.get_by_name(db_session=db_session, name=n.name)
- for n in individual_contact_in.incident_types
- ]
- update_data = individual_contact_in.dict(
- skip_defaults=True, exclude={"terms", "incident_priorities", "incident_types"}
- )
+ """Updates an individual."""
+ individual_contact_data = individual_contact.dict()
+ update_data = individual_contact_in.dict(exclude_unset=True, exclude={"filters"})
for field in individual_contact_data:
if field in update_data:
- setattr(individual_contact_in, field, update_data[field])
+ setattr(individual_contact, field, update_data[field])
+
+ if individual_contact_in.filters is not None:
+ filters = [
+ search_filter_service.get(db_session=db_session, search_filter_id=f.id)
+ for f in individual_contact_in.filters
+ ]
+ individual_contact.filters = filters
- individual_contact.terms = terms
- individual_contact.incident_types = incident_types
- individual_contact.incident_priorities = incident_priorities
- db_session.add(individual_contact)
db_session.commit()
return individual_contact
-def delete(*, db_session, individual_contact_id: int):
+def delete(*, db_session: Session, individual_contact_id: int):
+ """Deletes an individual."""
individual = (
db_session.query(IndividualContact)
.filter(IndividualContact.id == individual_contact_id)
diff --git a/src/dispatch/individual/views.py b/src/dispatch/individual/views.py
index 727bc17459f2..3214fd2bf692 100644
--- a/src/dispatch/individual/views.py
+++ b/src/dispatch/individual/views.py
@@ -1,9 +1,14 @@
-from typing import List
+from fastapi import APIRouter, Depends
+from pydantic import ValidationError
-from fastapi import APIRouter, Depends, HTTPException, Query
-from sqlalchemy.orm import Session
-
-from dispatch.database import get_db, search_filter_sort_paginate
+from dispatch.auth.permissions import (
+ PermissionsDependency,
+ SensitiveProjectActionPermission,
+ IndividualContactUpdatePermission,
+)
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
from .models import (
IndividualContactCreate,
@@ -11,95 +16,108 @@
IndividualContactRead,
IndividualContactUpdate,
)
-from .service import create, delete, get, get_by_email, update
-
-router = APIRouter()
+from .service import get, get_by_email_and_project, create, update, delete
-@router.get("/", response_model=IndividualContactPagination)
-def get_individuals(
- db_session: Session = Depends(get_db),
- page: int = 1,
- items_per_page: int = Query(5, alias="itemsPerPage"),
- query_str: str = Query(None, alias="q"),
- sort_by: List[str] = Query(None, alias="sortBy[]"),
- descending: List[bool] = Query(None, alias="descending[]"),
- fields: List[str] = Query(None, alias="field[]"),
- ops: List[str] = Query(None, alias="op[]"),
- values: List[str] = Query(None, alias="value[]"),
-):
- """
- Retrieve individual contacts.
- """
- return search_filter_sort_paginate(
- db_session=db_session,
- model="IndividualContact",
- query_str=query_str,
- page=page,
- items_per_page=items_per_page,
- sort_by=sort_by,
- descending=descending,
- fields=fields,
- values=values,
- ops=ops,
- )
-
-
-@router.post("/", response_model=IndividualContactRead)
-def create_individual(
- *, db_session: Session = Depends(get_db), individual_contact_in: IndividualContactCreate
-):
- """
- Create a new individual contact.
- """
- individual = get_by_email(db_session=db_session, email=individual_contact_in.email)
- if individual:
- raise HTTPException(
- status_code=400, detail="The individual with this email already exists."
- )
- individual = create(db_session=db_session, individual_contact_in=individual_contact_in)
- return individual
+router = APIRouter()
@router.get("/{individual_contact_id}", response_model=IndividualContactRead)
-def get_individual(*, db_session: Session = Depends(get_db), individual_contact_id: int):
- """
- Get a individual contact.
- """
+def get_individual(db_session: DbSession, individual_contact_id: PrimaryKey):
+ """Gets an individual contact."""
individual = get(db_session=db_session, individual_contact_id=individual_contact_id)
if not individual:
- raise HTTPException(status_code=404, detail="The individual with this id does not exist.")
+ raise ValidationError.from_exception_data(
+ "IndividualContactRead",
+ [
+ {
+ "type": "value_error",
+ "loc": ("individual",),
+ "msg": "Individual not found.",
+ "input": individual_contact_id,
+ }
+ ],
+ )
return individual
-@router.put("/{individual_contact_id}", response_model=IndividualContactRead)
+@router.get("", response_model=IndividualContactPagination)
+def get_individuals(common: CommonParameters):
+ """Retrieve individual contacts."""
+ return search_filter_sort_paginate(model="IndividualContact", **common)
+
+
+@router.post("", response_model=IndividualContactRead)
+def create_individual(db_session: DbSession, individual_contact_in: IndividualContactCreate):
+ """Creates a new individual contact."""
+ individual = get_by_email_and_project(
+ db_session=db_session,
+ email=individual_contact_in.email,
+ project_id=individual_contact_in.project.id,
+ )
+ if individual:
+ raise ValidationError(
+ [
+ {
+ "msg": "An individual with this email already exists.",
+ "loc": "email",
+ }
+ ]
+ )
+ return create(db_session=db_session, individual_contact_in=individual_contact_in)
+
+
+@router.put(
+ "/{individual_contact_id}",
+ response_model=IndividualContactRead,
+ summary="Updates an individual's contact information.",
+ dependencies=[Depends(PermissionsDependency([IndividualContactUpdatePermission]))],
+)
def update_individual(
- *,
- db_session: Session = Depends(get_db),
- individual_contact_id: int,
+ db_session: DbSession,
+ individual_contact_id: PrimaryKey,
individual_contact_in: IndividualContactUpdate,
):
- """
- Update a individual contact.
- """
+ """Updates an individual contact."""
individual = get(db_session=db_session, individual_contact_id=individual_contact_id)
if not individual:
- raise HTTPException(status_code=404, detail="The individual with this id does not exist.")
- individual = update(
+ raise ValidationError.from_exception_data(
+ "IndividualContactRead",
+ [
+ {
+ "type": "value_error",
+ "loc": ("individual",),
+ "msg": "Individual not found.",
+ "input": individual_contact_id,
+ }
+ ],
+ )
+ return update(
db_session=db_session,
individual_contact=individual,
individual_contact_in=individual_contact_in,
)
- return individual
-@router.delete("/{individual_contact_id}")
-def delete_individual(*, db_session: Session = Depends(get_db), individual_contact_id: int):
- """
- Delete a individual contact.
- """
+@router.delete(
+ "/{individual_contact_id}",
+ response_model=None,
+ summary="Deletes an individual contact.",
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def delete_individual(db_session: DbSession, individual_contact_id: PrimaryKey):
+ """Deletes an individual contact."""
individual = get(db_session=db_session, individual_contact_id=individual_contact_id)
if not individual:
- raise HTTPException(status_code=404, detail="The individual with this id does not exist.")
-
+ raise ValidationError.from_exception_data(
+ "IndividualContactRead",
+ [
+ {
+ "type": "value_error",
+ "loc": ("individual",),
+ "msg": "Individual not found.",
+ "input": individual_contact_id,
+ }
+ ],
+ )
delete(db_session=db_session, individual_contact_id=individual_contact_id)
diff --git a/src/dispatch/logging.py b/src/dispatch/logging.py
index 350b0b383cbd..27ea76aa395a 100644
--- a/src/dispatch/logging.py
+++ b/src/dispatch/logging.py
@@ -1,11 +1,33 @@
import logging
+
from dispatch.config import LOG_LEVEL
+from dispatch.enums import DispatchEnum
+
+
+LOG_FORMAT_DEBUG = "%(levelname)s:%(message)s:%(pathname)s:%(funcName)s:%(lineno)d"
+
+
+class LogLevels(DispatchEnum):
+ info = "INFO"
+ warn = "WARN"
+ error = "ERROR"
+ debug = "DEBUG"
def configure_logging():
- if LOG_LEVEL == "DEBUG":
- # log level:logged message:full module path:function invoked:line number of logging call
- LOGFORMAT = "%(levelname)s:%(message)s:%(pathname)s:%(funcName)s:%(lineno)d"
- logging.basicConfig(level=LOG_LEVEL, format=LOGFORMAT)
- else:
- logging.basicConfig(level=LOG_LEVEL)
+ log_level = str(LOG_LEVEL).upper() # cast to string
+ log_levels = list(LogLevels)
+
+ if log_level not in log_levels:
+ # we use error as the default log level
+ logging.basicConfig(level=LogLevels.error)
+ return
+
+ if log_level == LogLevels.debug:
+ logging.basicConfig(level=log_level, format=LOG_FORMAT_DEBUG)
+ return
+
+ logging.basicConfig(level=log_level)
+
+ # sometimes the slack client can be too verbose
+ logging.getLogger("slack_sdk.web.base_client").setLevel(logging.CRITICAL)
diff --git a/src/dispatch/main.py b/src/dispatch/main.py
index 01affb3626ef..0b7428e6e40b 100644
--- a/src/dispatch/main.py
+++ b/src/dispatch/main.py
@@ -1,45 +1,66 @@
-import time
import logging
-from tabulate import tabulate
+import time
+from contextvars import ContextVar
from os import path
-
-# from starlette.middleware.gzip import GZipMiddleware
-from fastapi import FastAPI
+from uuid import uuid1
+import warnings
+from typing import Final
+from fastapi import FastAPI, status
+from fastapi.responses import JSONResponse
+from pydantic import ValidationError
from sentry_asgi import SentryMiddleware
-from starlette.applications import Starlette
+from slowapi import _rate_limit_exceeded_handler
+from slowapi.errors import RateLimitExceeded
+from sqlalchemy import inspect
+from sqlalchemy.orm import scoped_session, sessionmaker
from starlette.middleware.base import BaseHTTPMiddleware, RequestResponseEndpoint
+from starlette.middleware.gzip import GZipMiddleware
from starlette.requests import Request
from starlette.responses import FileResponse, Response, StreamingResponse
+from starlette.routing import compile_path
from starlette.staticfiles import StaticFiles
-import httpx
from .api import api_router
-from .config import STATIC_DIR
-from .database import SessionLocal
-from .metrics import provider as metric_provider
-from .logging import configure_logging
+from .common.utils.cli import install_plugin_events, install_plugins
+from .config import (
+ STATIC_DIR,
+)
+from .database.core import engine
+from .database.logging import SessionTracker
from .extensions import configure_extensions
+from .logging import configure_logging
+from .metrics import provider as metric_provider
+from .rate_limiter import limiter
+
+# Filter out Pydantic migration warnings
+warnings.filterwarnings("ignore", message=".*has been moved to.*")
log = logging.getLogger(__name__)
-app = Starlette()
-frontend = Starlette()
+# we configure the logging level and format
+configure_logging()
-api = FastAPI(docs_url=None, redoc_url=None, openapi_url=None)
+# we configure the extensions such as Sentry
+configure_extensions()
-api.include_router(api_router, prefix="/v1")
-if STATIC_DIR:
- frontend.mount("/", StaticFiles(directory=STATIC_DIR), name="app")
+async def not_found(request, exc):
+ return JSONResponse(
+ status_code=status.HTTP_404_NOT_FOUND, content={"detail": [{"msg": "Not Found."}]}
+ )
-app.mount("/api", app=api)
-app.mount("/", app=frontend)
+exception_handlers = {404: not_found}
-def get_path_template(request: Request) -> str:
- if hasattr(request, "path"):
- return ",".join(request.path.split("/")[1:4])
- return ".".join(request.url.path.split("/")[1:4])
+# we create the ASGI for the app
+app = FastAPI(exception_handlers=exception_handlers, openapi_url="")
+app.state.limiter = limiter
+app.add_exception_handler(RateLimitExceeded, _rate_limit_exceeded_handler)
+app.add_middleware(GZipMiddleware, minimum_size=1000)
+
+# we create the ASGI for the frontend
+frontend = FastAPI(openapi_url="")
+frontend.add_middleware(GZipMiddleware, minimum_size=1000)
@frontend.middleware("http")
@@ -48,29 +69,124 @@ async def default_page(request, call_next):
if response.status_code == 404:
if STATIC_DIR:
return FileResponse(path.join(STATIC_DIR, "index.html"))
- else:
- async with httpx.AsyncClient() as client:
- remote_resp = await client.get(
- str(request.url.replace(port=8080)), headers=dict(request.headers)
- )
- return StreamingResponse(
- remote_resp.aiter_bytes(),
- headers=remote_resp.headers,
- status_code=remote_resp.status_code,
- media_type=remote_resp.headers.get("content-type"),
- )
return response
-@app.middleware("http")
+# we create the Web API framework
+api = FastAPI(
+ title="Dispatch",
+ description="Welcome to Dispatch's API documentation! Here you will able to discover all of the ways you can interact with the Dispatch API.",
+ root_path="/api/v1",
+ docs_url=None,
+ openapi_url="/docs/openapi.json",
+ redoc_url="/docs",
+)
+api.add_middleware(GZipMiddleware, minimum_size=1000)
+
+
+def get_path_params_from_request(request: Request) -> str:
+ path_params = {}
+ for r in api_router.routes:
+ path_regex, path_format, param_converters = compile_path(r.path)
+ path = request["path"].removeprefix("/api/v1") # remove the /api/v1 for matching
+ match = path_regex.match(path)
+ if match:
+ path_params = match.groupdict()
+ return path_params
+
+
+def get_path_template(request: Request) -> str:
+ if hasattr(request, "path"):
+ return ",".join(request.path.split("/")[1:])
+ return ".".join(request.url.path.split("/")[1:])
+
+
+REQUEST_ID_CTX_KEY: Final[str] = "request_id"
+_request_id_ctx_var: ContextVar[str | None] = ContextVar(REQUEST_ID_CTX_KEY, default=None)
+
+
+def get_request_id() -> str | None:
+ return _request_id_ctx_var.get()
+
+
+@api.middleware("http")
async def db_session_middleware(request: Request, call_next):
- response = Response("Internal Server Error", status_code=500)
+ request_id = str(uuid1())
+
+ # we create a per-request id such that we can ensure that our session is scoped for a particular request.
+ # see: https://github.com/tiangolo/fastapi/issues/726
+ ctx_token = _request_id_ctx_var.set(request_id)
+ session = None
+
try:
- request.state.db = SessionLocal()
+ path_params = get_path_params_from_request(request)
+
+ # if this call is organization specific set the correct search path
+ organization_slug = path_params.get("organization", "default")
+ request.state.organization = organization_slug
+ schema = f"dispatch_organization_{organization_slug}"
+
+ # validate slug exists
+ schema_names = inspect(engine).get_schema_names()
+ if schema not in schema_names:
+ return JSONResponse(
+ status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
+ content={"detail": [{"msg": f"Unknown database schema name: {schema}"}]},
+ )
+
+ # add correct schema mapping depending on the request
+ schema_engine = engine.execution_options(
+ schema_translate_map={
+ None: schema,
+ }
+ )
+
+ session = scoped_session(sessionmaker(bind=schema_engine), scopefunc=get_request_id)
+ request.state.db = session()
+
+ # we track the session
+ request.state.db._dispatch_session_id = SessionTracker.track_session(
+ request.state.db, context=f"api_request_{organization_slug}"
+ )
+
response = await call_next(request)
+
+ # If we got here without exceptions, commit any pending changes
+ if hasattr(request.state, "db") and request.state.db.is_active:
+ request.state.db.commit()
+
+ return response
+
+ except Exception as e:
+ # Explicitly rollback on exceptions
+ try:
+ if hasattr(request.state, "db") and request.state.db.is_active:
+ request.state.db.rollback()
+ except Exception as rollback_error:
+ logging.error(f"Error during rollback: {rollback_error}")
+
+ # Re-raise the original exception
+ raise e from None
finally:
- request.state.db.close()
- return response
+ # Always clean up resources
+ if hasattr(request.state, "db"):
+ # Untrack the session
+ if hasattr(request.state.db, "_dispatch_session_id"):
+ try:
+ SessionTracker.untrack_session(request.state.db._dispatch_session_id)
+ except Exception as untrack_error:
+ logging.error(f"Failed to untrack session: {untrack_error}")
+
+ # Close the session
+ try:
+ request.state.db.close()
+ if session is not None:
+ session.remove() # Remove the session from the registry
+ except Exception as close_error:
+ logging.error(f"Error closing database session: {close_error}")
+
+ # Always reset the context variable
+ _request_id_ctx_var.reset(ctx_token)
@app.middleware("http")
@@ -84,10 +200,6 @@ class MetricsMiddleware(BaseHTTPMiddleware):
async def dispatch(self, request: Request, call_next: RequestResponseEndpoint) -> Response:
path_template = get_path_template(request)
- # exclude non api requests e.g. static content
- if "api" not in path_template:
- return await call_next(request)
-
method = request.method
tags = {"method": method, "endpoint": path_template}
@@ -95,30 +207,63 @@ async def dispatch(self, request: Request, call_next: RequestResponseEndpoint) -
start = time.perf_counter()
response = await call_next(request)
elapsed_time = time.perf_counter() - start
+ tags.update({"status_code": response.status_code})
+ metric_provider.counter("server.call.counter", tags=tags)
+ metric_provider.timer("server.call.elapsed", value=elapsed_time, tags=tags)
+ log.debug(f"server.call.elapsed.{path_template}: {elapsed_time}")
except Exception as e:
metric_provider.counter("server.call.exception.counter", tags=tags)
raise e from None
- else:
- tags.update({"status_code": response.status_code})
- metric_provider.timer("server.call.elapsed", value=elapsed_time, tags=tags)
- metric_provider.counter("server.call.counter", tags=tags)
+ return response
+
+
+class ExceptionMiddleware(BaseHTTPMiddleware):
+ async def dispatch(
+ self, request: Request, call_next: RequestResponseEndpoint
+ ) -> StreamingResponse:
+ try:
+ response = await call_next(request)
+ except ValidationError as e:
+ log.exception(e)
+ response = JSONResponse(
+ status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, content={"detail": e.errors()}
+ )
+ except ValueError as e:
+ log.exception(e)
+ response = JSONResponse(
+ status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
+ content={"detail": [{"msg": "Unknown", "loc": ["Unknown"], "type": "Unknown"}]},
+ )
+ except Exception as e:
+ log.exception(e)
+ response = JSONResponse(
+ status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
+ content={"detail": [{"msg": "Unknown", "loc": ["Unknown"], "type": "Unknown"}]},
+ )
return response
-app.add_middleware(SentryMiddleware)
-app.add_middleware(MetricsMiddleware)
-# app.add_middleware(GZipMiddleware)
+# we add a middleware class for logging exceptions to Sentry
+api.add_middleware(SentryMiddleware)
-configure_logging()
-configure_extensions()
+# we add a middleware class for capturing metrics using Dispatch's metrics provider
+api.add_middleware(MetricsMiddleware)
+
+api.add_middleware(ExceptionMiddleware)
+
+# we install all the plugins
+install_plugins()
-table = []
-for r in api_router.routes:
- auth = False
- for d in r.dependencies:
- if d.dependency.__name__ == "get_current_user": # TODO this is fragile
- auth = True
- table.append([r.path, auth, ",".join(r.methods)])
+# we add all the plugin event API routes to the API router
+install_plugin_events(api_router)
-log.debug("Available Endpoints \n" + tabulate(table, headers=["Path", "Authenticated", "Methods"]))
+# we add all API routes to the Web API framework
+api.include_router(api_router)
+
+# we mount the frontend and app
+if STATIC_DIR and path.isdir(STATIC_DIR):
+ frontend.mount("/", StaticFiles(directory=STATIC_DIR), name="app")
+
+app.mount("/api/v1", app=api)
+app.mount("/", app=frontend)
diff --git a/src/dispatch/messaging.py b/src/dispatch/messaging.py
deleted file mode 100644
index 86c16cd50a96..000000000000
--- a/src/dispatch/messaging.py
+++ /dev/null
@@ -1,462 +0,0 @@
-import copy
-from enum import Enum
-
-from jinja2 import Template
-
-from typing import List
-
-from dispatch.conversation.enums import ConversationButtonActions
-from dispatch.incident.enums import IncidentStatus
-from dispatch.incident_priority.models import IncidentPriorityType
-
-from .config import (
- DISPATCH_UI_URL,
- INCIDENT_RESOURCE_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT,
- INCIDENT_RESOURCE_FAQ_DOCUMENT,
- INCIDENT_RESOURCE_INCIDENT_REVIEW_DOCUMENT,
- INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT,
- INCIDENT_RESOURCE_INVESTIGATION_SHEET,
-)
-
-
-class MessageType(str, Enum):
- incident_daily_summary = "incident-daily-summary"
- incident_daily_summary_no_incidents = "incident-daily-summary-no-incidents"
- incident_notification = "incident-notification"
- incident_participant_welcome = "incident-participant-welcome"
- incident_resources_message = "incident-resources-message"
- incident_status_report = "incident-status-report"
- incident_task_list = "incident-task-list"
- incident_task_reminder = "incident-task-reminder"
-
-
-INCIDENT_PRIORITY_DESCRIPTIONS = {
- IncidentPriorityType.info: "This incident is in tracking only mode and is not under active investigation.",
- IncidentPriorityType.low: "This incident will require you to perform tasks during working hours until the incident is stable.",
- IncidentPriorityType.medium: "This incident requires your full attention during working hours until the incident is stable.",
- IncidentPriorityType.high: "This incident requires your full attention, and should be prioritized over all other work until the incident is stable.",
-}
-
-INCIDENT_PRIORITY_DESCRIPTIONS_FYI = {
- IncidentPriorityType.info: "This incident is in tracking only mode and is not under active investigation.",
- IncidentPriorityType.low: "This incident may require your team's attention during working hours, until the incident is stable.",
- IncidentPriorityType.medium: "This incident may require your team's full attention during working hours, until the incident is stable.",
- IncidentPriorityType.high: "This incident may require your team's full attention, and should be prioritized over all other work, until the incident is stable.",
-}
-
-INCIDENT_STATUS_DESCRIPTIONS = {
- IncidentStatus.active: "This incident is under active investigation.",
- IncidentStatus.stable: "This incident is stable, the bulk of the investigation has been completed or most of the risk has been mitigated.",
- IncidentStatus.closed: "This no longer requires additional involvement, long term incident action items have been assigned to their respective owners.",
-}
-
-INCIDENT_STATUS_REPORT_DESCRIPTION = """
-This is an incident status update.
-""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_TASK_REMINDER_DESCRIPTION = """
-You are assigned to the following incident tasks.
-This is a reminder that these tasks have *passed* their due date.
-Please review and update as appropriate.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_TASK_LIST_DESCRIPTION = """The following are open incident tasks."""
-
-INCIDENT_DAILY_SUMMARY_DESCRIPTION = """
-Daily Incidents Summary""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_DAILY_SUMMARY_ACTIVE_INCIDENTS_DESCRIPTION = f"""
-Active Incidents (<{DISPATCH_UI_URL}/incidents/status|Details>)""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_DAILY_SUMMARY_NO_ACTIVE_INCIDENTS_DESCRIPTION = """
-There are no active incidents at this moment.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_DAILY_SUMMARY_STABLE_CLOSED_INCIDENTS_DESCRIPTION = """
-Stable or Closed Incidents (last 24 hours)""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_DAILY_SUMMARY_NO_STABLE_CLOSED_INCIDENTS_DESCRIPTION = """
-There are no stable or closed incidents in the last 24 hours.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_COMMANDER_DESCRIPTION = """
-The Incident Commander (IC) is responsible for
-knowing the full context of the incident.
-Contact them about any questions or concerns.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_COMMANDER_READDED_DESCRIPTION = """
-{{ commander_fullname }} (Incident Commander) has been re-added to the conversation.
-Please, handoff the Incident Commander role before leaving the conversation.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_TICKET_DESCRIPTION = """
-Ticket for tracking purposes. It contains a description of
-the incident and links to resources.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_CONVERSATION_DESCRIPTION = """
-Private conversation for real-time discussion. All incident participants get added to it.
-""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT_DESCRIPTION = """
-Document containing the list of slash commands available to the Incident Commander (IC)
-and participants in the incident conversation.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_CONFERENCE_DESCRIPTION = """
-Video conference and phone bridge to be used throughout the incident. Password: {{conference_challenge}}
-""".replace(
- "\n", ""
-).strip()
-
-INCIDENT_STORAGE_DESCRIPTION = """
-Common storage for all incident artifacts and
-documents. Add logs, screen captures, or any other data collected during the
-investigation to this drive. It is shared with all incident participants.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_INVESTIGATION_DOCUMENT_DESCRIPTION = """
-This is a document for all incident facts and context. All
-incident participants are expected to contribute to this document.
-It is shared with all incident participants.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_INVESTIGATION_SHEET_DESCRIPTION = """
-This is a sheet for tracking impacted assets. All
-incident participants are expected to contribute to this sheet.
-It is shared with all incident participants.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_FAQ_DOCUMENT_DESCRIPTION = """
-First time responding to an information security incident? This
-document answers common questions encountered when
-helping us respond to an incident.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_REVIEW_DOCUMENT_DESCRIPTION = """
-This document will capture all lessons learned, questions, and action items raised during the incident.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_DOCUMENT_DESCRIPTIONS = {
- INCIDENT_RESOURCE_FAQ_DOCUMENT: INCIDENT_FAQ_DOCUMENT_DESCRIPTION,
- INCIDENT_RESOURCE_INCIDENT_REVIEW_DOCUMENT: INCIDENT_REVIEW_DOCUMENT_DESCRIPTION,
- INCIDENT_RESOURCE_INVESTIGATION_DOCUMENT: INCIDENT_INVESTIGATION_DOCUMENT_DESCRIPTION,
- INCIDENT_RESOURCE_INVESTIGATION_SHEET: INCIDENT_INVESTIGATION_SHEET_DESCRIPTION,
- INCIDENT_RESOURCE_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT: INCIDENT_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT_DESCRIPTION,
-}
-
-INCIDENT_PARTICIPANT_WELCOME_DESCRIPTION = """
-You\'re being contacted because we think you may
-be able to help us during this information security incident.
-Please review the content below and join us in the
-incident Slack channel.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_WELCOME_CONVERSATION_COPY = """
-This is the incident conversation. Please pull in any
-individuals you feel may be able to help resolve this incident.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_NOTIFICATION_PURPOSES_FYI = """
-This message is for notification purposes only.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_GET_INVOLVED_BUTTON_DESCRIPTION = """
-Click the button to be added to the incident conversation.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_CAN_REPORT_REMINDER = """
-It's time to send a new CAN report. Go to the Demisto UI and run the
-CAN Report playbook from the Playground Work Plan.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_VULNERABILITY_DESCRIPTION = """
-We are tracking the details of the vulnerability that led to this incident
-in the VUL Jira issue linked above.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_STABLE_DESCRIPTION = """
-The risk has been contained and the incident marked as stable.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_CLOSED_DESCRIPTION = """
-The incident has been resolved and marked as closed.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_STATUS_REPORT_DESCRIPTION = """
-The following conditions, actions, and needs summarize the current status of the incident.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_NEW_ROLE_DESCRIPTION = """
-{{assigner_fullname}} has assigned the role of {{assignee_role}} to {{assignee_fullname}}.
-Please, contact {{assignee_firstname}} about any questions or concerns.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_STATUS_REPORT_REMINDER_DESCRIPTION = """You have not provided a status report for this incident recently.
-Consider providing one to inform participants of the current conditions, actions, and needs.
-You can use `{{command}}` in the conversation to assist you in writing one.""".replace(
- "\n", " "
-).strip()
-
-INCIDENT_TASK_NEW_DESCRIPTION = """
-The following incident task has been created in the incident document.\n\n*Description:* {{task_description}}\n\n*Assignees:* {{task_assignees}}"""
-
-INCIDENT_TASK_RESOLVED_DESCRIPTION = """
-The following incident task has been resolved in the incident document.\n\n*Description:* {{task_description}}\n\n*Assignees:* {{task_assignees}}"""
-
-INCIDENT_TYPE_CHANGE_DESCRIPTION = """
-The incident type has been changed from *{{ incident_type_old }}* to *{{ incident_type_new }}*."""
-
-INCIDENT_STATUS_CHANGE_DESCRIPTION = """
-The incident status has been changed from *{{ incident_status_old }}* to *{{ incident_status_new }}*."""
-
-INCIDENT_PRIORITY_CHANGE_DESCRIPTION = """
-The incident priority has been changed from *{{ incident_priority_old }}* to *{{ incident_priority_new }}*."""
-
-INCIDENT_NAME = {
- "title": "{{name}} Incident Notification",
- "title_link": "{{ticket_weblink}}",
- "text": INCIDENT_NOTIFICATION_PURPOSES_FYI,
- "button_text": "Get Involved",
- "button_value": "{{incident_id}}",
- "button_action": ConversationButtonActions.invite_user,
-}
-
-INCIDENT_TITLE = {"title": "Incident Title", "text": "{{title}}"}
-
-INCIDENT_DESCRIPTION = {"title": "Incident Description", "text": "{{description}}"}
-
-INCIDENT_STATUS = {
- "title": "Incident Status - {{status}}",
- "status_mapping": INCIDENT_STATUS_DESCRIPTIONS,
-}
-
-INCIDENT_PRIORITY = {
- "title": "Incident Priority - {{priority}}",
- "priority_mapping": INCIDENT_PRIORITY_DESCRIPTIONS,
-}
-
-INCIDENT_PRIORITY_FYI = {
- "title": "Incident Priority - {{priority}}",
- "priority_mapping": INCIDENT_PRIORITY_DESCRIPTIONS_FYI,
-}
-
-INCIDENT_COMMANDER = {
- "title": "Incident Commander - {{commander_fullname}}",
- "title_link": "{{commander_weblink}}",
- "text": INCIDENT_COMMANDER_DESCRIPTION,
-}
-
-INCIDENT_CONFERENCE = {
- "title": "Incident Conference",
- "title_link": "{{conference_weblink}}",
- "text": INCIDENT_CONFERENCE_DESCRIPTION,
-}
-
-INCIDENT_STORAGE = {
- "title": "Incident Storage",
- "title_link": "{{storage_weblink}}",
- "text": INCIDENT_STORAGE_DESCRIPTION,
-}
-
-INCIDENT_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT = {
- "title": "Incident Conversation Commands Reference Document",
- "title_link": "{{conversation_commands_reference_document_weblink}}",
- "text": INCIDENT_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT_DESCRIPTION,
-}
-
-INCIDENT_INVESTIGATION_DOCUMENT = {
- "title": "Incident Investigation Document",
- "title_link": "{{document_weblink}}",
- "text": INCIDENT_INVESTIGATION_DOCUMENT_DESCRIPTION,
-}
-
-INCIDENT_INVESTIGATION_SHEET = {
- "title": "Incident Investigation Sheet",
- "title_link": "{{sheet_weblink}}",
- "text": INCIDENT_INVESTIGATION_SHEET_DESCRIPTION,
-}
-
-INCIDENT_FAQ_DOCUMENT = {
- "title": "Incident FAQ Document",
- "title_link": "{{faq_weblink}}",
- "text": INCIDENT_FAQ_DOCUMENT_DESCRIPTION,
-}
-
-INCIDENT_TYPE_CHANGE = {"title": "Incident Type Change", "text": INCIDENT_TYPE_CHANGE_DESCRIPTION}
-
-INCIDENT_STATUS_CHANGE = {
- "title": "Incident Status Change",
- "text": INCIDENT_STATUS_CHANGE_DESCRIPTION,
-}
-
-INCIDENT_PRIORITY_CHANGE = {
- "title": "Incident Priority Change",
- "text": INCIDENT_PRIORITY_CHANGE_DESCRIPTION,
-}
-
-INCIDENT_PARTICIPANT_WELCOME = {
- "title": "Welcome to {{name}}",
- "title_link": "{{ticket_weblink}}",
- "text": INCIDENT_PARTICIPANT_WELCOME_DESCRIPTION,
-}
-
-INCIDENT_GET_INVOLVED_BUTTON = {
- "title": "Get Involved",
- "text": INCIDENT_GET_INVOLVED_BUTTON_DESCRIPTION,
- "button_text": "Get Involved",
- "button_value": "{{incident_id}}",
- "button_action": ConversationButtonActions.invite_user,
-}
-
-INCIDENT_PARTICIPANT_WELCOME_MESSAGE = [
- INCIDENT_PARTICIPANT_WELCOME,
- INCIDENT_TITLE,
- INCIDENT_STATUS,
- INCIDENT_PRIORITY,
- INCIDENT_COMMANDER,
- INCIDENT_INVESTIGATION_DOCUMENT,
- INCIDENT_STORAGE,
- INCIDENT_CONFERENCE,
- INCIDENT_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT,
- INCIDENT_FAQ_DOCUMENT,
-]
-
-INCIDENT_RESOURCES_MESSAGE = [
- INCIDENT_COMMANDER,
- INCIDENT_INVESTIGATION_DOCUMENT,
- INCIDENT_STORAGE,
- INCIDENT_CONFERENCE,
- INCIDENT_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT,
- INCIDENT_FAQ_DOCUMENT,
-]
-
-INCIDENT_NOTIFICATION_COMMON = [INCIDENT_NAME, INCIDENT_TITLE]
-
-INCIDENT_NOTIFICATION = INCIDENT_NOTIFICATION_COMMON.copy()
-INCIDENT_NOTIFICATION.extend([INCIDENT_STATUS, INCIDENT_PRIORITY_FYI, INCIDENT_COMMANDER])
-
-INCIDENT_STATUS_REPORT = [
- {"title": "Incident Status Report", "text": INCIDENT_STATUS_REPORT_DESCRIPTION},
- {"title": "Conditions", "text": "{{conditions}}"},
- {"title": "Actions", "text": "{{actions}}"},
- {"title": "Needs", "text": "{{needs}}"},
-]
-
-INCIDENT_STATUS_REPORT_REMINDER = [
- {
- "title": "{{name}} Incident - Status Report Reminder",
- "title_link": "{{ticket_weblink}}",
- "text": INCIDENT_STATUS_REPORT_REMINDER_DESCRIPTION,
- },
- INCIDENT_TITLE,
-]
-
-INCIDENT_TASK_REMINDER = [
- {"title": "Incident - {{ name }}", "text": "{{ title }}"},
- {"title": "Creator", "text": "{{ creator }}"},
- {"title": "Description", "text": "{{ description }}"},
- {"title": "Priority", "text": "{{ priority }}"},
- {"title": "Created At", "text": "", "datetime": "{{ created_at}}"},
- {"title": "Resolve By", "text": "", "datetime": "{{ resolve_by }}"},
- {"title": "Link", "text": "{{ weblink }}"},
-]
-
-INCIDENT_REVIEW_DOCUMENT_NOTIFICATION = [
- {
- "title": "Incident Review Document",
- "title_link": "{{incident_review_document_weblink}}",
- "text": INCIDENT_REVIEW_DOCUMENT_DESCRIPTION,
- }
-]
-
-INCIDENT_NEW_ROLE_NOTIFICATION = [
- {
- "title": "New {{assignee_role}} - {{assignee_fullname}}",
- "title_link": "{{assignee_weblink}}",
- "text": INCIDENT_NEW_ROLE_DESCRIPTION,
- }
-]
-
-INCIDENT_TASK_NEW_NOTIFICATION = [
- {
- "title": "New Incident Task",
- "title_link": "{{task_weblink}}",
- "text": INCIDENT_TASK_NEW_DESCRIPTION,
- }
-]
-
-INCIDENT_TASK_RESOLVED_NOTIFICATION = [
- {
- "title": "Resolved Incident Task",
- "title_link": "{{task_weblink}}",
- "text": INCIDENT_TASK_RESOLVED_DESCRIPTION,
- }
-]
-
-INCIDENT_COMMANDER_READDED_NOTIFICATION = [
- {"title": "Incident Commander Re-Added", "text": INCIDENT_COMMANDER_READDED_DESCRIPTION}
-]
-
-
-def render_message_template(message_template: List[dict], **kwargs):
- """Renders the jinja data included in the template itself."""
- data = []
- new_copy = copy.deepcopy(message_template)
- for d in new_copy:
- if d.get("priority_mapping"):
- d["text"] = d["priority_mapping"][kwargs["priority"]]
-
- if d.get("status_mapping"):
- d["text"] = d["status_mapping"][kwargs["status"]]
-
- if d.get("datetime"):
- d["datetime"] = Template(d["datetime"]).render(**kwargs)
-
- d["text"] = Template(d["text"]).render(**kwargs)
- d["title"] = Template(d["title"]).render(**kwargs)
-
- if d.get("title_link"):
- d["title_link"] = Template(d["title_link"]).render(**kwargs)
-
- if d.get("button_text"):
- d["button_text"] = Template(d["button_text"]).render(**kwargs)
-
- if d.get("button_value"):
- d["button_value"] = Template(d["button_value"]).render(**kwargs)
-
- data.append(d)
- return data
diff --git a/src/dispatch/messaging/__init__.py b/src/dispatch/messaging/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/messaging/email/filters.py b/src/dispatch/messaging/email/filters.py
new file mode 100644
index 000000000000..44d18c7ff0ee
--- /dev/null
+++ b/src/dispatch/messaging/email/filters.py
@@ -0,0 +1,31 @@
+import os
+from datetime import datetime
+
+import markdown
+from jinja2 import FileSystemLoader
+from jinja2.sandbox import ImmutableSandboxedEnvironment
+from markupsafe import Markup
+from dispatch import config
+
+here = os.path.dirname(os.path.realpath(__file__))
+
+
+autoescape = bool(config.DISPATCH_ESCAPE_HTML)
+env = ImmutableSandboxedEnvironment(loader=FileSystemLoader(here), autoescape=autoescape)
+
+
+def safe_format_datetime(value):
+ try:
+ return datetime.fromisoformat(value).strftime("%A, %B %d, %Y")
+ except (ValueError, TypeError):
+ return ""
+
+
+def safe_format_markdown(value):
+ if not isinstance(value, str):
+ return ""
+ return Markup(markdown.markdown(value, output_format="html5", extensions=["extra"]))
+
+
+env.filters["datetime"] = safe_format_datetime
+env.filters["markdown"] = safe_format_markdown
diff --git a/src/dispatch/messaging/email/templates/base.mjml b/src/dispatch/messaging/email/templates/base.mjml
new file mode 100644
index 000000000000..adea9cf639c6
--- /dev/null
+++ b/src/dispatch/messaging/email/templates/base.mjml
@@ -0,0 +1,67 @@
+
+
+ Dispatch
+ Dispatch
+
+
+
+
+
+
+
+
+
+
+ .body-section {
+ -webkit-box-shadow: 1px 4px 11px 0px rgba(0, 0, 0, 0.15);
+ -moz-box-shadow: 1px 4px 11px 0px rgba(0, 0, 0, 0.15);
+ box-shadow: 1px 4px 11px 0px rgba(0, 0, 0, 0.15);
+ }
+
+
+ .text-link {
+ color: #5e6ebf
+ }
+
+
+ .footer-link {
+ color: #888888
+ }
+
+
+
+
+
+
+
+ D i s p a t c h
+
+
+
+
+ {% block description %}{% endblock %}
+ {% block items %}{% endblock %}
+
+
+
+
+
+
+
+
+ For any questions about this message, please reach out to {{contact_fullname}} .
+
+
+
+
+
+
+
+
+ You are receiving this email because you were involved with an incident managed by Dispatch.
+
+
+
+
+
+
diff --git a/src/dispatch/messaging/email/templates/executive_report.mjml b/src/dispatch/messaging/email/templates/executive_report.mjml
new file mode 100644
index 000000000000..05c368119dd1
--- /dev/null
+++ b/src/dispatch/messaging/email/templates/executive_report.mjml
@@ -0,0 +1,74 @@
+{% extends 'templates/base.mjml' %}
+{% block description %}
+
+
+
+ {{ name }} - Executive Report
+
+
+ You're receiving this executive report, because you, or an e-mail distribution list you're on,
+ is a member of this incident's notifications Google group ({{ notifications_group }}).
+
+
+ Please be aware that this report is a point in time update and
+ the details below are likely to change as the incident evolves.
+ No direct action from you is currently required.
+
+
+
+{% endblock %}
+
+{% block items %}
+
+
+
+
+ Title
+
+
+ {{ title }}
+
+
+
+
+
+
+
+ Current Status
+
+
+ {{ current_status }}
+
+
+
+
+
+
+
+ Overview
+
+
+ {{ overview }}
+
+
+
+
+
+
+
+ Next Steps
+
+
+ {{ next_steps }}
+
+
+
+
+
+
+
+ A document version of the report can be found here . Please, use the document for questions and feedback.
+
+
+
+{% endblock %}
diff --git a/src/dispatch/messaging/email/templates/notification.mjml b/src/dispatch/messaging/email/templates/notification.mjml
new file mode 100644
index 000000000000..87d5e24cfe10
--- /dev/null
+++ b/src/dispatch/messaging/email/templates/notification.mjml
@@ -0,0 +1,16 @@
+{% extends "templates/base.mjml" %}
+{% block items %}
+
+
+ {% for item in items %}
+
+ {{ item.title }}
+
+
+ {{ item.text }}
+
+
+ {% endfor %}
+
+
+{% endblock %}
diff --git a/src/dispatch/messaging/email/templates/notification_list.mjml b/src/dispatch/messaging/email/templates/notification_list.mjml
new file mode 100644
index 000000000000..bca2ce643955
--- /dev/null
+++ b/src/dispatch/messaging/email/templates/notification_list.mjml
@@ -0,0 +1,41 @@
+{% extends 'templates/base.mjml' %}
+{% block description %}
+
+
+
+ {{ description }}
+
+
+
+{% endblock %}
+
+{% block items %}
+{% for item in items %}
+
+
+
+ {% for row in item %}
+ {% if row.title_link %}
+
+ {{ row.title }}
+
+ {% elif row.datetime %}
+
+ {{ row.title }}
+
+
+ {{ row.datetime | datetime }}
+
+ {% else %}
+
+ {{ row.title }}
+
+
+ {{ row.text }}
+
+ {% endif %}
+ {% endfor %}
+
+
+{% endfor %}
+{% endblock %}
diff --git a/src/dispatch/messaging/email/templates/tactical_report.mjml b/src/dispatch/messaging/email/templates/tactical_report.mjml
new file mode 100644
index 000000000000..23a65436c517
--- /dev/null
+++ b/src/dispatch/messaging/email/templates/tactical_report.mjml
@@ -0,0 +1,65 @@
+{% extends 'templates/base.mjml' %}
+{% block description %}
+
+
+
+ {{ name }} - Tactical Report
+
+
+ This message is for notification purposes only. No direct action of you
+ is currently required.
+
+
+ Please be aware that this is a point in time update and the details below will
+ likely change.
+
+
+
+{% endblock %}
+
+{% block items %}
+
+
+
+
+ Title
+
+
+ {{ title }}
+
+
+
+
+
+
+
+ Conditions
+
+
+ {{ conditions }}
+
+
+
+
+
+
+
+ Actions
+
+
+ {{ actions }}
+
+
+
+
+
+
+
+ Needs
+
+
+ {{ needs }}
+
+
+
+{% endblock %}
diff --git a/src/dispatch/messaging/email/utils.py b/src/dispatch/messaging/email/utils.py
new file mode 100644
index 000000000000..81f2a96dbdfe
--- /dev/null
+++ b/src/dispatch/messaging/email/utils.py
@@ -0,0 +1,122 @@
+import logging
+import os
+import subprocess
+import tempfile
+
+import jinja2.exceptions
+from dispatch.config import MJML_PATH
+
+
+from dispatch.messaging.strings import (
+ EVERGREEN_REMINDER_DESCRIPTION,
+ INCIDENT_DAILY_REPORT_DESCRIPTION,
+ INCIDENT_FEEDBACK_DAILY_REPORT_DESCRIPTION,
+ INCIDENT_TASK_REMINDER_DESCRIPTION,
+ CASE_FEEDBACK_DAILY_REPORT_DESCRIPTION,
+ MessageType,
+ render_message_template,
+)
+
+from .filters import env
+
+log = logging.getLogger(__name__)
+
+
+def get_template(message_type: MessageType, project_id: int):
+ """Fetches the correct template based on the message type."""
+ template_map = {
+ MessageType.incident_completed_form_notification: ("notification.mjml", None),
+ MessageType.incident_executive_report: ("executive_report.mjml", None),
+ MessageType.incident_notification: ("notification.mjml", None),
+ MessageType.case_notification: ("notification.mjml", None),
+ MessageType.incident_participant_welcome: ("notification.mjml", None),
+ MessageType.incident_tactical_report: ("tactical_report.mjml", None),
+ MessageType.case_participant_welcome: ("notification.mjml", None),
+ MessageType.incident_task_reminder: (
+ "notification_list.mjml",
+ INCIDENT_TASK_REMINDER_DESCRIPTION,
+ ),
+ MessageType.evergreen_reminder: (
+ "notification_list.mjml",
+ EVERGREEN_REMINDER_DESCRIPTION,
+ ),
+ MessageType.incident_feedback_daily_report: (
+ "notification_list.mjml",
+ INCIDENT_FEEDBACK_DAILY_REPORT_DESCRIPTION,
+ ),
+ MessageType.case_feedback_daily_report: (
+ "notification_list.mjml",
+ CASE_FEEDBACK_DAILY_REPORT_DESCRIPTION,
+ ),
+ MessageType.incident_daily_report: (
+ "notification_list.mjml",
+ INCIDENT_DAILY_REPORT_DESCRIPTION,
+ ),
+ }
+
+ template_key, description = template_map.get(message_type, (None, None))
+
+ if not template_key:
+ raise Exception(f"Unable to determine template. MessageType: {message_type}")
+
+ try:
+ template_path = os.path.join("templates", "project_id", f"{project_id}", template_key)
+ template = env.get_template(template_path)
+ except jinja2.exceptions.TemplateNotFound:
+ template_path = os.path.join("templates", template_key)
+ template = env.get_template(template_path)
+ log.debug("Resolved template path: %s", template_path)
+
+ return template, description
+
+
+def create_multi_message_body(
+ message_template: dict, message_type: MessageType, items: list, project_id: int, **kwargs
+):
+ """Creates a multi message message body based on message type."""
+ template, description = get_template(message_type, project_id)
+
+ master_map = []
+ for item in items:
+ master_map.append(render_message_template(message_template, **item))
+
+ kwargs.update({"items": master_map, "description": description})
+ return render_html(template.render(**kwargs))
+
+
+def create_message_body(
+ message_template: dict, message_type: MessageType, project_id: int, **kwargs
+):
+ """Creates the correct message body based on message type."""
+ template, description = get_template(message_type, project_id)
+
+ items_grouped_rendered = []
+ if kwargs.get("items_grouped"):
+ items_grouped_template = kwargs["items_grouped_template"]
+ for item in kwargs["items_grouped"]:
+ item_rendered = render_message_template(items_grouped_template, **item)
+ items_grouped_rendered.append(item_rendered)
+
+ kwargs.update({"items": items_grouped_rendered, "description": description})
+ return render_html(template.render(**kwargs))
+
+ items_rendered = render_message_template(message_template, **kwargs)
+ kwargs.update({"items": items_rendered, "description": description})
+ return render_html(template.render(**kwargs))
+
+
+def render_html(template):
+ """Uses the mjml cli to create html."""
+
+ with tempfile.NamedTemporaryFile("w+") as fp:
+ fp.write(template)
+ fp.flush()
+ process = subprocess.run(
+ ["./mjml", fp.name, "-s"],
+ cwd=MJML_PATH,
+ capture_output=True,
+ )
+ if process.stderr:
+ log.error(process.stderr.decode("utf-8"))
+ raise Exception("MJML template processing failed.")
+ return process.stdout.decode("utf-8")
diff --git a/src/dispatch/messaging/strings.py b/src/dispatch/messaging/strings.py
new file mode 100644
index 000000000000..db6fe1fc8715
--- /dev/null
+++ b/src/dispatch/messaging/strings.py
@@ -0,0 +1,1325 @@
+import copy
+
+
+from dispatch.messaging.email.filters import env
+from dispatch.conversation.enums import ConversationButtonActions
+from dispatch.incident.enums import IncidentStatus
+from dispatch.case.enums import CaseStatus
+from dispatch.enums import Visibility
+from dispatch.email_templates.models import EmailTemplates
+
+from dispatch import config
+from dispatch.enums import DispatchEnum, DocumentResourceTypes, DocumentResourceReferenceTypes
+
+"""Dict for reminder strings and values. Note values are in hours"""
+reminder_select_values = {
+ "thirty": {"message": "30 minutes", "value": 0.5},
+ "one_hour": {"message": "1 hour", "value": 1},
+ "two_hours": {"message": "2 hours", "value": 2},
+}
+
+
+class MessageType(DispatchEnum):
+ entity_update = "entity-update"
+ evergreen_reminder = "evergreen-reminder"
+ incident_closed_information_review_reminder = "incident-closed-information-review-reminder"
+ incident_completed_form_notification = "incident-completed-form-notification"
+ incident_daily_report = "incident-daily-report"
+ incident_weekly_report = "incident-weekly-report"
+ incident_executive_report = "incident-executive-report"
+ incident_feedback_daily_report = "incident-feedback-daily-report"
+ incident_management_help_tips = "incident-management-help-tips"
+ incident_notification = "incident-notification"
+ incident_open_tasks = "incident-open-tasks"
+ incident_participant_suggested_reading = "incident-participant-suggested-reading"
+ incident_participant_welcome = "incident-participant-welcome"
+ incident_rating_feedback = "incident-rating-feedback"
+ incident_status_reminder = "incident-status-reminder"
+ incident_tactical_report = "incident-tactical-report"
+ incident_task_list = "incident-task-list"
+ incident_task_reminder = "incident-task-reminder"
+ case_notification = "case-notification"
+ case_status_reminder = "case-status-reminder"
+ service_feedback = "service-feedback"
+ task_add_to_incident = "task-add-to-incident"
+ case_rating_feedback = "case-rating-feedback"
+ case_feedback_daily_report = "case-feedback-daily-report"
+ case_participant_welcome = "case-participant-welcome"
+
+
+INCIDENT_STATUS_DESCRIPTIONS = {
+ IncidentStatus.active: "This incident is under active investigation.",
+ IncidentStatus.stable: "This incident is stable, the bulk of the investigation has been completed or most of the risk has been mitigated.",
+ IncidentStatus.closed: "This no longer requires additional involvement, long term incident action items have been assigned to their respective owners.",
+}
+
+CASE_STATUS_DESCRIPTIONS = {
+ CaseStatus.new: "This case is new and needs triaging.",
+ CaseStatus.triage: "This case is being triaged.",
+ CaseStatus.escalated: "This case has been escalated.",
+ CaseStatus.stable: (
+ "This case is stable, the bulk of the investigation has been completed "
+ "or most of the risk has been mitigated."
+ ),
+ CaseStatus.closed: "This case has been closed.",
+}
+
+INCIDENT_VISIBILITY_DESCRIPTIONS = {
+ Visibility.open: "We ask that you use your best judgment while sharing details about this incident outside of the dedicated channels of communication. Please reach out to the Incident Commander if you have any questions.",
+ Visibility.restricted: "This incident is restricted to immediate participants of this incident. We ask that you exercise extra caution and discretion while talking about this incident outside of the dedicated channels of communication. Only invite new participants that are strictly necessary. Please reach out to the Incident Commander if you have any questions.",
+}
+
+CASE_VISIBILITY_DESCRIPTIONS = {
+ Visibility.open: "We ask that you use your best judgment while sharing details about this case outside of the dedicated channels of communication. Please reach out to the case assignee if you have any questions.",
+ Visibility.restricted: "This case is restricted to immediate participants of this case. We ask that you exercise extra caution and discretion while talking about this case outside of the dedicated channels of communication. Only invite new participants that are strictly necessary. Please reach out to the case assignee if you have any questions.",
+}
+
+EVERGREEN_REMINDER_DESCRIPTION = """
+You are the owner of the following resources in Dispatch.
+This is a reminder that these resources should be kept up to date in order to effectively
+respond to incidents. Please review and update them, or mark them as deprecated.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_FEEDBACK_DAILY_REPORT_DESCRIPTION = """
+This is a daily report of feedback about incidents handled by you.""".replace(
+ "\n", " "
+).strip()
+
+CASE_FEEDBACK_DAILY_REPORT_DESCRIPTION = """
+This is a daily report of feedback about cases handled by you.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_WEEKLY_REPORT_TITLE = """
+Incidents Weekly Report""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_WEEKLY_REPORT_DESCRIPTION = """
+This is an AI-generated weekly summary of incidents that have been marked as closed in the last week.
+NOTE: These summaries may contain errors or inaccuracies.
+Please verify the information before relying on it.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_WEEKLY_REPORT_NO_INCIDENTS_DESCRIPTION = """
+No open visibility incidents have been closed in the last week.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_DAILY_REPORT_TITLE = """
+Incidents Daily Report""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_DAILY_REPORT_DESCRIPTION = """
+This is a daily report of incidents that are currently active and incidents that have been marked as stable or closed in the last 24 hours.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_DAILY_REPORT_FOOTER_CONTEXT = """
+For questions about an incident, please reach out to the incident's commander.""".replace(
+ "\n", " "
+).strip()
+
+
+INCIDENT_REPORTER_DESCRIPTION = """
+The person who reported the incident. Contact them if the report details need clarification.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_COMMANDER_DESCRIPTION = """
+The Incident Commander (IC) is responsible for
+knowing the full context of the incident.
+Contact them about any questions or concerns.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_COMMANDER_READDED_DESCRIPTION = """
+{{ commander_fullname }} (Incident Commander) has been re-added to the conversation.
+Please, handoff the Incident Commander role before leaving the conversation.""".replace(
+ "\n", " "
+).strip()
+
+TICKET_DESCRIPTION = """
+Ticket for tracking purposes. It contains information and links to resources.""".replace(
+ "\n", " "
+).strip()
+
+TACTICAL_GROUP_DESCRIPTION = """
+Group for managing member access to storage. All participants get added to it.""".replace(
+ "\n", " "
+).strip()
+
+NOTIFICATIONS_GROUP_DESCRIPTION = """
+Group for email notification purposes. All participants get added to it.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_CONVERSATION_DESCRIPTION = """
+Private conversation for real-time discussion. All incident participants get added to it.
+""".replace(
+ "\n", " "
+).strip()
+
+CASE_CONVERSATION_REFERENCE_DOCUMENT_DESCRIPTION = """
+Document containing the list of slash commands available to the Assignee
+and participants in the case conversation.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_CONVERSATION_REFERENCE_DOCUMENT_DESCRIPTION = """
+Document containing the list of slash commands available to the Incident Commander (IC)
+and participants in the incident conversation.""".replace(
+ "\n", " "
+).strip()
+
+CASE_CONFERENCE_DESCRIPTION = """
+Video conference and phone bridge to be used throughout the case. Password: {{conference_challenge if conference_challenge else 'N/A'}}
+""".replace(
+ "\n", ""
+).strip()
+
+INCIDENT_CONFERENCE_DESCRIPTION = """
+Video conference and phone bridge to be used throughout the incident. Password: {{conference_challenge if conference_challenge else 'N/A'}}
+""".replace(
+ "\n", ""
+).strip()
+
+STORAGE_DESCRIPTION = """
+Common storage for all artifacts and
+documents. Add logs, screen captures, or any other data collected during the
+investigation to this folder. It is shared with all participants.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_INVESTIGATION_DOCUMENT_DESCRIPTION = """
+This is a document for all incident facts and context. All
+incident participants are expected to contribute to this document.
+It is shared with all incident participants.""".replace(
+ "\n", " "
+).strip()
+
+CASE_INVESTIGATION_DOCUMENT_DESCRIPTION = """
+This is a document for all investigation facts and context. All
+case participants are expected to contribute to this document.
+It is shared with all participants.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_INVESTIGATION_SHEET_DESCRIPTION = """
+This is a sheet for tracking impacted assets. All
+incident participants are expected to contribute to this sheet.
+It is shared with all incident participants.""".replace(
+ "\n", " "
+).strip()
+
+CASE_FAQ_DOCUMENT_DESCRIPTION = """
+First time responding to a case? This
+document answers common questions encountered when
+helping us respond to a case.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_FAQ_DOCUMENT_DESCRIPTION = """
+First time responding to an incident? This
+document answers common questions encountered when
+helping us respond to an incident.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_REVIEW_DOCUMENT_DESCRIPTION = """
+This document will capture all lessons learned, questions, and action items raised during the incident.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_EXECUTIVE_REPORT_DOCUMENT_DESCRIPTION = """
+This is a document that contains an executive report about the incident.""".replace(
+ "\n", " "
+).strip()
+
+DOCUMENT_DESCRIPTIONS = {
+ DocumentResourceReferenceTypes.conversation: INCIDENT_CONVERSATION_REFERENCE_DOCUMENT_DESCRIPTION,
+ DocumentResourceReferenceTypes.faq: INCIDENT_FAQ_DOCUMENT_DESCRIPTION,
+ DocumentResourceTypes.case: CASE_INVESTIGATION_DOCUMENT_DESCRIPTION,
+ DocumentResourceTypes.executive: INCIDENT_EXECUTIVE_REPORT_DOCUMENT_DESCRIPTION,
+ DocumentResourceTypes.incident: INCIDENT_INVESTIGATION_DOCUMENT_DESCRIPTION,
+ DocumentResourceTypes.review: INCIDENT_REVIEW_DOCUMENT_DESCRIPTION,
+ DocumentResourceTypes.tracking: INCIDENT_INVESTIGATION_SHEET_DESCRIPTION,
+}
+
+INCIDENT_RESOLUTION_DEFAULT = """
+Description of the actions taken to resolve the incident.
+""".replace(
+ "\n", " "
+).strip()
+
+CASE_RESOLUTION_DEFAULT = """
+Description of the actions taken to resolve the case.
+""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_COMPLETED_FORM_DESCRIPTION = """
+A new {{form_type}} form related to incident {{name}} has been
+submitted that requires your immediate attention. This form details
+aspects related to potential legal implications. You can review the
+detailed report by clicking on the link below. Please note, the information
+contained in this report is confidential.
+""".replace(
+ "\n", " "
+).strip()
+
+CASE_PARTICIPANT_WELCOME_DESCRIPTION = """
+You\'ve been added to this case, because we think you may
+be able to help resolve it. Please review the case details below and
+reach out to the assignee if you have any questions.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_PARTICIPANT_WELCOME_DESCRIPTION = """
+You\'ve been added to this incident, because we think you may
+be able to help resolve it. Please review the incident details below and
+reach out to the incident commander if you have any questions.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_PARTICIPANT_SUGGESTED_READING_DESCRIPTION = """
+Dispatch thinks the following documents might be
+relevant to this incident.""".replace(
+ "\n", " "
+).strip()
+
+NOTIFICATION_PURPOSES_FYI = """
+This message is for notification purposes only.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_TACTICAL_REPORT_DESCRIPTION = """
+The following conditions, actions, and needs summarize the current status of the incident.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_NEW_ROLE_DESCRIPTION = """
+{{assigner_fullname if assigner_fullname else assigner_email}} has assigned the role of {{assignee_role}} to {{assignee_fullname if assignee_fullname else assignee_email}}.
+Please, contact {{assignee_fullname if assignee_fullname else assignee_email}} about any questions or concerns.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_REPORT_REMINDER_DESCRIPTION = """You have not provided a {{report_type}} for this incident recently.
+You can use `{{command}}` in the conversation to assist you in writing one.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_REPORT_REMINDER_DELAYED_DESCRIPTION = """You asked me to send you this reminder to write a {{report_type}} for this incident.
+You can use `{{command}}` in the conversation to assist you in writing one.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_CLOSE_REMINDER_DESCRIPTION = """The status of this incident hasn't been updated recently.
+You can use `{{command}}` in the <{{conversation_weblink}}|conversation> to close the incident if it has been resolved and can be closed.""".replace(
+ "\n", " "
+).strip()
+
+CASE_TRIAGE_REMINDER_DESCRIPTION = """The status of this case hasn't been updated recently.
+Please ensure you triage the case based on its priority and update its title, priority, severity and tags.""".replace(
+ "\n", " "
+).strip()
+
+CASE_CLOSE_REMINDER_DESCRIPTION = """The status of this case hasn't been updated recently.
+You can use the case 'Resolve' button if it has been resolved and can be closed.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_TASK_NEW_DESCRIPTION = """
+The following incident task has been created and assigned to you by {{task_creator}}: {{task_description}}"""
+
+INCIDENT_TASK_RESOLVED_DESCRIPTION = """
+The following incident task has been resolved: {{task_description}}"""
+
+INCIDENT_TASK_REMINDER_DESCRIPTION = """
+The following incident tasks are assigned to you.
+This is a reminder that these tasks have passed their due date.
+Please review and mark them as resolved if appropriate. Resolving them will stop the reminders.""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_TASK_LIST_DESCRIPTION = """The following are open incident tasks."""
+
+INCIDENT_OPEN_TASKS_DESCRIPTION = """
+Please resolve or transfer ownership of all the open incident tasks assigned to you in the incident documents or using the <{{dispatch_ui_url}}|Dispatch Web UI>,
+then wait about 30 seconds for Dispatch to update the tasks before leaving the incident conversation.
+""".replace(
+ "\n", " "
+).strip()
+
+INCIDENT_TASK_ADD_TO_INCIDENT_DESCRIPTION = """
+You have been added to this incident because you were assigned a task related to it. View all tasks for this incident using the <{{dispatch_ui_url}}|Dispatch Web UI>
+\n\n *Task Description:* {{task_description}}
+\n\n *Link to task in document:* {{task_weblink}}
+"""
+
+INCIDENT_MONITOR_CREATED_DESCRIPTION = """
+A new monitor instance has been created.
+\n\n *Weblink:* {{weblink}}
+"""
+
+INCIDENT_MONITOR_UPDATE_DESCRIPTION = """
+This monitor detected a change in state. State has changed from *{{ monitor_state_old }}* to *{{ monitor_state_new }}*.
+"""
+
+INCIDENT_MONITOR_IGNORED_DESCRIPTION = """
+This monitor is now ignored. Dispatch won't remind this incident channel about it again.
+\n\n *Weblink:* {{weblink}}
+"""
+
+INCIDENT_WORKFLOW_CREATED_DESCRIPTION = """
+A new workflow instance has been created.
+\n\n *Creator:* {{instance_creator_name}}
+"""
+
+INCIDENT_WORKFLOW_UPDATE_DESCRIPTION = """
+This workflow's status has changed from *{{ instance_status_old }}* to *{{ instance_status_new }}*.
+\n\n*Workflow Description*: {{workflow_description}}
+\n\n *Creator:* {{instance_creator_name}}
+"""
+
+INCIDENT_WORKFLOW_COMPLETE_DESCRIPTION = """
+This workflow's status has changed from *{{ instance_status_old }}* to *{{ instance_status_new }}*.
+\n\n *Workflow Description:* {{workflow_description}}
+\n\n *Creator:* {{instance_creator_name}}
+{% if instance_artifacts %}
+\n\n *Workflow Artifacts:*
+\n\n {% for a in instance_artifacts %}- <{{a.weblink}}|{{a.name}}> \n\n{% endfor %}
+{% endif %}
+"""
+
+INCIDENT_CLOSED_INFORMATION_REVIEW_REMINDER_DESCRIPTION = """
+Thanks for closing incident {{name}}. Please, take a minute to review and update the following incident information in the <{{dispatch_ui_incident_url}}|Dispatch Web UI>:
+\n âĸ Incident Title: {{title}}
+\n âĸ Incident Description: {{description}}
+\n âĸ Incident Resolution: {{resolution}}
+\n âĸ Incident Type: {{type}}
+\n âĸ Incident Severity: {{severity}}
+\n âĸ Incident Priority: {{priority}}
+\n Also, please consider taking the following actions:
+\n âĸ Update or add any relevant tags to the incident using the <{{dispatch_ui_incident_url}}|Dispatch Web UI>.
+\n âĸ Add any relevant, non-operational costs to the incident using the <{{dispatch_ui_incident_url}}|Dispatch Web UI>.
+\n âĸ Review and close any incident tasks that are no longer relevant or required.
+"""
+
+INCIDENT_CLOSED_RATING_FEEDBACK_DESCRIPTION = """
+Thanks for participating in the {{name}} ("{{title}}") incident. We would appreciate if you could rate your experience and provide feedback."""
+
+CASE_CLOSED_RATING_FEEDBACK_DESCRIPTION = """
+Thanks for participating in the {{name}} ("{{title}}") case. We would appreciate if you could rate your experience and provide feedback."""
+
+INCIDENT_MANAGEMENT_HELP_TIPS_MESSAGE_DESCRIPTION = """
+Hey, I see you're the Incident Commander for <{{conversation_weblink}}|{{name}}> ("{{title}}"). Here are a few things to consider when managing the incident:
+\n âĸ Keep the incident and its status up to date using the Slack `{{update_command}}` command.
+\n âĸ Invite incident participants and team oncalls by mentioning them in the incident channel or using the Slack `{{engage_oncall_command}}` command.
+\n âĸ Keep incident participants and stakeholders informed by creating tactical and executive reports using the `{{tactical_report_command}}` and `{{executive_report_command}}` commands.
+\n âĸ Get links to incident resources from the <{{dispatch_ui_incident_url}}|Dispatch Web UI> or bookmarks in the incident conversation.
+\n
+To find a Slack command, simply type `/` in the message field or click the lightning bolt icon to the left of the message field.
+"""
+
+ONCALL_SHIFT_FEEDBACK_DESCRIPTION = """
+Hi {{ individual_name }}, it appears that your {{ oncall_service_name }} shift recently completed on {{ shift_end_at }} UTC. To help us understand the impact on our responders, we would appreciate your feedback."""
+
+ONCALL_SHIFT_FEEDBACK_RECEIVED_DESCRIPTION = """
+We received your feedback for your shift that ended {{ shift_end_at }} UTC. Thank you!"""
+
+
+INCIDENT_NAME_WITH_ENGAGEMENT = {
+ "title": "đ¨ {{name}} Incident Notification",
+ "title_link": "{{ticket_weblink}}",
+ "text": NOTIFICATION_PURPOSES_FYI,
+ "buttons": [
+ {
+ "button_text": "Subscribe",
+ "button_value": "{{organization_slug}}-{{incident_id}}",
+ "button_action": ConversationButtonActions.subscribe_user,
+ },
+ {
+ "button_text": "Join",
+ "button_value": "{{organization_slug}}-{{incident_id}}",
+ "button_action": ConversationButtonActions.invite_user,
+ },
+ ],
+}
+
+INCIDENT_NAME_WITH_ENGAGEMENT_NO_DESCRIPTION = {
+ "title": "{{name}}",
+ "title_link": "{{ticket_weblink}}",
+ "text": "{{ignore}}",
+ "buttons": [
+ {
+ "button_text": "Subscribe",
+ "button_value": "{{organization_slug}}-{{incident_id}}",
+ "button_action": ConversationButtonActions.subscribe_user,
+ },
+ {
+ "button_text": "Join",
+ "button_value": "{{organization_slug}}-{{incident_id}}",
+ "button_action": ConversationButtonActions.invite_user,
+ },
+ ],
+}
+
+INCIDENT_NAME_WITH_ENGAGEMENT_NO_SELF_JOIN = {
+ "title": "đ¨ {{name}} Incident Notification",
+ "title_link": "{{ticket_weblink}}",
+ "text": NOTIFICATION_PURPOSES_FYI,
+ "buttons": [
+ {
+ "button_text": "Subscribe",
+ "button_value": "{{organization_slug}}-{{incident_id}}",
+ "button_action": ConversationButtonActions.subscribe_user,
+ },
+ ],
+}
+
+CASE_NAME = {
+ "title": "đŧ {{name}} Case Notification",
+ "title_link": "{{ticket_weblink}}",
+ "text": NOTIFICATION_PURPOSES_FYI,
+}
+
+CASE_NAME_WITH_ENGAGEMENT = {
+ "title": "đŧ {{name}} Case Notification",
+ "title_link": "{{ticket_weblink}}",
+ "text": NOTIFICATION_PURPOSES_FYI,
+ "buttons": [
+ {
+ "button_text": "Join",
+ "button_value": "{{organization_slug}}-{{case_id}}",
+ "button_action": ConversationButtonActions.invite_user_case,
+ },
+ ],
+}
+
+CASE_NAME_WITH_ENGAGEMENT_NO_DESCRIPTION = {
+ "title": "{{name}}",
+ "title_link": "{{ticket_weblink}}",
+ "text": "{{ignore}}",
+ "buttons": [
+ {
+ "button_text": "Join",
+ "button_value": "{{organization_slug}}-{{case_id}}",
+ "button_action": ConversationButtonActions.invite_user_case,
+ },
+ ],
+}
+
+CASE_NAME_WITH_ENGAGEMENT_NO_SELF_JOIN = {
+ "title": "đŧ {{name}} Case Notification",
+ "title_link": "{{ticket_weblink}}",
+ "text": NOTIFICATION_PURPOSES_FYI,
+}
+
+
+CASE_STATUS_CHANGE = {
+ "title": "{% set status_emojis = {'Closed': 'â
', 'New': 'đ', 'Triage': 'đ', 'Stable': 'đĄī¸', 'Escalated': 'âŦī¸'} %}{{ status_emojis.get(case_status_new, 'đ') }} Status Change",
+ "text": "{{ case_status_old }} â {{ case_status_new }}",
+}
+
+CASE_TYPE_CHANGE = {
+ "title": "đˇī¸ Case Type Change",
+ "text": "{{ case_type_old }} â {{ case_type_new }}",
+}
+
+CASE_SEVERITY_CHANGE = {
+ "title": "{% if case_severity_old.view_order < case_severity_new.view_order %}âŦī¸{% elif case_severity_old.view_order > case_severity_new.view_order %}âŦī¸{% else %}âī¸{% endif %} Severity Change",
+ "text": "{{ case_severity_old.name }} â {{ case_severity_new.name }}",
+}
+
+CASE_PRIORITY_CHANGE = {
+ "title": "{% if case_priority_old.view_order < case_priority_new.view_order %}âŦī¸{% elif case_priority_old.view_order > case_priority_new.view_order %}âŦī¸{% else %}âī¸{% endif %} Priority Change",
+ "text": "{{ case_priority_old.name }} â {{ case_priority_new.name }}",
+}
+
+CASE_VISIBILITY_CHANGE = {
+ "title": "{% set visibility_emojis = {'Open': 'đ', 'Restricted': 'đ'} %}{{ visibility_emojis.get(case_visibility_new, 'đī¸') }} Visibility Change",
+ "text": "{{ case_visibility_old }} â {{ case_visibility_new }}",
+}
+
+INCIDENT_NAME = {
+ "title": "đ¨ {{name}} Incident Notification",
+ "title_link": "{{ticket_weblink}}",
+ "text": NOTIFICATION_PURPOSES_FYI,
+}
+
+INCIDENT_NAME_SUMMARY = {
+ "title": "{{name}} Incident Summary",
+ "title_link": "{{ticket_weblink}}",
+ "text": "{{ignore}}",
+}
+
+INCIDENT_SUMMARY = {"title": "Summary", "text": "{{summary}}"}
+
+INCIDENT_TITLE = {"title": "đ Title", "text": "{{title}}"}
+
+CASE_TITLE = {"title": "đ Title", "text": "{{title}}"}
+
+CASE_STATUS = {
+ "title": "Status - {{status}}",
+ "status_mapping": CASE_STATUS_DESCRIPTIONS,
+}
+
+FORM_TYPE_DESCRIPTION = {
+ "title": "{{form_type}} form for incident {{name}}",
+ "title_link": "{{form_weblink}}",
+ "text": "{{form_type_description}}",
+}
+
+INCIDENT_DATA_SECTION = {
+ "type": "context",
+ "text": "Key details about incident {{name}}",
+}
+
+if config.DISPATCH_MARKDOWN_IN_INCIDENT_DESC:
+ INCIDENT_DESCRIPTION = {"title": "Description", "text": "{{description | markdown}}"}
+else:
+ INCIDENT_DESCRIPTION = {"title": "Description", "text": "{{description}}"}
+
+INCIDENT_STATUS = {
+ "title": "Status - {{status}}",
+ "status_mapping": INCIDENT_STATUS_DESCRIPTIONS,
+}
+
+INCIDENT_VISIBILITY = {
+ "title": "Visibility - {{visibility}}",
+ "visibility_mapping": INCIDENT_VISIBILITY_DESCRIPTIONS,
+}
+
+INCIDENT_TYPE = {"title": "Type - {{type}}", "text": "{{type_description}}"}
+
+INCIDENT_SEVERITY = {
+ "title": "Severity - {{severity}}",
+ "text": "{{severity_description}}",
+}
+
+INCIDENT_SEVERITY_FYI = {
+ "title": "Severity - {{severity}}",
+ "text": "{{severity_description}}",
+}
+INCIDENT_PRIORITY = {
+ "title": "Priority - {{priority}}",
+ "text": "{{priority_description}}",
+}
+
+INCIDENT_PRIORITY_FYI = {
+ "title": "Priority - {{priority}}",
+ "text": "{{priority_description}}",
+}
+
+INCIDENT_REPORTER = {
+ "title": "Reporter - {{reporter_fullname}}, {{reporter_team}}",
+ "title_link": "{{reporter_weblink}}",
+ "text": INCIDENT_REPORTER_DESCRIPTION,
+}
+
+INCIDENT_COMMANDER = {
+ "title": "đ§âđ Commander - {{commander_fullname}}, {{commander_team}}",
+ "title_link": "{{commander_weblink}}",
+ "text": INCIDENT_COMMANDER_DESCRIPTION,
+}
+
+INCIDENT_COMMANDER_SUMMARY = {
+ "title": "Commander - {{commander_fullname}}, {{commander_team}}",
+ "title_link": "{{commander_weblink}}",
+ "text": "{{ignore}}",
+}
+
+INCIDENT_CONFERENCE = {
+ "title": "Conference",
+ "title_link": "{{conference_weblink}}",
+ "text": INCIDENT_CONFERENCE_DESCRIPTION,
+}
+
+INCIDENT_STORAGE = {
+ "title": "Storage",
+ "title_link": "{{storage_weblink}}",
+ "text": STORAGE_DESCRIPTION,
+}
+
+INCIDENT_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT = {
+ "title": "Incident Conversation Commands Reference Document",
+ "title_link": "{{conversation_commands_reference_document_weblink}}",
+ "text": INCIDENT_CONVERSATION_REFERENCE_DOCUMENT_DESCRIPTION,
+}
+
+INCIDENT_INVESTIGATION_DOCUMENT = {
+ "title": "Investigation Document",
+ "title_link": "{{document_weblink}}",
+ "text": INCIDENT_INVESTIGATION_DOCUMENT_DESCRIPTION,
+}
+
+INCIDENT_REVIEW_DOCUMENT = {
+ "title": "Review Document",
+ "title_link": "{{review_document_weblink}}",
+ "text": INCIDENT_REVIEW_DOCUMENT_DESCRIPTION,
+}
+
+INCIDENT_FAQ_DOCUMENT = {
+ "title": "FAQ Document",
+ "title_link": "{{faq_weblink}}",
+ "text": INCIDENT_FAQ_DOCUMENT_DESCRIPTION,
+}
+
+INCIDENT_STATUS_CHANGE = {
+ "title": "{% set status_emojis = {'Closed': 'â
', 'Stable': 'đĄī¸', 'Active': 'đĨ'} %}{{ status_emojis.get(incident_status_new, 'đ') }} Status Change",
+ "text": "{{ incident_status_old }} â {{ incident_status_new }}",
+}
+
+INCIDENT_TYPE_CHANGE = {
+ "title": "đˇī¸ Incident Type Change",
+ "text": "{{ incident_type_old }} â {{ incident_type_new }}",
+}
+
+INCIDENT_SEVERITY_CHANGE = {
+ "title": "{% if incident_severity_old.view_order < incident_severity_new.view_order %}âŦī¸{% elif incident_severity_old.view_order > incident_severity_new.view_order %}âŦī¸{% else %}âī¸{% endif %} Severity Change",
+ "text": "{{ incident_severity_old.name }} â {{ incident_severity_new.name }}",
+}
+
+INCIDENT_PRIORITY_CHANGE = {
+ "title": "{% if incident_priority_old.view_order < incident_priority_new.view_order %}âŦī¸{% elif incident_priority_old.view_order > incident_priority_new.view_order %}âŦī¸{% else %}âī¸{% endif %} Priority Change",
+ "text": "{{ incident_priority_old.name }} â {{ incident_priority_new.name }}",
+}
+
+INCIDENT_PARTICIPANT_SUGGESTED_READING_ITEM = {
+ "title": "{{name}}",
+ "title_link": "{{weblink}}",
+ "text": "{{description}}",
+}
+
+INCIDENT_PARTICIPANT_WELCOME = {
+ "title": "Welcome to {{name}}",
+ "title_link": "{{ticket_weblink}}",
+ "text": INCIDENT_PARTICIPANT_WELCOME_DESCRIPTION,
+}
+
+INCIDENT_COMPLETED_FORM = {
+ "title": "Completed {{form_type}} form notification for {{name}}",
+ "text": INCIDENT_COMPLETED_FORM_DESCRIPTION,
+}
+
+INCIDENT_PARTICIPANT_WELCOME_MESSAGE = [
+ INCIDENT_PARTICIPANT_WELCOME,
+ INCIDENT_TITLE,
+ INCIDENT_DESCRIPTION,
+ INCIDENT_VISIBILITY,
+ INCIDENT_STATUS,
+ INCIDENT_TYPE,
+ INCIDENT_SEVERITY,
+ INCIDENT_PRIORITY,
+ INCIDENT_REPORTER,
+ INCIDENT_COMMANDER,
+ INCIDENT_INVESTIGATION_DOCUMENT,
+ INCIDENT_STORAGE,
+ INCIDENT_CONFERENCE,
+ INCIDENT_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT,
+ INCIDENT_FAQ_DOCUMENT,
+]
+
+INCIDENT_COMPLETED_FORM_MESSAGE = [
+ INCIDENT_COMPLETED_FORM,
+ FORM_TYPE_DESCRIPTION,
+ INCIDENT_DATA_SECTION,
+ INCIDENT_TITLE,
+ INCIDENT_DESCRIPTION,
+ INCIDENT_STATUS,
+ INCIDENT_COMMANDER,
+]
+
+INCIDENT_NOTIFICATION_COMMON = [INCIDENT_TITLE]
+
+INCIDENT_NOTIFICATION = INCIDENT_NOTIFICATION_COMMON.copy()
+INCIDENT_NOTIFICATION.extend(
+ [
+ INCIDENT_DESCRIPTION,
+ INCIDENT_STATUS,
+ INCIDENT_TYPE,
+ INCIDENT_SEVERITY_FYI,
+ INCIDENT_PRIORITY_FYI,
+ INCIDENT_REPORTER,
+ INCIDENT_COMMANDER,
+ ]
+)
+
+INCIDENT_TACTICAL_REPORT = [
+ {"title": "Incident Tactical Report", "text": INCIDENT_TACTICAL_REPORT_DESCRIPTION},
+ {"title": "Conditions", "text": "{{conditions}}"},
+ {"title": "Actions", "text": "{{actions}}"},
+ {"title": "Needs", "text": "{{needs}}"},
+]
+
+INCIDENT_EXECUTIVE_REPORT = [
+ {"title": "Incident Title", "text": "{{title}}"},
+ {"title": "Current Status", "text": "{{current_status}}"},
+ {"title": "Overview", "text": "{{overview}}"},
+ {"title": "Next Steps", "text": "{{next_steps}}"},
+]
+
+REMIND_AGAIN_OPTIONS = {
+ "text": "[Optional] Remind me again in:",
+ "select": {
+ "placeholder": "Choose a time value",
+ "select_action": ConversationButtonActions.remind_again,
+ "options": [
+ {
+ "option_text": value["message"],
+ "option_value": "{{organization_slug}}-{{incident_id}}-{{report_type}}-" + key,
+ }
+ for key, value in reminder_select_values.items()
+ ],
+ },
+}
+
+INCIDENT_REPORT_REMINDER = [
+ {
+ "title": "{{name}} Incident - {{report_type}} Reminder",
+ "title_link": "{{ticket_weblink}}",
+ "text": INCIDENT_REPORT_REMINDER_DESCRIPTION,
+ },
+ INCIDENT_TITLE,
+ REMIND_AGAIN_OPTIONS,
+]
+
+INCIDENT_REPORT_REMINDER_DELAYED = [
+ {
+ "title": "{{name}} Incident - {{report_type}} Reminder",
+ "title_link": "{{ticket_weblink}}",
+ "text": INCIDENT_REPORT_REMINDER_DELAYED_DESCRIPTION,
+ },
+ INCIDENT_TITLE,
+ REMIND_AGAIN_OPTIONS,
+]
+
+
+INCIDENT_CLOSE_REMINDER = [
+ {
+ "title": "{{name}} Incident - Close Reminder",
+ "title_link": "{{dispatch_ui_incident_url}}",
+ "text": INCIDENT_CLOSE_REMINDER_DESCRIPTION,
+ },
+ INCIDENT_TITLE,
+ INCIDENT_STATUS,
+]
+
+CASE_DESCRIPTION = {"title": "Description", "text": "{{description}}"}
+
+CASE_VISIBILITY = {
+ "title": "Visibility - {{visibility}}",
+ "visibility_mapping": CASE_VISIBILITY_DESCRIPTIONS,
+}
+
+CASE_TYPE = {"title": "Type - {{type}}", "text": "{{type_description}}"}
+
+CASE_SEVERITY = {
+ "title": "Severity - {{severity}}",
+ "text": "{{severity_description}}",
+}
+
+CASE_PRIORITY = {
+ "title": "Priority - {{priority}}",
+ "text": "{{priority_description}}",
+}
+
+CASE_CLOSE_REMINDER = [
+ {
+ "title": "{{name}} Case - Close Reminder",
+ "title_link": "{{dispatch_ui_case_url}}",
+ "text": CASE_CLOSE_REMINDER_DESCRIPTION,
+ },
+ CASE_TITLE,
+ CASE_STATUS,
+]
+
+CASE_TRIAGE_REMINDER = [
+ {
+ "title": "{{name}} Case - Triage Reminder",
+ "title_link": "{{dispatch_ui_case_url}}",
+ "text": CASE_TRIAGE_REMINDER_DESCRIPTION,
+ },
+ CASE_TITLE,
+ CASE_STATUS,
+]
+
+CASE_ASSIGNEE_DESCRIPTION = """
+The Case Assignee is responsible for
+knowing the full context of the case.
+Contact them about any questions or concerns.""".replace(
+ "\n", " "
+).strip()
+
+CASE_REPORTER_DESCRIPTION = """
+The person who reported the case. Contact them if the report details need clarification.""".replace(
+ "\n", " "
+).strip()
+
+CASE_REPORTER = {
+ "title": "Reporter - {{reporter_fullname}}, {{reporter_team}}",
+ "title_link": "{{reporter_weblink}}",
+ "text": CASE_REPORTER_DESCRIPTION,
+}
+
+CASE_ASSIGNEE = {
+ "title": "đĩī¸ââī¸ Assignee - {{assignee_fullname}}, {{assignee_team}}",
+ "title_link": "{{assignee_weblink}}",
+ "text": CASE_ASSIGNEE_DESCRIPTION,
+}
+
+CASE_CONFERENCE = {
+ "title": "Conference",
+ "title_link": "{{conference_weblink}}",
+ "text": CASE_CONFERENCE_DESCRIPTION,
+}
+
+CASE_STORAGE = {
+ "title": "Storage",
+ "title_link": "{{storage_weblink}}",
+ "text": STORAGE_DESCRIPTION,
+}
+
+CASE_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT = {
+ "title": "Incident Conversation Commands Reference Document",
+ "title_link": "{{conversation_commands_reference_document_weblink}}",
+ "text": CASE_CONVERSATION_REFERENCE_DOCUMENT_DESCRIPTION,
+}
+
+CASE_INVESTIGATION_DOCUMENT = {
+ "title": "Investigation Document",
+ "title_link": "{{document_weblink}}",
+ "text": CASE_INVESTIGATION_DOCUMENT_DESCRIPTION,
+}
+
+
+CASE_FAQ_DOCUMENT = {
+ "title": "FAQ Document",
+ "title_link": "{{faq_weblink}}",
+ "text": CASE_FAQ_DOCUMENT_DESCRIPTION,
+}
+
+CASE_PARTICIPANT_WELCOME = {
+ "title": "Welcome to {{name}}",
+ "title_link": "{{ticket_weblink}}",
+ "text": CASE_PARTICIPANT_WELCOME_DESCRIPTION,
+}
+
+CASE_NOTIFICATION_COMMON = [CASE_TITLE]
+
+CASE_NOTIFICATION = CASE_NOTIFICATION_COMMON.copy()
+CASE_NOTIFICATION.extend(
+ [
+ INCIDENT_DESCRIPTION,
+ CASE_STATUS,
+ INCIDENT_TYPE,
+ INCIDENT_SEVERITY_FYI,
+ INCIDENT_PRIORITY_FYI,
+ CASE_REPORTER,
+ CASE_ASSIGNEE,
+ ]
+)
+
+CASE_PARTICIPANT_WELCOME_MESSAGE = [
+ CASE_PARTICIPANT_WELCOME,
+ CASE_TITLE,
+ CASE_DESCRIPTION,
+ CASE_VISIBILITY,
+ CASE_STATUS,
+ CASE_TYPE,
+ CASE_SEVERITY,
+ CASE_PRIORITY,
+ CASE_REPORTER,
+ CASE_ASSIGNEE,
+ CASE_INVESTIGATION_DOCUMENT,
+ CASE_STORAGE,
+ CASE_CONFERENCE,
+ CASE_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT,
+ CASE_FAQ_DOCUMENT,
+]
+
+
+INCIDENT_TASK_REMINDER = [
+ {"title": "Incident - {{ name }}", "text": "{{ title }}"},
+ {"title": "Creator", "text": "{{ creator }}"},
+ {"title": "Description", "text": "{{ description }}"},
+ {"title": "Priority", "text": "{{ priority }}"},
+ {"title": "Created At", "text": "", "datetime": "{{ created_at}}"},
+ {"title": "Resolve By", "text": "", "datetime": "{{ resolve_by }}"},
+ {"title": "Link", "text": "{{ weblink }}"},
+]
+
+EVERGREEN_REMINDER = [
+ {"title": "Project", "text": "{{ project }}"},
+ {"title": "Type", "text": "{{ resource_type }}"},
+ {"title": "Name", "text": "{{ name }}"},
+ {"title": "Description", "text": "{{ description }}"},
+ {"title": "Link", "text": "{{ weblink }}"},
+]
+
+INCIDENT_NEW_ROLE_NOTIFICATION = [
+ {
+ "title": "New {{assignee_role}} - {{assignee_fullname if assignee_fullname else assignee_email}}",
+ "title_link": "{{assignee_weblink}}",
+ "text": INCIDENT_NEW_ROLE_DESCRIPTION,
+ }
+]
+
+INCIDENT_TASK_NEW_NOTIFICATION = [
+ {
+ "title": "New Incident Task",
+ "title_link": "{{task_weblink}}",
+ "text": INCIDENT_TASK_NEW_DESCRIPTION,
+ }
+]
+
+INCIDENT_TASK_RESOLVED_NOTIFICATION = [
+ {
+ "title": "Resolved Incident Task",
+ "title_link": "{{task_weblink}}",
+ "text": INCIDENT_TASK_RESOLVED_DESCRIPTION,
+ }
+]
+
+INCIDENT_MONITOR_CREATED_NOTIFICATION = [
+ {
+ "title": "Monitor Created",
+ "title_link": "{{weblink}}",
+ "text": INCIDENT_MONITOR_CREATED_DESCRIPTION,
+ }
+]
+
+INCIDENT_MONITOR_UPDATE_NOTIFICATION = [
+ {
+ "title": "Monitor Status Change",
+ "title_link": "{{weblink}}",
+ "text": INCIDENT_MONITOR_UPDATE_DESCRIPTION,
+ }
+]
+
+INCIDENT_MONITOR_IGNORE_NOTIFICATION = [
+ {
+ "title": "Monitor Ignored",
+ "title_link": "{{weblink}}",
+ "text": INCIDENT_MONITOR_IGNORED_DESCRIPTION,
+ }
+]
+
+INCIDENT_WORKFLOW_CREATED_NOTIFICATION = [
+ {
+ "title": "Workflow Created - {{workflow_name}}",
+ "text": INCIDENT_WORKFLOW_CREATED_DESCRIPTION,
+ }
+]
+
+INCIDENT_WORKFLOW_UPDATE_NOTIFICATION = [
+ {
+ "title": "Workflow Status Change - {{workflow_name}}",
+ "title_link": "{{instance_weblink}}",
+ "text": INCIDENT_WORKFLOW_UPDATE_DESCRIPTION,
+ }
+]
+
+INCIDENT_WORKFLOW_COMPLETE_NOTIFICATION = [
+ {
+ "title": "Workflow Completed - {{workflow_name}}",
+ "title_link": "{{instance_weblink}}",
+ "text": INCIDENT_WORKFLOW_COMPLETE_DESCRIPTION,
+ }
+]
+
+INCIDENT_COMMANDER_READDED_NOTIFICATION = [
+ {"title": "Incident Commander Re-Added", "text": INCIDENT_COMMANDER_READDED_DESCRIPTION}
+]
+
+INCIDENT_CLOSED_INFORMATION_REVIEW_REMINDER_NOTIFICATION = [
+ {
+ "title": "{{name}} Incident - Information Review Reminder",
+ "title_link": "{{dispatch_ui_incident_url}}",
+ "text": INCIDENT_CLOSED_INFORMATION_REVIEW_REMINDER_DESCRIPTION,
+ }
+]
+
+INCIDENT_CLOSED_RATING_FEEDBACK_NOTIFICATION = [
+ {
+ "title": "{{name}} Incident - Rating and Feedback",
+ "title_link": "{{ticket_weblink}}",
+ "text": INCIDENT_CLOSED_RATING_FEEDBACK_DESCRIPTION,
+ "buttons": [
+ {
+ "button_text": "Provide Feedback",
+ "button_value": "{{organization_slug}}-{{incident_id}}",
+ "button_action": ConversationButtonActions.feedback_notification_provide,
+ }
+ ],
+ }
+]
+
+CASE_CLOSED_RATING_FEEDBACK_NOTIFICATION = [
+ {
+ "title": "{{name}} Case - Rating and Feedback",
+ "title_link": "{{ticket_weblink}}",
+ "text": CASE_CLOSED_RATING_FEEDBACK_DESCRIPTION,
+ "buttons": [
+ {
+ "button_text": "Provide Feedback",
+ "button_value": "{{organization_slug}}-{{case_id}}",
+ "button_action": ConversationButtonActions.case_feedback_notification_provide,
+ }
+ ],
+ }
+]
+
+INCIDENT_FEEDBACK_DAILY_REPORT = [
+ {"title": "Incident", "text": "{{ name }}"},
+ {"title": "Incident Title", "text": "{{ title }}"},
+ {"title": "Rating", "text": "{{ rating }}"},
+ {"title": "Feedback", "text": "{{ feedback }}"},
+ {"title": "Participant", "text": "{{ participant }}"},
+ {"title": "Created At", "text": "", "datetime": "{{ created_at}}"},
+]
+
+CASE_FEEDBACK_DAILY_REPORT = [
+ {"title": "Case", "text": "{{ name }}"},
+ {"title": "Case Title", "text": "{{ title }}"},
+ {"title": "Rating", "text": "{{ rating }}"},
+ {"title": "Feedback", "text": "{{ feedback }}"},
+ {"title": "Participant", "text": "{{ participant }}"},
+ {"title": "Created At", "text": "", "datetime": "{{ created_at}}"},
+]
+
+INCIDENT_WEEKLY_REPORT_HEADER = {
+ "type": "header",
+ "text": INCIDENT_WEEKLY_REPORT_TITLE,
+}
+
+INCIDENT_WEEKLY_REPORT_HEADER_DESCRIPTION = {
+ "text": INCIDENT_WEEKLY_REPORT_DESCRIPTION,
+}
+
+INCIDENT_WEEKLY_REPORT_HEADER_NO_INCIDENTS_DESCRIPTION = {
+ "text": INCIDENT_WEEKLY_REPORT_NO_INCIDENTS_DESCRIPTION,
+}
+
+INCIDENT_DAILY_REPORT_HEADER = {
+ "type": "header",
+ "text": INCIDENT_DAILY_REPORT_TITLE,
+}
+
+INCIDENT_DAILY_REPORT_HEADER_DESCRIPTION = {
+ "text": INCIDENT_DAILY_REPORT_DESCRIPTION,
+}
+
+INCIDENT_DAILY_REPORT_FOOTER = {
+ "type": "context",
+ "text": INCIDENT_DAILY_REPORT_FOOTER_CONTEXT,
+}
+
+INCIDENT_DAILY_REPORT = [
+ INCIDENT_DAILY_REPORT_HEADER,
+ INCIDENT_DAILY_REPORT_HEADER_DESCRIPTION,
+ INCIDENT_DAILY_REPORT_FOOTER,
+]
+
+INCIDENT_WEEKLY_REPORT = [
+ INCIDENT_WEEKLY_REPORT_HEADER,
+ INCIDENT_WEEKLY_REPORT_HEADER_DESCRIPTION,
+ INCIDENT_DAILY_REPORT_FOOTER,
+]
+
+INCIDENT_WEEKLY_REPORT_NO_INCIDENTS = [
+ INCIDENT_WEEKLY_REPORT_HEADER,
+ INCIDENT_WEEKLY_REPORT_HEADER_NO_INCIDENTS_DESCRIPTION,
+]
+
+INCIDENT = [
+ INCIDENT_NAME_WITH_ENGAGEMENT_NO_DESCRIPTION,
+ INCIDENT_TITLE,
+ INCIDENT_STATUS,
+ INCIDENT_TYPE,
+ INCIDENT_SEVERITY,
+ INCIDENT_PRIORITY,
+ INCIDENT_COMMANDER,
+]
+
+INCIDENT_SUMMARY_TEMPLATE = [
+ INCIDENT_NAME_SUMMARY,
+ INCIDENT_TITLE,
+ INCIDENT_COMMANDER_SUMMARY,
+ INCIDENT_SUMMARY,
+]
+
+INCIDENT_MANAGEMENT_HELP_TIPS_MESSAGE = [
+ {
+ "title": "{{name}} Incident - Management Help Tips",
+ "text": INCIDENT_MANAGEMENT_HELP_TIPS_MESSAGE_DESCRIPTION,
+ }
+]
+
+INCIDENT_OPEN_TASKS = [
+ {
+ "title": "{{title}}",
+ "text": INCIDENT_OPEN_TASKS_DESCRIPTION,
+ }
+]
+
+INCIDENT_TASK_ADD_TO_INCIDENT = [
+ {
+ "title": "{{title}}",
+ "text": INCIDENT_TASK_ADD_TO_INCIDENT_DESCRIPTION,
+ }
+]
+
+ONCALL_SHIFT_FEEDBACK_BUTTONS = [
+ {
+ "button_text": "Provide Feedback",
+ "button_value": "{{organization_slug}}|{{project_id}}|{{oncall_schedule_id}}|{{shift_end_at}}|{{reminder_id}}|{{details}}",
+ "button_action": ConversationButtonActions.service_feedback,
+ }
+]
+
+ONCALL_SHIFT_FEEDBACK_NOTIFICATION = [
+ {
+ "title": "Oncall Shift Feedback",
+ "text": ONCALL_SHIFT_FEEDBACK_DESCRIPTION,
+ "buttons": ONCALL_SHIFT_FEEDBACK_BUTTONS,
+ }
+]
+
+ONCALL_SHIFT_FEEDBACK_NOTIFICATION_REMINDER = [
+ {
+ "title": "Oncall Shift Feedback - REMINDER",
+ "text": ONCALL_SHIFT_FEEDBACK_DESCRIPTION,
+ "buttons": ONCALL_SHIFT_FEEDBACK_BUTTONS,
+ }
+]
+
+ONCALL_SHIFT_FEEDBACK_RECEIVED = [
+ {
+ "title": "Oncall Shift Feedback - RECEIVED",
+ "text": ONCALL_SHIFT_FEEDBACK_RECEIVED_DESCRIPTION,
+ }
+]
+
+
+def render_message_template(message_template: list[dict], **kwargs):
+ """Renders the jinja data included in the template itself."""
+ data = []
+ new_copy = copy.deepcopy(message_template)
+ for d in new_copy:
+ if d.get("header"):
+ d["header"] = env.from_string(d["header"]).render(**kwargs)
+
+ if d.get("title"):
+ d["title"] = env.from_string(d["title"]).render(**kwargs)
+
+ if d.get("title_link"):
+ d["title_link"] = env.from_string(d["title_link"]).render(**kwargs)
+
+ if d["title_link"] == "None": # skip blocks with no content
+ continue
+
+ # skip blocks that do not have new links rendered, as no real value was provided
+ if not d["title_link"]:
+ continue
+
+ if d.get("text"):
+ d["text"] = env.from_string(d["text"]).render(**kwargs)
+
+ # NOTE: we truncate the string to 2500 characters
+ # to prevent hitting limits on SaaS integrations (e.g. Slack)
+ d["text"] = d["text"] if len(d["text"]) <= 2500 else d["text"][:2500]
+
+ # render a new button array given the template
+ if d.get("buttons"):
+ for button in d["buttons"]:
+ button["button_text"] = env.from_string(button["button_text"]).render(**kwargs)
+ button["button_value"] = env.from_string(button["button_value"]).render(**kwargs)
+
+ if button.get("button_action"):
+ button["button_action"] = env.from_string(button["button_action"]).render(
+ **kwargs
+ )
+
+ if button.get("button_url"):
+ button["button_url"] = env.from_string(button["button_url"]).render(**kwargs)
+
+ # render drop-down list
+ if select := d.get("select"):
+ if placeholder := select.get("placeholder"):
+ select["placeholder"] = env.from_string(placeholder).render(**kwargs)
+
+ select["select_action"] = env.from_string(select["select_action"]).render(**kwargs)
+
+ for option in select["options"]:
+ option["option_text"] = env.from_string(option["option_text"]).render(**kwargs)
+ option["option_value"] = env.from_string(option["option_value"]).render(**kwargs)
+
+ if d.get("visibility_mapping"):
+ d["text"] = d["visibility_mapping"][kwargs["visibility"]]
+
+ if d.get("status_mapping"):
+ d["text"] = d["status_mapping"][kwargs["status"]]
+
+ if d.get("datetime"):
+ d["datetime"] = env.from_string(d["datetime"]).render(**kwargs)
+
+ if d.get("context"):
+ d["context"] = env.from_string(d["context"]).render(**kwargs)
+
+ data.append(d)
+
+ return data
+
+
+def generate_welcome_message(
+ welcome_message: EmailTemplates, is_incident: bool = True
+) -> list[dict | None]:
+ """Generates the welcome message."""
+ if welcome_message is None:
+ if is_incident:
+ return INCIDENT_PARTICIPANT_WELCOME_MESSAGE
+ else:
+ return CASE_PARTICIPANT_WELCOME_MESSAGE
+
+ participant_welcome = {
+ "title": welcome_message.welcome_text,
+ "title_link": "{{ticket_weblink}}",
+ "text": welcome_message.welcome_body,
+ }
+
+ component_mapping = {
+ "Title": INCIDENT_TITLE if is_incident else CASE_TITLE,
+ "Description": INCIDENT_DESCRIPTION if is_incident else CASE_DESCRIPTION,
+ "Visibility": INCIDENT_VISIBILITY if is_incident else CASE_VISIBILITY,
+ "Status": INCIDENT_STATUS if is_incident else CASE_STATUS,
+ "Type": INCIDENT_TYPE if is_incident else CASE_TYPE,
+ "Severity": INCIDENT_SEVERITY if is_incident else CASE_SEVERITY,
+ "Priority": INCIDENT_PRIORITY if is_incident else CASE_PRIORITY,
+ "Reporter": INCIDENT_REPORTER if is_incident else CASE_REPORTER,
+ "Commander": INCIDENT_COMMANDER if is_incident else CASE_ASSIGNEE,
+ "Investigation Document": (
+ INCIDENT_INVESTIGATION_DOCUMENT if is_incident else CASE_INVESTIGATION_DOCUMENT
+ ),
+ "Storage": INCIDENT_STORAGE if is_incident else CASE_STORAGE,
+ "Conference": INCIDENT_CONFERENCE if is_incident else CASE_CONFERENCE,
+ "Slack Commands": (
+ INCIDENT_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT
+ if is_incident
+ else CASE_CONVERSATION_COMMANDS_REFERENCE_DOCUMENT
+ ),
+ "FAQ Document": INCIDENT_FAQ_DOCUMENT if is_incident else CASE_FAQ_DOCUMENT,
+ }
+
+ message = [participant_welcome]
+
+ for component in component_mapping.keys():
+ # if the component type is in welcome_message.components, then add it
+ if component in welcome_message.components:
+ message.append(component_mapping[component])
+
+ return message
diff --git a/src/dispatch/metrics.py b/src/dispatch/metrics.py
index 467063d394e6..794205d539a9 100644
--- a/src/dispatch/metrics.py
+++ b/src/dispatch/metrics.py
@@ -12,7 +12,9 @@ class Metrics(object):
def __init__(self):
if not METRIC_PROVIDERS:
- log.warning("No metric providers specified metrics will not be sent.")
+ log.info(
+ "No metric providers defined via METRIC_PROVIDERS env var. Metrics will not be sent."
+ )
else:
self._providers = METRIC_PROVIDERS
diff --git a/src/dispatch/models.py b/src/dispatch/models.py
index af5ec2e94efd..19822c9e7cc9 100644
--- a/src/dispatch/models.py
+++ b/src/dispatch/models.py
@@ -1,179 +1,164 @@
-from datetime import datetime
-from typing import List, Optional
-
-from pydantic import BaseModel
-from sqlalchemy import (
- Boolean,
- Column,
- DateTime,
- ForeignKey,
- Integer,
- PrimaryKeyConstraint,
- String,
- Table,
- event,
-)
-from sqlalchemy.ext.declarative import declared_attr
+"""Shared models and mixins for the Dispatch application."""
+
+from datetime import datetime, timedelta, timezone
-from .database import Base
+from pydantic import EmailStr
+from pydantic import Field, StringConstraints, ConfigDict, BaseModel
+from pydantic import SecretStr
-# Association tables
-definition_teams = Table(
- "definition_teams",
- Base.metadata,
- Column("definition_id", Integer, ForeignKey("definition.id")),
- Column("team_contact_id", Integer, ForeignKey("team_contact.id")),
- PrimaryKeyConstraint("definition_id", "team_contact_id"),
-)
+from sqlalchemy import Boolean, Column, DateTime, Integer, String, event, ForeignKey
+from sqlalchemy import func
+from sqlalchemy.ext.declarative import declared_attr
+from sqlalchemy.ext.hybrid import hybrid_property
+from sqlalchemy.orm import relationship
+from typing import Annotated, ClassVar
-definition_terms = Table(
- "definition_terms",
- Base.metadata,
- Column("definition_id", Integer, ForeignKey("definition.id")),
- Column("term_id", Integer, ForeignKey("term.id")),
- PrimaryKeyConstraint("definition_id", "term_id"),
-)
+# pydantic type that limits the range of primary keys
+PrimaryKey = Annotated[int, Field(gt=0, lt=2147483647)]
+NameStr = Annotated[str, StringConstraints(pattern=r".*\S.*", strip_whitespace=True, min_length=3)]
+OrganizationSlug = Annotated[str, StringConstraints(pattern=r"^[\w]+(?:_[\w]+)*$", min_length=3)]
# SQLAlchemy models...
+class ProjectMixin(object):
+ """Project mixin for adding project relationships to models."""
+
+ @declared_attr
+ def project_id(cls): # noqa
+ """Returns the project_id column."""
+ return Column(Integer, ForeignKey("project.id", ondelete="CASCADE"))
+
+ @declared_attr
+ def project(cls):
+ """Returns the project relationship."""
+ return relationship("Project")
+
+
class TimeStampMixin(object):
- """ Timestamping mixin"""
+ """Timestamping mixin for created_at and updated_at fields."""
- created_at = Column(DateTime, default=datetime.utcnow)
+ created_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
created_at._creation_order = 9998
- updated_at = Column(DateTime, default=datetime.utcnow)
+ updated_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
updated_at._creation_order = 9998
@staticmethod
def _updated_at(mapper, connection, target):
- target.updated_at = datetime.utcnow()
+ """Updates the updated_at field to the current UTC time."""
+ target.updated_at = datetime.now(timezone.utc)
@classmethod
def __declare_last__(cls):
+ """Registers the before_update event to update the updated_at field."""
event.listen(cls, "before_update", cls._updated_at)
class ContactMixin(TimeStampMixin):
- """ Contact mixin"""
+ """Contact mixin for contact-related fields."""
is_active = Column(Boolean, default=True)
is_external = Column(Boolean, default=False)
contact_type = Column(String)
- email = Column(String, unique=True)
+ email = Column(String)
company = Column(String)
notes = Column(String)
owner = Column(String)
class ResourceMixin(TimeStampMixin):
- """Resource mixin."""
+ """Resource mixin for resource-related fields."""
resource_type = Column(String)
resource_id = Column(String)
weblink = Column(String)
- @declared_attr
- def incident_id(cls): # noqa
- return Column(Integer, ForeignKey("incident.id"))
+class EvergreenMixin(object):
+ """Evergreen mixin for evergreen-related fields and logic."""
-# Pydantic models...
-class DispatchBase(BaseModel):
- class Config:
- orm_mode = True
- validate_assignment = True
-
+ evergreen = Column(Boolean)
+ evergreen_owner = Column(String)
+ evergreen_reminder_interval = Column(Integer, default=90) # number of days
+ evergreen_last_reminder_at = Column(DateTime, default=lambda: datetime.now(timezone.utc))
-class ContactBase(DispatchBase):
- email: str
- name: Optional[str] = None
- is_active: Optional[bool] = True
- is_external: Optional[bool] = False
- company: Optional[str] = None
- contact_type: Optional[str] = None
- notes: Optional[str] = None
- owner: Optional[str] = None
+ @hybrid_property
+ def overdue(self):
+ """Returns True if the evergreen reminder is overdue."""
+ now = datetime.now(timezone.utc)
+ if self.evergreen_last_reminder_at is not None and self.evergreen_reminder_interval is not None:
+ next_reminder = self.evergreen_last_reminder_at + timedelta(
+ days=self.evergreen_reminder_interval
+ )
+ if now >= next_reminder:
+ return True
+ return False
+ @overdue.expression
+ def overdue(cls):
+ """SQL expression for checking if the evergreen reminder is overdue."""
+ return (
+ func.date_part("day", func.now() - cls.evergreen_last_reminder_at)
+ >= cls.evergreen_reminder_interval # noqa
+ )
-class PluginOptionModel(DispatchBase):
- pass
+class FeedbackMixin(object):
+ """Feedback mixin for feedback-related fields."""
-# self referential models
-class TermNested(DispatchBase):
- id: Optional[int]
- text: str
- # disabling this for now as recursive models break swagger api gen
- # definitions: Optional[List["DefinitionNested"]] = []
+ rating = Column(String)
+ feedback = Column(String)
-class DefinitionNested(DispatchBase):
- id: Optional[int]
- text: str
- terms: Optional[List["TermNested"]] = []
-
-
-class ServiceNested(DispatchBase):
- pass
-
-
-class IndividualNested(DispatchBase):
- pass
-
-
-class TeamNested(DispatchBase):
- pass
-
-
-class TermReadNested(DispatchBase):
- id: int
- text: str
-
-
-class DefinitionReadNested(DispatchBase):
- id: int
- text: str
+# Pydantic models...
+class DispatchBase(BaseModel):
+ """Base Pydantic model with shared config for Dispatch models."""
+ model_config: ClassVar[ConfigDict] = ConfigDict(
+ from_attributes=True,
+ validate_assignment=True,
+ arbitrary_types_allowed=True,
+ str_strip_whitespace=True,
+ json_encoders={
+ # custom output conversion for datetime
+ datetime: lambda v: v.strftime("%Y-%m-%dT%H:%M:%S.%fZ") if v else None,
+ SecretStr: lambda v: v.get_secret_value() if v else None,
+ },
+ )
-class ServiceReadNested(DispatchBase):
- name: Optional[str] = None
- external_id: Optional[str] = None
- is_active: Optional[bool] = None
- type: Optional[str] = None
+class Pagination(DispatchBase):
+ """Pydantic model for paginated results."""
+ itemsPerPage: int
+ page: int
+ total: int
-class IndividualReadNested(ContactBase):
- title: Optional[str] = None
- weblink: Optional[str]
- title: Optional[str]
+class PrimaryKeyModel(BaseModel):
+ """Pydantic model for a primary key field."""
+ id: PrimaryKey
-class TeamReadNested(ContactBase):
- pass
+class EvergreenBase(DispatchBase):
+ """Base Pydantic model for evergreen resources."""
+ evergreen: bool | None = False
+ evergreen_owner: EmailStr | None = None
+ evergreen_reminder_interval: int | None = 90
+ evergreen_last_reminder_at: datetime | None = None
-class PolicyReadNested(DispatchBase):
- pass
+class ResourceBase(DispatchBase):
+ """Base Pydantic model for resource-related fields."""
+ resource_type: str | None = None
+ resource_id: str | None = None
+ weblink: str | None = None
-from dispatch.incident.models import * # noqa
-from dispatch.conversation.models import * # noqa
-from dispatch.definition.models import * # noqa
-from dispatch.document.models import Document # noqa
-from dispatch.group.models import * # noqa
-from dispatch.incident_priority.models import * # noqa
-from dispatch.incident_type.models import * # noqa
-from dispatch.individual.models import * # noqa
-from dispatch.participant.models import * # noqa
-from dispatch.participant_role.models import * # noqa
-from dispatch.policy.models import * # noqa
-from dispatch.route.models import * # noqa
-from dispatch.service.models import * # noqa
-from dispatch.status_report.models import * # noqa
-from dispatch.storage.models import * # noqa
-from dispatch.task.models import * # noqa
-from dispatch.team.models import * # noqa
-from dispatch.term.models import * # noqa
-from dispatch.tag.models import * # noqa
-from dispatch.ticket.models import * # noqa
-from dispatch.conference.models import * # noqa
+class ContactBase(DispatchBase):
+ """Base Pydantic model for contact-related fields."""
+ email: EmailStr
+ name: str | None = None
+ is_active: bool | None = True
+ is_external: bool | None = False
+ company: str | None = None
+ contact_type: str | None = None
+ notes: str | None = None
+ owner: str | None = None
diff --git a/src/dispatch/monitor/flows.py b/src/dispatch/monitor/flows.py
new file mode 100644
index 000000000000..f9f404f7961b
--- /dev/null
+++ b/src/dispatch/monitor/flows.py
@@ -0,0 +1,34 @@
+"""
+.. module: dispatch.monitor.flows
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
+
+import logging
+
+from sqlalchemy.orm import Session
+
+from dispatch.plugin import service as plugin_service
+
+
+log = logging.getLogger(__name__)
+
+
+def send_monitor_notification(
+ project_id: int, conversation_id: int, message_template: str, db_session: Session, **kwargs
+):
+ """Sends a monitor notification."""
+ notification_text = "Incident Notification"
+ notification_type = "incident-notification"
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning("Monitor notification not sent. No conversation plugin enabled.")
+ return
+
+ plugin.instance.send(
+ conversation_id, notification_text, message_template, notification_type, **kwargs
+ )
diff --git a/src/dispatch/monitor/models.py b/src/dispatch/monitor/models.py
new file mode 100644
index 000000000000..fbd90d4172a3
--- /dev/null
+++ b/src/dispatch/monitor/models.py
@@ -0,0 +1,46 @@
+from sqlalchemy.orm import relationship
+from sqlalchemy import Column, ForeignKey, Integer, JSON, Boolean
+
+from dispatch.database.core import Base
+from dispatch.incident.models import IncidentRead
+from dispatch.plugin.models import PluginInstance, PluginInstanceRead
+from dispatch.participant.models import ParticipantRead
+from dispatch.models import (
+ PrimaryKey,
+ ResourceBase,
+ ResourceMixin,
+ TimeStampMixin,
+)
+
+
+class Monitor(Base, ResourceMixin, TimeStampMixin):
+ id = Column(Integer, primary_key=True)
+ plugin_instance_id = Column(Integer, ForeignKey(PluginInstance.id))
+ plugin_instance = relationship(PluginInstance, backref="monitors")
+ incident_id = Column(Integer, ForeignKey("incident.id", ondelete="CASCADE"))
+ incident = relationship("Incident", backref="monitors", foreign_keys=[incident_id])
+ creator_id = Column(Integer, ForeignKey("participant.id"))
+ creator = relationship(
+ "Participant", backref="created_monitor_instances", foreign_keys=[creator_id]
+ )
+ enabled = Column(Boolean, default=True)
+ status = Column(JSON)
+
+
+class MonitorBase(ResourceBase):
+ enabled: bool | None = None
+ status: dict | None = None
+
+
+class MonitorCreate(MonitorBase):
+ plugin_instance: PluginInstanceRead
+ creator: ParticipantRead
+ incident: IncidentRead
+
+
+class MonitorUpdate(MonitorBase):
+ id: PrimaryKey
+
+
+class MonitorRead(MonitorBase):
+ id: PrimaryKey
diff --git a/src/dispatch/monitor/scheduled.py b/src/dispatch/monitor/scheduled.py
new file mode 100644
index 000000000000..d6e651f8571e
--- /dev/null
+++ b/src/dispatch/monitor/scheduled.py
@@ -0,0 +1,102 @@
+import logging
+
+from sqlalchemy.orm import Session
+from schedule import every
+
+from dispatch.database.core import resolve_attr
+from dispatch.decorators import scheduled_project_task, timer
+from dispatch.incident import service as incident_service
+from dispatch.incident.enums import IncidentStatus
+from dispatch.incident.models import Incident
+from dispatch.messaging.strings import (
+ INCIDENT_MONITOR_UPDATE_NOTIFICATION,
+)
+from dispatch.plugin import service as plugin_service
+from dispatch.plugin.models import Plugin
+from dispatch.project.models import Project
+from dispatch.scheduler import scheduler
+from dispatch.monitor.models import MonitorUpdate
+from dispatch.monitor import service as monitor_service
+from dispatch.monitor.flows import send_monitor_notification
+
+
+log = logging.getLogger(__name__)
+
+MONITOR_SYNC_INTERVAL = 30 # seconds
+
+
+def run_monitors(
+ db_session: Session,
+ project: Project,
+ monitor_plugin: Plugin,
+ incidents: list[Incident],
+ notify: bool = False,
+):
+ """Performs monitor run."""
+ for incident in incidents:
+ for monitor in incident.monitors:
+ # once an instance is complete we don't update it any more
+ if not monitor.enabled:
+ continue
+
+ log.debug(f"Processing monitor. Monitor: {monitor.weblink}")
+ monitor_status = monitor_plugin.instance.get_match_status(
+ weblink=monitor.weblink,
+ last_modified=monitor.updated_at,
+ )
+
+ log.debug(f"Retrieved data from plugin. Data: {monitor_status}")
+ if not monitor_status:
+ continue
+
+ monitor_status_old = monitor.status
+ if monitor_status["state"] == monitor.status["state"]:
+ continue
+
+ monitor_service.update(
+ db_session=db_session,
+ monitor=monitor,
+ monitor_in=MonitorUpdate(
+ id=monitor.id,
+ weblink=monitor.weblink,
+ enabled=monitor.enabled,
+ status=monitor_status,
+ ),
+ )
+
+ if notify:
+ send_monitor_notification(
+ project.id,
+ incident.conversation.channel_id,
+ INCIDENT_MONITOR_UPDATE_NOTIFICATION,
+ db_session,
+ monitor_state_old=monitor_status_old["state"],
+ monitor_state_new=monitor.status["state"],
+ weblink=monitor.weblink,
+ monitor_creator_name=resolve_attr(monitor, "creator.individual.name"),
+ )
+
+
+@scheduler.add(every(MONITOR_SYNC_INTERVAL).seconds, name="sync-active-stable-monitors")
+@timer
+@scheduled_project_task
+def sync_active_stable_monitors(db_session: Session, project: Project):
+ """Syncs incident monitors for active and stable incidents."""
+ monitor_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project.id, plugin_type="monitor"
+ )
+ if not monitor_plugin:
+ log.warning(
+ f"Incident monitors not synced. No monitor plugin enabled. Project: {project.name}. Organization: {project.organization.name}"
+ )
+ return
+
+ # we get all active and stable incidents
+ active_incidents = incident_service.get_all_by_status(
+ db_session=db_session, project_id=project.id, status=IncidentStatus.active
+ )
+ stable_incidents = incident_service.get_all_by_status(
+ db_session=db_session, project_id=project.id, status=IncidentStatus.stable
+ )
+ incidents = active_incidents + stable_incidents
+ run_monitors(db_session, project, monitor_plugin, incidents, notify=True)
diff --git a/src/dispatch/monitor/service.py b/src/dispatch/monitor/service.py
new file mode 100644
index 000000000000..2c19c453b9d9
--- /dev/null
+++ b/src/dispatch/monitor/service.py
@@ -0,0 +1,80 @@
+
+from sqlalchemy.sql.expression import true
+from dispatch.incident import service as incident_service
+from dispatch.plugin import service as plugin_service
+from dispatch.participant import service as participant_service
+
+from .models import (
+ Monitor,
+ MonitorCreate,
+ MonitorUpdate,
+)
+
+
+def get(*, db_session, monitor_id: int) -> Monitor | None:
+ """Returns a monitor based on the given monitor id."""
+ return db_session.query(Monitor).filter(Monitor.id == monitor_id).one_or_none()
+
+
+def get_all(*, db_session) -> list[Monitor | None]:
+ """Returns all monitors."""
+ return db_session.query(Monitor)
+
+
+def get_enabled(*, db_session) -> list[Monitor | None]:
+ """Fetches all enabled monitors."""
+ return db_session.query(Monitor).filter(Monitor.enabled == true()).all()
+
+
+def get_by_weblink(*, db_session, weblink: str) -> Monitor | None:
+ """Fetches a monitor by it's weblink"""
+ return db_session.query(Monitor).filter(Monitor.weblink == weblink).one_or_none()
+
+
+def create_or_update(*, db_session, monitor_in: MonitorCreate) -> Monitor:
+ """Creates or updates a monitor."""
+ monitor = get_by_weblink(db_session=db_session, weblink=monitor_in.weblink)
+ if monitor:
+ monitor = update(db_session=db_session, monitor=monitor, monitor_in=monitor_in)
+ else:
+ monitor = create(db_session=db_session, monitor_in=monitor_in)
+
+ return monitor
+
+
+def create(*, db_session, monitor_in: MonitorCreate) -> Monitor:
+ """Creates a new monitor."""
+ incident = incident_service.get(db_session=db_session, incident_id=monitor_in.incident.id)
+ plugin_instance = plugin_service.get_instance(
+ db_session=db_session, plugin_instance_id=monitor_in.plugin_instance.id
+ )
+ creator = participant_service.get(db_session=db_session, participant_id=monitor_in.creator.id)
+ monitor = Monitor(
+ **monitor_in.dict(exclude={"plugin_instance", "incident", "creator"}),
+ plugin_instance=plugin_instance,
+ incident=incident,
+ creator=creator,
+ )
+
+ db_session.add(monitor)
+ db_session.commit()
+ return monitor
+
+
+def update(*, db_session, monitor: Monitor, monitor_in: MonitorUpdate) -> Monitor:
+ """Updates a monitor."""
+ monitor_data = monitor.dict()
+ update_data = monitor_in.dict(exclude_unset=True)
+
+ for field in monitor_data:
+ if field in update_data:
+ setattr(monitor, field, update_data[field])
+
+ db_session.commit()
+ return monitor
+
+
+def delete(*, db_session, monitor_id: int):
+ """Deletes a monitor."""
+ db_session.query(Monitor).filter(Monitor.id == monitor_id).delete()
+ db_session.commit()
diff --git a/src/dispatch/nlp.py b/src/dispatch/nlp.py
new file mode 100644
index 000000000000..aa759a90af49
--- /dev/null
+++ b/src/dispatch/nlp.py
@@ -0,0 +1,54 @@
+import logging
+
+import spacy
+from spacy.matcher import PhraseMatcher
+
+log = logging.getLogger(__name__)
+
+nlp = spacy.blank("en")
+nlp.vocab.lex_attr_getters = {}
+
+
+def build_term_vocab(terms: list[str]):
+ """Builds nlp vocabulary."""
+ for v in terms:
+ texts = [v, v.lower(), v.upper(), v.title()]
+ for t in texts:
+ if t: # guard against `None`
+ phrase = nlp.tokenizer(t)
+ for w in phrase:
+ _ = nlp.tokenizer.vocab[w.text]
+ yield phrase
+
+
+def build_phrase_matcher(name: str, phrases: list[str]) -> PhraseMatcher:
+ """Builds a PhraseMatcher object."""
+ matcher = PhraseMatcher(nlp.tokenizer.vocab)
+ matcher.add(name, phrases)
+ return matcher
+
+
+def extract_terms_from_text(text: str, matcher: PhraseMatcher) -> list[str]:
+ """Extracts key terms out of test."""
+ terms = []
+ doc = nlp.tokenizer(text)
+ for w in doc:
+ _ = doc.vocab[
+ w.text.lower()
+ ] # We normalize our docs so that vocab doesn't take so long to build.
+
+ matches = matcher(doc)
+ for _, start, end in matches:
+ with doc.retokenize() as retokenizer:
+ retokenizer.merge(doc[start:end])
+
+ # We try to filter out common stop words unless
+ # we have surrounding context that would suggest they are not stop words.
+ span = doc[start:end]
+ for token in span:
+ if token.is_stop:
+ continue
+
+ terms.append(token.text.lower())
+
+ return terms
diff --git a/src/dispatch/notification/__init__.py b/src/dispatch/notification/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/notification/models.py b/src/dispatch/notification/models.py
new file mode 100644
index 000000000000..9f9b5cef923b
--- /dev/null
+++ b/src/dispatch/notification/models.py
@@ -0,0 +1,86 @@
+from datetime import datetime
+from sqlalchemy import Boolean, Column, Integer, String, ForeignKey, Table
+from sqlalchemy.orm import relationship
+from sqlalchemy.sql.schema import PrimaryKeyConstraint
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base
+from dispatch.enums import DispatchEnum
+from dispatch.project.models import ProjectRead
+from dispatch.search_filter.models import SearchFilterRead, SearchFilterUpdate
+
+from dispatch.models import (
+ EvergreenBase,
+ EvergreenMixin,
+ TimeStampMixin,
+ ProjectMixin,
+ NameStr,
+ PrimaryKey,
+ Pagination,
+)
+
+
+class NotificationTypeEnum(DispatchEnum):
+ conversation = "conversation"
+ email = "email"
+
+
+assoc_notification_filters = Table(
+ "assoc_notification_filters",
+ Base.metadata,
+ Column("notification_id", Integer, ForeignKey("notification.id", ondelete="CASCADE")),
+ Column("search_filter_id", Integer, ForeignKey("search_filter.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("notification_id", "search_filter_id"),
+)
+
+
+class Notification(Base, TimeStampMixin, ProjectMixin, EvergreenMixin):
+ # Columns
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ type = Column(String)
+ target = Column(String)
+ enabled = Column(Boolean, default=True)
+
+ # Relationships
+ filters = relationship(
+ "SearchFilter", secondary=assoc_notification_filters, backref="notifications"
+ )
+
+ search_vector = Column(
+ TSVectorType(
+ "name",
+ "description",
+ regconfig="pg_catalog.simple",
+ )
+ )
+
+
+# Pydantic models
+class NotificationBase(EvergreenBase):
+ name: NameStr
+ description: str | None = None
+ type: NotificationTypeEnum
+ target: str
+ enabled: bool | None = None
+
+
+class NotificationCreate(NotificationBase):
+ filters: list[SearchFilterRead | None] = []
+ project: ProjectRead
+
+
+class NotificationUpdate(NotificationBase):
+ filters: list[SearchFilterUpdate | None] = []
+
+
+class NotificationRead(NotificationBase):
+ id: PrimaryKey
+ created_at: datetime | None = None
+ updated_at: datetime | None = None
+ filters: list[SearchFilterRead | None] = []
+
+
+class NotificationPagination(Pagination):
+ items: list[NotificationRead] = []
diff --git a/src/dispatch/notification/service.py b/src/dispatch/notification/service.py
new file mode 100644
index 000000000000..855b9c150d77
--- /dev/null
+++ b/src/dispatch/notification/service.py
@@ -0,0 +1,163 @@
+import logging
+
+
+from dispatch.database.core import Base
+from dispatch.models import PrimaryKey
+from dispatch.plugin import service as plugin_service
+from dispatch.project import service as project_service
+from dispatch.search_filter import service as search_filter_service
+
+from .models import Notification, NotificationCreate, NotificationUpdate
+
+
+log = logging.getLogger(__name__)
+
+
+def get(*, db_session, notification_id: int) -> Notification | None:
+ """Gets a notification by id."""
+ return db_session.query(Notification).filter(Notification.id == notification_id).one_or_none()
+
+
+def get_all(*, db_session):
+ """Gets all notifications."""
+ return db_session.query(Notification)
+
+
+def get_all_enabled(*, db_session, project_id: int) -> list[Notification | None]:
+ """Gets all enabled notifications."""
+ return (
+ db_session.query(Notification)
+ .filter(Notification.enabled == True) # noqa Flake8 E712
+ .filter(Notification.project_id == project_id)
+ ).all()
+
+
+def get_overdue_evergreen_notifications(
+ *, db_session, project_id: int
+) -> list[Notification | None]:
+ """Returns all notifications that have not had a recent evergreen notification."""
+ query = (
+ db_session.query(Notification)
+ .filter(Notification.project_id == project_id)
+ .filter(Notification.evergreen == True) # noqa
+ .filter(Notification.overdue == True) # noqa
+ )
+ return query.all()
+
+
+def create(*, db_session, notification_in: NotificationCreate) -> Notification:
+ """Creates a new notification."""
+ filters = []
+ if notification_in.filters:
+ filters = [
+ search_filter_service.get(db_session=db_session, search_filter_id=f.id)
+ for f in notification_in.filters
+ ]
+
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=notification_in.project
+ )
+
+ notification = Notification(
+ **notification_in.dict(exclude={"filters", "project"}), filters=filters, project=project
+ )
+
+ db_session.add(notification)
+ db_session.commit()
+ return notification
+
+
+def update(
+ *, db_session, notification: Notification, notification_in: NotificationUpdate
+) -> Notification:
+ """Updates a notification."""
+ notification_data = notification.dict()
+ update_data = notification_in.dict(
+ exclude_unset=True,
+ exclude={"filters"},
+ )
+
+ for field in notification_data:
+ if field in update_data:
+ setattr(notification, field, update_data[field])
+
+ if notification_in.filters is not None:
+ filters = [
+ search_filter_service.get(db_session=db_session, search_filter_id=f.id)
+ for f in notification_in.filters
+ ]
+ notification.filters = filters
+
+ db_session.commit()
+ return notification
+
+
+def delete(*, db_session, notification_id: int):
+ """Deletes a notification."""
+ notification = (
+ db_session.query(Notification).filter(Notification.id == notification_id).one_or_none()
+ )
+ db_session.delete(notification)
+ db_session.commit()
+
+
+def send(
+ *, db_session, project_id: int, notification: Notification, notification_params: dict = None
+):
+ """Send a notification via plugin."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type=notification.type
+ )
+
+ if plugin:
+ # Can raise exception "tenacity.RetryError: RetryError". (Email may still go through).
+ try:
+ plugin.instance.send(
+ notification.target,
+ notification_params["text"],
+ notification_params["template"],
+ notification_params["type"],
+ **notification_params["kwargs"],
+ )
+ except Exception as e:
+ log.exception(e)
+ log.error(f"Error in sending {notification_params['type']}: {e}")
+ log.exception(e)
+ else:
+ log.warning(
+ f"Notification {notification.name} not sent. No {notification.type} plugin is active."
+ )
+
+
+def filter_and_send(
+ *,
+ db_session,
+ project_id: PrimaryKey,
+ class_instance: type[Base],
+ notification_params: dict = None,
+):
+ """Sends notifications."""
+ notifications = get_all_enabled(db_session=db_session, project_id=project_id)
+ for notification in notifications:
+ for search_filter in notification.filters:
+ match = search_filter_service.match(
+ db_session=db_session,
+ subject=search_filter.subject,
+ filter_spec=search_filter.expression,
+ class_instance=class_instance,
+ )
+ if match:
+ send(
+ db_session=db_session,
+ project_id=project_id,
+ notification=notification,
+ notification_params=notification_params,
+ )
+
+ if not notification.filters:
+ send(
+ db_session=db_session,
+ project_id=project_id,
+ notification=notification,
+ notification_params=notification_params,
+ )
diff --git a/src/dispatch/notification/views.py b/src/dispatch/notification/views.py
new file mode 100644
index 000000000000..6962c254eefc
--- /dev/null
+++ b/src/dispatch/notification/views.py
@@ -0,0 +1,85 @@
+from fastapi import APIRouter, Depends, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
+from dispatch.models import PrimaryKey
+
+from .models import (
+ NotificationCreate,
+ NotificationPagination,
+ NotificationRead,
+ NotificationUpdate,
+)
+from .service import create, delete, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=NotificationPagination)
+def get_notifications(common: CommonParameters):
+ """Get all notifications, or only those matching a given search term."""
+ return search_filter_sort_paginate(model="Notification", **common)
+
+
+@router.get("/{notification_id}", response_model=NotificationRead)
+def get_notification(db_session: DbSession, notification_id: PrimaryKey):
+ """Get a notification by its id."""
+ notification = get(db_session=db_session, notification_id=notification_id)
+ if not notification:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A notification with this id does not exist."}],
+ )
+ return notification
+
+
+@router.post(
+ "",
+ response_model=NotificationRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_notification(db_session: DbSession, notification_in: NotificationCreate):
+ """Create a notification."""
+ notification = create(db_session=db_session, notification_in=notification_in)
+ return notification
+
+
+@router.put(
+ "/{notification_id}",
+ response_model=NotificationRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_notification(
+ db_session: DbSession,
+ notification_id: PrimaryKey,
+ notification_in: NotificationUpdate,
+):
+ """Update a notification by its id."""
+ notification = get(db_session=db_session, notification_id=notification_id)
+ if not notification:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A notification with this id does not exist."}],
+ )
+ notification = update(
+ db_session=db_session, notification=notification, notification_in=notification_in
+ )
+ return notification
+
+
+@router.delete(
+ "/{notification_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def delete_notification(db_session: DbSession, notification_id: PrimaryKey):
+ """Delete a notification, returning only an HTTP 200 OK if successful."""
+ notification = get(db_session=db_session, notification_id=notification_id)
+ if not notification:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A notification with this id does not exist."}],
+ )
+ delete(db_session=db_session, notification_id=notification_id)
diff --git a/src/dispatch/organization/__init__.py b/src/dispatch/organization/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/organization/models.py b/src/dispatch/organization/models.py
new file mode 100644
index 000000000000..8455f9daf117
--- /dev/null
+++ b/src/dispatch/organization/models.py
@@ -0,0 +1,74 @@
+"""Models for organization resources in the Dispatch application."""
+
+from slugify import slugify
+
+from sqlalchemy.event import listen
+from sqlalchemy import Column, Integer, String, Boolean
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base
+from dispatch.models import DispatchBase, NameStr, OrganizationSlug, PrimaryKey, Pagination
+
+
+class Organization(Base):
+ """SQLAlchemy model for organization resources."""
+ __table_args__ = {"schema": "dispatch_core"}
+
+ id = Column(Integer, primary_key=True)
+ name = Column(String, unique=True)
+ slug = Column(String)
+ default = Column(Boolean)
+ description = Column(String)
+ banner_enabled = Column(Boolean)
+ banner_color = Column(String)
+ banner_text = Column(String)
+
+ search_vector = Column(
+ TSVectorType("name", "description", weights={"name": "A", "description": "B"})
+ )
+
+
+def generate_slug(target, value, oldvalue, initiator):
+ """Creates a reasonable slug based on organization name."""
+ if value and (not target.slug or value != oldvalue):
+ target.slug = slugify(value, separator="_")
+
+
+listen(Organization.name, "set", generate_slug)
+
+
+class OrganizationBase(DispatchBase):
+ """Base Pydantic model for organization resources."""
+ id: PrimaryKey | None = None
+ name: NameStr
+ description: str | None = None
+ default: bool | None = False
+ banner_enabled: bool | None = False
+ banner_color: str | None = None
+ banner_text: NameStr | None = None
+
+
+class OrganizationCreate(OrganizationBase):
+ """Pydantic model for creating an organization resource."""
+ pass
+
+
+class OrganizationUpdate(DispatchBase):
+ """Pydantic model for updating an organization resource."""
+ id: PrimaryKey | None = None
+ description: str | None = None
+ default: bool | None = False
+ banner_enabled: bool | None = False
+ banner_color: str | None = None
+ banner_text: NameStr | None = None
+
+
+class OrganizationRead(OrganizationBase):
+ """Pydantic model for reading an organization resource."""
+ id: PrimaryKey | None = None
+ slug: OrganizationSlug | None = None
+
+
+class OrganizationPagination(Pagination):
+ """Pydantic model for paginated organization results."""
+ items: list[OrganizationRead] = []
diff --git a/src/dispatch/organization/service.py b/src/dispatch/organization/service.py
new file mode 100644
index 000000000000..a34771aeb783
--- /dev/null
+++ b/src/dispatch/organization/service.py
@@ -0,0 +1,168 @@
+from pydantic import ValidationError
+from sqlalchemy.sql.expression import true
+
+from dispatch.auth.models import DispatchUser, DispatchUserOrganization
+from dispatch.database.core import engine
+from dispatch.database.manage import init_schema
+from dispatch.enums import UserRoles
+
+from .models import Organization, OrganizationCreate, OrganizationRead, OrganizationUpdate
+
+
+def get(*, db_session, organization_id: int) -> Organization | None:
+ """Gets an organization."""
+ return db_session.query(Organization).filter(Organization.id == organization_id).first()
+
+
+def get_default(*, db_session) -> Organization | None:
+ """Gets the default organization."""
+ return db_session.query(Organization).filter(Organization.default == true()).one_or_none()
+
+
+def get_default_or_raise(*, db_session) -> Organization:
+ """Returns the default organization or raise a ValidationError if one doesn't exist."""
+ organization = get_default(db_session=db_session)
+
+ if not organization:
+ raise ValidationError(
+ [
+ {
+ "loc": ("organization",),
+ "msg": "No default organization defined.",
+ "type": "value_error",
+ }
+ ]
+ )
+ return organization
+
+
+def get_by_name(*, db_session, name: str) -> Organization | None:
+ """Gets an organization by its name."""
+ return db_session.query(Organization).filter(Organization.name == name).one_or_none()
+
+
+def get_by_name_or_raise(*, db_session, organization_in: OrganizationRead) -> Organization:
+ """Returns the organization specified or raises ValidationError."""
+ organization = get_by_name(db_session=db_session, name=organization_in.name)
+
+ if not organization:
+ raise ValidationError(
+ [
+ {
+ "msg": "Organization not found.",
+ "organization": organization_in.name,
+ "loc": "organization",
+ }
+ ],
+ model=OrganizationRead,
+ )
+
+ return organization
+
+
+def get_by_slug(*, db_session, slug: str) -> Organization | None:
+ """Gets an organization by its slug."""
+ return db_session.query(Organization).filter(Organization.slug == slug).one_or_none()
+
+
+def get_by_slug_or_raise(*, db_session, organization_in: OrganizationRead) -> Organization:
+ """Returns the organization specified or raises ValidationError."""
+ organization = get_by_slug(db_session=db_session, slug=organization_in.slug)
+
+ if not organization:
+ raise ValidationError(
+ [
+ {
+ "msg": "Organization not found.",
+ "organization": organization_in.name,
+ "loc": "organization",
+ }
+ ],
+ model=OrganizationRead,
+ )
+
+ return organization
+
+
+def get_by_name_or_default(*, db_session, organization_in: OrganizationRead) -> Organization:
+ """Returns a organization based on a name or the default if not specified."""
+ if organization_in and organization_in.name:
+ organization = get_by_name(db_session=db_session, name=organization_in.name)
+ if organization:
+ return organization
+ return get_default_or_raise(db_session=db_session)
+
+
+def get_all(*, db_session) -> list[Organization | None]:
+ """Gets all organizations."""
+ return db_session.query(Organization)
+
+
+def create(*, db_session, organization_in: OrganizationCreate) -> Organization:
+ """Creates an organization."""
+ organization = Organization(
+ **organization_in.dict(exclude={"banner_color"}),
+ )
+
+ if organization_in.banner_color:
+ organization.banner_color = organization_in.banner_color
+
+ # we let the new schema session create the organization
+ organization = init_schema(engine=engine, organization=organization)
+ return organization
+
+
+def get_or_create(*, db_session, organization_in: OrganizationCreate) -> Organization:
+ """Gets an existing or creates a new organization."""
+ if organization_in.id:
+ q = db_session.query(Organization).filter(Organization.id == organization_in.id)
+ else:
+ q = db_session.query(Organization).filter_by(**organization_in.dict(exclude={"id"}))
+
+ instance = q.first()
+ if instance:
+ return instance
+
+ return create(db_session=db_session, organization_in=organization_in)
+
+
+def update(
+ *, db_session, organization: Organization, organization_in: OrganizationUpdate
+) -> Organization:
+ """Updates an organization."""
+ organization_data = organization.dict()
+
+ update_data = organization_in.dict(exclude_unset=True, exclude={"banner_color"})
+
+ for field in organization_data:
+ if field in update_data:
+ setattr(organization, field, update_data[field])
+
+ if organization_in.banner_color:
+ organization.banner_color = organization_in.banner_color
+
+ db_session.commit()
+ return organization
+
+
+def delete(*, db_session, organization_id: int):
+ """Deletes an organization."""
+ organization = db_session.query(Organization).filter(Organization.id == organization_id).first()
+ db_session.delete(organization)
+ db_session.commit()
+
+
+def add_user(
+ *,
+ db_session,
+ user: DispatchUser,
+ organization: Organization,
+ role: UserRoles = UserRoles.member,
+):
+ """Adds a user to an organization."""
+ db_session.add(
+ DispatchUserOrganization(
+ dispatch_user_id=user.id, organization_id=organization.id, role=role
+ )
+ )
+ db_session.commit()
diff --git a/src/dispatch/organization/views.py b/src/dispatch/organization/views.py
new file mode 100644
index 000000000000..fd13317e4eec
--- /dev/null
+++ b/src/dispatch/organization/views.py
@@ -0,0 +1,137 @@
+from fastapi import APIRouter, Depends, HTTPException, status
+from slugify import slugify
+from pydantic import ValidationError
+
+from sqlalchemy.exc import IntegrityError
+
+from dispatch.auth.permissions import (
+ OrganizationOwnerPermission,
+ PermissionsDependency,
+)
+from dispatch.auth.service import CurrentUser
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.enums import UserRoles
+from dispatch.models import PrimaryKey
+from dispatch.project import flows as project_flows
+from dispatch.project import service as project_service
+from dispatch.project.models import ProjectCreate
+
+from .models import (
+ OrganizationCreate,
+ OrganizationRead,
+ OrganizationUpdate,
+ OrganizationPagination,
+)
+from .service import create, get, get_by_name, get_by_slug, update, add_user
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=OrganizationPagination)
+def get_organizations(common: CommonParameters):
+ """Get all organizations."""
+ return search_filter_sort_paginate(model="Organization", **common)
+
+
+@router.post(
+ "",
+ response_model=OrganizationRead,
+)
+def create_organization(
+ db_session: DbSession,
+ organization_in: OrganizationCreate,
+ current_user: CurrentUser,
+):
+ """Create a new organization."""
+ if not organization_in.name:
+ raise HTTPException(
+ status_code=status.HTTP_400_BAD_REQUEST,
+ detail=[{"msg": "An organization name is required."}],
+ )
+ organization = get_by_name(db_session=db_session, name=organization_in.name)
+ if organization:
+ raise HTTPException(
+ status_code=status.HTTP_409_CONFLICT,
+ detail=[{"msg": "An organization with this name already exists."}],
+ )
+ if organization_in.id and get(db_session=db_session, organization_id=organization_in.id):
+ raise HTTPException(
+ status_code=status.HTTP_409_CONFLICT,
+ detail=[{"msg": "An organization with this id already exists."}],
+ )
+ slug = slugify(organization_in.name, separator="_")
+ if get_by_slug(db_session=db_session, slug=slug):
+ raise HTTPException(
+ status_code=status.HTTP_409_CONFLICT,
+ detail=[{"msg": "An organization with this slug already exists."}],
+ )
+ # we create the organization
+ organization = create(db_session=db_session, organization_in=organization_in)
+
+ # we add the creator as organization owner
+ add_user(
+ db_session=db_session, organization=organization, user=current_user, role=UserRoles.owner
+ )
+
+ # we create the default project
+ project_in = ProjectCreate(
+ name="default",
+ default=True,
+ description="Default Dispatch project.",
+ organization=organization,
+ )
+ project = project_service.create(db_session=db_session, project_in=project_in)
+
+ # we initialize the default project
+ project_flows.project_init_flow(
+ project_id=project.id, organization_slug=organization.slug, db_session=db_session
+ )
+
+ return organization
+
+
+@router.get("/{organization_id}", response_model=OrganizationRead)
+def get_organization(db_session: DbSession, organization_id: PrimaryKey):
+ """Get an organization."""
+ organization = get(db_session=db_session, organization_id=organization_id)
+ if not organization:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An organization with this id does not exist."}],
+ )
+ return organization
+
+
+@router.put(
+ "/{organization_id}",
+ response_model=OrganizationRead,
+ dependencies=[Depends(PermissionsDependency([OrganizationOwnerPermission]))],
+)
+def update_organization(
+ db_session: DbSession,
+ organization_id: PrimaryKey,
+ organization_in: OrganizationUpdate,
+):
+ """Update an organization."""
+ organization = get(db_session=db_session, organization_id=organization_id)
+ if not organization:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An organization with this id does not exist."}],
+ )
+ try:
+ organization = update(
+ db_session=db_session, organization=organization, organization_in=organization_in
+ )
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "An organization with this name already exists.",
+ "loc": "name",
+ }
+ ],
+ ) from None
+ return organization
diff --git a/src/dispatch/participant/flows.py b/src/dispatch/participant/flows.py
index 4b48390591da..9881ff116db2 100644
--- a/src/dispatch/participant/flows.py
+++ b/src/dispatch/participant/flows.py
@@ -1,84 +1,174 @@
import logging
-from dispatch.config import INCIDENT_PLUGIN_CONTACT_SLUG
-from dispatch.database import SessionLocal
-from dispatch.incident import service as incident_service
+from sqlalchemy.orm import Session
+
+from dispatch.case.models import Case
+from dispatch.database.core import get_table_name_by_class_instance
+from dispatch.enums import EventType
+from dispatch.event import service as event_service
+from dispatch.incident.models import Incident
from dispatch.individual import service as individual_service
+from dispatch.participant import service as participant_service
+from dispatch.participant.models import Participant
from dispatch.participant_role import service as participant_role_service
-from dispatch.participant_role.models import ParticipantRoleType, ParticipantRoleCreate
-from dispatch.plugins.base import plugins
-
-from .service import get_or_create, get_by_incident_id_and_email
-
+from dispatch.participant_role.models import (
+ ParticipantRoleCreate,
+ ParticipantRoleType,
+)
+from dispatch.service import service as service_service
+from typing import TypeVar
log = logging.getLogger(__name__)
+Subject = TypeVar("Subject", Case, Incident)
-def add_participant(
- user_email: str, incident_id: id, db_session: SessionLocal, role: ParticipantRoleType = None
-):
- """Adds a participant."""
- # We load the incident
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
-
- # We get or create a new individual
- individual = individual_service.get_or_create(db_session=db_session, email=user_email)
- # We create a role for the participant
- participant_role_in = ParticipantRoleCreate(role=role)
- participant_role = participant_role_service.create(
- db_session=db_session, participant_role_in=participant_role_in
+def add_participant(
+ user_email: str,
+ subject: Subject,
+ db_session: Session,
+ service_id: int = None,
+ roles: list[str | None] = None,
+) -> Participant:
+ """Adds a participant to an incident or a case."""
+ # we get or create a new individual
+ individual = individual_service.get_or_create(
+ db_session=db_session, project=subject.project, email=user_email
)
- # We get or create a new participant
- participant = get_or_create(
+ # we get or create a new participant
+ subject_type = get_table_name_by_class_instance(subject)
+
+ # set up roles
+ if roles is None:
+ roles = [ParticipantRoleType.observer]
+ participant_roles = [ParticipantRoleCreate(role=role) for role in roles]
+ participant = participant_service.get_or_create(
db_session=db_session,
- incident_id=incident.id,
+ subject_id=subject.id,
+ subject_type=subject_type,
individual_id=individual.id,
- participant_roles=[participant_role],
+ service_id=service_id,
+ participant_roles=participant_roles,
)
individual.participant.append(participant)
- incident.participants.append(participant)
-
- # We add and commit the changes
+ subject.participants.append(participant)
+
+ # TODO: Split this assignment depending on Obj type
+ # we update the commander, reporter, scribe, or liaison foreign key
+ for role in roles:
+ if role == ParticipantRoleType.incident_commander:
+ subject.commander_id = participant.id
+ subject.commanders_location = participant.location
+ elif role == ParticipantRoleType.reporter:
+ subject.reporter_id = participant.id
+ subject.reporters_location = participant.location
+ elif role == ParticipantRoleType.scribe:
+ subject.scribe_id = participant.id
+ elif role == ParticipantRoleType.liaison:
+ subject.liaison_id = participant.id
+ elif role == ParticipantRoleType.observer:
+ subject.observer_id = participant.id
+ elif role == ParticipantRoleType.assignee:
+ subject.assignee_id = participant.id
+
+ # we add and commit the changes
+ db_session.add(participant)
db_session.add(individual)
- db_session.add(incident)
+ db_session.add(subject)
db_session.commit()
- log.debug(
- f"{individual.name} with email address {individual.email} has been added to incident id {incident.id} with role {participant_role.role}."
- )
+ if subject_type == "case":
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"{individual.name} added to case with role(s): {', '.join([role.role.value for role in participant_roles])}",
+ case_id=subject.id,
+ )
+ if subject_type == "incident":
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"{individual.name} added to incident with role(s): {', '.join([role.role.value for role in participant_roles])}",
+ incident_id=subject.id,
+ type=EventType.participant_updated,
+ )
return participant
-def remove_participant(user_email: str, incident_id: int, db_session: SessionLocal):
+def remove_participant(user_email: str, incident: Incident, db_session: Session):
"""Removes a participant."""
- # We load the incident
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+ inactivated = inactivate_participant(user_email, incident, db_session)
+
+ if inactivated:
+ participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=incident.id, email=user_email
+ )
- # We get information about the individual
- contact_plugin = plugins.get(INCIDENT_PLUGIN_CONTACT_SLUG)
- individual_info = contact_plugin.get(user_email)
- individual_fullname = individual_info["fullname"]
+ log.debug(f"Removing {participant.individual.name} from {incident.name} incident...")
- log.debug(f"Removing {individual_fullname} from incident {incident.name}...")
+ participant.service = None
- participant = get_by_incident_id_and_email(
- db_session=db_session, incident_id=incident_id, email=user_email
- )
+ db_session.add(participant)
+ db_session.commit()
+
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"{participant.individual.name} has been removed",
+ incident_id=incident.id,
+ type=EventType.participant_updated,
+ )
+
+
+def remove_case_participant(user_email: str, case: Case, db_session: Session):
+ """Removes a participant."""
+ inactivated = inactivate_participant(user_email, case, db_session)
+
+ if inactivated:
+ participant = participant_service.get_by_case_id_and_email(
+ db_session=db_session, case_id=case.id, email=user_email
+ )
+
+ log.debug(f"Removing {participant.individual.name} from {case.name} case...")
+
+ participant.service = None
+
+ db_session.add(participant)
+ db_session.commit()
+
+ event_service.log_subject_event(
+ subject=case,
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"{participant.individual.name} has been removed",
+ type=EventType.participant_updated,
+ )
+
+
+def inactivate_participant(user_email: str, subject: Subject, db_session: Session):
+ """Inactivates a participant."""
+ subject_type = get_table_name_by_class_instance(subject)
+
+ if subject_type == "case":
+ participant = participant_service.get_by_case_id_and_email(
+ db_session=db_session, case_id=subject.id, email=user_email
+ )
+ else:
+ participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=subject.id, email=user_email
+ )
if not participant:
log.debug(
- f"Can't remove {individual_fullname}. They're not an active participant of incident {incident.name}."
+ f"Can't inactivate participant with {user_email} email. They're not a participant of {subject.name} {subject_type}."
)
return False
- # We mark the participant as inactive
- participant.is_active = False
+ log.debug(f"Inactivating {participant.individual.name} from {subject.name} {subject_type}...")
- # We make the participant renounce to their active roles
participant_active_roles = participant_role_service.get_all_active_roles(
db_session=db_session, participant_id=participant.id
)
@@ -87,51 +177,61 @@ def remove_participant(user_email: str, incident_id: int, db_session: SessionLoc
db_session=db_session, participant_role=participant_active_role
)
- # We add and commit the changes
- db_session.add(participant)
- db_session.commit()
-
- log.debug(f"Participant {participant.individual.name} has been removed from the incident.")
-
+ event_service.log_subject_event(
+ subject=subject,
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"{participant.individual.name} has been inactivated",
+ type=EventType.participant_updated,
+ )
return True
-def reactivate_participant(user_email: str, incident_id: int, db_session: SessionLocal):
+def reactivate_participant(
+ user_email: str, subject: Subject, db_session: Session, service_id: int = None
+):
"""Reactivates a participant."""
- # We load the incident
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
-
- # We get information about the individual
- contact_plugin = plugins.get(INCIDENT_PLUGIN_CONTACT_SLUG)
- individual_info = contact_plugin.get(user_email)
- individual_fullname = individual_info["fullname"]
+ subject_type = get_table_name_by_class_instance(subject)
- log.debug(f"Reactivating {individual_fullname} on incident {incident.name}...")
-
- participant = get_by_incident_id_and_email(
- db_session=db_session, incident_id=incident_id, email=user_email
- )
+ if subject_type == "case":
+ participant = participant_service.get_by_case_id_and_email(
+ db_session=db_session, case_id=subject.id, email=user_email
+ )
+ else:
+ participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=subject.id, email=user_email
+ )
if not participant:
- log.debug(
- f"{individual_fullname} is not an inactive participant of incident {incident.name}."
- )
+ log.debug(f"{user_email} is not an inactive participant of {subject.name} {subject_type}.")
return False
- # We mark the participant as active
- participant.is_active = True
+ log.debug(f"Reactivating {participant.individual.name} on {subject.name} {subject_type}...")
- # We create a role for the participant
- participant_role_in = ParticipantRoleCreate(role=ParticipantRoleType.participant)
+ # we get the last active role
+ participant_role = participant_role_service.get_last_active_role(
+ db_session=db_session, participant_id=participant.id
+ )
+ # we create a new role based on the last active role
+ participant_role_in = ParticipantRoleCreate(role=participant_role.role)
participant_role = participant_role_service.create(
db_session=db_session, participant_role_in=participant_role_in
)
- participant.participant_role.append(participant_role)
+ participant.participant_roles.append(participant_role)
+
+ if service_id:
+ service = service_service.get(db_session=db_session, service_id=service_id)
+ participant.service = service
- # We add and commit the changes
db_session.add(participant)
db_session.commit()
- log.debug(f"{individual_fullname} has been reactivated.")
+ event_service.log_subject_event(
+ subject=subject,
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"{participant.individual.name} has been reactivated",
+ type=EventType.participant_updated,
+ )
return True
diff --git a/src/dispatch/participant/models.py b/src/dispatch/participant/models.py
index 32a02e9c016d..0555b8705814 100644
--- a/src/dispatch/participant/models.py
+++ b/src/dispatch/participant/models.py
@@ -1,65 +1,98 @@
-from datetime import datetime
-
-from typing import Optional, List
-
-from sqlalchemy.orm import relationship
-from sqlalchemy import Column, Boolean, String, Integer, ForeignKey, DateTime, event
-
-from dispatch.database import Base
-from dispatch.models import DispatchBase
-from dispatch.participant_role.models import ParticipantRoleCreate
+from sqlalchemy.orm import relationship, backref
+from sqlalchemy import Column, Boolean, String, Integer, ForeignKey, select
+from sqlalchemy.ext.hybrid import hybrid_property
+
+from dispatch.database.core import Base
+from dispatch.models import DispatchBase, PrimaryKey, Pagination
+from dispatch.participant_role.models import (
+ ParticipantRoleCreate,
+ ParticipantRoleRead,
+ ParticipantRoleReadMinimal,
+ ParticipantRole,
+)
+from dispatch.service.models import ServiceRead
+from dispatch.individual.models import IndividualContactRead, IndividualContactReadMinimal
class Participant(Base):
+ # columns
id = Column(Integer, primary_key=True)
- is_active = Column(Boolean, default=True)
- active_at = Column(DateTime, default=datetime.utcnow)
- inactive_at = Column(DateTime)
- incident_id = Column(Integer, ForeignKey("incident.id"))
- individual_contact_id = Column(Integer, ForeignKey("individual_contact.id"))
- location = Column(String)
team = Column(String)
department = Column(String)
- participant_role = relationship("ParticipantRole", lazy="subquery", backref="participant")
- status_reports = relationship("StatusReport", backref="participant")
-
- @staticmethod
- def _active_at(mapper, connection, target):
- if target.is_active:
- target.inactive_at = None
-
- @staticmethod
- def _inactive_at(mapper, connection, target):
- if not target.is_active:
- target.inactive_at = datetime.utcnow()
-
- @classmethod
- def __declare_last__(cls):
- event.listen(cls, "before_update", cls._active_at)
- event.listen(cls, "before_update", cls._inactive_at)
+ location = Column(String)
+ added_by_id = Column(Integer, ForeignKey("participant.id"))
+ added_by = relationship(
+ "Participant", backref=backref("added_participant"), remote_side=[id], post_update=True
+ )
+ added_reason = Column(String)
+ after_hours_notification = Column(Boolean, default=False)
+ user_conversation_id = Column(String)
+
+ # relationships
+ feedback = relationship("Feedback", backref="participant")
+ incident_id = Column(Integer, ForeignKey("incident.id", ondelete="CASCADE", use_alter=True))
+ case_id = Column(Integer, ForeignKey("case.id", ondelete="CASCADE", use_alter=True))
+ individual = relationship("IndividualContact", lazy="subquery", backref="participant")
+ individual_contact_id = Column(Integer, ForeignKey("individual_contact.id", ondelete="CASCADE"))
+ participant_roles = relationship(
+ "ParticipantRole", backref="participant", lazy="subquery", cascade="all, delete-orphan"
+ )
+ reports = relationship("Report", backref="participant")
+ service = relationship("Service", backref="participant")
+ service_id = Column(Integer, ForeignKey("service.id", ondelete="CASCADE"))
+ created_tasks = relationship(
+ "Task", backref="creator", primaryjoin="Participant.id==Task.creator_id"
+ )
+ owned_tasks = relationship("Task", backref="owner", primaryjoin="Participant.id==Task.owner_id")
+
+ @hybrid_property
+ def active_roles(self):
+ roles = []
+ if self.participant_roles:
+ for pr in self.participant_roles:
+ if not pr.renounced_at:
+ roles.append(pr)
+ return roles
+
+ @active_roles.expression
+ def active_roles(cls):
+ return (
+ select([Participant])
+ .where(Participant.incident_id == cls.id)
+ .where(ParticipantRole.renounced_at == None) # noqa
+ )
class ParticipantBase(DispatchBase):
- pass
+ location: str | None = None
+ team: str | None = None
+ department: str | None = None
+ added_reason: str | None = None
class ParticipantCreate(ParticipantBase):
- participant_role: Optional[List[ParticipantRoleCreate]] = []
- location: Optional[str]
- team: Optional[str]
- department: Optional[str]
+ participant_roles: list[ParticipantRoleCreate] | None = []
+ location: str | None = None
+ team: str | None = None
+ department: str | None = None
+ service: ServiceRead | None = None
class ParticipantUpdate(ParticipantBase):
- pass
+ individual: IndividualContactRead | None = None
class ParticipantRead(ParticipantBase):
- id: int
- active_at: Optional[datetime] = None
- inactive_at: Optional[datetime] = None
+ id: PrimaryKey
+ participant_roles: list[ParticipantRoleRead] | None = []
+ individual: IndividualContactRead | None = None
+
+
+class ParticipantReadMinimal(ParticipantBase):
+ id: PrimaryKey
+ participant_roles: list[ParticipantRoleReadMinimal] | None = []
+ individual: IndividualContactReadMinimal | None = None
-class ParticipantPagination(DispatchBase):
- total: int
- items: List[ParticipantRead] = []
+class ParticipantPagination(Pagination):
+ items: list[ParticipantRead] = []
diff --git a/src/dispatch/participant/service.py b/src/dispatch/participant/service.py
index ee925d6c9cf4..ea873a4bd253 100644
--- a/src/dispatch/participant/service.py
+++ b/src/dispatch/participant/service.py
@@ -1,134 +1,267 @@
-from typing import List, Optional
+"""
+.. module: dispatch.participant.service
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
-from fastapi.encoders import jsonable_encoder
+import logging
-from dispatch.config import INCIDENT_PLUGIN_CONTACT_SLUG
+from sqlalchemy.orm import Session
+
+from dispatch.case import service as case_service
+from dispatch.decorators import timer
+from dispatch.incident import service as incident_service
from dispatch.individual import service as individual_service
from dispatch.individual.models import IndividualContact
from dispatch.participant_role import service as participant_role_service
-from dispatch.participant_role.models import ParticipantRole, ParticipantRoleType
-from dispatch.plugins.base import plugins
-
+from dispatch.participant_role.models import ParticipantRole, ParticipantRoleCreate
+from dispatch.plugin import service as plugin_service
+from dispatch.service import service as service_service
from .models import Participant, ParticipantCreate, ParticipantUpdate
-def get(*, db_session, participant_id: int) -> Optional[Participant]:
- """
- Get a participant by id.
- """
+log = logging.getLogger(__name__)
+
+
+def get(*, db_session: Session, participant_id: int) -> Participant | None:
+ """Returns a participant based on the given participant id."""
return db_session.query(Participant).filter(Participant.id == participant_id).first()
def get_by_individual_contact_id(
- *, db_session, individual_contact_id: int
-) -> Optional[Participant]:
- """
- Get a participant by individual contact id.
- """
+ *, db_session: Session, individual_id: int
+) -> list[Participant | None]:
+ """Returns all participants with the given individual contact id."""
return (
db_session.query(Participant)
- .filter(Participant.individual_contact_id == individual_contact_id)
- .first()
+ .filter(Participant.individual_contact_id == individual_id)
+ .all()
)
+def get_by_incident_id(*, db_session: Session, incident_id: int) -> list[Participant | None]:
+ """Returns all participants for the given incident id."""
+ return db_session.query(Participant).filter(Participant.incident_id == incident_id).all()
+
+
def get_by_incident_id_and_role(
- *, db_session, incident_id: int, role: str
-) -> Optional[Participant]:
- """
- Get a participant by incident id and role name.
- """
+ *, db_session: Session, incident_id: int, role: str
+) -> Participant | None:
+ """Returns all participants that have the given role for the given incident id."""
return (
db_session.query(Participant)
+ .join(ParticipantRole)
.filter(Participant.incident_id == incident_id)
+ .filter(ParticipantRole.renounced_at.is_(None))
+ .filter(ParticipantRole.role == role)
+ .one_or_none()
+ )
+
+
+def get_by_case_id_and_role(
+ *, db_session: Session, case_id: int, role: str
+) -> Participant | None:
+ """Get a participant by case id and role name."""
+ return (
+ db_session.query(Participant)
.join(ParticipantRole)
- .filter(ParticipantRole.renounce_at.is_(None))
+ .filter(Participant.case_id == case_id)
+ .filter(ParticipantRole.renounced_at.is_(None))
.filter(ParticipantRole.role == role)
.one_or_none()
)
def get_by_incident_id_and_email(
- *, db_session, incident_id: int, email: str
-) -> Optional[Participant]:
- """
- Get a participant by incident id and email.
- """
+ *, db_session: Session, incident_id: int, email: str
+) -> Participant | None:
+ """Returns the participant with the given email for the given incident id."""
return (
db_session.query(Participant)
+ .join(IndividualContact)
.filter(Participant.incident_id == incident_id)
+ .filter(IndividualContact.email == email)
+ .one_or_none()
+ )
+
+
+def get_by_case_id_and_email(
+ *, db_session: Session, case_id: int, email: str
+) -> Participant | None:
+ """Get a participant by case id and email."""
+ return (
+ db_session.query(Participant)
.join(IndividualContact)
+ .filter(Participant.case_id == case_id)
.filter(IndividualContact.email == email)
.one_or_none()
)
-def get_all(*, db_session) -> List[Optional[Participant]]:
- """
- Get all participants.
- """
- return db_session.query(Participant)
+@timer
+def get_by_incident_id_and_service_id(
+ *, db_session: Session, incident_id: int, service_id: int
+) -> Participant | None:
+ """Get participant by incident and service id."""
+ return (
+ db_session.query(Participant)
+ .filter(Participant.incident_id == incident_id)
+ .filter(Participant.service_id == service_id)
+ .first()
+ )
+
+
+def get_by_case_id_and_service_id(
+ *, db_session: Session, case_id: int, service_id: int
+) -> Participant | None:
+ """Get participant by incident and service id."""
+ return (
+ db_session.query(Participant)
+ .filter(Participant.case_id == case_id)
+ .filter(Participant.service_id == service_id)
+ .one_or_none()
+ )
-def get_all_by_incident_id(*, db_session, incident_id: int) -> List[Optional[Participant]]:
+def get_by_incident_id_and_conversation_id(
+ *, db_session: Session, incident_id: int, user_conversation_id: str
+) -> Participant | None:
+ """Get participant by incident and user_conversation id."""
+ return (
+ db_session.query(Participant)
+ .filter(Participant.incident_id == incident_id)
+ .filter(Participant.user_conversation_id == user_conversation_id)
+ .one_or_none()
+ )
+
+
+def get_by_case_id_and_conversation_id(
+ *, db_session: Session, case_id: int, user_conversation_id: str
+) -> Participant | None:
+ """Get participant by case and user_conversation id."""
+ return (
+ db_session.query(Participant)
+ .filter(Participant.case_id == case_id)
+ .filter(Participant.user_conversation_id == user_conversation_id)
+ .one_or_none()
+ )
+
+
+def get_all(*, db_session: Session) -> list[Participant | None]:
+ """Returns all participants."""
+ return db_session.query(Participant).all()
+
+
+def get_all_by_incident_id(*, db_session: Session, incident_id: int) -> list[Participant | None]:
"""Get all participants by incident id."""
- return db_session.query(Participant).filter(Participant.incident_id == incident_id)
+ return db_session.query(Participant).filter(Participant.incident_id == incident_id).all()
def get_or_create(
*,
- db_session,
- incident_id: int,
+ db_session: Session,
+ subject_id: int,
+ subject_type: str,
individual_id: int,
- participant_roles: List[ParticipantRoleType],
+ service_id: int,
+ participant_roles: list[ParticipantRoleCreate],
) -> Participant:
"""Gets an existing participant object or creates a new one."""
- participant = (
- db_session.query(Participant)
- .filter(Participant.incident_id == incident_id)
- .filter(Participant.individual_contact_id == individual_id)
- .one_or_none()
- )
+ query = db_session.query(Participant)
+
+ if subject_type == "incident":
+ query = query.filter(Participant.incident_id == subject_id)
+ else:
+ query = query.filter(Participant.case_id == subject_id)
+
+ participant: Participant = query.filter(
+ Participant.individual_contact_id == individual_id
+ ).one_or_none()
if not participant:
- # We get information about the individual
- contact_plugin = plugins.get(INCIDENT_PLUGIN_CONTACT_SLUG)
+ if subject_type == "incident":
+ subject = incident_service.get(db_session=db_session, incident_id=subject_id)
+ if subject_type == "case":
+ subject = case_service.get(db_session=db_session, case_id=subject_id)
+
individual_contact = individual_service.get(
db_session=db_session, individual_contact_id=individual_id
)
- individual_info = contact_plugin.get(individual_contact.email)
- location = individual_info["location"]
- team = individual_info["team"]
- department = individual_info["department"]
+
+ individual_info = {}
+ contact_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=subject.project.id, plugin_type="contact"
+ )
+ if contact_plugin:
+ # We get information about the individual
+ individual_info = contact_plugin.instance.get(
+ individual_contact.email, db_session=db_session
+ )
+
+ location = individual_info.get("location", "Unknown")
+ team = individual_info.get("team", individual_contact.email.split("@")[1])
+ department = individual_info.get("department", "Unknown")
+
participant_in = ParticipantCreate(
- participant_role=participant_roles, team=team, department=department, location=location
+ participant_roles=participant_roles,
+ team=team,
+ department=department,
+ location=location,
)
+
+ if service_id:
+ participant_in.service = {"id": service_id}
+
participant = create(db_session=db_session, participant_in=participant_in)
+ else:
+ # we add additional roles to the participant
+ for participant_role in participant_roles:
+ participant.participant_roles.append(
+ participant_role_service.create(
+ db_session=db_session, participant_role_in=participant_role
+ )
+ )
+
+ if not participant.service:
+ # we only associate the service with the participant once to prevent overwrites
+ service = service_service.get(db_session=db_session, service_id=service_id)
+ if service:
+ participant.service_id = service_id
+ participant.service = service
+
+ db_session.commit()
return participant
-def create(*, db_session, participant_in: ParticipantCreate) -> Participant:
- """
- Create a new participant.
- """
+def create(*, db_session: Session, participant_in: ParticipantCreate) -> Participant:
+ """Creates a new participant."""
participant_roles = [
participant_role_service.create(db_session=db_session, participant_role_in=participant_role)
- for participant_role in participant_in.participant_role
+ for participant_role in participant_in.participant_roles
]
+
+ service = None
+ if participant_in.service:
+ service = service_service.get(db_session=db_session, service_id=participant_in.service.id)
+
participant = Participant(
- **participant_in.dict(exclude={"participant_role"}), participant_role=participant_roles
+ **participant_in.dict(exclude={"participant_roles", "service"}),
+ service=service,
+ participant_roles=participant_roles,
)
+
db_session.add(participant)
db_session.commit()
return participant
-def create_all(*, db_session, participants_in: List[ParticipantCreate]) -> List[Participant]:
- """
- Create a list of participants.
- """
+def create_all(
+ *, db_session: Session, participants_in: list[ParticipantCreate]
+) -> list[Participant]:
+ """Create a list of participants."""
participants = [Participant(**t.dict()) for t in participants_in]
db_session.bulk_save_objects(participants)
db_session.commit()
@@ -136,28 +269,22 @@ def create_all(*, db_session, participants_in: List[ParticipantCreate]) -> List[
def update(
- *, db_session, participant: Participant, participant_in: ParticipantUpdate
+ *, db_session: Session, participant: Participant, participant_in: ParticipantUpdate
) -> Participant:
- """
- Updates a participant.
- """
- participant_data = jsonable_encoder(participant)
-
- update_data = participant_in.dict(skip_defaults=True)
+ """Updates an existing participant."""
+ participant_data = participant.dict()
+ update_data = participant_in.dict(exclude_unset=True)
for field in participant_data:
if field in update_data:
setattr(participant, field, update_data[field])
- db_session.add(participant)
db_session.commit()
return participant
-def delete(*, db_session, participant_id: int):
- """
- Deletes a participant.
- """
+def delete(*, db_session: Session, participant_id: int):
+ """Deletes a participant."""
participant = db_session.query(Participant).filter(Participant.id == participant_id).first()
db_session.delete(participant)
db_session.commit()
diff --git a/src/dispatch/participant_activity/__init__.py b/src/dispatch/participant_activity/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/participant_activity/models.py b/src/dispatch/participant_activity/models.py
new file mode 100644
index 000000000000..48fa0bd26dad
--- /dev/null
+++ b/src/dispatch/participant_activity/models.py
@@ -0,0 +1,52 @@
+from datetime import datetime
+from sqlalchemy import Column, Integer, ForeignKey, DateTime
+from sqlalchemy.orm import relationship
+
+from dispatch.case.models import CaseRead
+from dispatch.database.core import Base
+from dispatch.incident.models import IncidentRead
+from dispatch.models import DispatchBase, PrimaryKey
+from dispatch.participant.models import ParticipantRead
+from dispatch.plugin.models import PluginEvent, PluginEventRead
+
+
+# SQLAlchemy Models
+class ParticipantActivity(Base):
+ id = Column(Integer, primary_key=True)
+
+ plugin_event_id = Column(Integer, ForeignKey(PluginEvent.id))
+ plugin_event = relationship(PluginEvent, foreign_keys=[plugin_event_id])
+
+ started_at = Column(DateTime, default=datetime.utcnow)
+ ended_at = Column(DateTime, default=datetime.utcnow)
+
+ participant_id = Column(Integer, ForeignKey("participant.id"))
+ participant = relationship("Participant", foreign_keys=[participant_id])
+
+ incident_id = Column(Integer, ForeignKey("incident.id"))
+ incident = relationship("Incident", foreign_keys=[incident_id])
+
+ case_id = Column(Integer, ForeignKey("case.id"))
+ case = relationship("Case", foreign_keys=[case_id])
+
+
+# Pydantic Models
+class ParticipantActivityBase(DispatchBase):
+ plugin_event: PluginEventRead
+ started_at: datetime | None = None
+ ended_at: datetime | None = None
+ participant: ParticipantRead
+ incident: IncidentRead | None = None
+ case: CaseRead | None = None
+
+
+class ParticipantActivityRead(ParticipantActivityBase):
+ id: PrimaryKey
+
+
+class ParticipantActivityCreate(ParticipantActivityBase):
+ pass
+
+
+class ParticipantActivityUpdate(ParticipantActivityBase):
+ id: PrimaryKey
diff --git a/src/dispatch/participant_activity/service.py b/src/dispatch/participant_activity/service.py
new file mode 100644
index 000000000000..44ad00b4eb8a
--- /dev/null
+++ b/src/dispatch/participant_activity/service.py
@@ -0,0 +1,189 @@
+from datetime import datetime, timedelta
+
+from sqlalchemy.orm import Session
+from dispatch.database.core import SessionLocal
+from dispatch.participant import service as participant_service
+from dispatch.plugin import service as plugin_service
+
+from .models import (
+ ParticipantActivity,
+ ParticipantActivityRead,
+ ParticipantActivityCreate,
+ ParticipantActivityUpdate,
+)
+
+
+def get_all_incident_participant_activities_from_last_update(
+ db_session: SessionLocal,
+ incident_id: int,
+) -> list[ParticipantActivityRead]:
+ """Fetches the most recent recorded participant incident activities for each participant for a given incident."""
+ return (
+ db_session.query(ParticipantActivity)
+ .distinct(ParticipantActivity.participant_id)
+ .filter(ParticipantActivity.incident_id == incident_id)
+ .order_by(ParticipantActivity.participant_id, ParticipantActivity.ended_at.desc())
+ .all()
+ )
+
+
+def create(*, db_session: SessionLocal, activity_in: ParticipantActivityCreate):
+ """Creates a new record for a participant's activity."""
+ incident_id = activity_in.incident.id if activity_in.incident else None
+ case_id = activity_in.case.id if activity_in.case else None
+
+ activity = ParticipantActivity(
+ plugin_event_id=activity_in.plugin_event.id,
+ started_at=activity_in.started_at,
+ ended_at=activity_in.ended_at,
+ participant_id=activity_in.participant.id,
+ incident_id=incident_id,
+ case_id=case_id,
+ )
+
+ db_session.add(activity)
+ db_session.commit()
+
+ return activity
+
+
+def update(
+ *,
+ db_session: SessionLocal,
+ activity: ParticipantActivity,
+ activity_in: ParticipantActivityUpdate,
+) -> ParticipantActivity:
+ """Updates an existing record for a participant's activity."""
+ activity.ended_at = activity_in.ended_at
+ db_session.commit()
+ return activity
+
+
+def get_all_case_participant_activities_for_case(
+ db_session: Session,
+ case_id: int,
+) -> list[ParticipantActivityRead]:
+ """Fetches all recorded participant case activities for a given case."""
+ return (
+ db_session.query(ParticipantActivity)
+ .filter(ParticipantActivity.case_id == case_id)
+ .order_by(ParticipantActivity.started_at.asc())
+ .all()
+ )
+
+
+def get_all_incident_participant_activities_for_incident(
+ db_session: SessionLocal,
+ incident_id: int,
+) -> list[ParticipantActivityRead]:
+ """Fetches all recorded participant incident activities for a given incident."""
+ return (
+ db_session.query(ParticipantActivity)
+ .filter(ParticipantActivity.incident_id == incident_id)
+ .order_by(ParticipantActivity.started_at.asc())
+ .all()
+ )
+
+
+def get_participant_activity_from_last_update(
+ db_session: SessionLocal,
+ participant_id: int,
+ incident_id=None,
+ case_id=None,
+) -> ParticipantActivity:
+ """Fetches the most recent recorded participant incident activity for a given incident and participant."""
+ if incident_id:
+ return (
+ db_session.query(ParticipantActivity)
+ .filter(ParticipantActivity.incident_id == incident_id)
+ .filter(ParticipantActivity.participant_id == participant_id)
+ .order_by(ParticipantActivity.ended_at.desc())
+ .first()
+ )
+ if case_id:
+ return (
+ db_session.query(ParticipantActivity)
+ .filter(ParticipantActivity.case_id == case_id)
+ .filter(ParticipantActivity.participant_id == participant_id)
+ .order_by(ParticipantActivity.ended_at.desc())
+ .first()
+ )
+
+
+def create_or_update(db_session: SessionLocal, activity_in: ParticipantActivityCreate) -> timedelta:
+ """Creates or updates a participant activity. Returns the change of the participant's total incident response time."""
+ delta = timedelta(seconds=0)
+
+ if activity_in.incident:
+ prev_activity = get_participant_activity_from_last_update(
+ db_session=db_session,
+ participant_id=activity_in.participant.id,
+ incident_id=activity_in.incident.id,
+ )
+ elif activity_in.case:
+ prev_activity = get_participant_activity_from_last_update(
+ db_session=db_session,
+ participant_id=activity_in.participant.id,
+ case_id=activity_in.case.id,
+ )
+ # There's continuous participant activity.
+ if prev_activity and activity_in.started_at < prev_activity.ended_at:
+ # Continuation of current plugin event.
+ if activity_in.plugin_event.id == prev_activity.plugin_event.id:
+ delta = activity_in.ended_at - prev_activity.ended_at
+ prev_activity.ended_at = activity_in.ended_at
+ db_session.commit()
+ return delta
+
+ # New activity is associated with a different plugin event.
+ delta += activity_in.started_at - prev_activity.ended_at
+ prev_activity.ended_at = activity_in.started_at
+
+ create(db_session=db_session, activity_in=activity_in)
+ delta += activity_in.ended_at - activity_in.started_at
+ return delta
+
+
+def get_participant_incident_activities_by_individual_contact(
+ db_session: SessionLocal, individual_contact_id: int
+) -> list[ParticipantActivity]:
+ """Fetches all recorded participant incident activities across all incidents for a given individual."""
+ participants = participant_service.get_by_individual_contact_id(
+ db_session=db_session, individual_id=individual_contact_id
+ )
+
+ return (
+ db_session.query(ParticipantActivity)
+ .filter(
+ ParticipantActivity.participant_id.in_([participant.id for participant in participants])
+ )
+ .all()
+ )
+
+
+def get_all_recorded_incident_partcipant_activities_for_plugin(
+ db_session: SessionLocal,
+ incident_id: int,
+ plugin_id: int,
+ started_at: datetime = datetime.min,
+ ended_at: datetime = datetime.utcnow(),
+) -> list[ParticipantActivityRead]:
+ """Fetches all recorded participant incident activities for a given plugin."""
+
+ plugin_events = plugin_service.get_all_events_for_plugin(
+ db_session=db_session, plugin_id=plugin_id
+ )
+ participant_activities_for_plugin = []
+
+ for plugin_event in plugin_events:
+ event_activities = (
+ db_session.query(ParticipantActivity)
+ .filter(ParticipantActivity.incident_id == incident_id)
+ .filter(ParticipantActivity.plugin_event_id == plugin_event.id)
+ .filter(ParticipantActivity.started_at >= started_at)
+ .filter(ParticipantActivity.ended_at <= ended_at)
+ .all()
+ )
+ participant_activities_for_plugin.extend(event_activities)
+
+ return participant_activities_for_plugin
diff --git a/src/dispatch/participant_role/enums.py b/src/dispatch/participant_role/enums.py
new file mode 100644
index 000000000000..ca9eb5dfd794
--- /dev/null
+++ b/src/dispatch/participant_role/enums.py
@@ -0,0 +1,11 @@
+from dispatch.enums import DispatchEnum
+
+
+class ParticipantRoleType(DispatchEnum):
+ assignee = "Assignee"
+ incident_commander = "Incident Commander"
+ liaison = "Liaison"
+ scribe = "Scribe"
+ participant = "Participant"
+ observer = "Observer"
+ reporter = "Reporter"
diff --git a/src/dispatch/participant_role/flows.py b/src/dispatch/participant_role/flows.py
index 42a5f6879817..4b8b4064ca07 100644
--- a/src/dispatch/participant_role/flows.py
+++ b/src/dispatch/participant_role/flows.py
@@ -1,18 +1,27 @@
+"""
+.. module: dispatch.participant_role.flows
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
+
import logging
+from typing import Any
+from sqlalchemy.orm import Session
-from dispatch.database import SessionLocal
-from dispatch.incident.models import Incident
+from dispatch.database.core import get_table_name_by_class_instance
+from dispatch.event import service as event_service
from dispatch.participant import service as participant_service
+from dispatch.participant_role.models import ParticipantRoleType
+from dispatch.enums import EventType
+
+from .service import get_all_active_roles, add_role, renounce_role
-from .models import ParticipantRoleType, ParticipantRoleCreate
-from .service import create, get_all_active_roles, add_role, renounce_role
log = logging.getLogger(__name__)
-def assign_role_flow(
- incident_id: int, assignee_contact_info: dict, assignee_role: str, db_session: SessionLocal
-):
+def assign_role_flow(subject: Any, assignee_email: str, assignee_role: str, db_session: Session):
"""Attempts to assign a role to a participant.
Returns:
@@ -22,18 +31,60 @@ def assign_role_flow(
- "assignee_has_role", if assignee already has the role.
"""
+ # we get the participant for the assignee
+ subject_type = get_table_name_by_class_instance(subject)
+ if subject_type == "incident":
+ assignee_participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=subject.id, email=assignee_email
+ )
+ if subject_type == "case":
+ assignee_participant = participant_service.get_by_case_id_and_email(
+ db_session=db_session, case_id=subject.id, email=assignee_email
+ )
+
+ # Cases don't have observers, so we don't need to handle this
+ if assignee_role == ParticipantRoleType.observer:
+ # we make the assignee renounce to the participant role
+ participant_active_roles = get_all_active_roles(
+ db_session=db_session, participant_id=assignee_participant.id
+ )
+ for participant_active_role in participant_active_roles:
+ if participant_active_role.role == ParticipantRoleType.participant:
+ renounce_role(db_session=db_session, participant_role=participant_active_role)
+ break
+
+ # we give the assignee the new role
+ add_role(
+ db_session=db_session,
+ participant_id=assignee_participant.id,
+ participant_role=assignee_role,
+ )
+
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"{assignee_participant.individual.name} has been assigned the role of {assignee_role}",
+ incident_id=subject.id,
+ type=EventType.participant_updated,
+ )
+
+ return "role_assigned"
+
# we get the participant that holds the role assigned to the assignee
- participant_with_assignee_role = participant_service.get_by_incident_id_and_role(
- db_session=db_session, incident_id=incident_id, role=assignee_role
- )
- # we get the participant for the assignee
- assignee_participant = participant_service.get_by_incident_id_and_email(
- db_session=db_session, incident_id=incident_id, email=assignee_contact_info["email"]
- )
+ if subject_type == "incident":
+ participant_with_assignee_role = participant_service.get_by_incident_id_and_role(
+ db_session=db_session, incident_id=subject.id, role=assignee_role
+ )
+ if subject_type == "case":
+ participant_with_assignee_role = participant_service.get_by_case_id_and_role(
+ db_session=db_session, case_id=subject.id, role=assignee_role
+ )
- if participant_with_assignee_role is assignee_participant:
- return "assignee_has_role"
+ if participant_with_assignee_role and assignee_participant:
+ if participant_with_assignee_role is assignee_participant:
+ log.debug(f"{assignee_participant.individual.email} already has role: {assignee_role}")
+ return "assignee_has_role"
if participant_with_assignee_role:
# we make the participant renounce to the role that has been given to the assignee
@@ -78,12 +129,39 @@ def assign_role_flow(
participant_role=assignee_role,
)
- log.debug(f"We assigned the {assignee_role} role to {assignee_contact_info['fullname']}.")
-
+ # we update the commander, reporter, scribe, or liaison foreign key
+ if assignee_role == ParticipantRoleType.incident_commander:
+ subject.commander_id = assignee_participant.id
+ elif assignee_role == ParticipantRoleType.reporter:
+ subject.reporter_id = assignee_participant.id
+ elif assignee_role == ParticipantRoleType.scribe:
+ subject.scribe_id = assignee_participant.id
+ elif assignee_role == ParticipantRoleType.liaison:
+ subject.liaison_id = assignee_participant.id
+ elif assignee_role == ParticipantRoleType.assignee:
+ subject.assignee_id = assignee_participant.id
+
+ # we add and commit the changes
+ db_session.add(subject)
+ db_session.commit()
+
+ if subject_type == "incident":
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"{assignee_participant.individual.name} has been assigned the role of {assignee_role}",
+ incident_id=subject.id,
+ type=EventType.participant_updated,
+ )
+ if subject_type == "case":
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description=f"{assignee_participant.individual.name} has been assigned the role of {assignee_role}",
+ case_id=subject.id,
+ )
return "role_assigned"
- log.debug(
- f"We were not able to assign the {assignee_role} role to {assignee_contact_info['fullname']}."
- )
+ log.debug(f"We were not able to assign the {assignee_role} role to {assignee_email}.")
return "role_not_assigned"
diff --git a/src/dispatch/participant_role/models.py b/src/dispatch/participant_role/models.py
index a872f2d6abf3..e742e71d0d67 100644
--- a/src/dispatch/participant_role/models.py
+++ b/src/dispatch/participant_role/models.py
@@ -1,36 +1,30 @@
from datetime import datetime
-from enum import Enum
-from typing import List, Optional
-from sqlalchemy import Column, DateTime, ForeignKey, Integer, String
-from dispatch.database import Base
-from dispatch.models import DispatchBase
+from sqlalchemy import Column, DateTime, ForeignKey, Integer, String
+from dispatch.database.core import Base
+from dispatch.models import DispatchBase, PrimaryKey
-class ParticipantRoleType(str, Enum):
- incident_commander = "Incident Commander"
- scribe = "Scribe"
- liaison = "Liaison"
- participant = "Participant"
- reporter = "Reporter"
+from .enums import ParticipantRoleType
class ParticipantRole(Base):
id = Column(Integer, primary_key=True)
- assume_at = Column(DateTime, default=datetime.utcnow)
- renounce_at = Column(DateTime)
+ assumed_at = Column(DateTime, default=datetime.utcnow)
+ renounced_at = Column(DateTime)
role = Column(String, default=ParticipantRoleType.participant)
- participant_id = Column(Integer, ForeignKey("participant.id"))
+ activity = Column(Integer, default=0)
+ participant_id = Column(Integer, ForeignKey("participant.id", ondelete="CASCADE"))
# Pydantic models...
class ParticipantRoleBase(DispatchBase):
- pass
+ role: str
class ParticipantRoleCreate(ParticipantRoleBase):
- role: Optional[ParticipantRoleType]
+ role: ParticipantRoleType
class ParticipantRoleUpdate(ParticipantRoleBase):
@@ -38,11 +32,15 @@ class ParticipantRoleUpdate(ParticipantRoleBase):
class ParticipantRoleRead(ParticipantRoleBase):
- id: int
- assume_at: Optional[datetime] = None
- renounce_at: Optional[datetime] = None
+ id: PrimaryKey
+ assumed_at: datetime | None = None
+ renounced_at: datetime | None = None
+ activity: int | None = None
+
+
+class ParticipantRoleReadMinimal(ParticipantRoleRead):
+ pass
class ParticipantRolePagination(ParticipantRoleBase):
- total: int
- items: List[ParticipantRoleRead] = []
+ items: list[ParticipantRoleRead] = []
diff --git a/src/dispatch/participant_role/service.py b/src/dispatch/participant_role/service.py
index 6f73cdd0ac3c..b222ab7e58af 100644
--- a/src/dispatch/participant_role/service.py
+++ b/src/dispatch/participant_role/service.py
@@ -1,9 +1,5 @@
from datetime import datetime
-from fastapi.encoders import jsonable_encoder
-
-from typing import List, Optional
-
from dispatch.participant import service as participant_service
from .models import (
@@ -14,27 +10,41 @@
)
-def get(*, db_session, participant_role_id: int) -> Optional[ParticipantRole]:
+def get(*, db_session, participant_role_id: int) -> ParticipantRole | None:
"""Returns a participant role based on the given id."""
return (
db_session.query(ParticipantRole).filter(ParticipantRole.id == participant_role_id).first()
)
-def get_all(*, db_session):
- """Returns all participant roles."""
- return db_session.query(ParticipantRole)
+def get_last_active_role(
+ *,
+ db_session,
+ participant_id: int,
+) -> ParticipantRole | None:
+ """Returns the participant's last active role."""
+ return (
+ db_session.query(ParticipantRole)
+ .filter(ParticipantRole.participant_id == participant_id)
+ .order_by(ParticipantRole.renounced_at.desc())
+ .first()
+ )
-def get_all_active_roles(*, db_session, participant_id: int) -> List[Optional[ParticipantRole]]:
+def get_all_active_roles(*, db_session, participant_id: int) -> list[ParticipantRole | None]:
"""Returns all active roles for the given participant id."""
return (
db_session.query(ParticipantRole)
.filter(ParticipantRole.participant_id == participant_id)
- .filter(ParticipantRole.renounce_at.is_(None))
+ .filter(ParticipantRole.renounced_at.is_(None))
)
+def get_all(*, db_session):
+ """Returns all participant roles."""
+ return db_session.query(ParticipantRole)
+
+
def add_role(
*, db_session, participant_id: int, participant_role: ParticipantRoleType
) -> ParticipantRole:
@@ -42,7 +52,7 @@ def add_role(
participant = participant_service.get(db_session=db_session, participant_id=participant_id)
participant_role_in = ParticipantRoleCreate(role=participant_role)
participant_role = create(db_session=db_session, participant_role_in=participant_role_in)
- participant.participant_role.append(participant_role)
+ participant.participant_roles.append(participant_role)
db_session.add(participant)
db_session.commit()
return participant_role
@@ -50,7 +60,7 @@ def add_role(
def renounce_role(*, db_session, participant_role: ParticipantRole) -> ParticipantRole:
"""Renounces the given role."""
- participant_role.renounce_at = datetime.utcnow()
+ participant_role.renounced_at = datetime.utcnow()
db_session.add(participant_role)
db_session.commit()
return participant_role
@@ -67,26 +77,21 @@ def create(*, db_session, participant_role_in: ParticipantRoleCreate) -> Partici
def update(
*, db_session, participant_role: ParticipantRole, participant_role_in: ParticipantRoleUpdate
) -> ParticipantRole:
- """
- Updates a participant role.
- """
- participant_role_data = jsonable_encoder(participant_role)
+ """Updates a participant role."""
+ participant_role_data = participant_role.dict()
- update_data = participant_role_in.dict(skip_defaults=True)
+ update_data = participant_role_in.dict(exclude_unset=True)
for field in participant_role_data:
if field in update_data:
setattr(participant_role, field, update_data[field])
- db_session.add(participant_role)
db_session.commit()
return participant_role
def delete(*, db_session, participant_role_id: int):
- """
- Deletes a participant role.
- """
+ """Deletes a participant role."""
participant_role = (
db_session.query(ParticipantRole).filter(ParticipantRole.id == participant_role_id).first()
)
diff --git a/src/dispatch/plugin/__init__.py b/src/dispatch/plugin/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/plugin/models.py b/src/dispatch/plugin/models.py
new file mode 100644
index 000000000000..1be6585a0e82
--- /dev/null
+++ b/src/dispatch/plugin/models.py
@@ -0,0 +1,242 @@
+import logging
+import json
+
+from pydantic import SecretStr
+from pydantic.json import pydantic_encoder
+
+from sqlalchemy import Column, Integer, String, Boolean, ForeignKey
+from sqlalchemy.ext.associationproxy import association_proxy
+from sqlalchemy.ext.hybrid import hybrid_property
+from sqlalchemy.orm import relationship
+from sqlalchemy_utils import TSVectorType, StringEncryptedType
+from sqlalchemy_utils.types.encrypted.encrypted_type import AesEngine
+
+from dispatch.config import DISPATCH_ENCRYPTION_KEY
+from dispatch.database.core import Base
+from dispatch.models import DispatchBase, ProjectMixin, Pagination, PrimaryKey, NameStr
+from dispatch.plugins.base import plugins
+from dispatch.project.models import ProjectRead
+from typing import Any
+
+logger = logging.getLogger(__name__)
+
+
+def show_secrets_encoder(obj):
+ if isinstance(obj, SecretStr):
+ return obj.get_secret_value()
+ else:
+ return pydantic_encoder(obj)
+
+
+class Plugin(Base):
+ __table_args__ = {"schema": "dispatch_core"}
+ id = Column(Integer, primary_key=True)
+ title = Column(String)
+ slug = Column(String, unique=True)
+ description = Column(String)
+ version = Column(String)
+ author = Column(String)
+ author_url = Column(String)
+ type = Column(String)
+ multiple = Column(Boolean)
+
+ search_vector = Column(
+ TSVectorType(
+ "title",
+ "slug",
+ "type",
+ "description",
+ weights={"title": "A", "slug": "B", "type": "C", "description": "C"},
+ )
+ )
+
+ @property
+ def configuration_schema(self):
+ """Renders the plugin's schema to JSON Schema."""
+ try:
+ plugin = plugins.get(self.slug)
+ if getattr(plugin, "configuration_schema", None) is not None:
+ return plugin.configuration_schema.schema()
+ return None
+ except Exception as e:
+ logger.warning(
+ f"Error trying to load configuration_schema for plugin with slug {self.slug}: {e}"
+ )
+ return None
+
+
+# SQLAlchemy Model
+class PluginEvent(Base):
+ __table_args__ = {"schema": "dispatch_core"}
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ slug = Column(String, unique=True)
+ description = Column(String)
+ plugin_id = Column(Integer, ForeignKey(Plugin.id))
+ plugin = relationship(Plugin, foreign_keys=[plugin_id])
+
+ search_vector = Column(
+ TSVectorType(
+ "name",
+ "slug",
+ "description",
+ weights={"name": "A", "slug": "B", "description": "C"},
+ )
+ )
+
+
+class PluginInstance(Base, ProjectMixin):
+ id = Column(Integer, primary_key=True)
+ enabled = Column(Boolean)
+ _configuration = Column(
+ StringEncryptedType(key=str(DISPATCH_ENCRYPTION_KEY), engine=AesEngine, padding="pkcs5")
+ )
+ plugin_id = Column(Integer, ForeignKey(Plugin.id))
+ plugin = relationship(Plugin, backref="instances")
+
+ # this is some magic that allows us to use the plugin search vectors
+ # against our plugin instances
+ search_vector = association_proxy("plugin", "search_vector")
+
+ @property
+ def instance(self):
+ """Fetches a plugin instance that matches this record."""
+ try:
+ plugin = plugins.get(self.plugin.slug)
+ plugin.configuration = self.configuration
+ plugin.project_id = self.project_id
+ return plugin
+ except Exception as e:
+ logger.warning(f"Error trying to load plugin with slug {self.slug}: {e}")
+ return self.plugin
+
+ @property
+ def broken(self):
+ try:
+ plugins.get(self.plugin.slug)
+ return False
+ except Exception:
+ return True
+
+ @property
+ def configuration_schema(self):
+ """Renders the plugin's schema to JSON Schema."""
+ try:
+ plugin = plugins.get(self.plugin.slug)
+ if getattr(plugin, "configuration_schema", None) is not None:
+ return plugin.configuration_schema.schema()
+ return None
+ except Exception as e:
+ logger.warning(
+ f"Error trying to load plugin {self.plugin.title} {self.plugin.description} with error {e}"
+ )
+ return None
+
+ @hybrid_property
+ def configuration(self):
+ """Property that correctly returns a plugins configuration object."""
+ try:
+ if self._configuration:
+ plugin = plugins.get(self.plugin.slug)
+ return plugin.configuration_schema.parse_raw(self._configuration)
+ except Exception as e:
+ logger.warning(
+ f"Error trying to load plugin {self.plugin.title} {self.plugin.description} with error {e}"
+ )
+ return None
+
+ @configuration.setter
+ def configuration(self, configuration):
+ """Property that correctly sets a plugins configuration object."""
+ if configuration:
+ plugin = plugins.get(self.plugin.slug)
+ config_object = plugin.configuration_schema.parse_obj(configuration)
+ self._configuration = json.dumps(
+ config_object.model_dump(), default=show_secrets_encoder
+ )
+
+
+# Pydantic models...
+class PluginBase(DispatchBase):
+ pass
+
+
+class PluginRead(PluginBase):
+ id: PrimaryKey
+ title: str
+ slug: str
+ author: str
+ author_url: str
+ type: str
+ multiple: bool
+ configuration_schema: Any | None = None
+ description: str | None = None
+
+
+class PluginEventBase(DispatchBase):
+ name: NameStr
+ slug: str
+ plugin: PluginRead
+ description: str | None = None
+
+
+class PluginEventRead(PluginEventBase):
+ id: PrimaryKey
+
+
+class PluginEventCreate(PluginEventBase):
+ pass
+
+
+class PluginEventPagination(Pagination):
+ items: list[PluginEventRead] = []
+
+
+class PluginInstanceRead(PluginBase):
+ id: PrimaryKey
+ enabled: bool | None = None
+ configuration: Any | None = None
+ configuration_schema: Any | None = None
+ plugin: PluginRead
+ project: ProjectRead | None = None
+ broken: bool | None = None
+
+
+class PluginInstanceReadMinimal(PluginBase):
+ id: PrimaryKey
+ enabled: bool | None = None
+ configuration_schema: Any | None = None
+ plugin: PluginRead
+ project: ProjectRead | None = None
+ broken: bool | None = None
+
+
+class PluginInstanceCreate(PluginBase):
+ enabled: bool | None = None
+ configuration: Any | None = None
+ plugin: PluginRead
+ project: ProjectRead
+
+
+class PluginInstanceUpdate(PluginBase):
+ id: PrimaryKey = None
+ enabled: bool | None = None
+ configuration: Any | None = None
+
+
+class KeyValue(DispatchBase):
+ key: str
+ value: str | list[str] | dict
+
+
+class PluginMetadata(DispatchBase):
+ slug: str
+ metadata: list[KeyValue] = []
+
+
+class PluginPagination(Pagination):
+ items: list[PluginRead] = []
+
+
+class PluginInstancePagination(Pagination):
+ items: list[PluginInstanceReadMinimal] = []
diff --git a/src/dispatch/plugin/service.py b/src/dispatch/plugin/service.py
new file mode 100644
index 000000000000..c87e2600801d
--- /dev/null
+++ b/src/dispatch/plugin/service.py
@@ -0,0 +1,202 @@
+import logging
+from pydantic import ValidationError
+
+from sqlalchemy.orm import Session
+
+from dispatch.plugins.bases import OncallPlugin
+from dispatch.project import service as project_service
+from dispatch.service import service as service_service
+
+from .models import (
+ Plugin,
+ PluginInstance,
+ PluginInstanceCreate,
+ PluginInstanceUpdate,
+ PluginEvent,
+ PluginEventCreate,
+)
+
+
+log = logging.getLogger(__name__)
+
+
+def get(*, db_session: Session, plugin_id: int) -> Plugin | None:
+ """Returns a plugin based on the given plugin id."""
+ return db_session.query(Plugin).filter(Plugin.id == plugin_id).one_or_none()
+
+
+def get_by_slug(*, db_session: Session, slug: str) -> Plugin:
+ """Fetches a plugin by slug."""
+ return db_session.query(Plugin).filter(Plugin.slug == slug).one_or_none()
+
+
+def get_all(*, db_session) -> list[Plugin | None]:
+ """Returns all plugins."""
+ return db_session.query(Plugin).all()
+
+
+def get_by_type(*, db_session: Session, plugin_type: str) -> list[Plugin | None]:
+ """Fetches all plugins for a given type."""
+ return db_session.query(Plugin).filter(Plugin.type == plugin_type).all()
+
+
+def get_instance(*, db_session: Session, plugin_instance_id: int) -> PluginInstance | None:
+ """Returns a plugin instance based on the given instance id."""
+ return (
+ db_session.query(PluginInstance)
+ .filter(PluginInstance.id == plugin_instance_id)
+ .one_or_none()
+ )
+
+
+def get_active_instance(
+ *, db_session: Session, plugin_type: str, project_id=None
+) -> PluginInstance | None:
+ """Fetches the current active plugin for the given type."""
+ return (
+ db_session.query(PluginInstance)
+ .join(Plugin)
+ .filter(Plugin.type == plugin_type)
+ .filter(PluginInstance.project_id == project_id)
+ .filter(PluginInstance.enabled == True) # noqa
+ .one_or_none()
+ )
+
+
+def get_active_instances(
+ *, db_session: Session, plugin_type: str, project_id=None
+) -> PluginInstance | None:
+ """Fetches the current active plugin for the given type."""
+ return (
+ db_session.query(PluginInstance)
+ .join(Plugin)
+ .filter(Plugin.type == plugin_type)
+ .filter(PluginInstance.project_id == project_id)
+ .filter(PluginInstance.enabled == True) # noqa
+ .all()
+ )
+
+
+def get_active_instance_by_slug(
+ *, db_session: Session, slug: str, project_id: int | None = None
+) -> PluginInstance | None:
+ """Fetches the current active plugin for the given type."""
+ return (
+ db_session.query(PluginInstance)
+ .join(Plugin)
+ .filter(Plugin.slug == slug)
+ .filter(PluginInstance.project_id == project_id)
+ .filter(PluginInstance.enabled == True) # noqa
+ .one_or_none()
+ )
+
+
+def get_enabled_instances_by_type(
+ *, db_session: Session, project_id: int, plugin_type: str
+) -> list[PluginInstance | None]:
+ """Fetches all enabled plugins for a given type."""
+ return (
+ db_session.query(PluginInstance)
+ .join(Plugin)
+ .filter(Plugin.type == plugin_type)
+ .filter(PluginInstance.project_id == project_id)
+ .filter(PluginInstance.enabled == True) # noqa
+ .all()
+ )
+
+
+def create_instance(
+ *, db_session: Session, plugin_instance_in: PluginInstanceCreate
+) -> PluginInstance:
+ """Creates a new plugin instance."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=plugin_instance_in.project
+ )
+ plugin = get(db_session=db_session, plugin_id=plugin_instance_in.plugin.id)
+ plugin_instance = PluginInstance(
+ **plugin_instance_in.dict(exclude={"project", "plugin", "configuration"}),
+ project=project,
+ plugin=plugin,
+ )
+ plugin_instance.configuration = plugin_instance_in.configuration
+
+ db_session.add(plugin_instance)
+ db_session.commit()
+ return plugin_instance
+
+
+def update_instance(
+ *,
+ db_session: Session,
+ plugin_instance: PluginInstance,
+ plugin_instance_in: PluginInstanceUpdate,
+) -> PluginInstance:
+ """Updates a plugin instance."""
+ plugin_instance_data = plugin_instance.dict()
+ update_data = plugin_instance_in.dict(exclude_unset=True)
+
+ if plugin_instance_in.enabled: # user wants to enable the plugin
+ if not plugin_instance.plugin.multiple:
+ # we can't have multiple plugins of this type disable the currently enabled one
+ enabled_plugin_instances = get_enabled_instances_by_type(
+ db_session=db_session,
+ project_id=plugin_instance.project_id,
+ plugin_type=plugin_instance.plugin.type,
+ )
+ if enabled_plugin_instances:
+ enabled_plugin_instances[0].enabled = False
+
+ if not plugin_instance_in.enabled: # user wants to disable the plugin
+ if plugin_instance.plugin.type == OncallPlugin.type:
+ oncall_services = service_service.get_all_by_type_and_status(
+ db_session=db_session, service_type=plugin_instance.plugin.slug, is_active=True
+ )
+ if oncall_services:
+ raise ValidationError([
+ {
+ "msg": "Cannot disable plugin instance: {plugin_instance.plugin.title}. One or more oncall services depend on it. ",
+ "loc": "plugin_instance",
+ }
+ ])
+
+ for field in plugin_instance_data:
+ if field in update_data:
+ setattr(plugin_instance, field, update_data[field])
+
+ plugin_instance.configuration = plugin_instance_in.configuration
+
+ db_session.commit()
+ return plugin_instance
+
+
+def delete_instance(*, db_session: Session, plugin_instance_id: int):
+ """Deletes a plugin instance."""
+ db_session.query(PluginInstance).filter(PluginInstance.id == plugin_instance_id).delete()
+ db_session.commit()
+
+
+def get_plugin_event_by_id(*, db_session: Session, plugin_event_id: int) -> PluginEvent | None:
+ """Returns a plugin event based on the plugin event id."""
+ return db_session.query(PluginEvent).filter(PluginEvent.id == plugin_event_id).one_or_none()
+
+
+def get_plugin_event_by_slug(*, db_session: Session, slug: str) -> PluginEvent | None:
+ """Returns a project based on the plugin event slug."""
+ return db_session.query(PluginEvent).filter(PluginEvent.slug == slug).one_or_none()
+
+
+def get_all_events_for_plugin(
+ *, db_session: Session, plugin_id: int
+) -> list[PluginEvent | None]:
+ """Returns all plugin events for a given plugin."""
+ return db_session.query(PluginEvent).filter(PluginEvent.plugin_id == plugin_id).all()
+
+
+def create_plugin_event(*, db_session: Session, plugin_event_in: PluginEventCreate) -> PluginEvent:
+ """Creates a new plugin event."""
+ plugin_event = PluginEvent(**plugin_event_in.dict(exclude={"plugin"}))
+ plugin_event.plugin = get(db_session=db_session, plugin_id=plugin_event_in.plugin.id)
+ db_session.add(plugin_event)
+ db_session.commit()
+
+ return plugin_event
diff --git a/src/dispatch/plugin/views.py b/src/dispatch/plugin/views.py
new file mode 100644
index 000000000000..e676396fb662
--- /dev/null
+++ b/src/dispatch/plugin/views.py
@@ -0,0 +1,112 @@
+from fastapi import APIRouter, Depends, HTTPException, status
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.auth.permissions import SensitiveProjectActionPermission, PermissionsDependency
+from dispatch.models import PrimaryKey
+
+from .models import (
+ PluginEventPagination,
+ PluginInstanceRead,
+ PluginInstanceCreate,
+ PluginInstanceUpdate,
+ PluginInstancePagination,
+ PluginPagination,
+)
+from .service import get_instance, update_instance, create_instance, delete_instance
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=PluginPagination)
+def get_plugins(common: CommonParameters):
+ """Get all plugins."""
+ return search_filter_sort_paginate(model="Plugin", **common)
+
+
+@router.get(
+ "/instances",
+ response_model=PluginInstancePagination,
+)
+def get_plugin_instances(common: CommonParameters):
+ """Get all plugin instances."""
+ return search_filter_sort_paginate(model="PluginInstance", **common)
+
+
+@router.get(
+ "/instances/{plugin_instance_id}",
+ response_model=PluginInstanceRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def get_plugin_instance(db_session: DbSession, plugin_instance_id: PrimaryKey):
+ """Get a plugin instance."""
+ plugin = get_instance(db_session=db_session, plugin_instance_id=plugin_instance_id)
+ if not plugin:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A plugin instance with this id does not exist."}],
+ )
+ return plugin
+
+
+@router.post(
+ "/instances",
+ response_model=PluginInstanceRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def create_plugin_instance(db_session: DbSession, plugin_instance_in: PluginInstanceCreate):
+ """Create a new plugin instance."""
+ return create_instance(db_session=db_session, plugin_instance_in=plugin_instance_in)
+
+
+@router.put(
+ "/instances/{plugin_instance_id}",
+ response_model=PluginInstanceCreate,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_plugin_instance(
+ db_session: DbSession,
+ plugin_instance_id: PrimaryKey,
+ plugin_instance_in: PluginInstanceUpdate,
+):
+ """Update a plugin instance."""
+ plugin_instance = get_instance(db_session=db_session, plugin_instance_id=plugin_instance_id)
+ if not plugin_instance:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A plugin instance with this id does not exist."}],
+ )
+
+ plugin_instance = update_instance(
+ db_session=db_session,
+ plugin_instance=plugin_instance,
+ plugin_instance_in=plugin_instance_in,
+ )
+
+ return plugin_instance
+
+
+@router.delete(
+ "/instances/{plugin_instance_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def delete_plugin_instances(
+ db_session: DbSession,
+ plugin_instance_id: PrimaryKey,
+):
+ """Deletes an existing plugin instance."""
+ plugin_instance = get_instance(db_session=db_session, plugin_instance_id=plugin_instance_id)
+ if not plugin_instance:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A plugin instance with this id does not exist."}],
+ )
+ delete_instance(db_session=db_session, plugin_instance_id=plugin_instance_id)
+
+
+@router.get("/plugin_events", response_model=PluginEventPagination)
+def get_plugin_events(common: CommonParameters):
+ """Get all plugins."""
+ return search_filter_sort_paginate(model="PluginEvent", **common)
diff --git a/src/dispatch/plugins/base/__init__.py b/src/dispatch/plugins/base/__init__.py
index fb5dde30e667..024a08b1f407 100644
--- a/src/dispatch/plugins/base/__init__.py
+++ b/src/dispatch/plugins/base/__init__.py
@@ -6,6 +6,7 @@
.. moduleauthor:: Kevin Glisson
"""
+
from __future__ import absolute_import, print_function
from dispatch.plugins.base.manager import PluginManager
diff --git a/src/dispatch/plugins/base/manager.py b/src/dispatch/plugins/base/manager.py
index 8821d0eccea9..99f07c44ecd6 100644
--- a/src/dispatch/plugins/base/manager.py
+++ b/src/dispatch/plugins/base/manager.py
@@ -5,6 +5,7 @@
.. moduleauthor:: Kevin Glisson (kglisson@netflix.com)
"""
+
import logging
from dispatch.common.managers import InstanceManager
@@ -24,8 +25,6 @@ def all(self, version=1, plugin_type=None):
for plugin in sorted(super(PluginManager, self).all(), key=lambda x: x.get_title()):
if not plugin.type == plugin_type and plugin_type:
continue
- if not plugin.is_enabled():
- continue
if version is not None and plugin.__version__ != version:
continue
yield plugin
diff --git a/src/dispatch/plugins/base/v1.py b/src/dispatch/plugins/base/v1.py
index 8d1f5ab7b011..ba4ce45fcdb8 100644
--- a/src/dispatch/plugins/base/v1.py
+++ b/src/dispatch/plugins/base/v1.py
@@ -6,15 +6,25 @@
.. moduleauthor:: Kevin Glisson
"""
+
import logging
from threading import local
-from typing import Any, List, Optional
-from pydantic.schema import schema
+from pydantic import BaseModel
+from typing import Any
logger = logging.getLogger(__name__)
+class PluginConfiguration(BaseModel):
+ pass
+
+
+class IPluginEvent:
+ name: str | None = None
+ description: str | None = None
+
+
# stolen from https://github.com/getsentry/sentry/
class PluginMount(type):
def __new__(cls, name, bases, attrs):
@@ -25,6 +35,7 @@ def __new__(cls, name, bases, attrs):
new_cls.title = new_cls.__name__
if not new_cls.slug:
new_cls.slug = new_cls.title.replace(" ", "-").lower()
+
return new_cls
@@ -44,56 +55,46 @@ class IPlugin(local):
"""
# Generic plugin information
- title: Optional[str] = None
- slug: Optional[str] = None
- description: Optional[str] = None
- version: Optional[str] = None
- author: Optional[str] = None
- author_url: Optional[str] = None
+ title: str | None = None
+ slug: str | None = None
+ description: str | None = None
+ version: str | None = None
+ author: str | None = None
+ author_url: str | None = None
+ configuration: dict | None = None
+ project_id: int | None = None
resource_links = ()
- _schema: Any = None
- commands: List[Any] = []
+ schema: PluginConfiguration
+ commands: list[Any] = []
events: Any = None
+ plugin_events: list[IPluginEvent | None] = []
# Global enabled state
- enabled: bool = True
+ enabled: bool = False
can_disable: bool = True
-
- def validate_options(self, options: dict) -> Any:
- """
- Validates given options against defined schema.
- >>> plugin.validate_options(options)
- """
- return self._schema(**options)
-
- @property
- def schema(self):
- """Returns current plugin schema."""
- return schema([self._schema])
+ multiple: bool = False
def is_enabled(self) -> bool:
"""
Returns a boolean representing if this plugin is enabled.
- If ``project`` is passed, it will limit the scope to that project.
>>> plugin.is_enabled()
"""
if not self.enabled:
return False
if not self.can_disable:
return True
-
return True
- def get_title(self) -> Optional[str]:
+ def get_title(self) -> str | None:
"""
Returns the general title for this plugin.
>>> plugin.get_title()
"""
return self.title
- def get_description(self) -> Optional[str]:
+ def get_description(self) -> str | None:
"""
Returns the description for this plugin. This is shown on the plugin configuration
page.
@@ -101,7 +102,7 @@ def get_description(self) -> Optional[str]:
"""
return self.description
- def get_resource_links(self) -> List[Any]:
+ def get_resource_links(self) -> list[Any]:
"""
Returns a list of tuples pointing to various resources for this plugin.
>>> def get_resource_links(self):
@@ -113,6 +114,14 @@ def get_resource_links(self) -> List[Any]:
"""
return self.resource_links
+ def get_event(self, event) -> IPluginEvent | None:
+ for plugin_event in self.plugin_events:
+ if plugin_event.slug == event.slug:
+ return plugin_event
+
+ def fetch_events(self, **kwargs):
+ raise NotImplementedError
+
class Plugin(IPlugin):
"""
diff --git a/src/dispatch/plugins/bases/__init__.py b/src/dispatch/plugins/bases/__init__.py
index c8d9438b16f4..7979d42c94fd 100644
--- a/src/dispatch/plugins/bases/__init__.py
+++ b/src/dispatch/plugins/bases/__init__.py
@@ -1,15 +1,22 @@
+from .artificial_intelligence import ArtificialIntelligencePlugin # noqa
+from .auth_mfa import MultiFactorAuthenticationPlugin # noqa
+from .auth_provider import AuthenticationProviderPlugin # noqa
from .conference import ConferencePlugin # noqa
from .contact import ContactPlugin # noqa
from .conversation import ConversationPlugin # noqa
from .definition import DefinitionPlugin # noqa
from .document import DocumentPlugin # noqa
+from .email import EmailPlugin # noqa
+from .monitor import MonitorPlugin # noqa
from .oncall import OncallPlugin # noqa
from .participant import ParticipantPlugin # noqa
from .participant_group import ParticipantGroupPlugin # noqa
+from .signal_consumer import SignalConsumerPlugin # noqa
+from .signal_enrichment import SignalEnrichmentPlugin # noqa
+from .source import SourcePlugin # noqa
from .storage import StoragePlugin # noqa
+from .tag import TagPlugin # noqa
from .task import TaskPlugin # noqa
from .term import TermPlugin # noqa
from .ticket import TicketPlugin # noqa
-from .document_resolver import DocumentResolverPlugin # noqa
-from .tag import TagPlugin # noqa
-from .auth_provider import AuthenticationProviderPlugin # noqa
+from .workflow import WorkflowPlugin # noqa
diff --git a/src/dispatch/plugins/bases/artificial_intelligence.py b/src/dispatch/plugins/bases/artificial_intelligence.py
new file mode 100644
index 000000000000..a136122d0f03
--- /dev/null
+++ b/src/dispatch/plugins/bases/artificial_intelligence.py
@@ -0,0 +1,22 @@
+"""
+.. module: dispatch.plugins.bases.artificial_intelligence
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+.. moduleauthor:: Marc Vilanova
+"""
+
+from dispatch.plugins.base import Plugin
+
+
+class ArtificialIntelligencePlugin(Plugin):
+ type = "artificial-intelligence"
+
+ def chat_completion(self, items, **kwargs):
+ raise NotImplementedError
+
+ def chat_parse(self, items, **kwargs):
+ raise NotImplementedError
+
+ def list_models(self, items, **kwargs):
+ raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/auth_mfa.py b/src/dispatch/plugins/bases/auth_mfa.py
new file mode 100644
index 000000000000..28500de37b55
--- /dev/null
+++ b/src/dispatch/plugins/bases/auth_mfa.py
@@ -0,0 +1,22 @@
+"""
+.. module: dispatch.plugins.bases.mfa
+ :platform: Unix
+ :copyright: (c) 2023 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+.. moduleauthor:: Will Sheldon
+"""
+
+from dispatch.plugins.base import Plugin
+
+
+class MultiFactorAuthenticationPlugin(Plugin):
+ type = "auth-mfa"
+
+ def send_push_notification(self, items, **kwargs):
+ raise NotImplementedError
+
+ def validate_mfa(self, items, **kwargs):
+ raise NotImplementedError
+
+ def create_mfa_challenge(self, items, **kwargs):
+ raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/auth_provider.py b/src/dispatch/plugins/bases/auth_provider.py
index 08da87b3d06d..92b6dda148db 100644
--- a/src/dispatch/plugins/bases/auth_provider.py
+++ b/src/dispatch/plugins/bases/auth_provider.py
@@ -5,14 +5,13 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
-from dispatch.models import PluginOptionModel
+
from dispatch.plugins.base import Plugin
from starlette.requests import Request
class AuthenticationProviderPlugin(Plugin):
type = "auth-provider"
- _schema = PluginOptionModel
def get_current_user(self, request: Request, **kwargs):
raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/conference.py b/src/dispatch/plugins/bases/conference.py
index f270c257529d..c3d73f81b8b4 100644
--- a/src/dispatch/plugins/bases/conference.py
+++ b/src/dispatch/plugins/bases/conference.py
@@ -5,13 +5,12 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
+
from dispatch.plugins.base import Plugin
-from dispatch.models import PluginOptionModel
class ConferencePlugin(Plugin):
type = "conference"
- _schema = PluginOptionModel
def create(self, items, **kwargs):
raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/contact.py b/src/dispatch/plugins/bases/contact.py
index 759bced930dc..ee9289d27260 100644
--- a/src/dispatch/plugins/bases/contact.py
+++ b/src/dispatch/plugins/bases/contact.py
@@ -5,13 +5,12 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
+
from dispatch.plugins.base import Plugin
-from dispatch.models import PluginOptionModel
class ContactPlugin(Plugin):
type = "contact"
- _schema = PluginOptionModel
def get(self, key, **kwargs):
raise NotImplementedError
@@ -24,6 +23,3 @@ def update(self, key, **kwargs):
def delete(self, key, **kwargs):
raise NotImplementedError
-
- def move(self, key, **kwargs):
- raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/conversation.py b/src/dispatch/plugins/bases/conversation.py
index d18cbdbfa1c7..347cd3d188d0 100644
--- a/src/dispatch/plugins/bases/conversation.py
+++ b/src/dispatch/plugins/bases/conversation.py
@@ -5,13 +5,12 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
+
from dispatch.plugins.base import Plugin
-from dispatch.models import PluginOptionModel
class ConversationPlugin(Plugin):
type = "conversation"
- _schema = PluginOptionModel
def create(self, items, **kwargs):
raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/definition.py b/src/dispatch/plugins/bases/definition.py
index a5272af5957b..970b2b3294c3 100644
--- a/src/dispatch/plugins/bases/definition.py
+++ b/src/dispatch/plugins/bases/definition.py
@@ -5,13 +5,12 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
+
from dispatch.plugins.base import Plugin
-from dispatch.models import PluginOptionModel
class DefinitionPlugin(Plugin):
type = "definition"
- _schema = PluginOptionModel
def get(self, key, **kwargs):
raise NotImplementedError
@@ -24,6 +23,3 @@ def update(self, key, **kwargs):
def delete(self, key, **kwargs):
raise NotImplementedError
-
- def move(self, key, **kwargs):
- raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/document.py b/src/dispatch/plugins/bases/document.py
index 86f470b4555f..f9a0053babb8 100644
--- a/src/dispatch/plugins/bases/document.py
+++ b/src/dispatch/plugins/bases/document.py
@@ -5,13 +5,12 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
+
from dispatch.plugins.base import Plugin
-from dispatch.models import PluginOptionModel
class DocumentPlugin(Plugin):
type = "document"
- _schema = PluginOptionModel
def get(self, key, **kwargs):
raise NotImplementedError
@@ -24,6 +23,3 @@ def update(self, key, **kwargs):
def delete(self, key, **kwargs):
raise NotImplementedError
-
- def move(self, key, **kwargs):
- raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/document_resolver.py b/src/dispatch/plugins/bases/document_resolver.py
deleted file mode 100644
index 7dbc76575c3e..000000000000
--- a/src/dispatch/plugins/bases/document_resolver.py
+++ /dev/null
@@ -1,17 +0,0 @@
-"""
-.. module: dispatch.plugins.bases.document
- :platform: Unix
- :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
- :license: Apache, see LICENSE for more details.
-.. moduleauthor:: Kevin Glisson
-"""
-from dispatch.plugins.base import Plugin
-from dispatch.models import PluginOptionModel
-
-
-class DocumentResolverPlugin(Plugin):
- type = "document-resolver"
- _schema = PluginOptionModel
-
- def get(self, items, **kwargs):
- raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/email.py b/src/dispatch/plugins/bases/email.py
new file mode 100644
index 000000000000..9c00cdf90f14
--- /dev/null
+++ b/src/dispatch/plugins/bases/email.py
@@ -0,0 +1,16 @@
+"""
+.. module: dispatch.plugins.bases.email
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+.. moduleauthor:: Kevin Glisson
+"""
+
+from dispatch.plugins.base import Plugin
+
+
+class EmailPlugin(Plugin):
+ type = "email"
+
+ def send(self, items, **kwargs):
+ raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/investigation_tooling.py b/src/dispatch/plugins/bases/investigation_tooling.py
new file mode 100644
index 000000000000..1ef6bf8174d4
--- /dev/null
+++ b/src/dispatch/plugins/bases/investigation_tooling.py
@@ -0,0 +1,28 @@
+"""
+.. module: dispatch.plugins.bases.investigation_tooling
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+.. moduleauthor:: Marc Vilanova
+"""
+
+from dispatch.case.models import Case
+from dispatch.plugins.base import Plugin
+
+
+class InvestigationToolingPlugin(Plugin):
+ """Investigation tooling base plugin class."""
+
+ type = "investigation-tooling"
+
+ def create_investigation(self, case: Case, **kwargs):
+ """Creates a new investigation.
+
+ Args:
+ case: Case object
+ kwargs: Optional kwargs.
+
+ Returns:
+ Additional context.
+ """
+ raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/metric.py b/src/dispatch/plugins/bases/metric.py
index 5f315aba6460..68d2e2789abc 100644
--- a/src/dispatch/plugins/bases/metric.py
+++ b/src/dispatch/plugins/bases/metric.py
@@ -5,13 +5,12 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
-from dispatch.models import PluginOptionModel
+
from dispatch.plugins.base import Plugin
class MetricPlugin(Plugin):
type = "metric"
- _schema = PluginOptionModel
def gauge(self, name, value, tags=None):
raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/monitor.py b/src/dispatch/plugins/bases/monitor.py
new file mode 100644
index 000000000000..9bee75137155
--- /dev/null
+++ b/src/dispatch/plugins/bases/monitor.py
@@ -0,0 +1,15 @@
+"""
+.. module: dispatch.plugins.bases.monitor
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
+
+from dispatch.plugins.base import Plugin
+
+
+class MonitorPlugin(Plugin):
+ type = "monitor"
+
+ def get_status(self, **kwargs):
+ raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/oncall.py b/src/dispatch/plugins/bases/oncall.py
index 2d99c73d0e7e..0a4d727b9089 100644
--- a/src/dispatch/plugins/bases/oncall.py
+++ b/src/dispatch/plugins/bases/oncall.py
@@ -5,13 +5,22 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
+
from dispatch.plugins.base import Plugin
-from dispatch.models import PluginOptionModel
class OncallPlugin(Plugin):
type = "oncall"
- _schema = PluginOptionModel
- def get(self, **kwargs):
+ def get(self, service_id: str, **kwargs):
+ raise NotImplementedError
+
+ def page(
+ self,
+ service_id: str,
+ incident_name: str,
+ incident_title: str,
+ incident_description: str,
+ **kwargs,
+ ):
raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/participant.py b/src/dispatch/plugins/bases/participant.py
index 7eb07727f74d..7ce6e2cd124b 100644
--- a/src/dispatch/plugins/bases/participant.py
+++ b/src/dispatch/plugins/bases/participant.py
@@ -5,13 +5,12 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
+
from dispatch.plugins.base import Plugin
-from dispatch.models import PluginOptionModel
class ParticipantPlugin(Plugin):
type = "participant"
- _schema = PluginOptionModel
def get(self, items, **kwargs):
raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/participant_group.py b/src/dispatch/plugins/bases/participant_group.py
index 36a5d74d3f9d..8dad03035584 100644
--- a/src/dispatch/plugins/bases/participant_group.py
+++ b/src/dispatch/plugins/bases/participant_group.py
@@ -5,13 +5,12 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
+
from dispatch.plugins.base import Plugin
-from dispatch.models import PluginOptionModel
class ParticipantGroupPlugin(Plugin):
- type = "participant_group"
- _schema = PluginOptionModel
+ type = "participant-group"
def create(self, participants, **kwargs):
raise NotImplementedError
@@ -21,3 +20,9 @@ def add(self, participant, **kwargs):
def remove(self, participant, **kwargs):
raise NotImplementedError
+
+ def delete(self, group, **kwargs):
+ raise NotImplementedError
+
+ def list(self, group, **kwargs):
+ raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/signal_consumer.py b/src/dispatch/plugins/bases/signal_consumer.py
new file mode 100644
index 000000000000..d3c1a7edfbde
--- /dev/null
+++ b/src/dispatch/plugins/bases/signal_consumer.py
@@ -0,0 +1,16 @@
+"""
+.. module: dispatch.plugins.bases.signal_consumer
+ :platform: Unix
+ :copyright: (c) 2022 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+.. moduleauthor:: Kevin Glisson
+"""
+
+from dispatch.plugins.base import Plugin
+
+
+class SignalConsumerPlugin(Plugin):
+ type = "signal-consumer"
+
+ def consume(self, **kwargs):
+ raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/signal_enrichment.py b/src/dispatch/plugins/bases/signal_enrichment.py
new file mode 100644
index 000000000000..361b9bef6b3b
--- /dev/null
+++ b/src/dispatch/plugins/bases/signal_enrichment.py
@@ -0,0 +1,16 @@
+"""
+.. module: dispatch.plugins.bases.signal_enrichment
+ :platform: Unix
+ :copyright: (c) 2022 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+.. moduleauthor:: Kevin Glisson
+"""
+
+from dispatch.plugins.base import Plugin
+
+
+class SignalEnrichmentPlugin(Plugin):
+ type = "signal-enrichment"
+
+ def enrich(self, **kwargs):
+ raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/source.py b/src/dispatch/plugins/bases/source.py
new file mode 100644
index 000000000000..512bfd1495fa
--- /dev/null
+++ b/src/dispatch/plugins/bases/source.py
@@ -0,0 +1,16 @@
+"""
+.. module: dispatch.plugins.bases.source
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+.. moduleauthor:: Kevin Glisson
+"""
+
+from dispatch.plugins.base import Plugin
+
+
+class SourcePlugin(Plugin):
+ type = "source"
+
+ def get(self, **kwargs):
+ raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/storage.py b/src/dispatch/plugins/bases/storage.py
index 8932f5381735..764f33a7918b 100644
--- a/src/dispatch/plugins/bases/storage.py
+++ b/src/dispatch/plugins/bases/storage.py
@@ -5,13 +5,12 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
-from dispatch.models import PluginOptionModel
+
from dispatch.plugins.base import Plugin
class StoragePlugin(Plugin):
type = "storage"
- _schema = PluginOptionModel
def get(self, **kwargs):
raise NotImplementedError
@@ -34,6 +33,12 @@ def add_participant(self, items, **kwargs):
def remove_participant(self, items, **kwargs):
raise NotImplementedError
+ def open(self, **kwargs):
+ raise NotImplementedError
+
+ def mark_readonly(self, folder_id: str, **kwargs):
+ raise NotImplementedError
+
def add_file(self, **kwargs):
raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/tag.py b/src/dispatch/plugins/bases/tag.py
index 8e1cf6ba5120..a1d39b771234 100644
--- a/src/dispatch/plugins/bases/tag.py
+++ b/src/dispatch/plugins/bases/tag.py
@@ -5,13 +5,12 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
-from dispatch.models import PluginOptionModel
+
from dispatch.plugins.base import Plugin
class TagPlugin(Plugin):
type = "tag"
- _schema = PluginOptionModel
def get(self, **kwargs):
raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/task.py b/src/dispatch/plugins/bases/task.py
index f48b261bddf9..eebc513f5ae6 100644
--- a/src/dispatch/plugins/bases/task.py
+++ b/src/dispatch/plugins/bases/task.py
@@ -5,13 +5,12 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
+
from dispatch.plugins.base import Plugin
-from dispatch.models import PluginOptionModel
class TaskPlugin(Plugin):
type = "task"
- _schema = PluginOptionModel
def get(self, **kwargs):
raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/term.py b/src/dispatch/plugins/bases/term.py
index e4f4ce610fe4..e98e93794a86 100644
--- a/src/dispatch/plugins/bases/term.py
+++ b/src/dispatch/plugins/bases/term.py
@@ -5,13 +5,12 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
-from dispatch.models import PluginOptionModel
+
from dispatch.plugins.base import Plugin
class TermPlugin(Plugin):
type = "term"
- _schema = PluginOptionModel
def get(self, **kwargs):
raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/ticket.py b/src/dispatch/plugins/bases/ticket.py
index 03e127b41f0e..ffb71c1616fd 100644
--- a/src/dispatch/plugins/bases/ticket.py
+++ b/src/dispatch/plugins/bases/ticket.py
@@ -5,16 +5,18 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
+
from dispatch.plugins.base import Plugin
-from dispatch.models import PluginOptionModel
class TicketPlugin(Plugin):
type = "ticket"
- _schema = PluginOptionModel
def create(self, ticket_id, **kwargs):
raise NotImplementedError
def update(self, ticket_id, **kwargs):
raise NotImplementedError
+
+ def delete(self, ticket_id, **kwargs):
+ raise NotImplementedError
diff --git a/src/dispatch/plugins/bases/workflow.py b/src/dispatch/plugins/bases/workflow.py
new file mode 100644
index 000000000000..7646400ca7f9
--- /dev/null
+++ b/src/dispatch/plugins/bases/workflow.py
@@ -0,0 +1,19 @@
+"""
+.. module: dispatch.plugins.bases.workflow
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+.. moduleauthor:: Kevin Glisson
+"""
+
+from dispatch.plugins.base import Plugin
+
+
+class WorkflowPlugin(Plugin):
+ type = "workflow"
+
+ def get_instance(self, workflow_id: str, instance_id: str, **kwargs):
+ raise NotImplementedError
+
+ def run(self, workflow_id: str, params: dict, **kwargs):
+ raise NotImplementedError
diff --git a/src/dispatch/plugins/dispatch_atlassian_confluence/__init__.py b/src/dispatch/plugins/dispatch_atlassian_confluence/__init__.py
new file mode 100644
index 000000000000..ad5cc752c07b
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_atlassian_confluence/__init__.py
@@ -0,0 +1 @@
+from ._version import __version__ # noqa
diff --git a/src/dispatch/plugins/dispatch_atlassian_confluence/_version.py b/src/dispatch/plugins/dispatch_atlassian_confluence/_version.py
new file mode 100644
index 000000000000..f102a9cadfa8
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_atlassian_confluence/_version.py
@@ -0,0 +1 @@
+__version__ = "0.0.1"
diff --git a/src/dispatch/plugins/dispatch_atlassian_confluence/config.py b/src/dispatch/plugins/dispatch_atlassian_confluence/config.py
new file mode 100644
index 000000000000..fca9fe53c27f
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_atlassian_confluence/config.py
@@ -0,0 +1,28 @@
+from pydantic import Field, SecretStr, AnyHttpUrl
+
+from enum import Enum
+from dispatch.config import BaseConfigurationModel
+
+
+class HostingType(str, Enum):
+ """Type of Atlassian Confluence deployment."""
+
+ cloud = "cloud"
+ server = "server"
+
+
+class ConfluenceConfigurationBase(BaseConfigurationModel):
+ """Atlassian Confluence configuration description."""
+
+ api_url: AnyHttpUrl = Field(
+ title="API URL", description="This URL is used for communication with API."
+ )
+ hosting_type: HostingType = Field(
+ "cloud", title="Hosting Type", description="Defines the type of deployment."
+ )
+ username: str = Field(
+ title="Username", description="Username to use to authenticate to Confluence API."
+ )
+ password: SecretStr = Field(
+ title="Password", description="Password to use to authenticate to Confluence API."
+ )
diff --git a/src/dispatch/plugins/dispatch_atlassian_confluence/docs/__init__.py b/src/dispatch/plugins/dispatch_atlassian_confluence/docs/__init__.py
new file mode 100644
index 000000000000..ad5cc752c07b
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_atlassian_confluence/docs/__init__.py
@@ -0,0 +1 @@
+from ._version import __version__ # noqa
diff --git a/src/dispatch/plugins/dispatch_atlassian_confluence/docs/_version.py b/src/dispatch/plugins/dispatch_atlassian_confluence/docs/_version.py
new file mode 100644
index 000000000000..f102a9cadfa8
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_atlassian_confluence/docs/_version.py
@@ -0,0 +1 @@
+__version__ = "0.0.1"
diff --git a/src/dispatch/plugins/dispatch_atlassian_confluence/docs/plugin.py b/src/dispatch/plugins/dispatch_atlassian_confluence/docs/plugin.py
new file mode 100644
index 000000000000..ae92413df341
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_atlassian_confluence/docs/plugin.py
@@ -0,0 +1,51 @@
+from dispatch.plugins.dispatch_atlassian_confluence import docs as confluence_doc_plugin
+from dispatch.plugins.bases import DocumentPlugin
+from dispatch.plugins.dispatch_atlassian_confluence.config import ConfluenceConfigurationBase
+from atlassian import Confluence
+
+
+def replace_content(client: Confluence, document_id: str, replacements: list[str]) -> {}:
+ # read content based on document_id
+ current_content = client.get_page_by_id(
+ document_id, expand="body.storage", status=None, version=None
+ )
+ current_content_body = current_content["body"]["storage"]["value"]
+ for k, v in replacements.items():
+ if v:
+ current_content_body = current_content_body.replace(k, v)
+
+ updated_content = client.update_page(
+ page_id=document_id,
+ title=current_content["title"],
+ body=current_content_body,
+ representation="storage",
+ type="page",
+ parent_id=None,
+ minor_edit=False,
+ full_width=False,
+ )
+ return updated_content
+
+
+class ConfluencePageDocPlugin(DocumentPlugin):
+ title = "Confluence pages plugin - Document Management"
+ slug = "confluence-docs-document"
+ description = "Use Confluence to update the contents."
+ version = confluence_doc_plugin.__version__
+
+ author = "Cino Jose"
+ author_url = "https://github.com/Netflix/dispatch"
+
+ def __init__(self):
+ self.configuration_schema = ConfluenceConfigurationBase
+
+ def update(self, document_id: str, **kwargs):
+ """Replaces text in document."""
+ kwargs = {"{{" + k + "}}": v for k, v in kwargs.items()}
+ confluence_client = Confluence(
+ url=str(self.configuration.api_url),
+ username=self.configuration.username,
+ password=self.configuration.password.get_secret_value(),
+ cloud=self.configuration.hosting_type,
+ )
+ return replace_content(confluence_client, document_id, kwargs)
diff --git a/src/dispatch/plugins/dispatch_atlassian_confluence/plugin.py b/src/dispatch/plugins/dispatch_atlassian_confluence/plugin.py
new file mode 100644
index 000000000000..9d841a91d273
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_atlassian_confluence/plugin.py
@@ -0,0 +1,138 @@
+from dispatch.plugins import dispatch_atlassian_confluence as confluence_plugin
+from dispatch.plugins.bases import StoragePlugin
+from dispatch.plugins.dispatch_atlassian_confluence.config import ConfluenceConfigurationBase
+
+from pydantic import Field
+
+from atlassian import Confluence
+import requests
+from requests.auth import HTTPBasicAuth
+import logging
+
+logger = logging.getLogger(__name__)
+
+
+# TODO : Use the common config from the root directory.
+class ConfluenceConfiguration(ConfluenceConfigurationBase):
+ """Confluence configuration description."""
+
+ template_id: str = Field(
+ title="Incident template ID", description="This is the page id of the template."
+ )
+ root_id: str = Field(
+ title="Default Space ID", description="Defines the default Confluence Space to use."
+ )
+ parent_id: str = Field(
+ title="Parent ID for the pages",
+ description="Define the page id of a parent page where all the incident documents can be kept.",
+ )
+ open_on_close: bool = Field(
+ title="Open On Close",
+ default=False,
+ description="Controls the visibility of resources on incident close. If enabled Dispatch will make all resources visible to the entire workspace.",
+ )
+ read_only: bool = Field(
+ title="Readonly",
+ default=False,
+ description="The incident document will be marked as readonly on incident close. Participants will still be able to interact with the document but any other viewers will not.",
+ )
+
+
+class ConfluencePagePlugin(StoragePlugin):
+ title = "Confluence Plugin - Store your incident details"
+ slug = "confluence"
+ description = "Confluence plugin to create incident documents"
+ version = confluence_plugin.__version__
+
+ author = "Cino Jose"
+ author_url = "https://github.com/Netflix/dispatch"
+
+ def __init__(self):
+ self.configuration_schema = ConfluenceConfiguration
+
+ def create_file(
+ self, drive_id: str, name: str, participants: list[str] = None, file_type: str = "folder"
+ ):
+ """Creates a new Home page for the incident documents.."""
+ try:
+ if file_type not in ["document", "folder"]:
+ return None
+ confluence_client = Confluence(
+ url=str(self.configuration.api_url),
+ username=self.configuration.username,
+ password=self.configuration.password.get_secret_value(),
+ cloud=self.configuration.hosting_type,
+ )
+ child_display_body = """Incident Documents: modified
+ true """
+ page_details = confluence_client.create_page(
+ drive_id,
+ name,
+ body=child_display_body,
+ parent_id=self.configuration.parent_id,
+ type="page",
+ representation="storage",
+ editor="v2",
+ full_width=False,
+ )
+ return {
+ "weblink": f"{self.configuration.api_url}wiki/spaces/{drive_id}/pages/{page_details['id']}/{name}",
+ "id": page_details["id"],
+ "name": name,
+ "description": "",
+ }
+ except Exception as e:
+ logger.error(f"Exception happened while creating page: {e}")
+
+ def copy_file(self, folder_id: str, file_id: str, name: str):
+ # TODO : This is the function that is responsible for making the incident documents.
+ try:
+ confluence_client = Confluence(
+ url=str(self.configuration.api_url),
+ username=self.configuration.username,
+ password=self.configuration.password.get_secret_value(),
+ cloud=self.configuration.hosting_type,
+ )
+ logger.info(f"Copy_file function with args {folder_id}, {file_id}, {name}")
+ template_content = confluence_client.get_page_by_id(
+ self.configuration.template_id, expand="body.storage", status=None, version=None
+ )
+ page_details = confluence_client.create_page(
+ space=self.configuration.root_id,
+ parent_id=folder_id,
+ title=name,
+ type="page",
+ body=template_content["body"],
+ representation="storage",
+ editor="v2",
+ full_width=False,
+ )
+ if self.configuration.parent_id:
+ """TODO: Find and fix why the page is not created under the parent_id, folder_id"""
+ self.move_file_confluence(page_id_to_move=page_details["id"], parent_id=folder_id)
+ return {
+ "weblink": f"{self.configuration.api_url}wiki/spaces/{folder_id}/pages/{page_details['id']}/{name}",
+ "id": page_details["id"],
+ "name": name,
+ }
+ except Exception as e:
+ logger.error(f"Exception happened while creating page: {e}")
+
+ def move_file(self, new_folder_id: str, file_id: str, **kwargs):
+ """Moves a file from one place to another. Not used in the plugin,
+ keeping the body as the interface is needed to avoid exceptions."""
+ return {}
+
+ def move_file_confluence(self, page_id_to_move: str, parent_id: str):
+ try:
+ url = f"{self.configuration.api_url}wiki/rest/api/content/{page_id_to_move}/move/append/{parent_id}"
+ auth = HTTPBasicAuth(
+ self.configuration.username, self.configuration.password.get_secret_value()
+ )
+ headers = {"Accept": "application/json"}
+ response = requests.request("PUT", url, headers=headers, auth=auth)
+ return response
+ except Exception as e:
+ logger.error(f"Exception happened while moving page: {e}")
diff --git a/src/dispatch/plugins/dispatch_aws/__init__.py b/src/dispatch/plugins/dispatch_aws/__init__.py
new file mode 100644
index 000000000000..ad5cc752c07b
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_aws/__init__.py
@@ -0,0 +1 @@
+from ._version import __version__ # noqa
diff --git a/src/dispatch/plugins/dispatch_aws/_version.py b/src/dispatch/plugins/dispatch_aws/_version.py
new file mode 100644
index 000000000000..3dc1f76bc69e
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_aws/_version.py
@@ -0,0 +1 @@
+__version__ = "0.1.0"
diff --git a/src/dispatch/plugins/dispatch_aws/config.py b/src/dispatch/plugins/dispatch_aws/config.py
new file mode 100644
index 000000000000..d14e3f58ae2a
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_aws/config.py
@@ -0,0 +1,29 @@
+from pydantic import Field
+from dispatch.config import BaseConfigurationModel
+
+
+class AWSSQSConfiguration(BaseConfigurationModel):
+ """Signal SQS configuration"""
+
+ queue_name: str = Field(
+ title="Queue Name",
+ description="Queue Name, not the ARN.",
+ )
+
+ queue_owner: str = Field(
+ title="Queue Owner",
+ description="Queue Owner Account ID.",
+ )
+
+ region: str = Field(
+ title="AWS Region",
+ description="AWS Region.",
+ default="us-east-1",
+ )
+
+ batch_size: int = Field(
+ title="Batch Size",
+ description="Number of messages to retrieve from SQS.",
+ default=10,
+ le=10,
+ )
diff --git a/src/dispatch/plugins/dispatch_aws/plugin.py b/src/dispatch/plugins/dispatch_aws/plugin.py
new file mode 100644
index 000000000000..4f66bc358cb9
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_aws/plugin.py
@@ -0,0 +1,155 @@
+"""
+.. module: dispatch.plugins.dispatchaws.plugin
+ :platform: Unix
+ :copyright: (c) 2023 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+.. moduleauthor:: Kevin Glisson
+"""
+
+import base64
+import json
+import logging
+import zlib
+from typing import TypedDict
+
+import boto3
+from psycopg2.errors import UniqueViolation
+from pydantic import ValidationError
+from sqlalchemy.exc import IntegrityError, ResourceClosedError
+from sqlalchemy.orm import Session
+
+from dispatch.metrics import provider as metrics_provider
+from dispatch.plugins.bases import SignalConsumerPlugin
+from dispatch.plugins.dispatch_aws.config import AWSSQSConfiguration
+from dispatch.project.models import Project
+from dispatch.signal import service as signal_service
+from dispatch.signal.models import SignalInstanceCreate
+
+from . import __version__
+
+log = logging.getLogger(__name__)
+
+
+def decompress_json(compressed_str: str) -> str:
+ """Decompress a base64 encoded zlibed JSON string."""
+ decoded = base64.b64decode(compressed_str)
+ decompressed = zlib.decompress(decoded)
+ return decompressed.decode("utf-8")
+
+
+class SqsEntries(TypedDict):
+ Id: str
+ ReceiptHandle: str
+
+
+class AWSSQSSignalConsumerPlugin(SignalConsumerPlugin):
+ title = "AWS SQS - Signal Consumer"
+ slug = "aws-sqs-signal-consumer"
+ description = "Uses SQS to consume signals."
+ version = __version__
+
+ author = "Netflix"
+ author_url = "https://github.com/netflix/dispatch.git"
+
+ def __init__(self):
+ self.configuration_schema = AWSSQSConfiguration
+
+ def consume(self, db_session: Session, project: Project) -> None:
+ client = boto3.client("sqs", region_name=self.configuration.region)
+ queue_url: str = client.get_queue_url(
+ QueueName=self.configuration.queue_name,
+ QueueOwnerAWSAccountId=self.configuration.queue_owner,
+ )["QueueUrl"]
+
+ while True:
+ response = client.receive_message(
+ QueueUrl=queue_url,
+ MaxNumberOfMessages=self.configuration.batch_size,
+ VisibilityTimeout=40,
+ WaitTimeSeconds=20,
+ )
+ if not response.get("Messages") or len(response["Messages"]) == 0:
+ log.info("No messages received from SQS.")
+ continue
+
+ entries: list[SqsEntries] = []
+ for message in response["Messages"]:
+ try:
+ message_body = json.loads(message["Body"])
+ message_body_message = message_body.get("Message")
+ message_attributes = message_body.get("MessageAttributes", {})
+
+ if message_attributes.get("compressed", {}).get("Value") == "zlib":
+ # Message is compressed, decompress it
+ message_body_message = decompress_json(message_body_message)
+
+ signal_data = json.loads(message_body_message)
+ except Exception as e:
+ log.exception(f"Unable to extract signal data from SQS message: {e}")
+ continue
+
+ try:
+ signal_instance_in = SignalInstanceCreate(
+ project=project, raw=signal_data, **signal_data
+ )
+ except ValidationError as e:
+ log.warning(
+ f"Received a signal instance that does not conform to the SignalInstanceCreate pydantic model. Skipping creation: {e}"
+ )
+ continue
+
+ # if the signal has an existing uuid we check if it already exists
+ if signal_instance_in.raw and signal_instance_in.raw.get("id"):
+ if signal_service.get_signal_instance(
+ db_session=db_session, signal_instance_id=signal_instance_in.raw["id"]
+ ):
+ log.info(
+ f"Received a signal that already exists in the database. Skipping signal instance creation: {signal_instance_in.raw['id']}"
+ )
+ continue
+
+ try:
+ with db_session.begin_nested():
+ signal_instance = signal_service.create_signal_instance(
+ db_session=db_session,
+ signal_instance_in=signal_instance_in,
+ )
+ except IntegrityError as e:
+ if isinstance(e.orig, UniqueViolation):
+ log.info(
+ f"Received a signal that already exists in the database. Skipping signal instance creation: {e}"
+ )
+ else:
+ log.exception(
+ f"Encountered an integrity error when trying to create a signal instance: {e}"
+ )
+ continue
+ except ResourceClosedError as e:
+ log.warning(
+ f"Encountered an error when trying to create a signal instance. The plugin will retry again as the message hasn't been deleted from the SQS queue. Signal name/variant: {signal_instance_in.raw['name'] if signal_instance_in.raw and signal_instance_in.raw['name'] else signal_instance_in.raw['variant']}. Error: {e}"
+ )
+ db_session.rollback()
+ continue
+ except Exception as e:
+ log.exception(
+ f"Encountered an error when trying to create a signal instance. Signal name/variant: {signal_instance_in.raw['name'] if signal_instance_in.raw and signal_instance_in.raw['name'] else signal_instance_in.raw['variant']}. Error: {e}"
+ )
+ db_session.rollback()
+ continue
+ else:
+ metrics_provider.counter(
+ "aws-sqs-signal-consumer.signal.received",
+ tags={
+ "signalName": signal_instance.signal.name,
+ "externalId": signal_instance.signal.external_id,
+ },
+ )
+ log.debug(
+ f"Received a signal with name {signal_instance.signal.name} and id {signal_instance.signal.id}"
+ )
+ entries.append(
+ {"Id": message["MessageId"], "ReceiptHandle": message["ReceiptHandle"]}
+ )
+
+ if entries:
+ client.delete_message_batch(QueueUrl=queue_url, Entries=entries)
diff --git a/src/dispatch/plugins/dispatch_core/config.py b/src/dispatch/plugins/dispatch_core/config.py
index 29e4546a3142..9fdda55275bf 100644
--- a/src/dispatch/plugins/dispatch_core/config.py
+++ b/src/dispatch/plugins/dispatch_core/config.py
@@ -1,3 +1,20 @@
+import logging
+
+from pydantic import Field
from starlette.config import Config
+from dispatch.config import BaseConfigurationModel
+
+log = logging.getLogger(__name__)
+
config = Config(".env")
+
+
+class DispatchTicketConfiguration(BaseConfigurationModel):
+ """Dispatch ticket configuration"""
+
+ use_incident_name: bool = Field(
+ True,
+ title="Use Incident Name",
+ description="Use the incident name as the ticket title.",
+ )
diff --git a/src/dispatch/plugins/dispatch_core/exceptions.py b/src/dispatch/plugins/dispatch_core/exceptions.py
new file mode 100644
index 000000000000..cefa6f51672b
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_core/exceptions.py
@@ -0,0 +1,34 @@
+class MfaException(Exception):
+ """Base exception for MFA-related errors."""
+
+ pass
+
+
+class InvalidChallengeError(MfaException):
+ """Raised when the challenge is invalid."""
+
+ pass
+
+
+class UserMismatchError(MfaException):
+ """Raised when the challenge doesn't belong to the current user."""
+
+ pass
+
+
+class ActionMismatchError(MfaException):
+ """Raised when the action doesn't match the challenge."""
+
+ pass
+
+
+class ExpiredChallengeError(MfaException):
+ """Raised when the challenge is no longer valid."""
+
+ pass
+
+
+class InvalidChallengeStateError(MfaException):
+ """Raised when the challenge is in an invalid state."""
+
+ pass
diff --git a/src/dispatch/plugins/dispatch_core/plugin.py b/src/dispatch/plugins/dispatch_core/plugin.py
index 2494217ae361..43acb74e1e7d 100644
--- a/src/dispatch/plugins/dispatch_core/plugin.py
+++ b/src/dispatch/plugins/dispatch_core/plugin.py
@@ -4,130 +4,589 @@
:copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
:license: Apache, see LICENSE for more details.
"""
-import logging
+import base64
+import json
+import logging
+import time
+from uuid import UUID
+from typing import Literal
import requests
+from cachetools import cached, TTLCache
from fastapi import HTTPException
from fastapi.security.utils import get_authorization_scheme_param
-
from jose import JWTError, jwt
-from starlette.status import HTTP_401_UNAUTHORIZED
+from jose.exceptions import JWKError
+from sqlalchemy.orm import Session
from starlette.requests import Request
+from starlette.status import HTTP_401_UNAUTHORIZED
+from dispatch.auth.models import DispatchUser, MfaChallenge, MfaChallengeStatus, MfaPayload
+from dispatch.case import service as case_service
+from dispatch.config import (
+ DISPATCH_AUTHENTICATION_PROVIDER_AWS_ALB_ARN,
+ DISPATCH_AUTHENTICATION_PROVIDER_AWS_ALB_EMAIL_CLAIM,
+ DISPATCH_AUTHENTICATION_PROVIDER_AWS_ALB_PUBLIC_KEY_CACHE_SECONDS,
+ DISPATCH_AUTHENTICATION_PROVIDER_HEADER_NAME,
+ DISPATCH_AUTHENTICATION_PROVIDER_PKCE_JWKS,
+ DISPATCH_JWT_AUDIENCE,
+ DISPATCH_JWT_EMAIL_OVERRIDE,
+ DISPATCH_JWT_SECRET,
+ DISPATCH_PKCE_DONT_VERIFY_AT_HASH,
+ DISPATCH_UI_URL,
+)
+from dispatch.database.core import Base
+from dispatch.incident import service as incident_service
from dispatch.individual import service as individual_service
+from dispatch.individual.models import IndividualContact, IndividualContactRead
+from dispatch.plugin import service as plugin_service
from dispatch.plugins import dispatch_core as dispatch_plugin
-from dispatch.plugins.base import plugins
from dispatch.plugins.bases import (
- ParticipantPlugin,
- DocumentResolverPlugin,
AuthenticationProviderPlugin,
+ ContactPlugin,
+ MultiFactorAuthenticationPlugin,
+ ParticipantPlugin,
+ TicketPlugin,
)
-
+from dispatch.plugins.dispatch_core.config import DispatchTicketConfiguration
+from dispatch.plugins.dispatch_core.exceptions import (
+ ActionMismatchError,
+ ExpiredChallengeError,
+ InvalidChallengeError,
+ InvalidChallengeStateError,
+ UserMismatchError,
+)
+from dispatch.plugins.dispatch_core.service import create_resource_id
+from dispatch.project import service as project_service
from dispatch.route import service as route_service
-from dispatch.route.models import RouteRequest
-
-from dispatch.config import DISPATCH_AUTHENTICATION_PROVIDER_PKCE_JWKS
+from dispatch.service import service as service_service
+from dispatch.service.models import Service, ServiceRead
+from dispatch.team import service as team_service
+from dispatch.team.models import TeamContact, TeamContactRead
log = logging.getLogger(__name__)
+class BasicAuthProviderPlugin(AuthenticationProviderPlugin):
+ title = "Dispatch Plugin - Basic Authentication Provider"
+ slug = "dispatch-auth-provider-basic"
+ description = "Generic basic authentication provider."
+ version = dispatch_plugin.__version__
+
+ author = "Netflix"
+ author_url = "https://github.com/netflix/dispatch.git"
+ configuration_schema = None
+
+ def get_current_user(self, request: Request, **kwargs):
+ authorization: str = request.headers.get("Authorization")
+ scheme, param = get_authorization_scheme_param(authorization)
+ if not authorization or scheme.lower() != "bearer":
+ log.exception(
+ f"Malformed authorization header. Scheme: {scheme} Param: {param} Authorization: {authorization}"
+ )
+ return
+
+ token = authorization.split()[1]
+
+ try:
+ data = jwt.decode(token, DISPATCH_JWT_SECRET)
+ except (JWKError, JWTError):
+ raise HTTPException(
+ status_code=HTTP_401_UNAUTHORIZED,
+ detail=[{"msg": "Could not validate credentials"}],
+ ) from None
+ return data["email"]
+
+
class PKCEAuthProviderPlugin(AuthenticationProviderPlugin):
- title = "Dispatch - PKCE Authentication Provider"
+ title = "Dispatch Plugin - PKCE Authentication Provider"
slug = "dispatch-auth-provider-pkce"
description = "Generic PCKE authentication provider."
version = dispatch_plugin.__version__
- author = "Kevin Glisson"
+ author = "Netflix"
author_url = "https://github.com/netflix/dispatch.git"
+ configuration_schema = None
def get_current_user(self, request: Request, **kwargs):
credentials_exception = HTTPException(
- status_code=HTTP_401_UNAUTHORIZED, detail="Could not validate credentials"
+ status_code=HTTP_401_UNAUTHORIZED, detail=[{"msg": "Could not validate credentials"}]
)
- authorization: str = request.headers.get("Authorization")
+ authorization: str = request.headers.get(
+ "Authorization", request.headers.get("authorization")
+ )
scheme, param = get_authorization_scheme_param(authorization)
if not authorization or scheme.lower() != "bearer":
raise credentials_exception
token = authorization.split()[1]
- key = requests.get(DISPATCH_AUTHENTICATION_PROVIDER_PKCE_JWKS).json()["keys"][0]
+
+ # Parse out the Key information. Add padding just in case
+ key_info = json.loads(base64.b64decode(token.split(".")[0] + "=========").decode("utf-8"))
+
+ # Grab all possible keys to account for key rotation and find the right key
+ keys = requests.get(DISPATCH_AUTHENTICATION_PROVIDER_PKCE_JWKS).json()["keys"]
+ for potential_key in keys:
+ if potential_key["kid"] == key_info["kid"]:
+ key = potential_key
try:
- data = jwt.decode(token, key)
- except JWTError:
+ jwt_opts = {}
+ if DISPATCH_PKCE_DONT_VERIFY_AT_HASH:
+ jwt_opts = {"verify_at_hash": False}
+ # If DISPATCH_JWT_AUDIENCE is defined, the we must include audience in the decode
+ if DISPATCH_JWT_AUDIENCE:
+ data = jwt.decode(token, key, audience=DISPATCH_JWT_AUDIENCE, options=jwt_opts)
+ else:
+ data = jwt.decode(token, key, options=jwt_opts)
+ except JWTError as err:
+ log.debug("JWT Decode error: {}".format(err))
+ raise credentials_exception from err
+
+ # Support overriding where email is returned in the id token
+ if DISPATCH_JWT_EMAIL_OVERRIDE:
+ return data[DISPATCH_JWT_EMAIL_OVERRIDE]
+ else:
+ return data["email"]
+
+
+class HeaderAuthProviderPlugin(AuthenticationProviderPlugin):
+ title = "Dispatch Plugin - HTTP Header Authentication Provider"
+ slug = "dispatch-auth-provider-header"
+ description = "Authenticate users based on HTTP request header."
+ version = dispatch_plugin.__version__
+
+ author = "Filippo Giunchedi"
+ author_url = "https://github.com/filippog"
+ configuration_schema = None
+
+ def get_current_user(self, request: Request, **kwargs):
+ value: str = request.headers.get(DISPATCH_AUTHENTICATION_PROVIDER_HEADER_NAME)
+ if not value:
+ log.error(
+ f"Unable to authenticate. Header {DISPATCH_AUTHENTICATION_PROVIDER_HEADER_NAME} not found."
+ )
+ raise HTTPException(status_code=HTTP_401_UNAUTHORIZED)
+ return value
+
+
+class AwsAlbAuthProviderPlugin(AuthenticationProviderPlugin):
+ title = "Dispatch Plugin - AWS ALB Authentication Provider"
+ slug = "dispatch-auth-provider-aws-alb"
+ description = "AWS Application Load Balancer authentication provider."
+ version = dispatch_plugin.__version__
+
+ author = "ManyPets"
+ author_url = "https://manypets.com/"
+ configuration_schema = None
+
+ @cached(cache=TTLCache(maxsize=1024, ttl=DISPATCH_AUTHENTICATION_PROVIDER_AWS_ALB_PUBLIC_KEY_CACHE_SECONDS))
+ def get_public_key(self, kid: str, region: str):
+ log.debug("Cache miss. Requesting key from AWS endpoint.")
+ url = f"https://public-keys.auth.elb.{region}.amazonaws.com/{kid}"
+ req = requests.get(url)
+ return req.text
+
+ def get_current_user(self, request: Request, **kwargs):
+ credentials_exception = HTTPException(
+ status_code=HTTP_401_UNAUTHORIZED, detail=[{"msg": "Could not validate credentials"}]
+ )
+
+ encoded_jwt: str = request.headers.get('x-amzn-oidc-data')
+ if not encoded_jwt:
+ log.error(
+ "Unable to authenticate. Header x-amzn-oidc-data not found."
+ )
raise credentials_exception
- return data["email"]
+ log.debug(f"Header x-amzn-oidc-data header received: {encoded_jwt}")
+
+ # Validate the signer
+ jwt_headers = encoded_jwt.split('.')[0]
+ decoded_jwt_headers = base64.b64decode(jwt_headers)
+ decoded_json = json.loads(decoded_jwt_headers)
+ received_alb_arn = decoded_json['signer']
+
+ if received_alb_arn != DISPATCH_AUTHENTICATION_PROVIDER_AWS_ALB_ARN:
+ log.error(
+ f"Unable to authenticate. ALB ARN {received_alb_arn} does not match expected ARN {DISPATCH_AUTHENTICATION_PROVIDER_AWS_ALB_ARN}"
+ )
+ raise credentials_exception
+
+ # Get the key id from JWT headers (the kid field)
+ kid = decoded_json['kid']
+
+ # Get the region from the ARN
+ region = DISPATCH_AUTHENTICATION_PROVIDER_AWS_ALB_ARN.split(':')[3]
+
+ # Get the public key from regional endpoint
+ log.debug(f"Getting public key for kid {kid} in region {region}.")
+ pub_key = self.get_public_key(kid, region)
+
+ # Get the payload
+ log.debug(f"Decoding {encoded_jwt} with public key {pub_key}.")
+ payload = jwt.decode(encoded_jwt, pub_key, algorithms=['ES256'])
+ return payload[DISPATCH_AUTHENTICATION_PROVIDER_AWS_ALB_EMAIL_CLAIM]
-class DispatchDocumentResolverPlugin(DocumentResolverPlugin):
- title = "Dispatch - Document Resolver"
- slug = "dispatch-document-resolver"
- description = "Uses dispatch itself to resolve incident documents."
+
+class DispatchTicketPlugin(TicketPlugin):
+ title = "Dispatch Plugin - Ticket Management"
+ slug = "dispatch-ticket"
+ description = "Uses Dispatch itself to create a ticket."
version = dispatch_plugin.__version__
- author = "Kevin Glisson"
+ author = "Netflix"
author_url = "https://github.com/netflix/dispatch.git"
- def get(
- self, incident_type: str, incident_priority: str, incident_description: str, db_session=None
+ def __init__(self):
+ self.configuration_schema = DispatchTicketConfiguration
+
+ def create(
+ self,
+ incident_id: int,
+ title: str,
+ commander_email: str,
+ reporter_email: str,
+ plugin_metadata: dict,
+ db_session=None,
):
- """Fetches documents from Dispatch."""
- route_in = {
- "text": incident_description,
- "context": {
- "incident_priorities": [incident_priority],
- "incident_types": [incident_type],
- "terms": [],
- },
+ """Creates a Dispatch incident ticket."""
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+
+ if self.configuration and self.configuration.use_incident_name:
+ resource_id = create_resource_id(f"{incident.project.slug}-{title}-{incident.id}")
+ else:
+ resource_id = f"dispatch-{incident.project.organization.slug}-{incident.project.slug}-{incident.id}"
+
+ return {
+ "resource_id": resource_id,
+ "weblink": f"{DISPATCH_UI_URL}/{incident.project.organization.name}/incidents/{resource_id}?project={incident.project.name}",
+ "resource_type": "dispatch-internal-ticket",
}
- route_in = RouteRequest(**route_in)
- recommendation = route_service.get(db_session=db_session, route_in=route_in)
- return recommendation.documents
+ def update(
+ self,
+ ticket_id: str,
+ title: str,
+ description: str,
+ incident_type: str,
+ incident_severity: str,
+ incident_priority: str,
+ status: str,
+ commander_email: str,
+ reporter_email: str,
+ conversation_weblink: str,
+ document_weblink: str,
+ storage_weblink: str,
+ conference_weblink: str,
+ dispatch_weblink: str,
+ cost: float,
+ incident_type_plugin_metadata: dict = None,
+ ):
+ """Updates a Dispatch incident ticket."""
+ return
+ def delete(
+ self,
+ ticket_id: str,
+ ):
+ """Deletes a Dispatch ticket."""
+ return
-class DispatchParticipantPlugin(ParticipantPlugin):
- title = "Dispatch - Participants"
- slug = "dispatch-participants"
- description = "Uses dispatch itself to determine participants."
+ def create_case_ticket(
+ self,
+ case_id: int,
+ title: str,
+ assignee_email: str,
+ # reporter: str,
+ case_type_plugin_metadata: dict,
+ db_session=None,
+ ):
+ """Creates a Dispatch case ticket."""
+ case = case_service.get(db_session=db_session, case_id=case_id)
+
+ resource_id = f"dispatch-{case.project.organization.slug}-{case.project.slug}-{case.id}"
+
+ return {
+ "resource_id": resource_id,
+ "weblink": f"{DISPATCH_UI_URL}/{case.project.organization.name}/cases/{resource_id}?project={case.project.name}",
+ "resource_type": "dispatch-internal-ticket",
+ }
+
+ def update_metadata(
+ self,
+ ticket_id: str,
+ metadata: dict,
+ ):
+ """Updates the metadata of a Dispatch ticket."""
+ return
+
+ def update_case_ticket(
+ self,
+ ticket_id: str,
+ title: str,
+ description: str,
+ resolution: str,
+ case_type: str,
+ case_severity: str,
+ case_priority: str,
+ status: str,
+ assignee_email: str,
+ # reporter_email: str,
+ document_weblink: str,
+ storage_weblink: str,
+ dispatch_weblink: str,
+ case_type_plugin_metadata: dict = None,
+ ):
+ """Updates a Dispatch case ticket."""
+ return
+
+ def create_task_ticket(
+ self,
+ task_id: int,
+ title: str,
+ assignee_email: str,
+ reporter_email: str,
+ incident_ticket_key: str = None,
+ task_plugin_metadata: dict = None,
+ db_session=None,
+ ):
+ """Creates a Dispatch task ticket."""
+ return {
+ "resource_id": "",
+ "weblink": "https://dispatch.example.com",
+ }
+
+
+class DispatchMfaPlugin(MultiFactorAuthenticationPlugin):
+ title = "Dispatch Plugin - Multi Factor Authentication"
+ slug = "dispatch-auth-mfa"
+ description = "Uses dispatch itself to validate external requests."
version = dispatch_plugin.__version__
- author = "Kevin Glisson"
+ author = "Netflix"
author_url = "https://github.com/netflix/dispatch.git"
+ configuration_schema = None
+
+ def wait_for_challenge(
+ self,
+ challenge_id: UUID,
+ db_session: Session,
+ timeout: int = 300,
+ ) -> MfaChallengeStatus:
+ """Waits for a multi-factor authentication challenge."""
+ start_time = time.time()
+
+ while time.time() - start_time < timeout:
+ db_session.expire_all()
+ challenge = db_session.query(MfaChallenge).filter_by(challenge_id=challenge_id).first()
+
+ if not challenge:
+ log.error(f"Challenge not found: {challenge_id}")
+ raise Exception("Challenge not found.")
+
+ if challenge.status == MfaChallengeStatus.APPROVED:
+ return MfaChallengeStatus.APPROVED
+ elif challenge.status == MfaChallengeStatus.DENIED:
+ raise Exception("Challenge denied.")
+
+ time.sleep(1)
+
+ # Timeout reached
+ log.warning(f"Timeout reached for challenge: {challenge_id}")
+
+ # Update the challenge status to EXPIRED if it times out
+ challenge = db_session.query(MfaChallenge).filter_by(challenge_id=challenge_id).first()
+ if challenge:
+ log.info(f"Updating challenge {challenge_id} status to EXPIRED")
+ challenge.status = MfaChallengeStatus.EXPIRED
+ db_session.commit()
+ else:
+ log.error(f"Challenge not found when trying to expire: {challenge_id}")
+
+ return MfaChallengeStatus.EXPIRED
+
+ def create_mfa_challenge(
+ self,
+ action: str,
+ current_user: DispatchUser,
+ db_session: Session,
+ project_id: int,
+ ) -> tuple[MfaChallenge, str]:
+ """Creates a multi-factor authentication challenge."""
+ project = project_service.get(db_session=db_session, project_id=project_id)
+
+ challenge = MfaChallenge(
+ action=action,
+ dispatch_user_id=current_user.id,
+ valid=True,
+ )
+ db_session.add(challenge)
+ db_session.commit()
+
+ org_slug = project.organization.slug if project.organization else "default"
+
+ challenge_url = f"{DISPATCH_UI_URL}/{org_slug}/mfa?project_id={project_id}&challenge_id={challenge.challenge_id}&action={action}"
+ return challenge, challenge_url
+
+ def validate_mfa_token(
+ self,
+ payload: MfaPayload,
+ current_user: DispatchUser,
+ db_session: Session,
+ ) -> Literal[MfaChallengeStatus.APPROVED]:
+ """Validates a multi-factor authentication token."""
+ challenge: MfaChallenge | None = (
+ db_session.query(MfaChallenge)
+ .filter_by(challenge_id=payload.challenge_id)
+ .one_or_none()
+ )
+
+ if not challenge:
+ raise InvalidChallengeError("Invalid challenge ID")
+ if challenge.dispatch_user_id != current_user.id:
+ raise UserMismatchError(
+ f"Challenge does not belong to the current user: {current_user.email}"
+ )
+ if challenge.action != payload.action:
+ raise ActionMismatchError("Action mismatch")
+ if not challenge.valid:
+ raise ExpiredChallengeError("Challenge is no longer valid")
+ if challenge.status == MfaChallengeStatus.APPROVED:
+ # Challenge has already been approved
+ return challenge.status
+ if challenge.status != MfaChallengeStatus.PENDING:
+ raise InvalidChallengeStateError(f"Challenge is in invalid state: {challenge.status}")
+
+ challenge.status = MfaChallengeStatus.APPROVED
+ db_session.add(challenge)
+ db_session.commit()
+
+ return challenge.status
+
+ def send_push_notification(self, items, **kwargs):
+ # Implement this method if needed
+ raise NotImplementedError
+
+ def validate_mfa(self, items, **kwargs):
+ # Implement this method if needed
+ raise NotImplementedError
+
+
+class DispatchContactPlugin(ContactPlugin):
+ title = "Dispatch Plugin - Contact plugin"
+ slug = "dispatch-contact"
+ description = "Uses dispatch itself to fetch incident participants contact info."
+ version = dispatch_plugin.__version__
+
+ author = "Netflix"
+ author_url = "https://github.com/netflix/dispatch.git"
+ configuration_schema = None
+
+ def get(self, email, db_session=None):
+ individual = individual_service.get_by_email_and_project(
+ db_session=db_session, email=email, project_id=self.project_id
+ )
+ if individual is None:
+ return {"email": email, "fullname": email}
+
+ data = individual.dict()
+ data["fullname"] = data["name"]
+ return data
+
+
+class DispatchParticipantResolverPlugin(ParticipantPlugin):
+ title = "Dispatch Plugin - Participant Resolver"
+ slug = "dispatch-participant-resolver"
+ description = "Uses dispatch itself to resolve incident participants."
+ version = dispatch_plugin.__version__
+
+ author = "Netflix"
+ author_url = "https://github.com/netflix/dispatch.git"
+ configuration_schema = None
def get(
self,
- incident_type: str,
- incident_priority: str,
- incident_description: str,
+ project_id: int,
+ class_instance: Base,
db_session=None,
):
"""Fetches participants from Dispatch."""
- route_in = {
- "text": incident_description,
- "context": {
- "incident_priorities": [incident_priority.__dict__],
- "incident_types": [incident_type.__dict__],
- "terms": [],
- },
- }
-
- route_in = RouteRequest(**route_in)
- recommendation = route_service.get(db_session=db_session, route_in=route_in)
+ models = [
+ (IndividualContact, IndividualContactRead),
+ (Service, ServiceRead),
+ (TeamContact, TeamContactRead),
+ ]
+ recommendation = route_service.get(
+ db_session=db_session,
+ project_id=project_id,
+ class_instance=class_instance,
+ models=models,
+ )
log.debug(f"Recommendation: {recommendation}")
- # we need to resolve our service contacts to individuals
- for s in recommendation.service_contacts:
- p = plugins.get(s.type)
- log.debug(f"Resolving service contact. ServiceContact: {s}")
- individual_email = p.get(s.external_id)
-
- individual = individual_service.get_or_create(
- db_session=db_session, email=individual_email,
- )
- recommendation.individual_contacts.append(individual)
+
+ individual_contacts = []
+ team_contacts = []
+ for match in recommendation.matches:
+ if match.resource_type == TeamContact.__name__:
+ team = team_service.get_or_create(
+ db_session=db_session,
+ email=match.resource_state["email"],
+ project=class_instance.project,
+ )
+ team_contacts.append(team)
+
+ if match.resource_type == IndividualContact.__name__:
+ individual = individual_service.get_or_create(
+ db_session=db_session,
+ email=match.resource_state["email"],
+ project=class_instance.project,
+ )
+
+ individual_contacts.append((individual, None))
+
+ # we need to do more work when we have a service
+ if match.resource_type == Service.__name__:
+ plugin_instance = plugin_service.get_active_instance_by_slug(
+ db_session=db_session,
+ slug=match.resource_state["type"],
+ project_id=project_id,
+ )
+
+ if plugin_instance:
+ if plugin_instance.enabled:
+ log.debug(
+ f"Resolving service contact. ServiceContact: {match.resource_state}"
+ )
+ # ensure that service is enabled
+ service = service_service.get_by_external_id_and_project_id(
+ db_session=db_session,
+ external_id=match.resource_state["external_id"],
+ project_id=project_id,
+ )
+ if service.is_active:
+ individual_email = plugin_instance.instance.get(
+ match.resource_state["external_id"]
+ )
+
+ individual = individual_service.get_or_create(
+ db_session=db_session,
+ email=individual_email,
+ project=class_instance.project,
+ )
+
+ individual_contacts.append((individual, match.resource_state["id"]))
+ else:
+ log.warning(
+ f"Skipping service contact. Service: {match.resource_state['name']} Reason: Associated service plugin not enabled."
+ )
+ else:
+ log.warning(
+ f"Skipping service contact. Service: {match.resource_state['name']} Reason: Associated service plugin not found."
+ )
db_session.commit()
- return list(recommendation.individual_contacts), list(recommendation.team_contacts)
+ return individual_contacts, team_contacts
diff --git a/src/dispatch/plugins/dispatch_core/service.py b/src/dispatch/plugins/dispatch_core/service.py
new file mode 100644
index 000000000000..b349849b53d7
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_core/service.py
@@ -0,0 +1,20 @@
+import re
+
+
+def create_resource_id(title: str) -> str:
+ """Creates a Slack-friendly resource id from the incident title."""
+ resource_id = title.lower()
+
+ # Replace any character that is not a lowercase letter or number with a hyphen
+ resource_id = re.sub(r"[^a-z0-9]", "-", resource_id)
+
+ # Replace multiple consecutive hyphens with a single hyphen
+ resource_id = re.sub(r"-+", "-", resource_id)
+
+ # Ensure the channel name is not longer than 80 characters
+ resource_id = resource_id[:80]
+
+ # Remove leading or trailing hyphens
+ resource_id = resource_id.strip("-")
+
+ return resource_id
diff --git a/src/dispatch/plugins/dispatch_duo/__init__.py b/src/dispatch/plugins/dispatch_duo/__init__.py
new file mode 100644
index 000000000000..ad5cc752c07b
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_duo/__init__.py
@@ -0,0 +1 @@
+from ._version import __version__ # noqa
diff --git a/src/dispatch/plugins/dispatch_duo/_version.py b/src/dispatch/plugins/dispatch_duo/_version.py
new file mode 100644
index 000000000000..3dc1f76bc69e
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_duo/_version.py
@@ -0,0 +1 @@
+__version__ = "0.1.0"
diff --git a/src/dispatch/plugins/dispatch_duo/config.py b/src/dispatch/plugins/dispatch_duo/config.py
new file mode 100644
index 000000000000..80098c69873f
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_duo/config.py
@@ -0,0 +1,18 @@
+from pydantic import Field, SecretStr
+from dispatch.config import BaseConfigurationModel
+
+
+class DuoConfiguration(BaseConfigurationModel):
+ """Duo configuration description."""
+
+ integration_key: SecretStr = Field(
+ title="Integration Key", description="Admin API integration key ('DI...'):"
+ )
+ integration_secret_key: SecretStr = Field(
+ title="Integration Secret Key",
+ description="Secret token used in conjunction with integration key.",
+ )
+ host: str = Field(
+ title="API Hostname",
+ description="API hostname ('api-....duosecurity.com'): ",
+ )
diff --git a/src/dispatch/plugins/dispatch_duo/enums.py b/src/dispatch/plugins/dispatch_duo/enums.py
new file mode 100644
index 000000000000..0cfd1362f4ab
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_duo/enums.py
@@ -0,0 +1,10 @@
+from dispatch.enums import DispatchEnum
+
+
+class PushResponseResult(DispatchEnum):
+ allow = "allow"
+ deny = "deny"
+ fraud = "fraud"
+ failed = "push_failed"
+ timeout = "timeout"
+ user_not_found = "user_not_found"
diff --git a/src/dispatch/plugins/dispatch_duo/plugin.py b/src/dispatch/plugins/dispatch_duo/plugin.py
new file mode 100644
index 000000000000..cecd15fe1601
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_duo/plugin.py
@@ -0,0 +1,120 @@
+"""
+.. module: dispatch.plugins.dispatch_duo.plugin
+ :platform: Unix
+ :copyright: (c) 2023 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+.. moduleauthor:: Will Sheldon
+"""
+
+import logging
+from typing import NewType
+
+from dispatch.decorators import apply, counter, timer
+from dispatch.plugins.bases import MultiFactorAuthenticationPlugin
+from dispatch.plugins.dispatch_duo.enums import PushResponseResult
+from dispatch.plugins.dispatch_duo import service as duo_service
+from dispatch.plugins.dispatch_duo.config import DuoConfiguration
+from . import __version__
+
+log = logging.getLogger(__name__)
+
+
+DuoAuthResponse = NewType(
+ "DuoAuthResponse",
+ dict[
+ str,
+ dict[str, str] | str,
+ ],
+)
+
+
+@apply(timer, exclude=["__init__"])
+@apply(counter, exclude=["__init__"])
+class DuoMfaPlugin(MultiFactorAuthenticationPlugin):
+ title = "Duo Plugin - Multi Factor Authentication"
+ slug = "duo-auth-mfa"
+ description = "Uses Duo to validate user actions with multi-factor authentication."
+ version = __version__
+
+ author = "Netflix"
+ author_url = "https://github.com/netflix/dispatch.git"
+
+ def __init__(self):
+ self.configuration_schema = DuoConfiguration
+
+ def send_push_notification(
+ self,
+ username: str,
+ type: str,
+ device: str = "auto",
+ ) -> DuoAuthResponse:
+ """Create a new push notification for authentication.
+
+ This function sends a push notification to a Duo-enabled device for multi-factor authentication.
+
+ Args:
+ username (str): The unique identifier for the user, commonly specified by your application during user
+ creation (e.g. wshel@netflix.com). This value may also represent a username alias assigned to a user (e.g. wshel).
+
+ type (str): A string that is displayed in the Duo Mobile app push notification and UI. The notification text
+ changes to "Verify request" and shows your customized string followed by a colon and the application's name,
+ and the request details screen also shows your customized string and the application's name.
+
+ device (str, optional): The ID of the device. This device must have the "push" capability. Defaults to "auto"
+ to use the first of the user's devices with the "push" capability.
+
+ Returns:
+ DuoAuthResponse: The response from the Duo API. A successful response would appear as:
+ {"response": {"result": "allow", "status": "allow", "status_msg": "Success. Logging you in..."}, "stat": "OK"}
+
+ Example:
+ >>> plugin = DuoMfaPlugin()
+ >>> result = plugin.send_push_notification(username='wshel@netflix.com', type='Login Request')
+ >>> result
+ {'response': {'result': 'allow', 'status': 'allow', 'status_msg': 'Success. Logging you in...'}, 'stat': 'OK'}
+
+ Notes:
+ For more information, see https://duo.com/docs/authapi#/auth
+ """
+ duo_client = duo_service.create_duo_auth_client(self.configuration)
+ userstatus = duo_client.preauth(username=username)
+ response = {}
+
+ if userstatus["result"] == "enroll" and "@" in username:
+ username, domain = username.split("@")
+ userstatus = duo_client.preauth(username=username)
+
+ if userstatus["result"] == "enroll":
+ log.warning(f"Sending push notification failed. Unable to find {username} in Duo")
+ return PushResponseResult.user_not_found
+ elif userstatus["result"] == "deny":
+ return PushResponseResult.deny
+ elif userstatus["result"] == "allow":
+ return PushResponseResult.allow
+ elif userstatus["result"] == "auth":
+ push_devs = [
+ row.get("device")
+ for row in userstatus.get("devices")
+ if "push" in row.get("capabilities", [])
+ ]
+ if len(push_devs) < 1:
+ log.error(f"ERROR: Duo account found for {username}, but no devices support Push")
+ return PushResponseResult.deny
+ try:
+ response = duo_client.auth(
+ factor="push", username=username, device=device, type=type
+ )
+ except RuntimeError as e:
+ log.error(f"ERROR: Runtime Error during Duo Push: {e}")
+ return PushResponseResult.deny
+ else:
+ log.error(f"ERROR: Unexpected user status from Duo during push: {userstatus}")
+ return PushResponseResult.deny
+
+ if response.get("result") == PushResponseResult.allow:
+ return PushResponseResult.allow
+
+ if response.get("status") == PushResponseResult.timeout:
+ return PushResponseResult.timeout
+
+ return response
diff --git a/src/dispatch/plugins/dispatch_duo/service.py b/src/dispatch/plugins/dispatch_duo/service.py
new file mode 100644
index 000000000000..1ae3750d8ee6
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_duo/service.py
@@ -0,0 +1,13 @@
+import duo_client
+from duo_client.auth import Auth
+
+from dispatch.plugins.dispatch_duo.config import DuoConfiguration
+
+
+def create_duo_auth_client(config: DuoConfiguration) -> Auth:
+ """Creates a Duo Auth API client."""
+ return duo_client.Auth(
+ ikey=config.integration_key.get_secret_value(),
+ skey=config.integration_secret_key.get_secret_value(),
+ host=config.host,
+ )
diff --git a/src/dispatch/plugins/dispatch_github/__init__.py b/src/dispatch/plugins/dispatch_github/__init__.py
new file mode 100644
index 000000000000..ad5cc752c07b
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_github/__init__.py
@@ -0,0 +1 @@
+from ._version import __version__ # noqa
diff --git a/src/dispatch/plugins/dispatch_github/_version.py b/src/dispatch/plugins/dispatch_github/_version.py
new file mode 100644
index 000000000000..3dc1f76bc69e
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_github/_version.py
@@ -0,0 +1 @@
+__version__ = "0.1.0"
diff --git a/src/dispatch/plugins/dispatch_github/plugin.py b/src/dispatch/plugins/dispatch_github/plugin.py
new file mode 100644
index 000000000000..ceb0a21eaf8b
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_github/plugin.py
@@ -0,0 +1,110 @@
+"""
+.. module: dispatch.plugins.dispatch_github.plugin
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
+
+import platform
+import re
+from re import Pattern
+import sys
+import requests
+from datetime import datetime
+from tenacity import TryAgain, retry, stop_after_attempt, wait_fixed
+
+from . import __version__
+from dispatch.config import BaseConfigurationModel
+from dispatch.decorators import apply, counter, timer
+from dispatch.plugins import dispatch_github as github_plugin
+from dispatch.plugins.bases.monitor import MonitorPlugin
+
+
+class GithubConfiguration(BaseConfigurationModel):
+ pass
+
+
+def create_ua_string():
+ client_name = __name__.split(".")[0]
+ client_version = __version__ # Version is returned from _version.py
+
+ # Collect the package info, Python version and OS version.
+ package_info = {
+ "client": "{0}/{1}".format(client_name, client_version),
+ "python": "Python/{v.major}.{v.minor}.{v.micro}".format(v=sys.version_info),
+ "system": "{0}/{1}".format(platform.system(), platform.release()),
+ }
+
+ # Concatenate and format the user-agent string to be passed into request headers
+ ua_string = []
+ for _, val in package_info.items():
+ ua_string.append(val)
+
+ return " ".join(ua_string)
+
+
+# NOTE we don't yet support enterprise github
+@apply(counter, exclude=["__init__"])
+@apply(timer, exclude=["__init__"])
+class GithubMonitorPlugin(MonitorPlugin):
+ title = "Github Plugin - Github Monitoring"
+ slug = "github-monitor"
+ description = "Allows for the monitoring of Github issues and PRs."
+ version = github_plugin.__version__
+
+ author = "Netflix"
+ author_url = "https://github.com/netflix/dispatch.git"
+
+ def __init__(self):
+ self.configuration_schema = GithubConfiguration
+
+ def get_matchers(self, *args, **kwargs) -> list[Pattern]:
+ """Returns a list of regexes that this monitor plugin should look for in chat messages."""
+ matchers = [
+ r"(?Phttps:\/\/github.com\/(?P[a-zA-Z0-9-_]+)*\/(?P[a-zA-Z0-9-_]+)*\/(?Ppull|issues)*\/(?P\w+)*)"
+ ]
+ return [re.compile(r) for r in matchers]
+
+ @retry(stop=stop_after_attempt(3), wait=wait_fixed(2))
+ def get_match_status(self, weblink: str, last_modified: datetime = None, **kwargs) -> dict:
+ """Fetches the match and attempts to determine current status."""
+ # determine what kind of link we have
+ base_url = "https://api.github.com/repos"
+
+ # NOTE I'm not sure about this logic
+ match_data = None
+ for matcher in self.get_matchers():
+ for match in matcher.finditer(f"<{weblink}>"):
+ match_data = match.groupdict()
+ if match_data:
+ break
+
+ if match_data["type"] == "pull":
+ # for some reason the api and the front end differ for PRs
+ match_data["type"] = match_data["type"].replace("pull", "pulls")
+
+ request_url = f"{base_url}/{match_data['organization']}/{match_data['repo']}/{match_data['type']}/{match_data['id']}"
+
+ # use conditional requests to avoid rate limits
+ # https://docs.github.com/en/rest/overview/resources-in-the-rest-api#conditional-requests
+ headers = {"User-Agent": create_ua_string()}
+ if last_modified:
+ headers.update({"If-Modified-Since": str(last_modified)})
+
+ resp = requests.get(request_url, headers=headers)
+
+ if resp.status_code == 304:
+ # no updates
+ return
+
+ if resp.status_code == 403:
+ raise TryAgain
+
+ if resp.status_code == 200:
+ data = resp.json()
+ monitor_data = {
+ "title": data["title"],
+ "state": data["state"],
+ }
+
+ return monitor_data
diff --git a/src/dispatch/plugins/dispatch_google/calendar/__init__.py b/src/dispatch/plugins/dispatch_google/calendar/__init__.py
index e69de29bb2d1..ad5cc752c07b 100644
--- a/src/dispatch/plugins/dispatch_google/calendar/__init__.py
+++ b/src/dispatch/plugins/dispatch_google/calendar/__init__.py
@@ -0,0 +1 @@
+from ._version import __version__ # noqa
diff --git a/src/dispatch/plugins/dispatch_google/calendar/config.py b/src/dispatch/plugins/dispatch_google/calendar/config.py
deleted file mode 100644
index f505d44ba1c0..000000000000
--- a/src/dispatch/plugins/dispatch_google/calendar/config.py
+++ /dev/null
@@ -1,5 +0,0 @@
-from starlette.config import Config
-
-config = Config(".env")
-
-GOOGLE_CALENDAR_ROOM_EMAIL = config("GOOGLE_CALENDAR_ROOM_EMAIL")
diff --git a/src/dispatch/plugins/dispatch_google/calendar/plugin.py b/src/dispatch/plugins/dispatch_google/calendar/plugin.py
index e561f95312a9..49266717e832 100644
--- a/src/dispatch/plugins/dispatch_google/calendar/plugin.py
+++ b/src/dispatch/plugins/dispatch_google/calendar/plugin.py
@@ -5,11 +5,12 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
+
import logging
import time
import uuid
from datetime import datetime, timedelta
-from typing import Any, List
+from typing import Any
from googleapiclient.errors import HttpError
from pytz import timezone
@@ -17,7 +18,10 @@
from dispatch.decorators import apply, counter, timer
from dispatch.plugins.bases import ConferencePlugin
+from dispatch.plugins.dispatch_google import calendar as google_calendar_plugin
from dispatch.plugins.dispatch_google.common import get_service
+from dispatch.plugins.dispatch_google.config import GoogleConfiguration
+
log = logging.getLogger(__name__)
@@ -37,7 +41,7 @@ def make_call(client: Any, func: Any, delay: int = None, propagate_errors: bool
return data
except HttpError:
- raise TryAgain
+ raise TryAgain from None
def get_event(client: Any, event_id: str):
@@ -73,11 +77,14 @@ def create_event(
name: str,
description: str = None,
title: str = None,
- participants: List[str] = [],
+ participants: list[str] = None,
start_time: str = None,
duration: int = 60000, # duration in mins ~6 weeks
):
- participants = [{"email": x} for x in participants]
+ if participants:
+ participants = [{"email": x} for x in participants]
+ else:
+ participants = []
request_id = str(uuid.uuid4())
body = {
@@ -116,23 +123,30 @@ def create_event(
@apply(timer, exclude=["__init__"])
@apply(counter, exclude=["__init__"])
class GoogleCalendarConferencePlugin(ConferencePlugin):
- title = "Google Calendar - Conference"
+ title = "Google Calendar Plugin - Conference Management"
slug = "google-calendar-conference"
- description = "Uses google calendar to manage conference rooms/meets."
+ description = "Uses Google calendar to manage conference rooms/meets."
+ version = google_calendar_plugin.__version__
- author = "Kevin Glisson"
+ author = "Netflix"
author_url = "https://github.com/netflix/dispatch.git"
def __init__(self):
self.scopes = ["https://www.googleapis.com/auth/calendar"]
+ self.configuration_schema = GoogleConfiguration
def create(
- self, name: str, description: str = None, title: str = None, participants: List[str] = []
+ self, name: str, description: str = None, title: str = None, participants: list[str] = None
):
"""Create a new event."""
- client = get_service("calendar", "v3", self.scopes)
+ client = get_service(self.configuration, "calendar", "v3", self.scopes)
conference = create_event(
- client, name, description=description, participants=participants, title=title
+ client,
+ name,
+ description=description,
+ participants=participants,
+ title=title,
+ duration=self.configuration.default_duration_minutes,
)
meet_url = ""
@@ -144,15 +158,15 @@ def create(
def delete(self, event_id: str):
"""Deletes an existing event."""
- client = get_service("calendar", "v3", self.scopes)
+ client = get_service(self.configuration, "calendar", "v3", self.scopes)
return delete_event(client, event_id)
def add_participant(self, event_id: str, participant: str):
"""Adds a new participant to event."""
- client = get_service("calendar", "v3", self.scopes)
+ client = get_service(self.configuration, "calendar", "v3", self.scopes)
return add_participant(client, event_id, participant)
def remove_participant(self, event_id: str, participant: str):
"""Removes a participant from event."""
- client = get_service("calendar", "v3", self.scopes)
+ client = get_service(self.configuration, "calendar", "v3", self.scopes)
return remove_participant(client, event_id, participant)
diff --git a/src/dispatch/plugins/dispatch_google/common.py b/src/dispatch/plugins/dispatch_google/common.py
index 0633925e4f65..e9bd0b75cff2 100644
--- a/src/dispatch/plugins/dispatch_google/common.py
+++ b/src/dispatch/plugins/dispatch_google/common.py
@@ -5,30 +5,24 @@
from google.oauth2 import service_account
import googleapiclient.discovery
-from .config import (
- GOOGLE_SERVICE_ACCOUNT_CLIENT_EMAIL,
- GOOGLE_SERVICE_ACCOUNT_CLIENT_ID,
- GOOGLE_SERVICE_ACCOUNT_DELEGATED_ACCOUNT,
- GOOGLE_SERVICE_ACCOUNT_PRIVATE_KEY,
- GOOGLE_SERVICE_ACCOUNT_PRIVATE_KEY_ID,
- GOOGLE_SERVICE_ACCOUNT_PROJECT_ID,
- GOOGLE_DEVELOPER_KEY,
-)
+from .config import GoogleConfiguration
TIMEOUT = 300
socket.setdefaulttimeout(TIMEOUT)
-def get_service(service_name: str, version: str, scopes: list):
+def get_service(
+ config: GoogleConfiguration, service_name: str, version: str, scopes: list
+) -> googleapiclient.discovery.Resource:
"""Formats specified credentials for Google clients."""
data = {
"type": "service_account",
- "project_id": GOOGLE_SERVICE_ACCOUNT_PROJECT_ID,
- "private_key_id": GOOGLE_SERVICE_ACCOUNT_PRIVATE_KEY_ID,
- "private_key": str(GOOGLE_SERVICE_ACCOUNT_PRIVATE_KEY).replace("\\n", "\n"),
- "client_email": GOOGLE_SERVICE_ACCOUNT_CLIENT_EMAIL,
- "client_id": GOOGLE_SERVICE_ACCOUNT_CLIENT_ID,
+ "project_id": config.service_account_project_id,
+ "private_key_id": config.service_account_private_key_id,
+ "private_key": config.service_account_private_key.get_secret_value().replace("\\n", "\n"),
+ "client_email": config.service_account_client_email,
+ "client_id": config.service_account_client_id,
"token_uri": "https://oauth2.googleapis.com/token",
}
@@ -40,12 +34,12 @@ def get_service(service_name: str, version: str, scopes: list):
service_account_file.name, scopes=scopes
)
- delegated_credentials = credentials.with_subject(GOOGLE_SERVICE_ACCOUNT_DELEGATED_ACCOUNT)
+ delegated_credentials = credentials.with_subject(config.service_account_delegated_account)
return googleapiclient.discovery.build(
service_name,
version,
credentials=delegated_credentials,
cache_discovery=False,
- developerKey=GOOGLE_DEVELOPER_KEY,
+ developerKey=config.developer_key.get_secret_value(),
)
diff --git a/src/dispatch/plugins/dispatch_google/config.py b/src/dispatch/plugins/dispatch_google/config.py
index c778cc29c84a..0768aad217f0 100644
--- a/src/dispatch/plugins/dispatch_google/config.py
+++ b/src/dispatch/plugins/dispatch_google/config.py
@@ -1,14 +1,45 @@
-from dispatch.config import config, Secret
+from pydantic import Field, SecretStr
+from dispatch.config import BaseConfigurationModel
-GOOGLE_DEVELOPER_KEY = config("GOOGLE_DEVELOPER_KEY", cast=Secret)
-GOOGLE_SERVICE_ACCOUNT_CLIENT_EMAIL = config("GOOGLE_SERVICE_ACCOUNT_CLIENT_EMAIL")
-GOOGLE_SERVICE_ACCOUNT_CLIENT_ID = config("GOOGLE_SERVICE_ACCOUNT_CLIENT_ID")
-GOOGLE_SERVICE_ACCOUNT_DELEGATED_ACCOUNT = config("GOOGLE_SERVICE_ACCOUNT_DELEGATED_ACCOUNT")
-GOOGLE_SERVICE_ACCOUNT_PRIVATE_KEY = config("GOOGLE_SERVICE_ACCOUNT_PRIVATE_KEY", cast=Secret)
-GOOGLE_SERVICE_ACCOUNT_PRIVATE_KEY_ID = config("GOOGLE_SERVICE_ACCOUNT_PRIVATE_KEY_ID")
-GOOGLE_SERVICE_ACCOUNT_PROJECT_ID = config("GOOGLE_SERVICE_ACCOUNT_PROJECT_ID")
-GOOGLE_DOMAIN = config("GOOGLE_DOMAIN")
+class GoogleConfiguration(BaseConfigurationModel):
+ """Google configuration"""
-GOOGLE_USER_OVERRIDE = config("GOOGLE_USER_OVERRIDE", default=None)
+ developer_key: SecretStr = Field(
+ title="Developer Key",
+ description="This is used by the Google API Discovery Service and prevents rate limiting.",
+ )
+ service_account_client_email: str = Field(
+ title="Service Account Client Email",
+ description="The client_email value from your Google Cloud Platform (GCP) service account configuration file.",
+ )
+ service_account_client_id: str = Field(
+ title="Service Account Client Id",
+ description="The client_id value from your Google Cloud Platform (GCP) service account configuration file.",
+ )
+ service_account_private_key: SecretStr = Field(
+ title="Service Account Private Key",
+ description="The private_key value from your Google Cloud Platform (GCP) service account configuration file.",
+ )
+ service_account_private_key_id: str = Field(
+ title="Service Account Private Key Id",
+ description="The private_key_id value from your Google Cloud Platform (GCP) service account configuration file.",
+ )
+ service_account_delegated_account: str = Field(
+ title="Service Account Delegated Account",
+ description="Account to delegate to from the Google Cloud Platform (GCP) service account. Outgoing emails and other artifacts will appear to be from this account.",
+ )
+ service_account_project_id: str = Field(
+ title="Service Account Project Id",
+ description="The project_id value from your Google Cloud Platform (GCP) service account configuration file.",
+ )
+ google_domain: str = Field(
+ title="Google Workspace Domain",
+ description="Base domain for which this Google Cloud Platform (GCP) service account resides.",
+ )
+ default_duration_minutes: int = Field(
+ default=1440, # 1 day
+ title="Default Event Duration (Minutes)",
+ description="Default duration in minutes for conference events. Defaults to 1440 minutes (1 day).",
+ )
diff --git a/src/dispatch/plugins/dispatch_google/docs/__init__.py b/src/dispatch/plugins/dispatch_google/docs/__init__.py
index 2d0b548e6531..ad5cc752c07b 100644
--- a/src/dispatch/plugins/dispatch_google/docs/__init__.py
+++ b/src/dispatch/plugins/dispatch_google/docs/__init__.py
@@ -1 +1 @@
-from dispatch import __version__ # noqa
+from ._version import __version__ # noqa
diff --git a/src/dispatch/plugins/dispatch_google/docs/plugin.py b/src/dispatch/plugins/dispatch_google/docs/plugin.py
index 6d1cbe9f2c75..b4121d098711 100644
--- a/src/dispatch/plugins/dispatch_google/docs/plugin.py
+++ b/src/dispatch/plugins/dispatch_google/docs/plugin.py
@@ -5,108 +5,302 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
+
+import logging
+from collections.abc import Generator
import unicodedata
-from typing import Any, List
+from typing import Any
+
+from googleapiclient.discovery import Resource
+from googleapiclient.errors import HttpError
from dispatch.decorators import apply, counter, timer
from dispatch.plugins.bases import DocumentPlugin
+from dispatch.plugins.dispatch_google import docs as google_docs_plugin
from dispatch.plugins.dispatch_google.common import get_service
-from ._version import __version__
+from dispatch.plugins.dispatch_google.config import GoogleConfiguration
+
+
+log = logging.getLogger(__name__)
def remove_control_characters(s):
return "".join(ch for ch in s if unicodedata.category(ch)[0] != "C")
-def replace_text(client: Any, document_id: str, replacements: List[str]):
- """Replaces text in specified document."""
- requests = []
- for k, v in replacements.items():
- requests.append(
- {"replaceAllText": {"containsText": {"text": k, "matchCase": "true"}, "replaceText": v}}
- )
+def find_links(obj: dict, find_key: str) -> Generator[list[Any], None, None]:
+ """Enumerate all the links found.
+ Returns a path of object, from leaf to parents to root.
- body = {"requests": requests}
- return client.batchUpdate(documentId=document_id, body=body).execute()
+ Parameters:
+ obj (dict): The object to search for links.
+ find_key (str): The key to search for.
+
+ Returns:
+ iter: The generator of a list containing the value and the path to the value.
+
+ This method was originally implemented in the open source library `Beancount`.
+ The original source code can be found at
+ https://github.com/beancount/beancount/blob/master/tools/transform_links_in_docs.py.
+ BeanCount is licensed under the GNU GPLv2.0 license.
+ """
+ if isinstance(obj, dict):
+ for key, value in obj.items():
+ if key == find_key:
+ yield [value, obj]
+ else:
+ for found in find_links(value, find_key):
+ found.append(obj)
+ yield found
+ elif isinstance(obj, list):
+ for value in obj:
+ for found in find_links(value, find_key):
+ found.append(obj)
+ yield found
+
+
+def iter_links(document_content: list) -> Generator[list[tuple[str, str]], None, None]:
+ """Find all the links and return them.
+ Parameters:
+ document_content (list): The contents of the body of a Google Doc (googleapi.discovery.Resource).
+
+ Returns:
+ list: A list of tuples containing the hyperlink url (str) and its corresponding hyperlink element (str).
+ e.g. [('https://www.netflix.com', {...{"textRun": {"textStyle": {"link": {"url": "https://www.netflix.com"}}}}})]
+
+ See https://developers.google.com/docs/api/samples/output-json.
+
+ This method was originally implemented in the open source library `Beancount`.
+ The original source code can be found at
+ https://github.com/beancount/beancount/blob/master/tools/transform_links_in_docs.py.
+ BeanCount is licensed under the GNU GPLv2.0 license.
+ """
+ for jpath in find_links(document_content, "link"):
+ for item in jpath:
+ if "textRun" in item:
+ link = item["textRun"]["textStyle"]["link"]
+ if "url" not in link:
+ continue
+ url = link["url"]
+ yield (url, item)
+
+
+def replace_weblinks(client: Resource, document_id: str, replacements: list[str]) -> int:
+ """Replaces hyperlinks in specified document.
+
+ If the url contains a placeholder, it will be replaced with the value in the replacements list.
+
+ Parameters:
+ client (Resource): The Google API client.
+ document_id (str): The document id.
+ replacements (list[str]): A list of string replacements to make.
+ Returns:
+ int: The number of hyperlink update requests made.
+ """
+ document = client.get(documentId=document_id).execute()
-def insert_into_table(
- client: Any, document_id: str, table_index: int, cell_index: int, rows: List[dict]
-):
- """Inserts rows and text into a table in an existing document."""
- # We insert as many rows as we need using the insertTableRow object.
+ if not document:
+ log.warning(f"Document with id {document_id} not found.")
+ return
+
+ document_content = document.get("body").get("content")
requests = []
- for _ in range(0, len(rows) - 1):
- requests.append(
- {
- "insertTableRow": {
- "tableCellLocation": {
- "tableStartLocation": {"index": table_index},
- "rowIndex": 1,
- "columnIndex": 1,
- },
- "insertBelow": "true",
- }
- }
- )
- # We insert the text into the table cells using the insertText object.
- index = cell_index # set index to first empty cell
- for row in rows:
- for _, value in row.items():
- requests.append({"insertText": {"location": {"index": index}, "text": str(value)}})
- index += len(str(value)) + 2 # set index to next cell
- index += 1 # set index to new row
+ for url, item in iter_links(document_content):
+ for k, v in replacements.items():
+ if k in url and v:
+ requests.append(
+ {
+ "updateTextStyle": {
+ "range": {
+ "startIndex": item["startIndex"],
+ "endIndex": (item["endIndex"]),
+ },
+ "textStyle": {"link": {"url": url.replace(k, v)}},
+ "fields": "link",
+ }
+ }
+ )
+
+ if not requests:
+ return 0
body = {"requests": requests}
- return client.batchUpdate(documentId=document_id, body=body).execute()
+ client.batchUpdate(documentId=document_id, body=body).execute()
+
+ return len(requests)
-def insert_incident_data(client: Any, document_id: str, index: int, incident_data: List[dict]):
- """Inserts incident data in an existing document."""
+def replace_text(client: Resource, document_id: str, replacements: list[str]) -> int:
+ """Replaces text in specified document.
+
+ Parameters:
+ client (Resource): The Google API client.
+ document_id (str): The document id.
+ replacements (list[str]): A list of string replacements to make.
+
+ Returns:
+ int: The number of replacement requests made.
+ """
requests = []
- for data in incident_data:
- key = data["key"]
- title = data["title"]
- summary = remove_control_characters(data["summary"])
- requests.append({"insertText": {"location": {"index": index}, "text": f"{key}: {title}\n"}})
+ for k, v in replacements.items():
requests.append(
- {
- "updateTextStyle": {
- "range": {"startIndex": index, "endIndex": index + len(f"{key}: {title}")},
- "textStyle": {"bold": True},
- "fields": "bold",
- }
- }
+ {"replaceAllText": {"containsText": {"text": k, "matchCase": "true"}, "replaceText": v}}
)
- index += len(f"{key}: {title}\n")
- requests.append({"insertText": {"location": {"index": index}, "text": f"{summary}\n\n"}})
- index += len(f"{summary}\n\n")
-
body = {"requests": requests}
- return client.batchUpdate(documentId=document_id, body=body).execute()
+ client.batchUpdate(documentId=document_id, body=body).execute()
+
+ return len(requests)
@apply(timer, exclude=["__init__"])
@apply(counter, exclude=["__init__"])
class GoogleDocsDocumentPlugin(DocumentPlugin):
- title = "Google Docs - Document"
+ title = "Google Docs Plugin - Document Management"
slug = "google-docs-document"
- description = "Uses google docs to manage document contents."
- version = __version__
+ description = "Uses Google docs to manage document contents."
+ version = google_docs_plugin.__version__
- author = "Kevin Glisson"
+ author = "Netflix"
author_url = "https://github.com/netflix/dispatch.git"
def __init__(self):
+ self.configuration_schema = GoogleConfiguration
self.scopes = [
"https://www.googleapis.com/auth/documents",
"https://www.googleapis.com/auth/drive",
]
- def update(self, document_id: str, **kwargs):
+ def update(self, document_id: str, **kwargs) -> None:
"""Replaces text in document."""
# TODO escape and use f strings? (kglisson)
kwargs = {"{{" + k + "}}": v for k, v in kwargs.items()}
- client = get_service("docs", "v1", self.scopes).documents()
- return replace_text(client, document_id, kwargs)
+ client = get_service(self.configuration, "docs", "v1", self.scopes).documents()
+ replace_weblinks(client, document_id, kwargs)
+ replace_text(client, document_id, kwargs)
+
+ def insert(self, document_id: str, request) -> bool | None:
+ client = get_service(self.configuration, "docs", "v1", self.scopes).documents()
+ body = {"requests": request}
+ try:
+ response = client.batchUpdate(documentId=document_id, body=body).execute()
+ if "replies" in response and response["replies"]:
+ return True
+
+ except HttpError as error:
+ log.exception(error)
+ return False
+ except Exception as e:
+ log.exception(e)
+ return False
+
+ def get_table_details(
+ self, document_id: str, header: str, doc_name: str
+ ) -> tuple[bool, int, int, list[int]]:
+ client = get_service(self.configuration, "docs", "v1", self.scopes).documents()
+ try:
+ document_content = (
+ client.get(documentId=document_id).execute().get("body").get("content")
+ )
+ start_index = 0
+ end_index = 0
+ header_index = 0
+ past_header = False
+ header_section = False
+ table_exists = False
+ table_indices = []
+ headingId = ""
+ for element in document_content:
+ if "paragraph" in element and "elements" in element["paragraph"]:
+ for item in element["paragraph"]["elements"]:
+ if "textRun" in item:
+ if item["textRun"]["content"].strip() == header:
+ header_index = element["endIndex"]
+ header_section = True
+ headingId = element["paragraph"].get("paragraphStyle")["headingId"]
+
+ elif header_section:
+ # Gets the end index of any text below the header
+ if (
+ any(
+ "headingId" in style
+ for style in element["paragraph"]["paragraphStyle"]
+ for style in element["paragraph"].get("paragraphStyle", {})
+ )
+ and element["paragraph"].get("paragraphStyle")["headingId"]
+ != headingId
+ ):
+ if header_index == element["startIndex"]:
+ requests = [
+ {
+ "insertText": {
+ "location": {
+ "index": header_index,
+ },
+ "text": "\n",
+ }
+ },
+ {
+ "updateParagraphStyle": {
+ "range": {
+ "startIndex": header_index,
+ "endIndex": header_index,
+ },
+ "paragraphStyle": {
+ "namedStyleType": "NORMAL_TEXT",
+ },
+ "fields": "namedStyleType",
+ }
+ },
+ ]
+ if GoogleDocsDocumentPlugin.insert(
+ self, document_id=document_id, request=requests
+ ):
+ header_index = header_index
+ past_header = True
+ header_section = False
+ break
+
+ if header_section and item["textRun"]["content"].strip():
+ header_index = item["endIndex"]
+
+ # checking if we are past header in question
+
+ # Checking for table under the header
+ elif header_section and "table" in element and not past_header:
+ table_exists = True
+ start_index = element["startIndex"]
+ end_index = element["endIndex"]
+ table = element["table"]
+ for row in table["tableRows"]:
+ for cell in row["tableCells"]:
+ table_indices.append(cell["content"][0]["startIndex"])
+ return table_exists, start_index, end_index, table_indices
+ except HttpError as error:
+ log.exception(error)
+ return table_exists, header_index, -1, table_indices
+ except Exception as e:
+ log.exception(e)
+ return table_exists, header_index, -1, table_indices
+ if header_index == 0:
+ log.error(
+ f"Could not find Timeline header in the {doc_name} document with id {document_id}"
+ )
+ raise Exception(f"Timeline header does not exist in the {doc_name} document")
+ return table_exists, header_index, -1, table_indices
+
+ def delete_table(self, document_id: str, request) -> bool | None:
+ try:
+ client = get_service(self.configuration, "docs", "v1", self.scopes).documents()
+ body = {"requests": request}
+ response = client.batchUpdate(documentId=document_id, body=body).execute()
+ if "replies" in response and response["replies"]:
+ return True
+
+ except HttpError as error:
+ log.exception(error)
+ return False
+ except Exception as e:
+ log.exception(e)
+ return False
diff --git a/src/dispatch/plugins/dispatch_google/drive/drive.py b/src/dispatch/plugins/dispatch_google/drive/drive.py
index 5db0d1d6e575..17fb7c9cb723 100644
--- a/src/dispatch/plugins/dispatch_google/drive/drive.py
+++ b/src/dispatch/plugins/dispatch_google/drive/drive.py
@@ -5,33 +5,34 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
+
import functools
import io
import json
import logging
-import tempfile
-import time
-import uuid
-from enum import Enum
-from typing import Any, List
+from datetime import datetime, timedelta, timezone
+from typing import Any
from googleapiclient.errors import HttpError
-from googleapiclient.http import MediaFileUpload, MediaIoBaseDownload
+from googleapiclient.http import MediaIoBaseDownload
+from http import HTTPStatus
+from ssl import SSLError
from tenacity import TryAgain, retry, retry_if_exception_type, stop_after_attempt, wait_exponential
-from dispatch.plugins.dispatch_google.config import GOOGLE_DOMAIN
+from dispatch.enums import DispatchEnum
log = logging.getLogger(__name__)
-class UserTypes(str, Enum):
+class UserTypes(DispatchEnum):
user = "user"
group = "group"
domain = "domain"
anyone = "anyone"
-class Roles(str, Enum):
+class Roles(DispatchEnum):
+ # NOTE: https://developers.google.com/drive/api/guides/ref-roles
owner = "owner"
organizer = "organizer"
file_organizer = "fileOrganizer"
@@ -40,6 +41,10 @@ class Roles(str, Enum):
reader = "reader"
+class Activity(DispatchEnum):
+ comment = "COMMENT"
+
+
def paginated(data_key):
def decorator(func):
@functools.wraps(func)
@@ -56,6 +61,9 @@ def decorated_function(*args, **kwargs):
kwargs["fields"] = ",".join(fields)
response = func(*args, **kwargs)
+ if not response.get(data_key):
+ break
+
results += response.get(data_key)
# stop if we hit an empty string
@@ -75,26 +83,35 @@ def decorated_function(*args, **kwargs):
return decorator
-# google sometimes has transient errors
@retry(
stop=stop_after_attempt(5),
retry=retry_if_exception_type(TryAgain),
wait=wait_exponential(multiplier=1, min=2, max=5),
)
def make_call(client: Any, func: Any, propagate_errors: bool = False, **kwargs):
- """Make an google client api call."""
+ """Makes a Google API call."""
try:
return getattr(client, func)(**kwargs).execute()
except HttpError as e:
- if e.resp.status in [300, 429, 500, 502, 503, 504]:
- log.debug("Google encountered an error retrying...")
- raise TryAgain
+ if e.resp.status in [
+ HTTPStatus.MULTIPLE_CHOICES,
+ HTTPStatus.TOO_MANY_REQUESTS,
+ HTTPStatus.INTERNAL_SERVER_ERROR,
+ HTTPStatus.BAD_GATEWAY,
+ HTTPStatus.SERVICE_UNAVAILABLE,
+ HTTPStatus.GATEWAY_TIMEOUT,
+ ]:
+ log.debug("We encountered an HTTP error. Retrying...")
+ raise TryAgain from None
if propagate_errors:
- raise HttpError
+ raise HttpError from None
- errors = json.loads(e.content.decode())
- raise Exception(f"Request failed. Errors: {errors}")
+ error = json.loads(e.content.decode())
+ raise Exception(f"Google request failed. Error: {error}") from None
+ except SSLError as e:
+ log.debug("We encountered an SSL error. Error: {e}")
+ raise Exception(f"Google request failed. Error: {e}") from None
@retry(wait=wait_exponential(multiplier=1, max=10))
@@ -103,63 +120,44 @@ def upload_chunk(request: Any):
try:
return request.next_chunk()
except HttpError as e:
- if e.resp.status in [500, 502, 503, 504]:
- # Call next_chunk() agai, but use an exponential backoff for repeated errors.
+ if e.resp.status in [
+ HTTPStatus.INTERNAL_SERVER_ERROR,
+ HTTPStatus.BAD_GATEWAY,
+ HTTPStatus.SERVICE_UNAVAILABLE,
+ HTTPStatus.GATEWAY_TIMEOUT,
+ ]:
+ # Call next_chunk() again, but use an exponential backoff for repeated errors.
raise e
# TODO add retry
-def upload_file(client: Any, path: str, name: str, mimetype: str):
- """Uploads a file."""
- media = MediaFileUpload(path, mimetype=mimetype, resumable=True)
-
- try:
- request = client.files().create(media_body=media, body={"name": name})
- response = None
-
- while not response:
- _, response = upload_chunk(request)
- return response
- except HttpError as e:
- if e.resp.status in [404]:
- # Start the upload all over again.
- raise TryAgain
- else:
- raise Exception(
- f"Failed to upload file. Name: {name} Path: {path} MIMEType: {mimetype}"
- )
-
-
def get_file(client: Any, file_id: str):
"""Gets a file's metadata."""
return make_call(
client.files(),
"get",
fileId=file_id,
- supportsTeamDrives=True,
fields="id, name, parents, webViewLink",
+ supportsAllDrives=True,
)
-# TODO add retry
-def download_file(client: Any, file_id: str):
- """Downloads a file."""
- request = client.files().get_media(fileId=file_id)
- fp = tempfile.NamedTemporaryFile()
- downloader = MediaIoBaseDownload(fp, request)
-
- response = False
- try:
- while not response:
- _, response = downloader.next_chunk()
- return fp
- except HttpError:
- # Do not retry. Log the error and fail.
- raise Exception(f"Failed to download file. Id: {file_id}")
+@paginated("activities")
+def get_activity(
+ client: Any, file_id: str, activity: Activity = Activity.comment, lookback: int = 60
+):
+ """Fetches file activity."""
+ lookback_time = datetime.now(timezone.utc) - timedelta(seconds=lookback)
+ activity_filter = (
+ f"time >= {int(lookback_time.timestamp() * 1000)} AND detail.action_detail_case: {activity}"
+ )
+ return make_call(
+ client.activity(), "query", body={"filter": activity_filter, "itemName": f"items/{file_id}"}
+ )
def download_google_document(client: Any, file_id: str, mime_type: str = "text/plain"):
- """Downloads a google document."""
+ """Downloads a Google document."""
request = client.files().export_media(fileId=file_id, mimeType=mime_type)
fp = io.BytesIO()
@@ -171,91 +169,42 @@ def download_google_document(client: Any, file_id: str, mime_type: str = "text/p
while not response:
_, response = downloader.next_chunk()
return fp.getvalue().decode("utf-8")
- except HttpError:
- # Do no retry. Log the error fail.
- raise Exception(f"Failed to export the file. Id: {file_id} MimeType: {mime_type}")
+ except (HttpError, OSError):
+ # Do no retry and raise exception
+ raise Exception(f"Failed to export the file. Id: {file_id} MimeType: {mime_type}") from None
-@retry(stop=stop_after_attempt(5), retry=retry_if_exception_type(TryAgain))
-def create_team_drive(client: Any, name: str, members: List[str], role: Roles):
- """Creates a new team drive."""
- request_id = str(uuid.uuid4())
- meta = {"name": name}
- try:
- drive_data = make_call(client.drives(), "create", body=meta, requestId=request_id)
- except HttpError as e:
- if e.resp.status in [409]:
- raise TryAgain
-
- for member in members:
- add_permission(client, member, drive_data["id"], role, "user")
-
- return drive_data
-
-
-def restrict_team_drive(client: Any, team_drive_id: str):
- """Applies a set of restrictions and capabilities to the shared drive."""
- body = {
- "restrictions": {
- "domainUsersOnly": True,
- "driveMembersOnly": True,
- "copyRequiresWriterPermission": True,
- },
- "capabilities": {
- "canChangeDomainUsersOnlyRestriction": False,
- "canChangeDriveMembersOnlyRestriction": False,
- "canChangeCopyRequiresWriterPermissionRestriction": False,
- "canShare": False,
- },
- }
- return make_call(client.drives(), "update", driveId=team_drive_id, body=body)
-
-
-def create_file(client: Any, parent_id: str, name: str, file_type: str = "folder"):
+def create_file(
+ client: Any,
+ parent_id: str,
+ name: str,
+ members: list[str],
+ role: Roles = Roles.writer,
+ file_type: str = "folder",
+):
"""Creates a new folder with the specified parents."""
- if file_type == "folder":
+ if file_type == "document":
+ mimetype = "application/vnd.google-apps.document"
+ elif file_type == "sheet":
+ mimetype = "application/vnd.google-apps.spreadsheet"
+ elif file_type == "folder":
mimetype = "application/vnd.google-apps.folder"
file_metadata = {"name": name, "mimeType": mimetype, "parents": [parent_id]}
- return make_call(
+ file_data = make_call(
client.files(),
"create",
body=file_metadata,
- supportsTeamDrives=True,
fields="id, name, parents, webViewLink",
+ supportsAllDrives=True,
)
+ if members:
+ for member in members:
+ add_permission(client, member, file_data["id"], role, "user")
-def delete_team_drive(client: Any, team_drive_id: str, empty: bool = True):
- """Deletes a team drive"""
- if empty:
- files = list_files(client, team_drive_id)
-
- time.sleep(5)
- for f in files:
- delete_file(client, team_drive_id, f["id"])
-
- return make_call(client.teamdrives(), "delete", teamDriveId=team_drive_id)
-
-
-def archive_team_drive(
- client: Any, source_team_drive_id: str, dest_team_drive_id: str, folder_name: str
-):
- """Archives a google team drive to a specified folder."""
- folder = create_file(client, parent_id=dest_team_drive_id, name=folder_name)
-
- files = list_files(
- client,
- team_drive_id=source_team_drive_id,
- q="mimeType != 'application/vnd.google-apps.folder'",
- )
-
- for f in files:
- add_domain_permission(client, f["id"], domain=GOOGLE_DOMAIN)
- move_file(client, folder["id"], f["id"])
-
- delete_team_drive(client, source_team_drive_id)
+ return file_data
@paginated("files")
@@ -273,33 +222,74 @@ def list_files(client: any, team_drive_id: str, q: str = None, **kwargs):
)
-@paginated("teamDrives")
-def list_team_drives(client, **kwargs):
- """Lists all available team drives."""
- return make_call(client.teamdrives(), "list", **kwargs)
-
-
@paginated("comments")
def list_comments(client: Any, file_id: str, **kwargs):
"""Lists all available comments on file."""
return make_call(client.comments(), "list", fileId=file_id, fields="*", **kwargs)
-def copy_file(client: Any, team_drive_id: str, file_id: str, new_file_name: str):
+def get_comment(client: Any, file_id: str, comment_id: str, **kwargs):
+ """Gets a specific comment."""
+ return make_call(
+ client.comments(), "get", fileId=file_id, commentId=comment_id, fields="*", **kwargs
+ )
+
+
+def get_person(client: Any, person_id: str, **kwargs):
+ """Gets a person's metadata given their people id."""
+ return make_call(
+ client.people(),
+ "get",
+ resourceName=person_id,
+ personFields="emailAddresses",
+ **kwargs,
+ )
+
+
+def add_reply(
+ client: Any, file_id: str, comment_id: str, content: str, resolved: bool = False, **kwargs
+):
+ """Adds a reply to a comment."""
+ if resolved:
+ action = "resolve"
+ content = content if content else "Resolved by Dispatch"
+ else:
+ action = "reopen"
+ content = content if content else "Reopened by Dispatch"
+
+ body = {"content": content, "action": action}
+ return make_call(
+ client.replies(), "create", fileId=file_id, commentId=comment_id, fields="id", body=body
+ )
+
+
+def copy_file(client: Any, folder_id: str, file_id: str, new_file_name: str):
"""Copies a given file."""
return make_call(
client.files(),
"copy",
- body={"name": new_file_name, "teamDriveId": team_drive_id},
+ body={"name": new_file_name, "driveId": folder_id},
fileId=file_id,
fields="id, name, parents, webViewLink",
- supportsTeamDrives=True,
+ supportsAllDrives=True,
)
-def delete_file(client: Any, team_drive_id: str, file_id: str):
- """Deletes a file from a teamdrive."""
- return make_call(client.files(), "delete", fileId=file_id, supportsTeamDrives=True)
+def delete_file(client: Any, file_id: str):
+ """Moves a folder or file to Trash in a Google Drive."""
+ property = {"trashed": True}
+ return make_call(
+ client.files(), "update", fileId=file_id, supportsAllDrives=True, body=property
+ )
+
+
+def mark_as_readonly(
+ client: Any,
+ file_id: str,
+):
+ """Adds the 'copyRequiresWriterPermission' capability to the given file."""
+ capability = {"copyRequiresWriterPermission": True}
+ return make_call(client.files(), "update", fileId=file_id, body=capability)
def add_domain_permission(
@@ -309,7 +299,7 @@ def add_domain_permission(
role: Roles = Roles.commenter,
user_type: UserTypes = UserTypes.domain,
):
- """Adds a domain permission to team drive or file."""
+ """Adds a domain permission to a team drive or file."""
permission = {"type": user_type, "role": role, "domain": domain}
return make_call(
client.permissions(),
@@ -318,7 +308,7 @@ def add_domain_permission(
body=permission,
sendNotificationEmail=False,
fields="id",
- supportsTeamDrives=True,
+ supportsAllDrives=True,
)
@@ -329,7 +319,7 @@ def add_permission(
role: Roles = Roles.owner,
user_type: UserTypes = UserTypes.user,
):
- """Adds a permission to team drive"""
+ """Adds a permission to a team drive or file."""
permission = {"type": user_type, "role": role, "emailAddress": email}
return make_call(
client.permissions(),
@@ -338,18 +328,18 @@ def add_permission(
body=permission,
sendNotificationEmail=False,
fields="id",
- supportsTeamDrives=True,
+ supportsAllDrives=True,
)
-def remove_permission(client: Any, email: str, team_drive_id: str):
- """Removes permission from team drive or file."""
+def remove_permission(client: Any, email: str, team_drive_or_file_id: str):
+ """Removes permission from a team drive or file."""
permissions = make_call(
client.permissions(),
"list",
- fileId=team_drive_id,
- supportsTeamDrives=True,
+ fileId=team_drive_or_file_id,
fields="permissions(id, emailAddress)",
+ supportsAllDrives=True,
)
for p in permissions["permissions"]:
@@ -357,15 +347,15 @@ def remove_permission(client: Any, email: str, team_drive_id: str):
make_call(
client.permissions(),
"delete",
- fileId=team_drive_id,
+ fileId=team_drive_or_file_id,
permissionId=p["id"],
- supportsTeamDrives=True,
+ supportsAllDrives=True,
)
-def move_file(client: Any, team_drive_id: str, file_id: str):
- """Moves a file from one team drive to another"""
- f = make_call(client.files(), "get", fileId=file_id, fields="parents", supportsTeamDrives=True)
+def move_file(client: Any, folder_id: str, file_id: str):
+ """Moves a file from one team drive to another."""
+ f = make_call(client.files(), "get", fileId=file_id, fields="parents", supportsAllDrives=True)
previous_parents = ",".join(f.get("parents"))
@@ -373,13 +363,8 @@ def move_file(client: Any, team_drive_id: str, file_id: str):
client.files(),
"update",
fileId=file_id,
- addParents=team_drive_id,
+ addParents=folder_id,
removeParents=previous_parents,
- supportsTeamDrives=True,
fields="id, name, parents, webViewLink",
+ supportsAllDrives=True,
)
-
-
-def list_permissions(client: Any, **kwargs):
- """List all permissions for file."""
- return make_call(client.files(), "list", **kwargs)
diff --git a/src/dispatch/plugins/dispatch_google/drive/plugin.py b/src/dispatch/plugins/dispatch_google/drive/plugin.py
index e1a1e725c98f..1f370d9d1bb3 100644
--- a/src/dispatch/plugins/dispatch_google/drive/plugin.py
+++ b/src/dispatch/plugins/dispatch_google/drive/plugin.py
@@ -1,161 +1,177 @@
-"""
-.. module: dispatch.plugins.google_drive.plugin
- :platform: Unix
- :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
- :license: Apache, see LICENSE for more details.
-.. moduleauthor:: Kevin Glisson
-"""
-from typing import List
+from pydantic import Field
from dispatch.decorators import apply, counter, timer
from dispatch.plugins.bases import StoragePlugin, TaskPlugin
from dispatch.plugins.dispatch_google import drive as google_drive_plugin
from dispatch.plugins.dispatch_google.common import get_service
+from dispatch.plugins.dispatch_google.config import GoogleConfiguration
from .drive import (
Roles,
+ UserTypes,
+ add_domain_permission,
add_permission,
- archive_team_drive,
+ add_reply,
copy_file,
create_file,
- create_team_drive,
delete_file,
- delete_team_drive,
download_google_document,
list_files,
- list_team_drives,
+ mark_as_readonly,
move_file,
remove_permission,
- restrict_team_drive,
)
-from .task import list_tasks
+from .task import get_task_activity
+
+
+class GoogleDriveConfiguration(GoogleConfiguration):
+ """Google drive configuration."""
+
+ root_id: str = Field(
+ title="Root Incident Storage FolderId",
+ description="This is the default folder for all incident data. Dispatch will create subfolders for each incident in this folder.",
+ )
+
+ open_on_close: bool = Field(
+ title="Open On Close",
+ default=False,
+ description="Controls the visibility of resources on incident close. If enabled Dispatch will make all resources visible to the entire workspace.",
+ )
+
+ read_only: bool = Field(
+ title="Readonly",
+ default=False,
+ description="The incident document will be marked as readonly on incident close. Participants will still be able to interact with the document but any other viewers will not.",
+ )
@apply(timer, exclude=["__init__"])
@apply(counter, exclude=["__init__"])
class GoogleDriveStoragePlugin(StoragePlugin):
- title = "Google Drive - Storage"
+ title = "Google Drive Plugin - Storage Management"
slug = "google-drive-storage"
description = "Uses Google Drive to help manage incident storage."
version = google_drive_plugin.__version__
- author = "Kevin Glisson"
+ author = "Netflix"
author_url = "https://github.com/netflix/dispatch.git"
- _schema = None
-
def __init__(self):
+ self.configuration_schema = GoogleDriveConfiguration
self.scopes = ["https://www.googleapis.com/auth/drive"]
def get(self, file_id: str, mime_type=None):
"""Fetches document text."""
- client = get_service("drive", "v3", self.scopes)
+ client = get_service(self.configuration, "drive", "v3", self.scopes)
return download_google_document(client, file_id, mime_type=mime_type)
- def create(self, name: str, participants: List[str], role: str = Roles.file_organizer.value):
- """Creates a new Google Drive."""
- client = get_service("drive", "v3", self.scopes)
- response = create_team_drive(client, name, participants, role)
- response["weblink"] = f"https://drive.google.com/drive/folders/{response['id']}"
- return response
-
- def delete(self, team_drive_id: str, empty: bool = True):
- """Deletes a Google Drive."""
- client = get_service("drive", "v3", self.scopes)
- return delete_team_drive(client, team_drive_id, empty)
-
- def list(self, **kwargs):
- """Lists all available team drives."""
- client = get_service("drive", "v3", self.scopes)
- return list_team_drives(client, **kwargs)
-
def add_participant(
self,
team_drive_or_file_id: str,
- participants: List[str],
- role: str = "owner",
- user_type: str = "user",
+ participants: list[str],
+ role: str = Roles.writer,
+ user_type: str = UserTypes.user,
):
- """Adds participants to existing Google Drive."""
- client = get_service("drive", "v3", self.scopes)
+ """Adds participants to an existing Google Drive."""
+ client = get_service(self.configuration, "drive", "v3", self.scopes)
for p in participants:
add_permission(client, p, team_drive_or_file_id, role, user_type)
- def remove_participant(self, team_drive_id: str, participants: List[str]):
- """Removes participants from existing Google Drive."""
- client = get_service("drive", "v3", self.scopes)
+ def remove_participant(self, team_drive_or_file_id: str, participants: list[str]):
+ """Removes participants from an existing Google Drive."""
+ client = get_service(self.configuration, "drive", "v3", self.scopes)
for p in participants:
- remove_permission(client, p, team_drive_id)
+ remove_permission(client, p, team_drive_or_file_id)
- def create_file(self, team_drive_id: str, name: str, file_type: str = "folder"):
- """Creates a new file in existing Google Drive."""
- client = get_service("drive", "v3", self.scopes)
- response = create_file(client, team_drive_id, name, file_type)
- response["weblink"] = response["webViewLink"]
- return response
+ def open(self, folder_id: str):
+ """Adds the domain permission to the folder."""
+ client = get_service(self.configuration, "drive", "v3", self.scopes)
+ add_domain_permission(client, folder_id, self.configuration.google_domain)
+
+ def mark_readonly(self, file_id: str):
+ """Adds the read only permission to the folder."""
+ client = get_service(self.configuration, "drive", "v3", self.scopes)
+ mark_as_readonly(client, file_id)
- def delete_file(self, team_drive_id: str, file_id: str):
- """Removes a file from existing Google Drive."""
- client = get_service("drive", "v3", self.scopes)
- response = delete_file(client, team_drive_id, file_id)
+ def create_file(
+ self,
+ parent_id: str,
+ name: str,
+ participants: list[str] = None,
+ role: str = Roles.writer,
+ file_type: str = "folder",
+ ):
+ """Creates a new file in an existing Google Drive."""
+ client = get_service(self.configuration, "drive", "v3", self.scopes)
+ response = create_file(client, parent_id, name, participants, role, file_type)
response["weblink"] = response["webViewLink"]
return response
- def copy_file(self, team_drive_id: str, file_id: str, name: str):
- """Creates a copy of the given file and places it in the specified team drive."""
- client = get_service("drive", "v3", self.scopes)
- response = copy_file(client, team_drive_id, file_id, name)
- response["weblink"] = response["webViewLink"]
+ def delete_file(self, file_id: str):
+ """Deletes a file or folder from an existing Google Drive."""
+ client = get_service(self.configuration, "drive", "v3", self.scopes)
+ response = delete_file(client, file_id)
return response
- def move_file(self, new_team_drive_id: str, file_id: str):
- """Moves a file from one team drive to another."""
- client = get_service("drive", "v3", self.scopes)
- response = move_file(client, new_team_drive_id, file_id)
+ def copy_file(self, folder_id: str, file_id: str, name: str):
+ """Creates a copy of the given file and places it in the specified Google Drive."""
+ client = get_service(self.configuration, "drive", "v3", self.scopes)
+ response = copy_file(client, folder_id, file_id, name)
response["weblink"] = response["webViewLink"]
return response
- def archive(self, source_team_drive_id: str, dest_team_drive_id: str, folder_name: str):
- """Archives a shared team drive to a specific folder."""
- client = get_service("drive", "v3", self.scopes)
- response = archive_team_drive(client, source_team_drive_id, dest_team_drive_id, folder_name)
+ def move_file(self, new_folder_id: str, file_id: str):
+ """Moves a file from one Google drive to another."""
+ client = get_service(self.configuration, "drive", "v3", self.scopes)
+ response = move_file(client, new_folder_id, file_id)
+ response["weblink"] = response["webViewLink"]
return response
- def list_files(self, team_drive_id: str, q: str = None):
- """Lists all files in team drive."""
- client = get_service("drive", "v3", self.scopes)
- return list_files(client, team_drive_id, q)
-
- def restrict(self, team_drive_id: str):
- """Applies a set of restrictions and capabilities to the team drive."""
- client = get_service("drive", "v3", self.scopes)
- response = restrict_team_drive(client, team_drive_id)
- return response
+ def list_files(self, folder_id: str, q: str = None):
+ """Lists all files in a Google drive."""
+ client = get_service(self.configuration, "drive", "v3", self.scopes)
+ return list_files(client, folder_id, q)
class GoogleDriveTaskPlugin(TaskPlugin):
- title = "Google Drive - Task"
+ title = "Google Drive Plugin - Task Management"
slug = "google-drive-task"
description = "Uses Google Drive to help manage incident tasks."
version = google_drive_plugin.__version__
- author = "Marc Vilanova"
+ author = "Netflix"
author_url = "https://github.com/netflix/dispatch.git"
- _schema = None
-
def __init__(self):
+ self.configuration_schema = GoogleConfiguration
self.scopes = ["https://www.googleapis.com/auth/drive"]
def create(self, file_id: str, text: str):
"""Creates a new task."""
pass
- def update(self, file_id: str, task_id):
+ def update(self, file_id: str, task_id: str, content: str = None, resolved: bool = False):
"""Updates an existing task."""
- pass
+ client = get_service(
+ self.configuration, "drive", "v3", ["https://www.googleapis.com/auth/drive"]
+ )
+ return add_reply(client, file_id, task_id, content, resolved)
- def list(self, file_id: str, **kwargs):
+ def list(self, file_id: str, lookback: int = 60, **kwargs):
"""Lists all available tasks."""
- client = get_service("drive", "v3", self.scopes)
- return list_tasks(client, file_id)
+ activity_client = get_service(
+ self.configuration,
+ "driveactivity",
+ "v2",
+ ["https://www.googleapis.com/auth/drive.activity.readonly"],
+ )
+ comment_client = get_service(
+ self.configuration, "drive", "v3", ["https://www.googleapis.com/auth/drive"]
+ )
+ people_client = get_service(
+ self.configuration,
+ "people",
+ "v1",
+ ["https://www.googleapis.com/auth/contacts.readonly"],
+ )
+ return get_task_activity(activity_client, comment_client, people_client, file_id, lookback)
diff --git a/src/dispatch/plugins/dispatch_google/drive/task.py b/src/dispatch/plugins/dispatch_google/drive/task.py
index a74f8b9b1e4e..cbb4401da71c 100644
--- a/src/dispatch/plugins/dispatch_google/drive/task.py
+++ b/src/dispatch/plugins/dispatch_google/drive/task.py
@@ -4,73 +4,145 @@
:copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
+.. moduleauthor:: Marc Vilanova
"""
+
import re
-from typing import Any, List, Dict
+import logging
+from typing import Any
+
+from dispatch.task.enums import TaskStatus
+from enum import Enum
+
+from .drive import get_activity, get_comment, get_person
+
+log = logging.getLogger(__name__)
-from dispatch.plugins.dispatch_google.config import GOOGLE_DOMAIN
-from .drive import get_file, list_comments
+class CommentTypes(str, Enum):
+ assignment = "assignment"
+ post = "post"
-def get_assignees(content: str) -> List[str]:
- """Gets assignees from comment."""
- regex = r"(?<=\+).*?(?=\@)"
- matches = re.finditer(regex, content, re.DOTALL)
- return [f"{m.group()}@{GOOGLE_DOMAIN}" for m in matches]
+class PostSubTypes(str, Enum):
+ subtype_unspecified = "SUBTYPE_UNSPECIFIED"
+ added = "ADDED"
+ deleted = "DELETED"
+ reply_added = "REPLY_ADDED"
+ reply_deleted = "REPLY_DELETED"
+ resolved = "RESOLVED"
+ reopened = "REOPENED"
-def parse_comment(content: str) -> Dict:
- """Parses a comment into it's various parts."""
- assignees = get_assignees(content)
- return {"assignees": assignees}
+class AssignmentSubTypes(str, Enum):
+ subtype_unspecified = "SUBTYPE_UNSPECIFIED"
+ added = "ADDED"
+ deleted = "DELETED"
+ reply_added = "REPLY_ADDED"
+ reply_deleted = "REPLY_DELETED"
+ resolved = "RESOLVED"
+ reopened = "REOPENED"
+ reassigned = "REASSIGNED"
-def get_task_status(task: dict):
- """Gets the current status from task."""
- status = {"resolved": False, "resolved_at": "", "resolved_by": ""}
- if task.get("resolved"):
- for r in task["replies"]:
- if r.get("action") == "resolve":
- status["resolved_by"] = r["author"]["displayName"]
- status["resolved_at"] = r["createdTime"]
- return status
+def find_urls(text: str) -> list[str]:
+ """Finds a url in a text blob."""
+ # findall() has been used
+ # with valid conditions for urls in string
+ regex = r"(?i)\b((?:https?://|www\d{0,3}[.]|[a-z0-9.\-]+[.][a-z]{2,4}/)(?:[^\s()<>]+|\(([^\s()<>]+|(\([^\s()<>]+\)))*\))+(?:\(([^\s()<>]+|(\([^\s()<>]+\)))*\)|[^\s`!()\[\]{};:'\".,<>?ÂĢÂģ""'']))"
+ url = re.findall(regex, text)
+ return [x[0] for x in url]
-def filter_comments(comments: List[Any]):
- """Filters comments for tasks."""
- return [c for c in comments if parse_comment(c["content"])["assignees"]]
+def get_tickets(replies: list[dict]):
+ """Fetches urls/tickets from task replies."""
+ tickets = []
+ for r in replies:
+ if r.get("content"):
+ for url in find_urls(r["content"]):
+ tickets.append({"web_link": url})
+ return tickets
-# NOTE We have to use `displayName` instead of `emailAddress` because it's
-# not visible to us. We should ask rcerda about why that might be.
-def list_tasks(client: Any, file_id: str):
- """Returns all tasks in file."""
- doc = get_file(client, file_id)
+def get_user_email(client: Any, person_id: str) -> str:
+ """Resolves the email address for the actor of the activity."""
+ try:
+ # fetch the email from the people api
+ person_data = get_person(client, person_id)
+ email_address = person_data["emailAddresses"][0]["value"]
+ except KeyError:
+ return "unknown@example.com"
- document_meta = {"document": {"id": file_id, "name": doc["name"]}}
+ return email_address
- all_comments = list_comments(client, file_id)
- task_comments = filter_comments(all_comments)
+
+def get_task_activity(
+ activity_client: Any, comment_client: Any, people_client: Any, file_id: str, lookback: int = 60
+):
+ """Gets a files comment activity and filters for task related events."""
+ activities = get_activity(activity_client, file_id, lookback=lookback)
tasks = []
- for t in task_comments:
- status = get_task_status(t)
- assignees = get_assignees(t["content"])
- description = (t.get("quotedFileContent", {}).get("value", ""),)
-
- task_meta = {
- "task": {
- "id": t["id"],
- "status": status,
- "description": description,
- "owner": t["author"]["displayName"],
- "created_at": t["createdTime"],
- "assignees": assignees,
- "web_link": f'https://docs.google.com/a/{GOOGLE_DOMAIN}/document/d/{file_id}/edit?disco={t["id"]}',
+ for a in sorted(activities, key=lambda time: time["timestamp"]):
+ # process an assignment activity
+ if a["primaryActionDetail"]["comment"].get(CommentTypes.assignment):
+ subtype = a["primaryActionDetail"]["comment"][CommentTypes.assignment]["subtype"]
+ discussion_id = a["targets"][0]["fileComment"]["legacyDiscussionId"]
+
+ task = {"resource_id": discussion_id}
+
+ # we assume the person doing the assignment to be the creator of the task
+ creator_person_id = a["actors"][0]["user"]["knownUser"]["personName"]
+ task["creator"] = {
+ "individual": {"email": get_user_email(people_client, creator_person_id)}
}
- }
- tasks.append({**document_meta, **task_meta})
+ # we create a new task when comment has an assignment added to it
+ if subtype == AssignmentSubTypes.added:
+ # we need to fetch the comment data
+ discussion_id = a["targets"][0]["fileComment"]["legacyDiscussionId"]
+ comment = get_comment(comment_client, file_id, discussion_id)
+
+ task["description"] = comment.get("quotedFileContent", {}).get("value", "")
+
+ task["tickets"] = get_tickets(comment["replies"])
+
+ # we only associate the current assignee event if multiple of people are mentioned (NOTE: should we also associated other mentions?)
+ assignee_person_id = a["primaryActionDetail"]["comment"][CommentTypes.assignment][
+ "assignedUser"
+ ]["knownUser"]["personName"]
+ task["assignees"] = [
+ {"individual": {"email": get_user_email(people_client, assignee_person_id)}}
+ ]
+
+ # this is when the user was assigned (making it into a task, not when the initial comment was created)
+ task["created_at"] = a["timestamp"]
+
+ # this is the deep link to the associated comment
+ task["weblink"] = a["targets"][0]["fileComment"]["linkToDiscussion"]
+
+ elif subtype == AssignmentSubTypes.reply_added:
+ # check to see if there are any linked tickets
+ comment_id = a["targets"][0]["fileComment"]["legacyDiscussionId"]
+ comment = get_comment(comment_client, file_id, comment_id)
+ task["tickets"] = get_tickets(comment["replies"])
+
+ elif subtype == AssignmentSubTypes.deleted:
+ task["status"] = TaskStatus.resolved
+
+ elif subtype == AssignmentSubTypes.resolved:
+ task["status"] = TaskStatus.resolved
+
+ elif subtype == AssignmentSubTypes.reassigned:
+ assignee_person_id = a["primaryActionDetail"]["comment"][CommentTypes.assignment][
+ "assignedUser"
+ ]["knownUser"]["personName"]
+ task["assignees"] = [
+ {"individual": {"email": get_user_email(people_client, assignee_person_id)}}
+ ]
+
+ elif subtype == AssignmentSubTypes.reopened:
+ task["status"] = TaskStatus.open
+ tasks.append(task)
return tasks
diff --git a/src/dispatch/plugins/dispatch_google/gmail/filters.py b/src/dispatch/plugins/dispatch_google/gmail/filters.py
deleted file mode 100644
index 8c9d37d01f4f..000000000000
--- a/src/dispatch/plugins/dispatch_google/gmail/filters.py
+++ /dev/null
@@ -1,14 +0,0 @@
-import os
-from datetime import datetime
-
-from jinja2 import Environment, FileSystemLoader
-
-here = os.path.dirname(os.path.realpath(__file__))
-env = Environment(loader=FileSystemLoader(here))
-
-
-def format_datetime(value):
- return datetime.fromisoformat(value).strftime("%A %d. %B %Y")
-
-
-env.filters["datetime"] = format_datetime
diff --git a/src/dispatch/plugins/dispatch_google/gmail/plugin.py b/src/dispatch/plugins/dispatch_google/gmail/plugin.py
index e861104ba7d3..5cc53c41c8b7 100644
--- a/src/dispatch/plugins/dispatch_google/gmail/plugin.py
+++ b/src/dispatch/plugins/dispatch_google/gmail/plugin.py
@@ -5,158 +5,117 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
+
+import time
+from email.mime.text import MIMEText
import base64
import logging
-import os
-import platform
-from email.mime.text import MIMEText
-from typing import Dict, List, Optional
from tenacity import retry, stop_after_attempt
-from dispatch.config import DISPATCH_HELP_EMAIL, DISPATCH_HELP_SLACK_CHANNEL
from dispatch.decorators import apply, counter, timer
-from dispatch.messaging import (
- INCIDENT_TASK_REMINDER_DESCRIPTION,
+from dispatch.messaging.strings import (
MessageType,
- render_message_template,
)
-from dispatch.plugins.bases import ConversationPlugin
+from dispatch.plugins.bases import EmailPlugin
from dispatch.plugins.dispatch_google import gmail as google_gmail_plugin
from dispatch.plugins.dispatch_google.common import get_service
-from dispatch.plugins.dispatch_google.config import (
- GOOGLE_USER_OVERRIDE,
- GOOGLE_SERVICE_ACCOUNT_DELEGATED_ACCOUNT,
-)
+from dispatch.plugins.dispatch_google.config import GoogleConfiguration
-from .filters import env
+from dispatch.messaging.email.utils import create_message_body, create_multi_message_body
log = logging.getLogger(__name__)
@retry(stop=stop_after_attempt(3))
-def send_message(service, message):
+def send_message(service, message: dict) -> bool:
"""Sends an email message."""
- return service.users().messages().send(userId="me", body=message).execute()
+ sent_message_thread_id = (
+ service.users().messages().send(userId="me", body=message).execute()["threadId"]
+ )
+
+ # wait for a bounce
+ time.sleep(1)
+
+ messages = (
+ service.users()
+ .messages()
+ .list(userId="me", q="from=mailer-daemon@googlemail.com", maxResults=10)
+ .execute()
+ ).get("messages", [])
+
+ for message in messages:
+ if message["threadId"] == sent_message_thread_id:
+ return False
+ return True
-def create_html_message(recipient: str, subject: str, body: str) -> Dict:
+def create_html_message(sender: str, recipient: str, cc: str, subject: str, body: str) -> dict:
"""Creates a message for an email."""
message = MIMEText(body, "html")
- if GOOGLE_USER_OVERRIDE:
- recipient = GOOGLE_USER_OVERRIDE
- log.warning("GOOGLE_USER_OVERIDE set. Using override.")
-
message["to"] = recipient
- message["from"] = GOOGLE_SERVICE_ACCOUNT_DELEGATED_ACCOUNT
+ message["cc"] = cc
+ message["from"] = sender
message["subject"] = subject
return {"raw": base64.urlsafe_b64encode(message.as_bytes()).decode()}
-def get_template(message_type: MessageType):
- """Fetches the correct template based on the message type."""
- template_map = {
- MessageType.incident_notification: ("notification.html", None),
- MessageType.incident_status_report: ("status_report.html", None),
- MessageType.incident_participant_welcome: ("notification.html", None),
- MessageType.incident_task_reminder: (
- "task_notification.html",
- INCIDENT_TASK_REMINDER_DESCRIPTION,
- ),
- }
-
- template_path, description = template_map.get(message_type, (None, None))
-
- if not template_path:
- raise Exception(f"Unable to determine template. MessageType: {message_type}")
-
- return env.get_template(os.path.join("templates", template_path)), description
-
-
-def create_multi_message_body(
- message_template: dict, message_type: MessageType, items: list, **kwargs
-):
- """Creates a multi message message body based on message type."""
- template, description = get_template(message_type)
-
- master_map = []
- for item in items:
- data = item.__dict__
- data.update({"name": item.incident.name, "title": item.incident.title})
- master_map.append(render_message_template(message_template, **data))
-
- kwargs.update({"items": master_map, "description": description})
- return template.render(**kwargs)
-
-
-def create_message_body(message_template: dict, message_type: MessageType, **kwargs):
- """Creates the correct message body based on message type."""
- template, description = get_template(message_type)
- kwargs.update(
- {
- "dispatch_help_email": DISPATCH_HELP_EMAIL,
- "dispatch_help_slack_channel": DISPATCH_HELP_SLACK_CHANNEL,
- }
- )
- rendered = render_message_template(message_template, **kwargs)
- kwargs.update({"items": rendered, "description": description})
- return template.render(**kwargs)
-
-
-def render_email(name, message):
- """Helper function to preview your email."""
- with open(name, "wb") as fp:
- fp.write(message.encode("utf-8"))
-
- if platform.system() == "Linux":
- cwd = os.getcwd()
- print(f"file:/{cwd}/{name}")
- else:
- print(
- rf"/Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome {name}"
- ) # noqa: W605
-
-
@apply(timer, exclude=["__init__"])
@apply(counter, exclude=["__init__"])
-class GoogleGmailConversationPlugin(ConversationPlugin):
- title = "Google Gmail - Conversation"
- slug = "google-gmail-conversation"
- description = "Uses gmail to facilitate conversations."
+class GoogleGmailEmailPlugin(EmailPlugin):
+ title = "Google Gmail Plugin - Email Management"
+ slug = "google-gmail-email"
+ description = "Uses Gmail to facilitate emails."
version = google_gmail_plugin.__version__
- author = "Kevin Glisson"
+ author = "Netflix"
author_url = "https://github.com/netflix/dispatch.git"
def __init__(self):
+ self.configuration_schema = GoogleConfiguration
self.scopes = ["https://mail.google.com/"]
def send(
self,
- user: str,
- message_template: dict,
+ recipient: str,
+ notification_text: str,
+ notification_template: dict,
notification_type: MessageType,
- items: Optional[List] = None,
+ items: list | None = None,
**kwargs,
):
"""Sends an html email based on the type."""
# TODO allow for bulk sending (kglisson)
- client = get_service("gmail", "v1", self.scopes)
+ client = get_service(self.configuration, "gmail", "v1", self.scopes)
+
+ subject = notification_text
+
+ if kwargs.get("name"):
+ subject = f"{kwargs['name'].upper()} - {notification_text}"
if kwargs.get("subject"):
subject = kwargs["subject"]
- else:
- subject = f"{kwargs['name'].upper()} - Incident Notification"
+
+ cc = ""
+ if kwargs.get("cc"):
+ cc = kwargs["cc"]
if not items:
- message_body = create_message_body(message_template, notification_type, **kwargs)
+ message_body = create_message_body(
+ notification_template, notification_type, self.project_id, **kwargs
+ )
else:
message_body = create_multi_message_body(
- message_template, notification_type, items, **kwargs
+ notification_template, notification_type, items, self.project_id, **kwargs
)
- # render_email("task-reminder.html", message_body)
- html_message = create_html_message(user, subject, message_body)
+ html_message = create_html_message(
+ self.configuration.service_account_delegated_account,
+ recipient,
+ cc,
+ subject,
+ message_body,
+ )
return send_message(client, html_message)
diff --git a/src/dispatch/plugins/dispatch_google/gmail/templates/base.html b/src/dispatch/plugins/dispatch_google/gmail/templates/base.html
deleted file mode 100644
index 0acdcbd1521f..000000000000
--- a/src/dispatch/plugins/dispatch_google/gmail/templates/base.html
+++ /dev/null
@@ -1,183 +0,0 @@
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- {{name}}
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
diff --git a/src/dispatch/plugins/dispatch_google/gmail/templates/notification.html b/src/dispatch/plugins/dispatch_google/gmail/templates/notification.html
deleted file mode 100644
index 2f7bc3cb5596..000000000000
--- a/src/dispatch/plugins/dispatch_google/gmail/templates/notification.html
+++ /dev/null
@@ -1,10 +0,0 @@
-{% extends "templates/base.html" %} {% block items %} {% for item in items %}
-
-{% endfor %} {% endblock %}
diff --git a/src/dispatch/plugins/dispatch_google/gmail/templates/status_report.html b/src/dispatch/plugins/dispatch_google/gmail/templates/status_report.html
deleted file mode 100644
index 095e68ba55a7..000000000000
--- a/src/dispatch/plugins/dispatch_google/gmail/templates/status_report.html
+++ /dev/null
@@ -1,23 +0,0 @@
-{% extends 'templates/base.html' %} {% block description %}
-
- This message is for notification purposes only. No direct action of you
- is currently required.
-
-
- Please be aware that this is a point in time update and the details below will
- likely change.
-
-{% endblock %} {% block items %}
-
- Conditions
- {{ conditions }}
-
-
- Actions
- {{ actions }}
-
-
-{% endblock %}
diff --git a/src/dispatch/plugins/dispatch_google/gmail/templates/task_notification.html b/src/dispatch/plugins/dispatch_google/gmail/templates/task_notification.html
deleted file mode 100644
index 7a08d7b67131..000000000000
--- a/src/dispatch/plugins/dispatch_google/gmail/templates/task_notification.html
+++ /dev/null
@@ -1,24 +0,0 @@
-{% extends 'templates/base.html' %}{% block description %}
-
- {{ description }}
-
-{% endblock %} {% block items %} {% for item in items %}
-
-
-{% for row in item %}
-
-
- {% if row.title_link %}
- {{ row.title }}
- {% elif row.datetime %}
- {{ row.title }} {{ row.datetime | datetime }}
- {% else %}
- {{ row.title }} {{ row.text }}
- {% endif %}
-
-{% endfor %}
-
-
-{% endfor %} {% endblock %}
diff --git a/src/dispatch/plugins/dispatch_google/groups/plugin.py b/src/dispatch/plugins/dispatch_google/groups/plugin.py
index c8ab1255ebfe..aac6dedac399 100644
--- a/src/dispatch/plugins/dispatch_google/groups/plugin.py
+++ b/src/dispatch/plugins/dispatch_google/groups/plugin.py
@@ -4,9 +4,10 @@
:copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
:license: Apache, see LICENSE for more details.
"""
+
import logging
import time
-from typing import Any, List
+from typing import Any
from googleapiclient.errors import HttpError
from tenacity import TryAgain, retry, retry_if_exception_type, stop_after_attempt, wait_exponential
@@ -15,7 +16,7 @@
from dispatch.plugins.bases import ParticipantGroupPlugin
from dispatch.plugins.dispatch_google import groups as google_group_plugin
from dispatch.plugins.dispatch_google.common import get_service
-from dispatch.plugins.dispatch_google.config import GOOGLE_USER_OVERRIDE, GOOGLE_DOMAIN
+from dispatch.plugins.dispatch_google.config import GoogleConfiguration
log = logging.getLogger(__name__)
@@ -25,7 +26,7 @@
retry=retry_if_exception_type(TryAgain),
wait=wait_exponential(multiplier=1, min=2, max=5),
)
-def make_call(client: Any, func: Any, delay: int = None, propagate_errors: bool = False, **kwargs):
+def make_call(client: Any, func: Any, delay: int | None = None, propagate_errors: bool = False, **kwargs):
"""Make an google client api call."""
try:
data = getattr(client, func)(**kwargs).execute()
@@ -35,12 +36,12 @@ def make_call(client: Any, func: Any, delay: int = None, propagate_errors: bool
return data
except HttpError as e:
- if e.resp.status in [409]:
+ if propagate_errors:
+ raise e
+ else:
log.error(e.content.decode())
- if propagate_errors:
- raise e
- raise TryAgain
+ raise TryAgain from None
def expand_group(client: Any, group_key: str):
@@ -67,10 +68,6 @@ def add_member(client: Any, group_key: str, email: str, role: str):
for m in members:
body = {"email": m, "role": role}
- if GOOGLE_USER_OVERRIDE:
- log.warning("GOOGLE_USER_OVERIDE set. Using override.")
- body["email"] = GOOGLE_USER_OVERRIDE
-
try:
make_call(
client.members(), "insert", groupKey=group_key, body=body, propagate_errors=True
@@ -78,6 +75,9 @@ def add_member(client: Any, group_key: str, email: str, role: str):
except HttpError as e:
# we are okay with duplication errors upon insert
if e.resp.status in [409]:
+ log.debug(
+ f"Member already exists in google group. GroupKey={group_key} Body={body}"
+ )
continue
log.debug(f"Error adding group member. GroupKey={group_key} Body={body} ")
@@ -85,12 +85,33 @@ def add_member(client: Any, group_key: str, email: str, role: str):
def remove_member(client: Any, group_key: str, email: str):
"""Removes member from google group."""
- return make_call(client.members(), "delete", groupKey=group_key, memberKey=email)
+ try:
+ return make_call(
+ client.members(), "delete", groupKey=group_key, memberKey=email, propagate_errors=True
+ )
+ except HttpError as e:
+ if e.resp.status in [409]:
+ log.debug(
+ f"Member does not exist in google group. GroupKey={group_key} MemberKey={email}"
+ )
+ return
+ elif e.resp.status in [404]:
+ log.debug(
+ f"Group does not exist. GroupKey={group_key} Trying to remove MemberKey={email}"
+ )
+ return
def list_members(client: Any, group_key: str, **kwargs):
"""Lists all members of google group."""
- return make_call(client.members(), "list", groupKey=group_key, **kwargs)
+ try:
+ return make_call(client.members(), "list", groupKey=group_key, **kwargs)
+ except HttpError as e:
+ if e.resp.status in [404]:
+ log.debug(f"Group does not exist. GroupKey={group_key} Trying to list members.")
+ return
+ else:
+ raise e
def create_group(client: Any, name: str, email: str, description: str):
@@ -104,36 +125,31 @@ def delete_group(client: Any, group_key: str, **kwargs):
return make_call(client.groups(), "delete", groupKey=group_key, **kwargs)
-def list_groups(client: Any, **kwargs):
- """Lists all google groups available."""
- return make_call(client.groups(), "list", **kwargs)
-
-
@apply(timer, exclude=["__init__"])
@apply(counter, exclude=["__init__"])
class GoogleGroupParticipantGroupPlugin(ParticipantGroupPlugin):
- title = "Google Group - Participant Group"
+ title = "Google Group Plugin - Participant Group Management"
slug = "google-group-participant-group"
description = "Uses Google Groups to help manage participant membership."
version = google_group_plugin.__version__
- author = "Kevin Glisson"
+ author = "Netflix"
author_url = "https://github.com/netflix/dispatch.git"
- _schema = None
-
def __init__(self):
+ self.configuration_schema = GoogleConfiguration
self.scopes = [
"https://www.googleapis.com/auth/admin.directory.group",
"https://www.googleapis.com/auth/apps.groups.settings",
]
def create(
- self, name: str, participants: List[str], description: str = None, role: str = "MEMBER"
+ self, name: str, participants: list[str], description: str = None, role: str = "MEMBER"
):
"""Creates a new Google Group."""
- client = get_service("admin", "directory_v1", self.scopes)
- group_key = f"{name.lower()}@{GOOGLE_DOMAIN}"
+ client = get_service(self.configuration, "admin", "directory_v1", self.scopes)
+ # note: group username is limited to 60 characters
+ group_key = f"{name.lower()[:60]}@{self.configuration.google_domain}"
if not description:
description = "Group automatically created by Dispatch."
@@ -145,24 +161,35 @@ def create(
group.update(
{
- "weblink": f"https://groups.google.com/a/{GOOGLE_DOMAIN}/forum/#!forum/{group['name']}"
+ "weblink": f"https://groups.google.com/a/{self.configuration.google_domain}/forum/#!forum/{group['name']}"
}
)
return group
- def add(self, email: str, participants: List[str], role: str = "MEMBER"):
- """Adds participants to existing Google Group."""
- client = get_service("admin", "directory_v1", self.scopes)
+ def add(self, email: str, participants: list[str], role: str = "MEMBER"):
+ """Adds participants to an existing Google Group."""
+ client = get_service(self.configuration, "admin", "directory_v1", self.scopes)
for p in participants:
add_member(client, email, p, role)
- def remove(self, email: str, participants: List[str]):
- """Removes participants from existing Google Group."""
- client = get_service("admin", "directory_v1", self.scopes)
+ def remove(self, email: str, participants: list[str]):
+ """Removes participants from an existing Google Group."""
+ client = get_service(self.configuration, "admin", "directory_v1", self.scopes)
for p in participants:
remove_member(client, email, p)
+ def list(self, email: str) -> list[str]:
+ """Lists members from an existing Google Group."""
+ client = get_service(self.configuration, "admin", "directory_v1", self.scopes)
+ try:
+ members = list_members(client, email)
+ return [m["email"] for m in members.get("members", [])]
+ except HttpError as e:
+ if e.resp.status == 404:
+ log.warning(f"Group does not exist. GroupKey={email} Trying to list members.")
+ return []
+
def delete(self, email: str):
- """Deletes an existing google group."""
- client = get_service("admin", "directory_v1", self.scopes)
+ """Deletes an existing Google group."""
+ client = get_service(self.configuration, "admin", "directory_v1", self.scopes)
delete_group(client, email)
diff --git a/src/dispatch/plugins/dispatch_jira/config.py b/src/dispatch/plugins/dispatch_jira/config.py
deleted file mode 100644
index cc08db6441a2..000000000000
--- a/src/dispatch/plugins/dispatch_jira/config.py
+++ /dev/null
@@ -1,11 +0,0 @@
-from starlette.datastructures import URL
-
-from dispatch.config import config, Secret
-
-
-JIRA_BROWSER_URL = config("JIRA_BROWSER_URL", cast=URL)
-JIRA_API_URL = config("JIRA_API_URL", cast=URL)
-JIRA_USERNAME = config("JIRA_USERNAME")
-JIRA_PASSWORD = config("JIRA_PASSWORD", cast=Secret)
-JIRA_PROJECT_KEY = config("JIRA_PROJECT_KEY")
-JIRA_ISSUE_TYPE_ID = config("JIRA_ISSUE_TYPE_ID")
diff --git a/src/dispatch/plugins/dispatch_jira/plugin.py b/src/dispatch/plugins/dispatch_jira/plugin.py
index fb1e414e4ff3..a0f1246949cf 100644
--- a/src/dispatch/plugins/dispatch_jira/plugin.py
+++ b/src/dispatch/plugins/dispatch_jira/plugin.py
@@ -1,139 +1,260 @@
"""
.. module: dispatch.plugins.dispatch_jira.plugin
- :platform: Unix
- :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
- :license: Apache, see LICENSE for more details.
+:platform: Unix
+:copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+:license: Apache, see LICENSE for more details.
"""
+
+import json
+import logging
+from typing import Any
+
+import requests
+from requests.auth import HTTPBasicAuth
+
+from pydantic import Field, SecretStr, AnyHttpUrl
+
from jinja2 import Template
from jira import JIRA
-from typing import Any, List
+from dispatch.config import BaseConfigurationModel
from dispatch.decorators import apply, counter, timer
+from dispatch.enums import DispatchEnum
from dispatch.plugins import dispatch_jira as jira_plugin
+from dispatch.case import service as case_service
+from dispatch.incident import service as incident_service
from dispatch.plugins.bases import TicketPlugin
+from dispatch.project.models import Project
-from .config import (
- JIRA_BROWSER_URL,
- JIRA_API_URL,
- JIRA_USERNAME,
- JIRA_PASSWORD,
- JIRA_PROJECT_KEY,
- JIRA_ISSUE_TYPE_ID,
+from .templates import (
+ CASE_ISSUE_SUMMARY_TEMPLATE,
+ INCIDENT_ISSUE_SUMMARY_NO_RESOURCES_TEMPLATE,
+ INCIDENT_ISSUE_SUMMARY_TEMPLATE,
)
-INCIDENT_TEMPLATE = """
-{color:red}*CONFIDENTIAL -- Internal use only{color}*
+from dispatch.config import DISPATCH_UI_URL
-*Summary*
-{{description}}
+log = logging.getLogger(__name__)
-*Incident Commander*
-[~{{commander_username}}]
-*Incident Resources*
-[Incident Conversation|{{conversation_weblink}}]
-[Incident Document|{{document_weblink}}]
-[Incident Storage|{{storage_weblink}}]
-[Incident Conference|{{conference_weblink}}]
-"""
+class HostingType(DispatchEnum):
+ """Type of Jira deployment."""
+
+ cloud = "cloud"
+ server = "server"
+
+
+class JiraConfiguration(BaseConfigurationModel):
+ """Jira configuration description."""
+
+ default_project_id: str = Field(
+ title="Default Project Id or Key", description="Defines the default Jira Project to use."
+ )
+ default_issue_type_name: str = Field(
+ title="Default Issue Type Name",
+ description="Defines the default Jira issue type name to use.",
+ )
+ hosting_type: HostingType = Field(
+ "cloud", title="Hosting Type", description="Defines the type of deployment."
+ )
+ username: str = Field(
+ title="Username", description="Username to use to authenticate to Jira API."
+ )
+ password: SecretStr = Field(
+ title="Password", description="Password to use to authenticate to Jira API."
+ )
+ api_url: AnyHttpUrl = Field(
+ title="API URL", description="This URL is used for communication with API."
+ )
+ browser_url: AnyHttpUrl = Field(
+ title="Browser URL", description="This URL is used to construct browser weblinks."
+ )
+
+
+def get_email_username(email: str) -> str:
+ """Returns username part of email, if valid email is provided."""
+ if "@" in email:
+ return email.split("@")[0]
+ return email
+
+
+def get_cloud_user_account_id_by_email(configuration: JiraConfiguration, user_email: str) -> dict:
+ """Gets the cloud Jira user account id by email address."""
+ endpoint = "groupuserpicker"
+ url = f"{configuration.api_url}/rest/api/2/{endpoint}?query={user_email}&maxResults=1"
+ auth = (configuration.username, configuration.password.get_secret_value())
-INCIDENT_PRIORITY_MAP = {
- "low": {"id": "28668"},
- "medium": {"id": "28669"},
- "high": {"id": "28670"},
- "info": {"id": "40461"},
-}
+ headers = {"Accept": "application/json"}
+ response = requests.request("GET", url, headers=headers, auth=HTTPBasicAuth(*auth))
+ users = json.loads(response.text)
+ if users["users"]["users"]:
+ return users["users"]["users"][0]["accountId"]
-def create_sec_issue(
- client: Any,
+ # we get and return the account id of the default Jira account
+ url = (
+ f"{configuration.api_url}/rest/api/2/{endpoint}?query={configuration.username}&maxResults=1"
+ )
+ response = requests.request("GET", url, headers=headers, auth=HTTPBasicAuth(*auth))
+ users = json.loads(response.text)
+ return users["users"]["users"][0]["accountId"]
+
+
+def get_user_field(client: JIRA, configuration: JiraConfiguration, user_email: str) -> dict:
+ """Returns correct Jira user field based on Jira hosting type."""
+ if configuration.hosting_type == HostingType.server:
+ username = get_email_username(user_email)
+ users = client.search_users(user=username)
+ for user in users:
+ if user.name == username:
+ return {"name": user.name}
+
+ # we default to the Jira user we use for managing issues
+ # if we can't find the user in Jira
+ return {"name": configuration.username}
+ if configuration.hosting_type == HostingType.cloud:
+ user_account_id = get_cloud_user_account_id_by_email(configuration, user_email)
+ return {"id": user_account_id}
+
+
+def process_plugin_metadata(plugin_metadata: dict):
+ """Processes plugin metadata."""
+ project_id = None
+ issue_type_name = None
+ if plugin_metadata:
+ for key_value in plugin_metadata["metadata"]:
+ if key_value["key"] == "project_id":
+ project_id = key_value["value"]
+ if key_value["key"] == "issue_type_name":
+ issue_type_name = key_value["value"]
+
+ return project_id, issue_type_name
+
+
+def create_dict_from_plugin_metadata(plugin_metadata: dict):
+ """Creates a dictionary from plugin metadata, excluding project_id and issue_type_name."""
+ metadata_dict = {}
+ if plugin_metadata:
+ for key_value in plugin_metadata["metadata"]:
+ if key_value["key"] != "project_id" and key_value["key"] != "issue_type_name":
+ metadata_dict[key_value["key"]] = key_value["value"]
+
+ return metadata_dict
+
+
+def create_client(configuration: JiraConfiguration) -> JIRA:
+ """Creates a Jira client."""
+ return JIRA(
+ str(configuration.api_url),
+ basic_auth=(configuration.username, configuration.password.get_secret_value()),
+ )
+
+
+def create_incident_issue_fields(
title: str,
- priority: str,
+ description: str,
incident_type: str,
+ incident_severity: str,
+ incident_priority: str,
+ assignee: dict,
+ reporter: dict,
commander_username: str,
- reporter_username: str,
-):
- issue_fields = {
- "project": {"key": JIRA_PROJECT_KEY},
- "issuetype": {"id": JIRA_ISSUE_TYPE_ID},
- "summary": title,
- "assignee": {"name": commander_username},
- "components": [{"name": incident_type}],
- "reporter": {"name": reporter_username},
- "customfield_10551": INCIDENT_PRIORITY_MAP[priority.lower()],
- }
-
- return create(client, issue_fields, type=JIRA_PROJECT_KEY)
-
-
-def create_issue_fields(
- title: str = None,
- description: str = None,
- incident_type: str = None,
- priority: str = None,
- commander_username: str = None,
- reporter_username: str = None,
- conversation_weblink: str = None,
- document_weblink: str = None,
- storage_weblink: str = None,
- conference_weblink: str = None,
- labels: List[str] = None,
- cost: str = None,
+ conversation_weblink: str,
+ document_weblink: str,
+ storage_weblink: str,
+ conference_weblink: str,
+ dispatch_weblink: str,
+ cost: float,
):
"""Creates Jira issue fields."""
- issue_fields = {}
+ cost = f"${cost:,.2f}"
- if title:
- issue_fields.update({"summary": title})
+ issue_fields = {}
+ issue_fields.update({"summary": title})
+ issue_fields.update({"assignee": assignee})
+ issue_fields.update({"reporter": reporter})
if (
- description
- and commander_username
- and document_weblink
- and conversation_weblink
- and storage_weblink
- and conference_weblink
+ conversation_weblink is None
+ and document_weblink is None
+ and storage_weblink is None
+ and conference_weblink is None
):
- description = Template(INCIDENT_TEMPLATE).render(
+ # the incident was opened as closed and we didn't create resources
+ description = Template(INCIDENT_ISSUE_SUMMARY_NO_RESOURCES_TEMPLATE).render(
+ description=description,
+ incident_type=incident_type,
+ incident_severity=incident_severity,
+ incident_priority=incident_priority,
+ cost=cost,
+ commander_username=commander_username,
+ )
+ else:
+ description = Template(INCIDENT_ISSUE_SUMMARY_TEMPLATE).render(
description=description,
+ incident_type=incident_type,
+ incident_severity=incident_severity,
+ incident_priority=incident_priority,
+ cost=cost,
commander_username=commander_username,
document_weblink=document_weblink,
conference_weblink=conference_weblink,
conversation_weblink=conversation_weblink,
storage_weblink=storage_weblink,
+ dispatch_weblink=dispatch_weblink,
)
- issue_fields.update({"description": description})
-
- if commander_username:
- issue_fields.update({"assignee": {"name": commander_username}})
-
- if reporter_username:
- issue_fields.update({"reporter": {"name": reporter_username}})
+ issue_fields.update({"description": description})
- if incident_type:
- issue_fields.update({"components": [{"name": incident_type}]})
-
- if priority:
- issue_fields.update({"customfield_10551": INCIDENT_PRIORITY_MAP[priority.lower()]})
+ return issue_fields
- if labels:
- issue_fields.update({"labels": labels})
- if cost:
- issue_fields.update({"customfield_20250": cost})
+def create_case_issue_fields(
+ title: str,
+ description: str,
+ resolution: str,
+ case_type: str,
+ case_severity: str,
+ case_priority: str,
+ assignee: dict,
+ reporter: dict,
+ assignee_username: str,
+ document_weblink: str,
+ storage_weblink: str,
+ dispatch_weblink: str,
+):
+ """Creates Jira issue fields."""
+ issue_fields = {}
+ issue_fields.update({"summary": title})
+ issue_fields.update({"assignee": assignee})
+ issue_fields.update({"reporter": reporter})
+
+ description = Template(CASE_ISSUE_SUMMARY_TEMPLATE).render(
+ assignee_username=assignee_username,
+ case_priority=case_priority,
+ case_severity=case_severity,
+ case_type=case_type,
+ description=description,
+ document_weblink=document_weblink,
+ resolution=resolution,
+ storage_weblink=storage_weblink,
+ dispatch_weblink=dispatch_weblink,
+ )
+ issue_fields.update({"description": description})
return issue_fields
-def create(client: Any, issue_fields: dict, type: str = JIRA_PROJECT_KEY) -> dict:
+def create(configuration: dict, client: Any, issue_fields: dict) -> dict:
"""Creates a Jira issue."""
issue = client.create_issue(fields=issue_fields)
- return {"resource_id": issue.key, "weblink": f"{JIRA_BROWSER_URL}/browse/{issue.key}"}
+ return {"resource_id": issue.key, "weblink": f"{configuration.browser_url}/browse/{issue.key}"}
-def update(client: Any, issue: Any, issue_fields: dict, transition: str = None) -> dict:
+def update(
+ configuration: dict, client: Any, issue: Any, issue_fields: dict, transition: str = None
+) -> dict:
"""Updates a Jira issue."""
- data = {"resource_id": issue.key, "link": f"{JIRA_BROWSER_URL}/browse/{issue.key}"}
+ data = {"resource_id": issue.key, "link": f"{configuration.browser_url}/browse/{issue.key}"}
if issue_fields:
issue.update(fields=issue_fields)
@@ -148,94 +269,338 @@ def update(client: Any, issue: Any, issue_fields: dict, transition: str = None)
return data
-def link_issues(client: Any, link_type: str, issue_id_a: str, issue_id_b: str):
- """Links two Jira issues."""
- issue_key_a = client.issue(issue_id_b).key
- issue_key_b = client.issue(issue_id_b).key
-
- body = (
- f"Creating link of type {link_type} between Jira issue keys {issue_key_a} and {issue_key_b}"
- )
-
- client.create_issue_link(
- type=link_type, inwardIssue=issue_key_a, outwardIssue=issue_key_b, comment={"body": body}
- )
+def create_fallback_ticket_incident(incident_id: int, project: Project):
+ resource_id = f"dispatch-{project.organization.slug}-{project.slug}-{incident_id}"
return {
- "key_a": issue_key_a,
- "link_a": f"{JIRA_BROWSER_URL}/browse/{issue_key_a}",
- "key_b": issue_key_b,
- "link_b": f"{JIRA_BROWSER_URL}/browse/{issue_key_b}",
+ "resource_id": resource_id,
+ "weblink": f"{DISPATCH_UI_URL}/{project.organization.name}/incidents/{resource_id}?project={project.name}",
+ "resource_type": "jira-error-ticket",
}
-def get_user_name(email):
- """Returns username part of email, if valid email is provided."""
- if "@" in email:
- return email.split("@")[0]
- return email
+def create_fallback_ticket_case(case_id: int, project: Project):
+ resource_id = f"dispatch-{project.organization.slug}-{project.slug}-{case_id}"
+
+ return {
+ "resource_id": resource_id,
+ "weblink": f"{DISPATCH_UI_URL}/{project.organization.name}/cases/{resource_id}?project={project.name}",
+ "resource_type": "jira-error-ticket",
+ }
@apply(counter, exclude=["__init__"])
@apply(timer, exclude=["__init__"])
class JiraTicketPlugin(TicketPlugin):
- title = "Jira - Ticket"
+ title = "Jira Plugin - Ticket Management"
slug = "jira-ticket"
- description = "Uses Jira as an external ticket creator."
+ description = "Uses Jira to help manage external tickets."
version = jira_plugin.__version__
- author = "Kevin Glisson"
+ author = "Netflix"
author_url = "https://github.com/netflix/dispatch.git"
- _schema = None
+ def __init__(self):
+ self.configuration_schema = JiraConfiguration
def create(
- self, title: str, incident_type: str, incident_priority: str, commander: str, reporter: str
+ self,
+ incident_id: int,
+ title: str,
+ commander_email: str,
+ reporter_email: str,
+ incident_type_plugin_metadata: dict = None,
+ db_session=None,
):
- """Creates a Jira ticket."""
- client = JIRA(str(JIRA_API_URL), basic_auth=(JIRA_USERNAME, str(JIRA_PASSWORD)))
- commander_username = get_user_name(commander)
- reporter_username = get_user_name(reporter)
- return create_sec_issue(
- client, title, incident_priority, incident_type, commander_username, reporter_username
- )
+ """Creates an incident Jira issue."""
+ try:
+ client = create_client(self.configuration)
+
+ assignee = get_user_field(client, self.configuration, commander_email)
+
+ reporter = assignee
+ if reporter_email != commander_email:
+ reporter = get_user_field(client, self.configuration, reporter_email)
+
+ project_id, issue_type_name = process_plugin_metadata(incident_type_plugin_metadata)
+ other_fields = create_dict_from_plugin_metadata(incident_type_plugin_metadata)
+
+ if not project_id:
+ project_id = self.configuration.default_project_id
+
+ # NOTE: to support issue creation by project id or key
+ project = {"id": project_id}
+ if not project_id.isdigit():
+ project = {"key": project_id}
+
+ if not issue_type_name:
+ issue_type_name = self.configuration.default_issue_type_name
+
+ issuetype = {"name": issue_type_name}
+
+ issue_fields = {
+ "project": project,
+ "issuetype": issuetype,
+ "assignee": assignee,
+ "reporter": reporter,
+ "summary": title.replace("\n", ""),
+ **other_fields,
+ }
+
+ ticket = create(self.configuration, client, issue_fields)
+ except Exception as e:
+ log.exception(
+ f"Failed to create Jira ticket for incident_id: {incident_id}. "
+ f"Creating incident ticket with core plugin instead. Error: {e}"
+ )
+ # fall back to creating a ticket without the plugin
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+ ticket = create_fallback_ticket_incident(
+ incident_id=incident.id, project=incident.project
+ )
+
+ return ticket
def update(
self,
ticket_id: str,
- title: str = None,
- description: str = None,
- incident_type: str = None,
- priority: str = None,
- status: str = None,
- commander_email: str = None,
- reporter_email: str = None,
- conversation_weblink: str = None,
- conference_weblink: str = None,
- document_weblink: str = None,
- storage_weblink: str = None,
- labels: List[str] = None,
- cost: str = None,
+ title: str,
+ description: str,
+ incident_type: str,
+ incident_severity: str,
+ incident_priority: str,
+ status: str,
+ commander_email: str,
+ reporter_email: str,
+ conversation_weblink: str,
+ document_weblink: str,
+ storage_weblink: str,
+ conference_weblink: str,
+ dispatch_weblink: str,
+ cost: float,
+ incident_type_plugin_metadata: dict = None,
):
- """Updates Jira ticket fields."""
- commander_username = get_user_name(commander_email) if commander_email else None
- reporter_username = get_user_name(reporter_email) if reporter_email else None
+ """Updates an incident Jira issue."""
+ client = create_client(self.configuration)
+
+ assignee = get_user_field(client, self.configuration, commander_email)
- client = JIRA(str(JIRA_API_URL), basic_auth=(JIRA_USERNAME, str(JIRA_PASSWORD)))
+ reporter = assignee
+ if reporter_email != commander_email:
+ reporter = get_user_field(client, self.configuration, reporter_email)
+
+ commander_username = get_email_username(commander_email)
issue = client.issue(ticket_id)
- issue_fields = create_issue_fields(
+ issue_fields = create_incident_issue_fields(
title=title,
description=description,
incident_type=incident_type,
- priority=priority,
+ incident_severity=incident_severity,
+ incident_priority=incident_priority,
+ assignee=assignee,
+ reporter=reporter,
commander_username=commander_username,
- reporter_username=reporter_username,
conversation_weblink=conversation_weblink,
- conference_weblink=conference_weblink,
document_weblink=document_weblink,
storage_weblink=storage_weblink,
- labels=labels,
+ conference_weblink=conference_weblink,
+ dispatch_weblink=dispatch_weblink,
cost=cost,
)
- return update(client, issue, issue_fields, status)
+
+ return update(self.configuration, client, issue, issue_fields, status)
+
+ def update_metadata(
+ self,
+ ticket_id: str,
+ metadata: dict,
+ ):
+ """Updates the metadata of a Jira issue."""
+ client = create_client(self.configuration)
+ issue = client.issue(ticket_id)
+
+ # check to make sure project id matches metadata
+ project_id, issue_type_name = process_plugin_metadata(metadata)
+ if project_id and issue.fields.project.key != project_id:
+ log.warning(
+ f"Project key mismatch between Jira issue {issue.fields.project.key} and metadata {project_id} for ticket {ticket_id}"
+ )
+ return
+ other_fields = create_dict_from_plugin_metadata(metadata)
+ issue_fields = {
+ **other_fields,
+ }
+ if issue_type_name:
+ issue_fields["issuetype"] = {"name": issue_type_name}
+
+ issue.update(fields=issue_fields)
+
+ def create_case_ticket(
+ self,
+ case_id: int,
+ title: str,
+ assignee_email: str,
+ # reporter_email: str,
+ case_type_plugin_metadata: dict = None,
+ db_session=None,
+ ):
+ """Creates a case Jira issue."""
+ try:
+ client = create_client(self.configuration)
+
+ assignee = get_user_field(client, self.configuration, assignee_email)
+ # TODO(mvilanova): enable reporter email and replace assignee email
+ # reporter = get_user_field(client, self.configuration, reporter_email)
+ reporter = assignee
+
+ project_id, issue_type_name = process_plugin_metadata(case_type_plugin_metadata)
+
+ if not project_id:
+ project_id = self.configuration.default_project_id
+
+ project = {"id": project_id}
+ if not project_id.isdigit():
+ project = {"key": project_id}
+
+ if not issue_type_name:
+ issue_type_name = self.configuration.default_issue_type_name
+
+ issuetype = {"name": issue_type_name}
+
+ issue_fields = {
+ "project": project,
+ "issuetype": issuetype,
+ "assignee": assignee,
+ "reporter": reporter,
+ "summary": title.replace("\n", ""),
+ }
+
+ ticket = create(self.configuration, client, issue_fields)
+ except Exception as e:
+ log.exception(
+ (
+ f"Failed to create Jira ticket for case_id: {case_id}. "
+ f"Creating case ticket with core plugin instead. Error: {e}"
+ )
+ )
+ # fall back to creating a ticket without the plugin
+ case = case_service.get(db_session=db_session, case_id=case_id)
+ ticket = create_fallback_ticket_case(case_id=case.id, project=case.project)
+
+ return ticket
+
+ def create_task_ticket(
+ self,
+ task_id: int,
+ title: str,
+ assignee_email: str,
+ reporter_email: str,
+ incident_ticket_key: str = None,
+ task_plugin_metadata: dict = None,
+ db_session=None,
+ ):
+ """Creates a task Jira issue."""
+ client = create_client(self.configuration)
+
+ assignee = get_user_field(client, self.configuration, assignee_email)
+ reporter = get_user_field(client, self.configuration, reporter_email)
+
+ project_id, issue_type_name = process_plugin_metadata(task_plugin_metadata)
+ other_fields = create_dict_from_plugin_metadata(task_plugin_metadata)
+
+ if not project_id:
+ project_id = self.configuration.default_project_id
+
+ project = {"id": project_id}
+ if not project_id.isdigit():
+ project = {"key": project_id}
+
+ if not issue_type_name:
+ issue_type_name = self.configuration.default_issue_type_name
+
+ issuetype = {"name": issue_type_name}
+
+ issue_fields = {
+ "project": project,
+ "issuetype": issuetype,
+ "assignee": assignee,
+ "reporter": reporter,
+ "summary": title.replace("\n", ""),
+ **other_fields,
+ }
+
+ issue = client.create_issue(fields=issue_fields)
+
+ if incident_ticket_key:
+ update = {
+ "issuelinks": [
+ {
+ "add": {
+ "type": {
+ "name": "Relates",
+ "inward": "is related to",
+ "outward": "relates to",
+ },
+ "outwardIssue": {"key": incident_ticket_key},
+ }
+ }
+ ]
+ }
+ issue.update(update=update)
+
+ return {
+ "resource_id": issue.key,
+ "weblink": f"{self.configuration.browser_url}/browse/{issue.key}",
+ }
+
+ def update_case_ticket(
+ self,
+ ticket_id: str,
+ title: str,
+ description: str,
+ resolution: str,
+ case_type: str,
+ case_severity: str,
+ case_priority: str,
+ status: str,
+ assignee_email: str,
+ # reporter_email: str,
+ document_weblink: str,
+ storage_weblink: str,
+ dispatch_weblink: str,
+ case_type_plugin_metadata: dict = None,
+ ):
+ """Updates a case Jira issue."""
+ client = create_client(self.configuration)
+
+ assignee = get_user_field(client, self.configuration, assignee_email)
+ # TODO(mvilanova): enable reporter email and replace assignee email
+ # reporter = get_user_field(client, self.configuration, reporter_email)
+ reporter = assignee
+
+ assignee_username = get_email_username(assignee_email)
+
+ issue = client.issue(ticket_id)
+ issue_fields = create_case_issue_fields(
+ title=title,
+ description=description,
+ resolution=resolution,
+ case_type=case_type,
+ case_severity=case_severity,
+ case_priority=case_priority,
+ assignee=assignee,
+ reporter=reporter,
+ assignee_username=assignee_username,
+ document_weblink=document_weblink,
+ storage_weblink=storage_weblink,
+ dispatch_weblink=dispatch_weblink,
+ )
+
+ return update(self.configuration, client, issue, issue_fields, status)
+
+ def delete(self, ticket_id: str):
+ """Deletes a Jira issue."""
+ client = create_client(self.configuration)
+ issue = client.issue(ticket_id)
+ issue.delete()
diff --git a/src/dispatch/plugins/dispatch_jira/templates.py b/src/dispatch/plugins/dispatch_jira/templates.py
new file mode 100644
index 000000000000..1437478766fa
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_jira/templates.py
@@ -0,0 +1,50 @@
+INCIDENT_ISSUE_SUMMARY_TEMPLATE = """
+{color:red}*Confidential Information - For Internal Use Only*{color}
+
+*Incident Details*
+Description: {{description}}
+Type: {{incident_type}}
+Severity: {{incident_severity}}
+Priority: {{incident_priority}}
+Cost: {{cost}}
+
+*Incident Resources*
+[Dispatch Link|{{dispatch_weblink}}]
+[Conversation|{{conversation_weblink}}]
+[Investigation Document|{{document_weblink}}]
+[Storage|{{storage_weblink}}]
+[Conference|{{conference_weblink}}]
+
+Incident Commander: [~{{commander_username}}]
+"""
+
+INCIDENT_ISSUE_SUMMARY_NO_RESOURCES_TEMPLATE = """
+{color:red}*Confidential Information - For Internal Use Only*{color}
+
+*Incident Details*
+Description: {{description}}
+Type: {{incident_type}}
+Severity: {{incident_severity}}
+Priority: {{incident_priority}}
+Cost: {{cost}}
+
+Incident Commander: [~{{commander_username}}]
+"""
+
+CASE_ISSUE_SUMMARY_TEMPLATE = """
+{color:red}*Confidential Information - For Internal Use Only*{color}
+
+*Case Details*
+Description: {{description}}
+Resolution: {{resolution}}
+Type: {{case_type}}
+Severity: {{case_severity}}
+Priority: {{case_priority}}
+
+*Case Resources*
+[Dispatch Link|{{dispatch_weblink}}]
+[Investigation Document|{{document_weblink}}]
+[Storage|{{storage_weblink}}]
+
+Investigator: [~{{assignee_username}}]
+"""
diff --git a/src/dispatch/plugins/dispatch_microsoft_teams/__init__.py b/src/dispatch/plugins/dispatch_microsoft_teams/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/plugins/dispatch_microsoft_teams/conference/__init__.py b/src/dispatch/plugins/dispatch_microsoft_teams/conference/__init__.py
new file mode 100644
index 000000000000..ad5cc752c07b
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_microsoft_teams/conference/__init__.py
@@ -0,0 +1 @@
+from ._version import __version__ # noqa
diff --git a/src/dispatch/plugins/dispatch_microsoft_teams/conference/_version.py b/src/dispatch/plugins/dispatch_microsoft_teams/conference/_version.py
new file mode 100644
index 000000000000..3dc1f76bc69e
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_microsoft_teams/conference/_version.py
@@ -0,0 +1 @@
+__version__ = "0.1.0"
diff --git a/src/dispatch/plugins/dispatch_microsoft_teams/conference/client.py b/src/dispatch/plugins/dispatch_microsoft_teams/conference/client.py
new file mode 100644
index 000000000000..d22ce7e25f60
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_microsoft_teams/conference/client.py
@@ -0,0 +1,75 @@
+import msal
+import logging
+import requests
+import json
+from typing import Any
+
+logger = logging.getLogger(__name__)
+
+
+class MSTeamsClient:
+ def __init__(
+ self,
+ client_id: str,
+ authority: str,
+ credential: str,
+ record_automatically: bool,
+ user_id: str,
+ scope="https://graph.microsoft.com/.default",
+ ):
+ """
+ "authority": "https://login.microsoftonline.com/Enter_the_Tenant_Id_Here",
+ "client_id": "Enter_the_Application_Id_Here",
+ "secret": "Enter_the_Client_Secret_Here"
+
+ Enter_the_Application_Id_Here - is the Application (client) ID for the application you registered.
+ Enter_the_Tenant_Id_Here - replace this value with the Tenant Id or Tenant name (for example, contoso.microsoft.com)
+ Enter_the_Client_Secret_Here - replace this value with the client secret created on step 1
+ """
+ self.client_id = client_id
+ self.authority = authority
+ self.client_credential = credential
+ self.scope = scope
+ self.record_automatically = record_automatically
+ self.user_id = user_id
+
+ def _do_authenticate(self) -> Any:
+ app = msal.ConfidentialClientApplication(
+ self.client_id, authority=self.authority, client_credential=self.client_credential
+ )
+ """
+ Contains the scopes requested. For confidential clients, this should use the format
+ similar to {Application ID URI}/.default to indicate that the scopes being requested
+ are the ones statically defined in the app object set in the Azure portal
+ (for Microsoft Graph, {Application ID URI} points to https://graph.microsoft.com).
+ For custom web APIs, {Application ID URI} is defined under the Expose an API section
+ in App registrations in the Azure portal.
+ """
+ result = None
+ logger.info("Completed app initialization")
+ result = app.acquire_token_silent([self.scope], account=None)
+ logger.info(f"Masked Result is {result}")
+ if not result:
+ logger.info("No suitable token exists in cache. Let's get a new one from AAD.")
+ result = app.acquire_token_for_client(scopes=self.scope)
+ return result
+
+ def create_meeting(self, incident: str) -> dict:
+ result = self._do_authenticate()
+ if "access_token" in result:
+ # Calling graph using the access token
+ data = {
+ "subject": f"{incident}",
+ "recordAutomatically": str(self.record_automatically).lower(),
+ "joinMeetingIdSettings": {"isPasscodeRequired": "false"},
+ }
+ graph_data = requests.post( # Use token to call downstream service
+ url=f"https://graph.microsoft.com/v1.0/users/{self.user_id}/onlineMeetings",
+ headers={"Authorization": "Bearer " + result["access_token"]},
+ json=data,
+ ).json()
+ logger.info("Graph API call result: ")
+ logger.info(json.dumps(graph_data, indent=2))
+ return graph_data
+
+ return {}
diff --git a/src/dispatch/plugins/dispatch_microsoft_teams/conference/config.py b/src/dispatch/plugins/dispatch_microsoft_teams/conference/config.py
new file mode 100644
index 000000000000..0379db416c98
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_microsoft_teams/conference/config.py
@@ -0,0 +1,28 @@
+from pydantic import Field, SecretStr
+
+from dispatch.config import BaseConfigurationModel
+
+
+class MicrosoftTeamsConfiguration(BaseConfigurationModel):
+ """MS teams configuration details."""
+
+ authority: str = Field(
+ title="MS team Authority URL",
+ description="Following format https://login.microsoftonline.com/Enter_the_Tenant_Id_Here.",
+ )
+ client_id: str = Field(
+ title="client id",
+ description="It is the Application (client) ID for the application you registered.",
+ )
+ secret: SecretStr = Field(
+ title="Azure Client Secret", description="This is the client secret created via Azure AD."
+ )
+ allow_auto_recording: bool = Field(
+ False,
+ title="Allow Auto Recording",
+ description="Enable if you would like to record the meetings by default.",
+ )
+ user_id: str = Field(
+ title="User id",
+ description="It is the User ID for which the application will create meeting on behalf.",
+ )
diff --git a/src/dispatch/plugins/dispatch_microsoft_teams/conference/plugin.py b/src/dispatch/plugins/dispatch_microsoft_teams/conference/plugin.py
new file mode 100644
index 000000000000..4f2f5165c2f0
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_microsoft_teams/conference/plugin.py
@@ -0,0 +1,46 @@
+import logging
+
+from dispatch.plugins.bases import ConferencePlugin
+from dispatch.plugins.dispatch_microsoft_teams import conference as teams_plugin
+from .config import MicrosoftTeamsConfiguration
+from .client import MSTeamsClient
+from dispatch.decorators import apply, counter, timer
+
+
+logger = logging.getLogger(__name__)
+
+
+class MicrosoftTeamsConferencePlugin(ConferencePlugin):
+ title = "Microsoft Teams Plugin - Conference Management"
+ slug = "microsoft-teams-conference"
+ description = "Uses MS Teams to manage conference meetings."
+ version = teams_plugin.__version__
+
+ author = "Cino Jose"
+ author_url = "https://github.com/netflix/dispatch.git"
+
+ def __init__(self):
+ self.configuration_schema = MicrosoftTeamsConfiguration
+
+ @apply(counter, exclude=["__init__"])
+ @apply(timer, exclude=["__init__"])
+ def create(
+ self, name: str, description: str = None, title: str = None, participants: list[str] = None
+ ):
+ try:
+ client = MSTeamsClient(
+ client_id=self.configuration.client_id,
+ authority=self.configuration.authority,
+ credential=self.configuration.secret.get_secret_value(),
+ user_id=self.configuration.user_id,
+ record_automatically=self.configuration.allow_auto_recording,
+ )
+ meeting_info = client.create_meeting(name)
+
+ return {
+ "weblink": meeting_info["joinWebUrl"],
+ "id": meeting_info["id"],
+ "challenge": "",
+ }
+ except Exception as e:
+ logger.error(f"There was an error when attempting to create the meeting {e}")
diff --git a/src/dispatch/plugins/dispatch_openai/__init__.py b/src/dispatch/plugins/dispatch_openai/__init__.py
new file mode 100644
index 000000000000..ad5cc752c07b
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_openai/__init__.py
@@ -0,0 +1 @@
+from ._version import __version__ # noqa
diff --git a/src/dispatch/plugins/dispatch_openai/_version.py b/src/dispatch/plugins/dispatch_openai/_version.py
new file mode 100644
index 000000000000..3dc1f76bc69e
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_openai/_version.py
@@ -0,0 +1 @@
+__version__ = "0.1.0"
diff --git a/src/dispatch/plugins/dispatch_openai/config.py b/src/dispatch/plugins/dispatch_openai/config.py
new file mode 100644
index 000000000000..c2abe129fd3b
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_openai/config.py
@@ -0,0 +1,19 @@
+from pydantic import Field, SecretStr
+
+from dispatch.config import BaseConfigurationModel
+
+
+class OpenAIConfiguration(BaseConfigurationModel):
+ """OpenAI configuration description."""
+
+ api_key: SecretStr = Field(title="API Key", description="Your secret OpenAI API key.")
+ model: str = Field(
+ "gpt-4o",
+ title="Model",
+ description="Available models can be found at https://platform.openai.com/docs/models",
+ )
+ system_message: str = Field(
+ "You are a helpful assistant.",
+ title="System Message",
+ description="The system message to help set the behavior of the assistant.",
+ )
diff --git a/src/dispatch/plugins/dispatch_openai/plugin.py b/src/dispatch/plugins/dispatch_openai/plugin.py
new file mode 100644
index 000000000000..a7103fd12cf3
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_openai/plugin.py
@@ -0,0 +1,86 @@
+"""
+.. module: dispatch.plugins.openai.plugin
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+.. moduleauthor:: Marc Vilanova
+"""
+
+import logging
+
+from openai import OpenAI
+from typing import TypeVar, Type
+
+from dispatch.decorators import apply, counter, timer
+from dispatch.plugins import dispatch_openai as openai_plugin
+from dispatch.plugins.bases import ArtificialIntelligencePlugin
+from dispatch.plugins.dispatch_openai.config import (
+ OpenAIConfiguration,
+)
+from pydantic import BaseModel
+
+logger = logging.getLogger(__name__)
+
+
+@apply(counter, exclude=["__init__"])
+@apply(timer, exclude=["__init__"])
+class OpenAIPlugin(ArtificialIntelligencePlugin):
+ title = "OpenAI Plugin - Generative Artificial Intelligence"
+ slug = "openai-artificial-intelligence"
+ description = "Uses OpenAI's platform to allow users to ask questions in natural language."
+ version = openai_plugin.__version__
+
+ author = "Netflix"
+ author_url = "https://github.com/netflix/dispatch.git"
+
+ def __init__(self):
+ self.configuration_schema = OpenAIConfiguration
+
+ def chat_completion(self, prompt: str) -> dict:
+ client = OpenAI(api_key=self.api_key)
+
+ try:
+ completion = client.chat.completions.create(
+ model=self.model,
+ messages=[
+ {
+ "role": "system",
+ "content": self.system_message,
+ },
+ {
+ "role": "user",
+ "content": prompt,
+ },
+ ],
+ )
+ except Exception as e:
+ logger.error(e)
+ raise
+
+ return completion.choices[0].message
+
+ T = TypeVar("T", bound=BaseModel)
+
+ def chat_parse(self, prompt: str, response_model: Type[T]) -> T:
+ client = OpenAI(api_key=self.api_key)
+
+ try:
+ completion = client.chat.completions.parse(
+ model=self.model,
+ response_format=response_model,
+ messages=[
+ {
+ "role": "system",
+ "content": self.system_message,
+ },
+ {
+ "role": "user",
+ "content": prompt,
+ },
+ ],
+ )
+ except Exception as e:
+ logger.error(e)
+ raise
+
+ return completion.choices[0].message.parsed
diff --git a/src/dispatch/plugins/dispatch_opsgenie/__init__.py b/src/dispatch/plugins/dispatch_opsgenie/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/plugins/dispatch_opsgenie/plugin.py b/src/dispatch/plugins/dispatch_opsgenie/plugin.py
new file mode 100644
index 000000000000..9d36b43597dc
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_opsgenie/plugin.py
@@ -0,0 +1,62 @@
+"""
+.. module: dispatch.plugins.dispatch_opsgenie.plugin
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
+
+import logging
+
+from pydantic import Field, SecretStr
+
+from dispatch.config import BaseConfigurationModel
+from dispatch.decorators import apply, counter, timer
+
+from dispatch.plugins.bases import OncallPlugin
+from .service import get_oncall, page_oncall
+
+
+__version__ = "0.1.0"
+
+log = logging.getLogger(__name__)
+
+
+class OpsgenieConfiguration(BaseConfigurationModel):
+ """Opsgenie configuration description."""
+
+ api_key: SecretStr = Field(
+ title="API Key", description="This is the key used to talk to the Opsgenine API."
+ )
+
+
+@apply(counter, exclude=["__init__"])
+@apply(timer, exclude=["__init__"])
+class OpsGenieOncallPlugin(OncallPlugin):
+ title = "OpsGenie Plugin - Oncall Management"
+ slug = "opsgenie-oncall"
+ author = "stefanm8"
+ author_url = "https://github.com/Netflix/dispatch"
+ description = "Uses Opsgenie to resolve and page oncall teams."
+ version = __version__
+
+ def __init__(self):
+ self.configuration_schema = OpsgenieConfiguration
+
+ def get(self, service_id: str, **kwargs):
+ return get_oncall(self.configuration.api_key, service_id)
+
+ def page(
+ self,
+ service_id: str,
+ incident_name: str,
+ incident_title: str,
+ incident_description: str,
+ **kwargs,
+ ):
+ return page_oncall(
+ self.configuration.api_key,
+ service_id,
+ incident_name,
+ incident_title,
+ incident_description,
+ )
diff --git a/src/dispatch/plugins/dispatch_opsgenie/service.py b/src/dispatch/plugins/dispatch_opsgenie/service.py
new file mode 100644
index 000000000000..e240b980dae0
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_opsgenie/service.py
@@ -0,0 +1,51 @@
+import json
+import requests
+
+from dispatch.exceptions import DispatchPluginException
+
+
+def get_auth(api_key: str) -> dict:
+ return {"Authorization": f"GenieKey {api_key.get_secret_value()}"}
+
+
+def get_oncall(api_key: str, team_id: str) -> str:
+ schedule_api = "https://api.opsgenie.com/v2/schedules"
+ response = requests.get(
+ f"{schedule_api}/{team_id}/on-calls",
+ headers=get_auth(api_key),
+ )
+
+ if response.status_code != 200:
+ raise DispatchPluginException(response.text)
+
+ body = response.json().get("data")
+
+ if not body:
+ raise DispatchPluginException
+
+ return body["onCallParticipants"][0].get("name")
+
+
+def page_oncall(
+ api_key: str,
+ service_id: str,
+ incident_name: str,
+ incident_title: str,
+ incident_description: str,
+) -> str:
+ data = {
+ "message": incident_title,
+ "alias": f"{incident_name}-{incident_title}",
+ "description": incident_description,
+ "responders": [{"id": service_id, "type": "schedule"}],
+ }
+
+ response = requests.post(
+ "https://api.opsgenie.com/v2/alerts",
+ headers={**get_auth(api_key), "content-type": "application/json"},
+ data=json.dumps(data),
+ )
+ if response.status_code != 202:
+ raise DispatchPluginException
+
+ return response.json().get("requestId")
diff --git a/src/dispatch/plugins/dispatch_pagerduty/config.py b/src/dispatch/plugins/dispatch_pagerduty/config.py
deleted file mode 100644
index cbd4ff8678bf..000000000000
--- a/src/dispatch/plugins/dispatch_pagerduty/config.py
+++ /dev/null
@@ -1,4 +0,0 @@
-from dispatch.config import config, Secret
-
-PAGERDUTY_API_KEY = config("PAGERDUTY_API_KEY", cast=Secret)
-PAGERDUTY_API_FROM_EMAIL = config("PAGERDUTY_API_FROM_EMAIL")
diff --git a/src/dispatch/plugins/dispatch_pagerduty/plugin.py b/src/dispatch/plugins/dispatch_pagerduty/plugin.py
index aada43bddc02..2cca42bda652 100644
--- a/src/dispatch/plugins/dispatch_pagerduty/plugin.py
+++ b/src/dispatch/plugins/dispatch_pagerduty/plugin.py
@@ -4,32 +4,139 @@
:copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
:license: Apache, see LICENSE for more details.
"""
+
+from pdpyras import APISession
+from pydantic import Field, SecretStr, EmailStr
import logging
+from dispatch.config import BaseConfigurationModel
from dispatch.decorators import apply, counter, timer
from dispatch.plugins import dispatch_pagerduty as pagerduty_oncall_plugin
from dispatch.plugins.bases import OncallPlugin
-from .service import get_oncall, page_oncall
+from .service import (
+ get_oncall_email,
+ page_oncall,
+ oncall_shift_check,
+ get_escalation_policy,
+ get_service,
+ get_next_oncall,
+)
log = logging.getLogger(__name__)
-@apply(timer)
-@apply(counter)
+class PagerdutyConfiguration(BaseConfigurationModel):
+ """The values below are the available configurations for Dispatch's PagerDuty plugin."""
+
+ api_key: SecretStr = Field(
+ title="API Key",
+ description="This is the key used to talk to the PagerDuty API. See: https://support.pagerduty.com/docs/generating-api-keys",
+ )
+ from_email: EmailStr = Field(
+ title="From Email",
+ description="This the email to put into the 'From' field of any page requests.",
+ )
+ pagerduty_api_url: str = Field(
+ "https://api.pagerduty.com",
+ title="Instance API URL",
+ description="Enter the URL for your API (defaults to US)",
+ )
+
+
+@apply(counter, exclude=["__init__"])
+@apply(timer, exclude=["__init__"])
class PagerDutyOncallPlugin(OncallPlugin):
- title = "PagerDuty - Oncall"
+ title = "PagerDuty Plugin - Oncall Management"
slug = "pagerduty-oncall"
- description = "Uses PagerDuty to resolve oncall identities"
+ author = "Netflix"
+ author_url = "https://github.com/Netflix/dispatch.git"
+ description = "Uses PagerDuty to resolve and page oncall teams."
version = pagerduty_oncall_plugin.__version__
- def get(self, service_id: str = None, service_name: str = None):
- """Gets the oncall person."""
- return get_oncall(service_id=service_id, service_name=service_name)
+ def __init__(self):
+ self.configuration_schema = PagerdutyConfiguration
+
+ def get(self, service_id: str) -> str:
+ """Gets the current oncall person's email."""
+ client = APISession(self.configuration.api_key.get_secret_value())
+ client.url = str(self.configuration.pagerduty_api_url)
+ return get_oncall_email(client=client, service_id=service_id)
def page(
- self, service_id: str, incident_name: str, incident_title: str, incident_description: str
- ):
+ self,
+ service_id: str,
+ incident_name: str,
+ incident_title: str,
+ incident_description: str,
+ **kwargs,
+ ) -> dict:
"""Pages the oncall person."""
- return page_oncall(service_id, incident_name, incident_title, incident_description)
+ client = APISession(self.configuration.api_key.get_secret_value())
+ client.url = str(self.configuration.pagerduty_api_url)
+ return page_oncall(
+ client=client,
+ from_email=self.configuration.from_email,
+ service_id=service_id,
+ incident_name=incident_name,
+ incident_title=incident_title,
+ incident_description=incident_description,
+ event_type=kwargs.get("event_type", "incident"),
+ )
+
+ def did_oncall_just_go_off_shift(self, schedule_id: str, hour: int) -> dict | None:
+ client = APISession(self.configuration.api_key.get_secret_value())
+ client.url = str(self.configuration.pagerduty_api_url)
+ return oncall_shift_check(
+ client=client,
+ schedule_id=schedule_id,
+ hour=hour,
+ )
+
+ def get_schedule_id_from_service_id(self, service_id: str) -> str | None:
+ if not service_id:
+ return None
+
+ try:
+ client = APISession(self.configuration.api_key.get_secret_value())
+ client.url = str(self.configuration.pagerduty_api_url)
+ service = get_service(
+ client=client,
+ service_id=service_id,
+ )
+ if service:
+ escalation_policy_id = service["escalation_policy"]["id"]
+ escalation_policy = get_escalation_policy(
+ client=client,
+ escalation_policy_id=escalation_policy_id,
+ )
+ if escalation_policy:
+ return escalation_policy["escalation_rules"][0]["targets"][0]["id"]
+ except Exception as e:
+ log.error("Error trying to retrieve schedule_id from service_id")
+ log.exception(e)
+
+ def get_service_url(self, service_id: str) -> str | None:
+ if not service_id:
+ return None
+
+ client = APISession(self.configuration.api_key.get_secret_value())
+ client.url = str(self.configuration.pagerduty_api_url)
+ try:
+ service = get_service(client, service_id)
+ return service.get("html_url")
+ except Exception as e:
+ log.error(f"Error retrieving service URL for service_id {service_id}")
+ log.exception(e)
+ return None
+
+ def get_next_oncall(self, service_id: str) -> str | None:
+ schedule_id = self.get_schedule_id_from_service_id(service_id)
+
+ client = APISession(self.configuration.api_key.get_secret_value())
+ client.url = str(self.configuration.pagerduty_api_url)
+ return get_next_oncall(
+ client=client,
+ schedule_id=schedule_id,
+ )
diff --git a/src/dispatch/plugins/dispatch_pagerduty/service.py b/src/dispatch/plugins/dispatch_pagerduty/service.py
index 7797548a4e34..cc84c51d329c 100644
--- a/src/dispatch/plugins/dispatch_pagerduty/service.py
+++ b/src/dispatch/plugins/dispatch_pagerduty/service.py
@@ -1,70 +1,220 @@
-import pypd
-from pypd.models.service import Service
+from datetime import datetime, timedelta
+from http import HTTPStatus
+import logging
+
+from pdpyras import APISession, PDHTTPError, PDClientError
+
+
+class PDNotFoundError(Exception):
+ """Raised when a PagerDuty object is not found."""
+
+
+log = logging.getLogger(__file__)
+
+
+def get_user(client: APISession, user_id: str) -> dict:
+ """Gets an oncall user by id."""
+ try:
+ user = client.rget(f"/users/{user_id}")
+ except PDHTTPError as e:
+ if e.response.status_code == HTTPStatus.NOT_FOUND.value:
+ message = f"User with id {user_id} not found."
+ log.error(message)
+ raise PDNotFoundError(message) from e
+ else:
+ raise e
+ except PDClientError as e:
+ log.error(f"Non-transient network or client error: {e}")
+ raise e
+
+ return user
+
+
+def get_service(client: APISession, service_id: str) -> dict:
+ """Gets an oncall service by id."""
+ try:
+ service = client.rget(f"/services/{service_id}")
+ except PDHTTPError as e:
+ if e.response.status_code == HTTPStatus.NOT_FOUND.value:
+ message = f"Service with id {service_id} not found."
+ log.error(message)
+ raise PDNotFoundError(message) from e
+ else:
+ raise e
+ except PDClientError as e:
+ log.error(f"Non-transient network or client error: {e}")
+ raise e
+
+ return service
+
+
+def get_escalation_policy(client: APISession, escalation_policy_id: str) -> dict:
+ """Gets an escalation policy by id."""
+ try:
+ escalation_policy = client.rget(f"/escalation_policies/{escalation_policy_id}")
+ except PDHTTPError as e:
+ if e.response.status_code == HTTPStatus.NOT_FOUND.value:
+ message = f"Escalation policy with id {escalation_policy_id} not found."
+ log.error(message)
+ raise PDNotFoundError(message) from e
+ else:
+ raise e
+ except PDClientError as e:
+ log.error(f"Non-transient network or client error: {e}")
+ raise e
+
+ return escalation_policy
+
+
+def create_incident(client: APISession, headers: dict, data: dict) -> dict:
+ """Creates an incident and pages the oncall person."""
+ try:
+ incident = client.rpost("/incidents", headers=headers, json=data)
+ except PDClientError as e:
+ log.error(
+ f"Error creating incident for service id {data['service']['id']} and escalation_policy id {data['escalation_policy']['id']}: {e}."
+ )
-from dispatch.exceptions import DispatchPluginException
-
-from .config import PAGERDUTY_API_KEY, PAGERDUTY_API_FROM_EMAIL
-
-
-pypd.api_key = PAGERDUTY_API_KEY
+ return incident
-def get_oncall_email(service: Service):
+def get_oncall_email(client: APISession, service_id: str) -> str:
"""Fetches the oncall's email for a given service."""
- escalation_policy_id = service.json["escalation_policy"]["id"]
- escalation_policy = pypd.EscalationPolicy.fetch(escalation_policy_id)
-
- schedule_id = escalation_policy.json["escalation_rules"][0]["targets"][0]["id"]
-
- oncall = pypd.OnCall.find_one(
- escalation_policy_ids=[escalation_policy_id], schedule_ids=[schedule_id]
+ service = get_service(client=client, service_id=service_id)
+ escalation_policy_id = service["escalation_policy"]["id"]
+ escalation_policy = get_escalation_policy(
+ client=client, escalation_policy_id=escalation_policy_id
)
- user_id = oncall.json["user"]["id"]
- user = pypd.User.fetch(user_id)
-
- return user.json["email"]
-
-
-def get_oncall(service_id: str = None, service_name: str = None):
- """Gets the oncall for a given service id or name."""
- if service_id:
- service = pypd.Service.fetch(service_id)
- return get_oncall_email(service)
- elif service_name:
- service = pypd.Service.find(query=service_name)
-
- if len(service) > 1:
- raise DispatchPluginException(
- f"More than one PagerDuty service found with service name: {service_name}"
- )
-
- if not service:
- raise DispatchPluginException(
- f"No on-call service found with service name: {service_name}"
+ # Iterate over all escalation rules and targets to find the oncall user
+ for rule in escalation_policy["escalation_rules"]:
+ for target in rule["targets"]:
+ filter_name = f"{target['type'].split('_')[0]}_ids[]"
+ filter_value = target["id"]
+
+ oncalls = list(
+ client.iter_all(
+ "oncalls", # method
+ {
+ filter_name: [filter_value],
+ "escalation_policy_ids[]": [escalation_policy_id],
+ }, # params
+ )
)
- return get_oncall_email(service[0])
+ if oncalls:
+ user_id = list(oncalls)[0]["user"]["id"]
+ user = get_user(client=client, user_id=user_id)
+ return user["email"]
- DispatchPluginException(f"Cannot fetch oncall. Must specify service_id or service_name.")
+ # If we reach this point, we couldn't find the oncall user
+ raise Exception(
+ f"No users could be found for this pagerduty escalation policy ({escalation_policy_id}). Is there a schedule associated???"
+ )
def page_oncall(
- service_id: str, incident_name: str, incident_title: str, incident_description: str
-):
+ client: APISession,
+ from_email: str,
+ service_id: str,
+ incident_name: str,
+ incident_title: str,
+ incident_description: str,
+ event_type: str = "incident",
+) -> dict:
"""Pages the oncall for a given service id."""
- service = pypd.Service.fetch(service_id)
- escalation_policy_id = service.json["escalation_policy"]["id"]
+ service = get_service(client, service_id)
+ escalation_policy_id = service["escalation_policy"]["id"]
+ headers = {"from": from_email}
data = {
- "type": "incident",
+ "type": event_type,
"title": f"{incident_name} - {incident_title}",
"service": {"id": service_id, "type": "service_reference"},
- "incident_key": incident_name,
"body": {"type": "incident_body", "details": incident_description},
"escalation_policy": {"id": escalation_policy_id, "type": "escalation_policy_reference"},
}
- incident = pypd.Incident.create(data=data, add_headers={"from": PAGERDUTY_API_FROM_EMAIL})
+ return create_incident(client, headers, data)
- return incident
+
+def get_oncall_at_time(client: APISession, schedule_id: str, utctime: str) -> dict | None:
+ """Retrieves the email of the oncall person at the utc time given."""
+ try:
+ oncalls = list(
+ client.iter_all(
+ "oncalls", # method
+ {
+ "schedule_ids[]": schedule_id,
+ "since": utctime,
+ "until": utctime,
+ }, # params
+ )
+ )
+ if not oncalls:
+ raise Exception(
+ f"No users could be found for PagerDuty schedule id {schedule_id}. Is there a schedule associated with it?"
+ )
+
+ user_id = oncalls[0]["user"]["id"]
+ user = get_user(client, user_id)
+ user_email = user["email"]
+ shift_end = oncalls[0]["end"]
+ shift_start = oncalls[0]["start"]
+ schedule_name = oncalls[0]["schedule"]["summary"]
+ return {
+ "email": user_email,
+ "shift_end": shift_end,
+ "shift_start": shift_start,
+ "schedule_name": schedule_name,
+ }
+
+ except PDHTTPError as e:
+ if e.response.status_code == HTTPStatus.NOT_FOUND.value:
+ message = f"Schedule with id {schedule_id} not found."
+ log.error(message)
+ raise PDNotFoundError(message) from e
+ else:
+ raise e
+ except PDClientError as e:
+ log.error(f"Non-transient network or client error: {e}")
+ raise e
+
+
+def oncall_shift_check(client: APISession, schedule_id: str, hour: int) -> dict | None:
+ """Determines whether the oncall person just went off shift and returns their email."""
+ now = datetime.utcnow()
+ # in case scheduler is late, replace hour with exact one for shift comparison
+ now = now.replace(hour=hour, minute=0, second=0, microsecond=0)
+
+ # compare oncall person scheduled 18 hours ago vs 2 hours from now
+ previous_shift = (now - timedelta(hours=18)).isoformat(timespec="minutes") + "Z"
+ next_shift = (now + timedelta(hours=2)).isoformat(timespec="minutes") + "Z"
+
+ previous_oncall = get_oncall_at_time(
+ client=client, schedule_id=schedule_id, utctime=previous_shift
+ )
+ next_oncall = get_oncall_at_time(client=client, schedule_id=schedule_id, utctime=next_shift)
+
+ if previous_oncall["email"] != next_oncall["email"]:
+ # find the beginning of previous_oncall's shift by going back in time in 24 hour increments
+ hours = 18
+ prior_oncall = previous_oncall
+ while prior_oncall["email"] == previous_oncall["email"]:
+ hours += 24
+ previous_shift = (now - timedelta(hours=hours)).isoformat(timespec="minutes") + "Z"
+ prior_oncall = get_oncall_at_time(
+ client=client, schedule_id=schedule_id, utctime=previous_shift
+ )
+ previous_oncall["shift_start"] = prior_oncall["shift_start"]
+ return previous_oncall
+
+
+def get_next_oncall(client: APISession, schedule_id: str) -> str | None:
+ """Retrieves the email of the next oncall person. Assumes 12-hour shifts"""
+ now = datetime.utcnow()
+
+ next_shift = (now + timedelta(hours=13)).isoformat(timespec="minutes") + "Z"
+ next_oncall = get_oncall_at_time(client=client, schedule_id=schedule_id, utctime=next_shift)
+
+ return None if not next_oncall else next_oncall["email"]
diff --git a/src/dispatch/plugins/dispatch_slack/__init__.py b/src/dispatch/plugins/dispatch_slack/__init__.py
index 2d0b548e6531..ad5cc752c07b 100644
--- a/src/dispatch/plugins/dispatch_slack/__init__.py
+++ b/src/dispatch/plugins/dispatch_slack/__init__.py
@@ -1 +1 @@
-from dispatch import __version__ # noqa
+from ._version import __version__ # noqa
diff --git a/src/dispatch/plugins/dispatch_slack/_version.py b/src/dispatch/plugins/dispatch_slack/_version.py
new file mode 100644
index 000000000000..3dc1f76bc69e
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/_version.py
@@ -0,0 +1 @@
+__version__ = "0.1.0"
diff --git a/src/dispatch/plugins/dispatch_slack/ack.py b/src/dispatch/plugins/dispatch_slack/ack.py
new file mode 100644
index 000000000000..d6b17c2cc88e
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/ack.py
@@ -0,0 +1,14 @@
+from blockkit import Modal, Section
+from slack_bolt import Ack
+
+
+def ack_submission_event(ack: Ack, title: str, close: str, text: str) -> None:
+ """Handles event acknowledgment."""
+ ack(
+ response_action="update",
+ view=Modal(
+ title=title,
+ close=close,
+ blocks=[Section(text=text)],
+ ).build(),
+ )
diff --git a/src/dispatch/plugins/dispatch_slack/bolt.py b/src/dispatch/plugins/dispatch_slack/bolt.py
new file mode 100644
index 000000000000..12e1dfca313d
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/bolt.py
@@ -0,0 +1,158 @@
+import logging
+import uuid
+from http import HTTPStatus
+from typing import Any
+
+from blockkit import Context, MarkdownText, Modal
+from slack_bolt.app import App
+from slack_bolt import Ack, BoltContext, BoltRequest, Respond
+from slack_bolt.response import BoltResponse
+from slack_sdk.web.client import WebClient
+from slack_sdk.errors import SlackApiError
+from sqlalchemy.orm import Session
+
+from dispatch.auth.models import DispatchUser
+
+from .decorators import message_dispatcher
+from .exceptions import BotNotPresentError, ContextError, DispatchException, RoleError, CommandError
+from .messaging import (
+ build_command_error_message,
+ build_bot_not_present_message,
+ build_context_error_message,
+ build_role_error_message,
+ build_slack_api_error_message,
+ build_unexpected_error_message,
+)
+from .middleware import (
+ configuration_middleware,
+ message_context_middleware,
+ user_middleware,
+ select_context_middleware,
+)
+
+app = App(token="xoxb-valid", request_verification_enabled=False, token_verification_enabled=False)
+logging.basicConfig(level=logging.DEBUG)
+
+
+@app.error
+def app_error_handler(
+ error: Any,
+ client: WebClient,
+ context: BoltContext,
+ body: dict,
+ payload: dict,
+ logger: logging.Logger,
+ respond: Respond,
+) -> BoltResponse:
+ if body:
+ logger.info(f"Request body: {body}")
+
+ message = build_and_log_error(client, error, logger, payload, context)
+
+ # if we have a parent view available
+ if context.get("parentView"):
+ modal = Modal(
+ title="Error",
+ close="Close",
+ blocks=[Context(elements=[MarkdownText(text=message)])],
+ ).build()
+
+ client.views_update(
+ view_id=context["parentView"]["id"],
+ view=modal,
+ )
+ return
+
+ # the user is within a modal flow
+ if body.get("view"):
+ modal = Modal(
+ title="Error",
+ close="Close",
+ blocks=[Context(elements=[MarkdownText(text=message)])],
+ ).build()
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ view=modal,
+ )
+ return
+
+ # the user is in a message flow
+ if body.get("response_url"):
+ # the user is in a thread
+ if thread := body.get("container", {}).get("thread_ts"):
+ client.chat_postEphemeral(
+ channel=context["channel_id"],
+ text=message,
+ thread_ts=thread,
+ user=context["user_id"],
+ )
+ else:
+ respond(text=message, response_type="ephemeral", replace_original=False)
+
+ if not isinstance(error, DispatchException):
+ return BoltResponse(body=body, status=HTTPStatus.INTERNAL_SERVER_ERROR.value)
+
+ # for known exceptions we return OK, prevents error messages from Slackbot
+ return BoltResponse(status=HTTPStatus.OK.value)
+
+
+def build_and_log_error(
+ client: WebClient,
+ error: Any,
+ logger: logging.Logger,
+ payload: dict,
+ context: BoltContext,
+) -> str:
+ if isinstance(error, RoleError):
+ message = build_role_error_message(payload)
+ logger.info(error)
+
+ elif isinstance(error, CommandError):
+ message = build_command_error_message(payload, error)
+ logger.info(error)
+
+ elif isinstance(error, ContextError):
+ message = build_context_error_message(payload, error)
+ logger.info(error)
+
+ elif isinstance(error, BotNotPresentError):
+ message = build_bot_not_present_message(
+ client, payload["command"], context["conversations"]
+ )
+ logger.info(error)
+
+ elif isinstance(error, SlackApiError):
+ message = build_slack_api_error_message(error)
+ logger.exception(error)
+
+ else:
+ guid = str(uuid.uuid4())
+ message = build_unexpected_error_message(guid)
+ logger.exception(error, extra={"slack_interaction_guid": guid})
+
+ return message
+
+
+@app.event(
+ {"type": "message"},
+ middleware=[
+ message_context_middleware,
+ user_middleware,
+ configuration_middleware,
+ select_context_middleware,
+ ],
+)
+def handle_message_events(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ payload: dict,
+ respond: Respond,
+ request: BoltRequest,
+ user: DispatchUser,
+) -> None:
+ """Container function for all message functions."""
+ message_dispatcher.dispatch(**locals())
diff --git a/src/dispatch/plugins/dispatch_slack/case/__init__.py b/src/dispatch/plugins/dispatch_slack/case/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/plugins/dispatch_slack/case/enums.py b/src/dispatch/plugins/dispatch_slack/case/enums.py
new file mode 100644
index 000000000000..a55b211df837
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/case/enums.py
@@ -0,0 +1,67 @@
+from dispatch.conversation.enums import ConversationButtonActions
+from dispatch.enums import DispatchEnum
+
+
+class CaseNotificationActions(DispatchEnum):
+ add_user = "case-add-user"
+ do_nothing = "case-do-not-add-user"
+ edit = "case-notification-edit"
+ escalate = "case-notification-escalate"
+ investigate = "case-notification-investigate"
+ invite_user_case = ConversationButtonActions.invite_user_case
+ join_incident = "case-notification-join-incident"
+ migrate = "case-notification-migrate"
+ reopen = "case-notification-reopen"
+ resolve = "case-notification-resolve"
+ triage = "case-notification-triage"
+ user_mfa = "case-notification-user-mfa"
+ update = "case-update"
+
+
+class CasePaginateActions(DispatchEnum):
+ list_signal_next = "case-list-signal-next"
+ list_signal_previous = "case-list-signal-previous"
+
+
+class CaseEditActions(DispatchEnum):
+ submit = "case-notification-edit-submit"
+
+
+class CaseResolveActions(DispatchEnum):
+ submit = "case-notification-resolve-submit"
+
+
+class CaseEscalateActions(DispatchEnum):
+ submit = "case-notification-escalate-submit"
+ project_select = "case-notification-escalate-project-select"
+
+
+class CaseMigrateActions(DispatchEnum):
+ submit = "case-notification-migrate-submit"
+
+
+class CaseReportActions(DispatchEnum):
+ submit = "case-report-submit"
+ project_select = "case-report-project-select"
+ case_type_select = "case-report-case-type-select"
+ assignee_select = "case-report-assignee-select"
+
+
+class CaseShortcutCallbacks(DispatchEnum):
+ report = "case-report"
+
+
+class SignalNotificationActions(DispatchEnum):
+ snooze = "signal-notification-snooze"
+
+
+class SignalSnoozeActions(DispatchEnum):
+ preview = "case-notification-snooze-preview"
+ submit = "case-notification-snooze-submit"
+
+
+class SignalEngagementActions(DispatchEnum):
+ approve = "signal-engagement-approve"
+ deny = "signal-engagement-deny"
+ approve_submit = "signal-engagement-approve-submit"
+ deny_submit = "signal-engagement-deny-submit"
diff --git a/src/dispatch/plugins/dispatch_slack/case/interactive.py b/src/dispatch/plugins/dispatch_slack/case/interactive.py
new file mode 100644
index 000000000000..5e16a44d2f03
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/case/interactive.py
@@ -0,0 +1,3212 @@
+import json
+import logging
+import re
+from datetime import datetime, timedelta, timezone
+from functools import partial
+from uuid import UUID
+
+import pytz
+from blockkit import (
+ Actions,
+ Button,
+ Context,
+ Divider,
+ Input,
+ MarkdownText,
+ Message,
+ Modal,
+ Section,
+ UsersSelect,
+)
+from slack_bolt import Ack, BoltContext, Respond, BoltRequest
+from slack_sdk.errors import SlackApiError
+from slack_sdk.web.client import WebClient
+from sqlalchemy.exc import IntegrityError
+from sqlalchemy.orm import Session
+
+from dispatch.auth.models import DispatchUser, MfaChallengeStatus
+from dispatch.case import flows as case_flows
+from dispatch.case import service as case_service
+from dispatch.case.enums import CaseResolutionReason, CaseStatus, CaseResolutionReasonDescription
+from dispatch.case.models import Case, CaseCreate, CaseRead, CaseUpdate
+from dispatch.case.priority import service as case_priority_service
+from dispatch.case.type import service as case_type_service
+from dispatch.config import DISPATCH_UI_URL
+from dispatch.conversation import flows as conversation_flows
+from dispatch.entity import service as entity_service
+from dispatch.enums import EventType, SubjectNames, UserRoles, Visibility
+from dispatch.event import service as event_service
+from dispatch.exceptions import ExistsError
+from dispatch.incident.type.service import get_by_name as get_type_by_name
+from dispatch.incident.priority.service import get_by_name as get_priority_by_name
+from dispatch.individual.models import IndividualContactRead
+from dispatch.participant import flows as participant_flows
+from dispatch.participant import service as participant_service
+from dispatch.participant.models import ParticipantUpdate
+from dispatch.participant_role import service as participant_role_service
+from dispatch.participant_role.models import ParticipantRoleType
+from dispatch.plugin import service as plugin_service
+from dispatch.plugins.dispatch_slack import service as dispatch_slack_service
+from dispatch.plugins.dispatch_slack.bolt import app
+from dispatch.plugins.dispatch_slack.case.enums import (
+ CaseEditActions,
+ CaseEscalateActions,
+ CaseMigrateActions,
+ CaseNotificationActions,
+ CasePaginateActions,
+ CaseReportActions,
+ CaseResolveActions,
+ CaseShortcutCallbacks,
+ SignalEngagementActions,
+ SignalNotificationActions,
+ SignalSnoozeActions,
+)
+from dispatch.plugins.dispatch_slack.case.messages import (
+ create_case_message,
+ create_case_user_not_in_slack_workspace_message,
+ create_manual_engagement_message,
+ create_signal_engagement_message,
+)
+from dispatch.plugins.dispatch_slack.config import SlackConversationConfiguration
+from dispatch.plugins.dispatch_slack.decorators import message_dispatcher
+from dispatch.plugins.dispatch_slack.enums import SlackAPIErrorCode
+from dispatch.plugins.dispatch_slack.fields import (
+ DefaultActionIds,
+ DefaultBlockIds,
+ case_priority_select,
+ case_resolution_reason_select,
+ case_status_select,
+ case_type_select,
+ case_visibility_select,
+ description_input,
+ entity_select,
+ extension_request_checkbox,
+ incident_priority_select,
+ incident_type_select,
+ project_select,
+ relative_date_picker_input,
+ resolution_input,
+ title_input,
+)
+from dispatch.plugins.dispatch_slack.middleware import (
+ action_context_middleware,
+ add_user_middleware,
+ button_context_middleware,
+ command_context_middleware,
+ configuration_middleware,
+ db_middleware,
+ engagement_button_context_middleware,
+ modal_submit_middleware,
+ shortcut_context_middleware,
+ subject_middleware,
+ user_middleware,
+ is_bot,
+)
+from dispatch.plugins.dispatch_slack.modals.common import send_success_modal
+from dispatch.plugins.dispatch_slack.models import (
+ AddUserMetadata,
+ CaseSubjects,
+ FormData,
+ FormMetadata,
+ SignalSubjects,
+ SubjectMetadata,
+)
+from dispatch.project import service as project_service
+from dispatch.search.utils import create_filter_expression
+from dispatch.service import flows as service_flows
+from dispatch.service.models import Service
+from dispatch.signal import service as signal_service
+from dispatch.signal.enums import SignalEngagementStatus
+from dispatch.signal.models import (
+ Signal,
+ SignalEngagement,
+ SignalFilter,
+ SignalFilterCreate,
+ SignalInstance,
+)
+from dispatch.ticket import flows as ticket_flows
+
+log = logging.getLogger(__name__)
+
+
+def configure(config: SlackConversationConfiguration):
+ """Maps commands/events to their functions."""
+ case_command_context_middleware = partial(
+ command_context_middleware,
+ expected_subject=SubjectNames.CASE,
+ )
+
+ # don't need an incident context
+ app.command(config.slack_command_list_signals, middleware=[db_middleware])(
+ handle_list_signals_command
+ )
+
+ middleware = [
+ subject_middleware,
+ configuration_middleware,
+ case_command_context_middleware,
+ ]
+
+ app.command(config.slack_command_create_case, middleware=[db_middleware])(report_issue)
+
+ app.command(config.slack_command_escalate_case, middleware=middleware)(
+ handle_escalate_case_command
+ )
+
+ # non-sensitive commands
+ middleware = [
+ subject_middleware,
+ configuration_middleware,
+ case_command_context_middleware,
+ user_middleware,
+ ]
+
+ app.command(config.slack_command_update_case, middleware=middleware)(handle_update_case_command)
+
+ app.command(config.slack_command_engage_user, middleware=middleware)(handle_engage_user_command)
+
+
+# Commands
+
+
+def handle_escalate_case_command(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+) -> None:
+ """Handles list participants command."""
+ ack()
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+ already_escalated = True if case.escalated_at else False
+ if already_escalated:
+ modal = Modal(
+ title="Already Escalated",
+ blocks=[Section(text="This case has already been escalated to an incident.")],
+ close="Close",
+ ).build()
+
+ return client.views_open(
+ trigger_id=body["trigger_id"],
+ view=modal,
+ )
+
+ default_title = case.title
+ default_description = case.description
+ default_project = {"text": case.project.display_name, "value": case.project.id}
+
+ blocks = [
+ Context(elements=[MarkdownText(text="Accept the defaults or adjust as needed.")]),
+ title_input(initial_value=default_title),
+ description_input(initial_value=default_description),
+ project_select(
+ db_session=db_session,
+ initial_option=default_project,
+ action_id=CaseEscalateActions.project_select,
+ dispatch_action=True,
+ ),
+ incident_type_select(
+ db_session=db_session,
+ initial_option=None,
+ project_id=case.project.id,
+ block_id=DefaultBlockIds.incident_type_select,
+ ),
+ incident_priority_select(
+ db_session=db_session,
+ project_id=case.project.id,
+ initial_option=None,
+ optional=True,
+ block_id=DefaultBlockIds.incident_priority_select,
+ ),
+ ]
+
+ modal = Modal(
+ title="Escalate Case",
+ submit="Escalate",
+ blocks=blocks,
+ close="Close",
+ callback_id=CaseEscalateActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_open(
+ trigger_id=body["trigger_id"],
+ view=modal,
+ )
+
+
+def handle_update_case_command(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+) -> None:
+ ack()
+
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+
+ try:
+ user_lookup_response = client.users_lookupByEmail(email=case.assignee.individual.email)
+ assignee_initial_user = user_lookup_response["user"]["id"]
+ except SlackApiError as e:
+ if e.response["error"] == SlackAPIErrorCode.USERS_NOT_FOUND:
+ log.warning(
+ f"Unable to fetch default assignee for {case.assignee.individual.email}: {e}"
+ )
+ assignee_initial_user = None
+
+ statuses = [{"text": str(s), "value": str(s)} for s in CaseStatus if s != CaseStatus.escalated]
+
+ blocks = [
+ title_input(initial_value=case.title),
+ description_input(initial_value=case.description),
+ case_resolution_reason_select(optional=True),
+ resolution_input(initial_value=case.resolution),
+ assignee_select(initial_user=assignee_initial_user),
+ case_status_select(
+ initial_option={"text": case.status, "value": case.status}, statuses=statuses
+ ),
+ Context(
+ elements=[
+ MarkdownText(
+ text=(
+ "Note: Cases cannot be escalated here. Please use the "
+ f"{SlackConversationConfiguration.model_json_schema()['properties']['slack_command_escalate_case']['default']} "
+ "slash command."
+ )
+ )
+ ]
+ ),
+ case_type_select(
+ db_session=db_session,
+ initial_option={"text": case.case_type.name, "value": case.case_type.id},
+ project_id=case.project.id,
+ ),
+ case_priority_select(
+ db_session=db_session,
+ initial_option={"text": case.case_priority.name, "value": case.case_priority.id},
+ project_id=case.project.id,
+ optional=True,
+ ),
+ case_visibility_select(
+ initial_option={"text": case.visibility, "value": case.visibility},
+ ),
+ ]
+
+ modal = Modal(
+ title="Edit Case",
+ blocks=blocks,
+ submit="Update",
+ close="Close",
+ callback_id=CaseEditActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def ack_engage_oncall_submission_event(ack: Ack) -> None:
+ """Handles engage oncall acknowledgment."""
+ modal = Modal(
+ title="Escalate Case",
+ close="Close",
+ blocks=[Section(text="Escalating case to an incident...")],
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+def handle_list_signals_command(
+ ack: Ack,
+ body: dict,
+ db_session: Session,
+ context: BoltContext,
+ client: WebClient,
+) -> None:
+ ack()
+
+ projects = project_service.get_all(db_session=db_session)
+ conversation_name = dispatch_slack_service.get_conversation_name_by_id(
+ client, context.channel_id
+ )
+
+ signals = []
+ for project in projects:
+ signals.extend(
+ signal_service.get_all_by_conversation_target(
+ db_session=db_session, project_id=project.id, conversation_target=conversation_name
+ )
+ )
+
+ if not signals:
+ modal = Modal(
+ title="Signal Definition List",
+ blocks=[
+ Context(elements=[f"There are no signals configured for {conversation_name}"]),
+ ],
+ close="Close",
+ ).build()
+
+ return client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+ limit = 25
+ current_page = 0
+ total_pages = len(signals) // limit + (1 if len(signals) % limit > 0 else 0)
+
+ _draw_list_signal_modal(
+ client=client,
+ body=body,
+ db_session=db_session,
+ conversation_name=conversation_name,
+ current_page=current_page,
+ total_pages=total_pages,
+ first_render=True,
+ )
+
+
+def handle_engage_user_command(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+) -> None:
+ """Handles engage user command."""
+ ack()
+
+ default_engagement = "We'd like to verify your identity. Can you please confirm this is you?"
+
+ blocks = [
+ Context(
+ elements=[
+ MarkdownText(
+ text="Accept the defaults or adjust as needed. Person to engage must already be a participant in the case."
+ )
+ ]
+ ),
+ assignee_select(label="Person to engage", placeholder="Select user"),
+ description_input(label="Engagement text", initial_value=default_engagement),
+ ]
+
+ modal = Modal(
+ title="Engage user via MFA",
+ submit="Engage",
+ blocks=blocks,
+ close="Close",
+ callback_id="manual-engage-mfa",
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_open(
+ trigger_id=body["trigger_id"],
+ view=modal,
+ )
+
+
+@app.view(
+ "manual-engage-mfa",
+ middleware=[
+ action_context_middleware,
+ db_middleware,
+ modal_submit_middleware,
+ ],
+)
+def engage(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ form_data: dict,
+) -> None:
+ """Handles the engage user action."""
+ ack()
+
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+ if not case:
+ log.error("Case not found when trying to engage user")
+ return
+
+ if not form_data.get(DefaultBlockIds.case_assignee_select):
+ log.error("Case assignee not found")
+ return
+
+ user_email = client.users_info(user=form_data[DefaultBlockIds.case_assignee_select]["value"])[
+ "user"
+ ]["profile"]["email"]
+
+ if not user_email:
+ log.error("Email for case assignee not found")
+ return
+
+ try:
+ conversation_flows.add_case_participants(case, [user_email], db_session)
+ except Exception as e:
+ if e.response["error"] == SlackAPIErrorCode.ORG_USER_NOT_IN_TEAM:
+ blocks = create_case_user_not_in_slack_workspace_message(user_email=user_email)
+ client.chat_update(
+ blocks=blocks,
+ channel=case.conversation.channel_id,
+ thread_ts=case.conversation.thread_id if case.has_thread else None,
+ )
+ log.error(
+ f"Failed to add participant to case threat. Participant is not in the Slack workspace: {e}"
+ )
+ return
+
+ participant = participant_service.get_by_case_id_and_email(
+ db_session=db_session, case_id=case.id, email=user_email
+ )
+ if not participant:
+ participant_flows.add_participant(
+ user_email,
+ case,
+ db_session,
+ roles=[ParticipantRoleType.participant],
+ )
+
+ if not form_data.get(DefaultBlockIds.description_input):
+ log.warning("Engagement text not found")
+ return
+
+ engagement = form_data[DefaultBlockIds.description_input]
+
+ try:
+ user = client.users_lookupByEmail(email=user_email)
+ except SlackApiError as e:
+ if e.response.get("error") == SlackAPIErrorCode.USERS_NOT_FOUND:
+ log.warning(
+ f"Failed to find Slack user for email {user_email}. "
+ "User may have been deactivated or never had Slack access."
+ )
+ client.chat_postMessage(
+ text=f"Unable to engage user {user_email} - user not found in Slack workspace.",
+ channel=case.conversation.channel_id,
+ thread_ts=case.conversation.thread_id if case.has_thread else None,
+ )
+ return
+ else:
+ raise
+
+ result = client.chat_postMessage(
+ text="Engaging user...",
+ channel=case.conversation.channel_id,
+ thread_ts=case.conversation.thread_id if case.has_thread else None,
+ )
+ thread_ts = result.data.get("ts")
+
+ blocks = create_manual_engagement_message(
+ case=case,
+ channel_id=case.conversation.channel_id,
+ engagement=engagement,
+ user_email=user_email,
+ user_id=user["user"]["id"],
+ thread_ts=thread_ts,
+ )
+ client.chat_update(
+ blocks=blocks,
+ channel=case.conversation.channel_id,
+ ts=thread_ts,
+ )
+
+
+def _draw_list_signal_modal(
+ client: WebClient,
+ body: dict,
+ db_session: Session,
+ conversation_name: str,
+ current_page: int,
+ total_pages: int,
+ first_render: bool,
+) -> None:
+ """Draw the signal definition list modal.
+
+ Args:
+ client (WebClient): A Slack WebClient object that provides a convenient interface to the Slack API.
+ body (dict): A dictionary that contains the original request payload from Slack.
+ db_session (Session): A SQLAlchemy database session.
+ conversation_name (str): The name of the Slack conversation.
+ current_page (int): The current page number.
+ total_pages (int): The total number of pages.
+ first_render (bool): A boolean indicating whether the modal is being rendered for the first time.
+
+ Returns:
+ None
+
+ Raises:
+ None
+
+ Example:
+ client = WebClient(token=)
+ body = {
+ "trigger_id": "836215173894.4768581721.6f8ab1fcee0478f0e6c0c2b0dc9f0c7a",
+ }
+ db_session = Session()
+ conversation_name = "test_conversation"
+ current_page = 0
+ total_pages = 3
+ first_render = True
+ _draw_list_signal_modal(
+ client, body, db_session, conversation_name, current_page, total_pages, first_render
+ )
+ """
+ modal = Modal(
+ title="Signal Definition List",
+ blocks=_build_signal_list_modal_blocks(
+ db_session=db_session,
+ conversation_name=conversation_name,
+ current_page=current_page,
+ total_pages=total_pages,
+ ),
+ close="Close",
+ private_metadata=json.dumps(
+ {
+ "conversation_name": conversation_name,
+ "current_page": current_page,
+ "total_pages": total_pages,
+ }
+ ),
+ ).build()
+
+ (
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+ if first_render is True
+ else client.views_update(view_id=body["view"]["id"], view=modal)
+ )
+
+
+def _build_signal_list_modal_blocks(
+ db_session: Session,
+ conversation_name: str,
+ current_page: int,
+ total_pages: int,
+) -> list:
+ """Builds a list of blocks for a modal view displaying signals.
+
+ This function creates a list of blocks that represent signals that are filtered by conversation_name. The list of signals
+ is paginated and limited to 25 signals per page.
+
+ The function returns the blocks with pagination controls that display the current page and allows navigation to the previous
+ and next pages.
+
+ Args:
+ db_session (Session): The database session.
+ conversation_name (str): The name of the conversation to filter signals by.
+ current_page (int): The current page being displayed.
+ total_pages (int): The total number of pages.
+
+ Returns:
+ list: A list of blocks representing the signals and pagination controls.
+
+ Example:
+ >>> blocks = _build_signal_list_modal_blocks(db_session, "conversation_name", 1, 2)
+ >>> len(blocks)
+ 26
+ """
+
+ blocks = []
+ limit = 25
+ start_index = current_page * limit
+ end_index = start_index + limit - 1
+
+ projects = project_service.get_all(db_session=db_session)
+ signals = []
+ for project in projects:
+ signals.extend(
+ signal_service.get_all_by_conversation_target(
+ db_session=db_session, project_id=project.id, conversation_target=conversation_name
+ )
+ )
+
+ limited_signals = []
+ for idx, signal in enumerate(signals[start_index : end_index + 1], start_index + 1): # noqa
+ limited_signals.append(signal)
+
+ button_metadata = SubjectMetadata(
+ type=SignalSubjects.signal,
+ organization_slug=signal.project.organization.slug,
+ id=signal.id,
+ project_id=signal.project.id,
+ ).json()
+
+ blocks.extend(
+ [
+ Section(
+ text=signal.name,
+ accessory=Button(
+ text="Snooze",
+ value=button_metadata,
+ action_id=SignalNotificationActions.snooze,
+ ),
+ ),
+ Context(
+ elements=[MarkdownText(text=f"{signal.variant}" if signal.variant else "N/A")]
+ ),
+ ]
+ )
+ # Don't add a divider if we are at the last signal
+ if idx != len(signals[start_index : end_index + 1]): # noqa
+ blocks.extend([Divider()])
+
+ pagination_blocks = [
+ Actions(
+ block_id="pagination",
+ elements=[
+ Button(
+ text="Previous",
+ action_id=CasePaginateActions.list_signal_previous,
+ style="danger" if current_page == 0 else "primary",
+ ),
+ Button(
+ text="Next",
+ action_id=CasePaginateActions.list_signal_next,
+ style="danger" if current_page == total_pages - 1 else "primary",
+ ),
+ ],
+ )
+ ]
+
+ return blocks + pagination_blocks if len(signals) > limit else blocks
+
+
+@app.action(
+ CasePaginateActions.list_signal_next, middleware=[action_context_middleware, db_middleware]
+)
+def handle_next_action(ack: Ack, body: dict, client: WebClient, db_session: Session):
+ """Handle the 'next' action in the signal list modal.
+
+ This function is called when the user clicks the 'next' button in the signal list modal. It increments the current page
+ of the modal and updates the view with the new page.
+
+ Args:
+ ack (function): The function to acknowledge the action request.
+ db_session (Session): The database session to query for signal data.
+ body (dict): The request payload from the action.
+ client (WebClient): The Slack API WebClient to interact with the Slack API.
+ """
+ ack()
+
+ metadata = json.loads(body["view"]["private_metadata"])
+
+ current_page = metadata["current_page"]
+ total_pages = metadata["total_pages"]
+ conversation_name = metadata["conversation_name"]
+
+ if current_page < total_pages - 1:
+ current_page += 1
+
+ _draw_list_signal_modal(
+ client=client,
+ body=body,
+ db_session=db_session,
+ conversation_name=conversation_name,
+ current_page=current_page,
+ total_pages=total_pages,
+ first_render=False,
+ )
+
+
+@app.action(
+ CasePaginateActions.list_signal_previous, middleware=[action_context_middleware, db_middleware]
+)
+def handle_previous_action(ack: Ack, body: dict, client: WebClient, db_session: Session):
+ """Handle the 'previous' action in the signal list modal.
+
+ This function is called when the user clicks the 'previous' button in the signal list modal. It decrements the current page
+ of the modal and updates the view with the new page.
+
+ Args:
+ ack (function): The function to acknowledge the action request.
+ db_session (Session): The database session to query for signal data.
+ body (dict): The request payload from the action.
+ client (WebClient): The Slack API WebClient to interact with the Slack API.
+ """
+ ack()
+
+ metadata = json.loads(body["view"]["private_metadata"])
+
+ current_page = metadata["current_page"]
+ total_pages = metadata["total_pages"]
+ conversation_name = metadata["conversation_name"]
+
+ if current_page > 0:
+ current_page -= 1
+
+ _draw_list_signal_modal(
+ client=client,
+ body=body,
+ db_session=db_session,
+ conversation_name=conversation_name,
+ current_page=current_page,
+ total_pages=total_pages,
+ first_render=False,
+ )
+
+
+@app.action(SignalNotificationActions.snooze, middleware=[button_context_middleware, db_middleware])
+def snooze_button_click(
+ ack: Ack, body: dict, client: WebClient, context: BoltContext, db_session: Session
+) -> None:
+ """Handles the snooze button click event."""
+ ack()
+
+ subject = context["subject"]
+
+ case_id = None
+ case_url = None
+ if subject.type == SignalSubjects.signal_instance:
+ instance = signal_service.get_signal_instance(
+ db_session=db_session, signal_instance_id=subject.id
+ )
+ subject.id = instance.signal.id
+ elif subject.type == CaseSubjects.case:
+ case = case_service.get(db_session=db_session, case_id=subject.id)
+ case_id = case.id
+ subject.type = SignalSubjects.signal_instance
+ subject.id = case.signal_instances[0].signal.id
+ case_url = (
+ f"{DISPATCH_UI_URL}/{case.project.organization.slug}/cases/{case.name}/"
+ f"signal/{case.signal_instances[0].id}"
+ )
+
+ signal = signal_service.get(db_session=db_session, signal_id=subject.id)
+ blocks = [
+ Context(elements=[MarkdownText(text=f"{signal.name}")]),
+ Divider(),
+ title_input(placeholder="A name for your snooze filter."),
+ description_input(placeholder="Provide a description for your snooze filter."),
+ relative_date_picker_input(label="Expiration"),
+ extension_request_checkbox(),
+ ]
+
+ # not all signals will have entities and slack doesn't like empty selects
+ entity_select_block = entity_select(
+ db_session=db_session,
+ signal_id=signal.id,
+ optional=True,
+ case_id=case_id,
+ )
+
+ if entity_select_block:
+ blocks.append(entity_select_block)
+
+ if case_url:
+ blocks.append(
+ Actions(
+ elements=[
+ Button(
+ text="â Add entities",
+ action_id="button-link",
+ style="primary",
+ url=case_url,
+ )
+ ]
+ ),
+ )
+ blocks.append(
+ Context(
+ elements=[
+ MarkdownText(
+ text="Signals that contain all selected entities will be snoozed for the configured timeframe."
+ )
+ ]
+ ),
+ )
+
+ modal = Modal(
+ title="Snooze Signal",
+ blocks=blocks,
+ submit="Preview",
+ close="Close",
+ callback_id=SignalSnoozeActions.preview,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ if view_id := body.get("view", {}).get("id"):
+ client.views_update(view_id=view_id, view=modal)
+ else:
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+@app.view(
+ SignalSnoozeActions.preview,
+ middleware=[
+ action_context_middleware,
+ db_middleware,
+ modal_submit_middleware,
+ ],
+)
+def handle_snooze_preview_event(
+ ack: Ack,
+ context: BoltContext,
+ db_session: Session,
+ form_data: dict,
+) -> None:
+ """Handles the snooze preview event."""
+ if form_data.get(DefaultBlockIds.title_input):
+ title = form_data[DefaultBlockIds.title_input]
+
+ name_taken = signal_service.get_signal_filter_by_name(
+ db_session=db_session, project_id=int(context["subject"].project_id), name=title
+ )
+ if name_taken:
+ modal = Modal(
+ title="Name Taken",
+ close="Close",
+ blocks=[
+ Context(
+ elements=[
+ MarkdownText(
+ text=f"A signal filter with the name '{title}' already exists."
+ )
+ ]
+ )
+ ],
+ ).build()
+ return ack(response_action="update", view=modal)
+
+ if form_data.get(DefaultBlockIds.entity_select):
+ entity_ids = [int(entity["value"]) for entity in form_data[DefaultBlockIds.entity_select]]
+
+ preview_signal_instances = entity_service.get_signal_instances_with_entities(
+ db_session=db_session,
+ signal_id=int(context["subject"].id),
+ entity_ids=entity_ids,
+ days_back=90,
+ )
+
+ text = (
+ "Examples matching your filter:"
+ if preview_signal_instances
+ else "No signals matching your filter."
+ )
+ else:
+ preview_signal_instances = None
+ text = "No entities selected. All instances of this signal will be snoozed."
+
+ blocks = [Context(elements=[MarkdownText(text=text)])]
+
+ if preview_signal_instances:
+ # Only show 5 examples
+ for signal_instance in preview_signal_instances[:5]:
+ blocks.extend(
+ [
+ Section(text=signal_instance.signal.name),
+ Context(
+ elements=[
+ MarkdownText(
+ text=f" Case: {signal_instance.case.name if signal_instance.case else 'N/A'}"
+ )
+ ]
+ ),
+ Context(
+ elements=[
+ MarkdownText(
+ text=f" Created: {signal_instance.case.created_at if signal_instance.case else 'N/A'}"
+ )
+ ]
+ ),
+ ]
+ )
+
+ private_metadata = FormMetadata(
+ form_data=form_data,
+ **context["subject"].dict(),
+ ).json()
+ modal = Modal(
+ title="Add Snooze",
+ submit="Create",
+ close="Close",
+ blocks=blocks,
+ callback_id=SignalSnoozeActions.submit,
+ private_metadata=private_metadata,
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ SignalSnoozeActions.submit,
+ middleware=[
+ action_context_middleware,
+ db_middleware,
+ user_middleware,
+ ],
+)
+def handle_snooze_submission_event(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ user: DispatchUser,
+) -> None:
+ """Handle the submission event of the snooze modal.
+
+ This function is executed when a user submits the snooze modal. It first
+ sends an MFA push notification to the user to confirm the action. If the
+ user accepts the MFA prompt, the function retrieves the relevant information
+ from the form data and creates a new signal filter. The new filter is then
+ added to the existing filters for the signal. Finally, the function updates
+ the modal view to show the result of the operation.
+
+ Args:
+ ack (Ack): The acknowledgement function.
+ body (dict): The request body.
+ client (WebClient): The Slack API client.
+ context (BoltContext): The context data.
+ db_session (Session): The database session.
+ user (DispatchUser): The Dispatch user who submitted the form.
+ """
+ mfa_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=int(context["subject"].project_id), plugin_type="auth-mfa"
+ )
+ mfa_enabled = True if mfa_plugin else False
+
+ form_data: FormData = context["subject"].form_data
+ extension_request_data = form_data.get(DefaultBlockIds.extension_request_checkbox)
+ extension_request_value = extension_request_data[0].value if extension_request_data else None
+ extension_requested = True if extension_request_value == "Yes" else False
+
+ def _create_snooze_filter(
+ db_session: Session,
+ subject: SubjectMetadata,
+ user: DispatchUser,
+ ) -> SignalFilter:
+ form_data: FormData = subject.form_data
+ # Get the existing filters for the signal
+ signal = signal_service.get(db_session=db_session, signal_id=subject.id)
+ # Create the new filter from the form data
+ if form_data.get(DefaultBlockIds.entity_select):
+ entities = [
+ {"id": int(entity.value)} for entity in form_data[DefaultBlockIds.entity_select]
+ ]
+ else:
+ entities = []
+
+ description = form_data[DefaultBlockIds.description_input]
+ name = form_data[DefaultBlockIds.title_input]
+ delta: str = form_data[DefaultBlockIds.relative_date_picker_input].value
+ # Check if the 'delta' string contains days
+ # Example: '1 day, 0:00:00' contains days, while '0:01:00' does not
+ if ", " in delta:
+ # Split the 'delta' string into days and time parts
+ # Example: '1 day, 0:00:00' -> days: '1 day' and time_str: '0:00:00'
+ days, time_str = delta.split(", ")
+
+ # Extract the integer value of days from the days string
+ # Example: '1 day' -> 1
+ days = int(days.split(" ")[0])
+ else:
+ # If the 'delta' string does not contain days, set days to 0
+ days = 0
+
+ # Directly assign the 'delta' string to the time_str variable
+ time_str = delta
+
+ # Split the 'time_str' variable into hours, minutes, and seconds
+ # Convert each part to an integer
+ # Example: '0:01:00' -> hours: 0, minutes: 1, seconds: 0
+ hours, minutes, seconds = [int(x) for x in time_str.split(":")]
+
+ # Create a timedelta object using the extracted days, hours, minutes, and seconds
+ delta = timedelta(
+ days=days,
+ hours=hours,
+ minutes=minutes,
+ seconds=seconds,
+ )
+
+ # Calculate the new date by adding the timedelta object to the current date and time
+ date = datetime.now(tz=timezone.utc) + delta
+
+ project = project_service.get(db_session=db_session, project_id=signal.project_id)
+
+ # None expression is for cases when no entities are selected, in which case
+ # the filter will apply to all instances of the signal
+ if entities:
+ filters = {
+ "entity": entities,
+ }
+ expression = create_filter_expression(filters, "Entity")
+ else:
+ expression = []
+
+ # Create a new filter with the selected entities and entity types
+ filter_in = SignalFilterCreate(
+ name=name,
+ description=description,
+ expiration=date,
+ expression=expression,
+ project=project,
+ )
+ try:
+ new_filter = signal_service.create_signal_filter(
+ db_session=db_session, creator=user, signal_filter_in=filter_in
+ )
+ except IntegrityError:
+ raise ExistsError("A signal filter with this name already exists.") from None
+
+ signal.filters.append(new_filter)
+ db_session.commit()
+ return new_filter
+
+ channel_id = context["subject"].channel_id
+ thread_id = context["subject"].thread_id
+
+ # Check if last_mfa_time was within the last hour
+ if not mfa_enabled:
+ new_filter = _create_snooze_filter(
+ db_session=db_session,
+ user=user,
+ subject=context["subject"],
+ )
+ signal = signal_service.get(db_session=db_session, signal_id=int(context["subject"].id))
+ post_snooze_message(
+ db_session=db_session,
+ client=client,
+ channel=channel_id,
+ user=user,
+ signal=signal,
+ new_filter=new_filter,
+ thread_ts=thread_id,
+ extension_requested=extension_requested,
+ oncall_service=new_filter.project.snooze_extension_oncall_service,
+ )
+ send_success_modal(
+ client=client,
+ view_id=body["view"]["id"],
+ title="Add Snooze",
+ message="Snooze Filter added successfully.",
+ )
+ else:
+ challenge, challenge_url = mfa_plugin.instance.create_mfa_challenge(
+ action="signal-snooze",
+ current_user=user,
+ db_session=db_session,
+ project_id=int(context["subject"].project_id),
+ )
+ ack_mfa_required_submission_event(
+ ack=ack, mfa_enabled=mfa_enabled, challenge_url=challenge_url
+ )
+
+ # wait for the mfa challenge
+ response = mfa_plugin.instance.wait_for_challenge(
+ challenge_id=challenge.challenge_id,
+ db_session=db_session,
+ )
+
+ if response == MfaChallengeStatus.APPROVED:
+ new_filter = _create_snooze_filter(
+ db_session=db_session,
+ user=user,
+ subject=context["subject"],
+ )
+ signal = signal_service.get(db_session=db_session, signal_id=int(context["subject"].id))
+ post_snooze_message(
+ db_session=db_session,
+ client=client,
+ channel=channel_id,
+ user=user,
+ signal=signal,
+ new_filter=new_filter,
+ thread_ts=thread_id,
+ extension_requested=extension_requested,
+ oncall_service=new_filter.project.snooze_extension_oncall_service,
+ )
+ send_success_modal(
+ client=client,
+ view_id=body["view"]["id"],
+ title="Add Snooze",
+ message="Snooze Filter added successfully.",
+ )
+ user.last_mfa_time = datetime.now()
+ db_session.commit()
+ else:
+ if response == MfaChallengeStatus.EXPIRED:
+ text = "Adding Snooze failed, the MFA request timed out."
+ elif response == MfaChallengeStatus.DENIED:
+ text = "Adding Snooze failed, challenge did not complete succsfully."
+ else:
+ text = "Adding Snooze failed, you must accept the MFA prompt."
+
+ modal = Modal(
+ title="Add Snooze",
+ close="Close",
+ blocks=[Section(text=text)],
+ ).build()
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ view=modal,
+ )
+
+
+def get_user_id_from_oncall_service(
+ client: WebClient,
+ db_session: Session,
+ oncall_service: Service | None,
+) -> str | None:
+ if not oncall_service:
+ return None
+
+ oncall_email = service_flows.resolve_oncall(service=oncall_service, db_session=db_session)
+ if oncall_email:
+ # Get the Slack user ID for the current oncall
+ try:
+ return client.users_lookupByEmail(email=oncall_email)["user"]["id"]
+ except SlackApiError:
+ log.error(f"Failed to find Slack user for email: {oncall_email}")
+ return None
+
+
+def post_snooze_message(
+ client: WebClient,
+ channel: str,
+ user: DispatchUser,
+ signal: Signal,
+ db_session: Session,
+ new_filter: SignalFilter,
+ thread_ts: str | None = None,
+ extension_requested: bool = False,
+ oncall_service: Service | None = None,
+):
+ def extract_entity_ids(expression: list[dict]) -> list[int]:
+ entity_ids = []
+ for item in expression:
+ if isinstance(item, dict) and "or" in item:
+ for condition in item["or"]:
+ if condition.get("model") == "Entity" and condition.get("field") == "id":
+ entity_ids.append(int(condition.get("value")))
+ return entity_ids
+
+ entity_ids = extract_entity_ids(new_filter.expression)
+
+ if entity_ids:
+ entities = []
+ for entity_id in entity_ids:
+ entity = entity_service.get(db_session=db_session, entity_id=entity_id)
+ if entity:
+ entities.append(entity)
+ entities_text = ", ".join([f"{entity.value} ({entity.id})" for entity in entities])
+ else:
+ entities_text = "All"
+
+ message = (
+ f":zzz: *New Signal Snooze Added*\n"
+ f"âĸ Created by: {user.email}\n"
+ f"âĸ Signal: {signal.name}\n"
+ f"âĸ Snooze Name: {new_filter.name}\n"
+ f"âĸ Description: {new_filter.description}\n"
+ f"âĸ Expiration: {new_filter.expiration}\n"
+ f"âĸ Entities: {entities_text}"
+ )
+ if extension_requested:
+ message += "\nâĸ *Extension Requested*"
+ if user_id := get_user_id_from_oncall_service(
+ client=client, db_session=db_session, oncall_service=oncall_service
+ ):
+ message += f" - notifying oncall: <@{user_id}>"
+
+ client.chat_postMessage(channel=channel, text=message, thread_ts=thread_ts)
+
+
+def assignee_select(
+ placeholder: str = "Select Assignee",
+ initial_user: str = None,
+ action_id: str = None,
+ block_id: str = DefaultBlockIds.case_assignee_select,
+ label: str = "Assignee",
+ **kwargs,
+):
+ """Builds a assignee select block."""
+ return Input(
+ element=UsersSelect(
+ placeholder=placeholder, action_id=action_id, initial_user=initial_user
+ ),
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+@message_dispatcher.add(
+ subject=CaseSubjects.case, exclude={"subtype": ["channel_join", "channel_leave"]}
+) # we ignore channel join and leave messages
+def handle_new_participant_message(
+ ack: Ack, user: DispatchUser, context: BoltContext, db_session: Session, client: WebClient
+) -> None:
+ """Looks for new participants that have starting chatting for the first time."""
+ ack()
+ participant = case_flows.case_add_or_reactivate_participant_flow(
+ case_id=int(context["subject"].id),
+ user_email=user.email,
+ db_session=db_session,
+ add_to_conversation=False,
+ )
+ participant.user_conversation_id = context["user_id"]
+
+ for participant_role in participant.active_roles:
+ participant_role.activity += 1
+
+ # re-assign role once threshold is reached
+ if participant_role.role == ParticipantRoleType.observer:
+ if participant_role.activity >= 3: # three messages sent to the case channel
+ # we change the participant's role to the participant one
+ participant_role_service.renounce_role(
+ db_session=db_session, participant_role=participant_role
+ )
+ participant_role_service.add_role(
+ db_session=db_session,
+ participant_id=participant.id,
+ participant_role=ParticipantRoleType.participant,
+ )
+
+ # we log the event
+ event_service.log_case_event(
+ db_session=db_session,
+ source="Slack Plugin - Conversation Management",
+ description=(
+ f"{participant.individual.name}'s role changed from {participant_role.role} to "
+ f"{ParticipantRoleType.participant} due to activity in the case channel"
+ ),
+ case_id=int(context["subject"].id),
+ type=EventType.participant_updated,
+ )
+
+ db_session.commit()
+
+
+@message_dispatcher.add(
+ subject=CaseSubjects.case, exclude={"subtype": ["channel_join", "channel_leave"]}
+) # we ignore channel join and leave messages
+def handle_case_participant_role_activity(
+ ack: Ack, db_session: Session, context: BoltContext, user: DispatchUser
+) -> None:
+ ack()
+
+ participant = participant_service.get_by_case_id_and_email(
+ db_session=db_session, case_id=int(context["subject"].id), email=user.email
+ )
+
+ if participant:
+ for participant_role in participant.active_roles:
+ participant_role.activity += 1
+ else:
+ # we have a new active participant lets add them
+ participant = case_flows.case_add_or_reactivate_participant_flow(
+ case_id=int(context["subject"].id), user_email=user.email, db_session=db_session
+ )
+ participant.user_conversation_id = context["user_id"]
+
+ # if a participant is active mark the case as being in the triaged state
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+
+ if case.status == CaseStatus.new:
+ case_flows.case_status_transition_flow_dispatcher(
+ case=case,
+ current_status=CaseStatus.triage,
+ db_session=db_session,
+ previous_status=case.status,
+ organization_slug=context["subject"].organization_slug,
+ )
+ case.status = CaseStatus.triage
+
+ # we update the ticket
+ ticket_flows.update_case_ticket(case=case, db_session=db_session)
+
+ case_flows.update_conversation(case, db_session)
+ db_session.commit()
+
+
+@message_dispatcher.add(
+ subject=CaseSubjects.case, exclude={"subtype": ["channel_join", "group_join"]}
+) # we ignore user channel and group join messages
+def handle_case_after_hours_message(
+ ack: Ack,
+ context: BoltContext,
+ client: WebClient,
+ db_session: Session,
+ respond: Respond,
+ payload: dict,
+ user: DispatchUser,
+) -> None:
+ """Notifies the user that this case is currently in after hours mode."""
+ ack()
+
+ # Check if the message is in a thread
+ # We should not attempt to raise this message when a message is sent to the main channel.
+ if "thread_ts" not in payload:
+ return
+
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+ owner_email = case.assignee.individual.email
+ participant = participant_service.get_by_case_id_and_email(
+ db_session=db_session, case_id=int(context["subject"].id), email=user.email
+ )
+
+ # get case priority settings and if delayed message warning is disabled, log and return
+ case_priority_data = case_priority_service.get(
+ db_session=db_session, case_priority_id=case.case_priority_id
+ )
+
+ if case_priority_data.disable_delayed_message_warning:
+ log.debug("delayed messaging is disabled, not sending a warning")
+ return
+
+ # handle no participant found
+ if not participant:
+ log.warning(
+ f"Participant not found for {user.email} in case {case.id}. Skipping after hours notification."
+ )
+ return
+
+ # get their timezone from slack
+ owner_tz = (dispatch_slack_service.get_user_info_by_email(client, email=owner_email))["tz"]
+ message = f"Responses may be delayed. The current case priority is *{case.case_priority.name}* and your message was sent outside of the Assignee's working hours (Weekdays, 9am-5pm, {owner_tz} timezone)."
+
+ now = datetime.now(pytz.timezone(owner_tz))
+ is_business_hours = now.weekday() not in [5, 6] and 9 <= now.hour < 17
+ if not is_business_hours:
+ if not participant.after_hours_notification:
+ participant.after_hours_notification = True
+ db_session.add(participant)
+ db_session.commit()
+
+ client.chat_postEphemeral(
+ text=message,
+ channel=payload["channel"],
+ thread_ts=payload["thread_ts"],
+ user=payload["user"],
+ )
+
+
+@message_dispatcher.add(subject=CaseSubjects.case)
+def handle_thread_creation(
+ ack: Ack,
+ client: WebClient,
+ payload: dict,
+ db_session: Session,
+ context: BoltContext,
+ request: BoltRequest,
+) -> None:
+ """Sends the user an ephemeral message if they use threads in a dedicated case channel."""
+ ack()
+
+ if not context["config"].ban_threads:
+ return
+
+ case = case_service.get(db_session=db_session, case_id=context["subject"].id)
+ if not case.dedicated_channel:
+ return
+
+ if payload.get("thread_ts") and not is_bot(request):
+ message = "Please refrain from using threads in case channels. Threads make it harder for case participants to maintain context."
+ client.chat_postEphemeral(
+ text=message,
+ channel=payload["channel"],
+ thread_ts=payload["thread_ts"],
+ user=payload["user"],
+ )
+
+
+@app.action("button-link")
+def ack_button_link(ack: Ack):
+ """Handles noop button link action."""
+ ack()
+
+
+@app.action(CaseNotificationActions.reopen, middleware=[button_context_middleware, db_middleware])
+def reopen_button_click(
+ ack: Ack,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+):
+ ack()
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+ case.status = CaseStatus.triage
+
+ # we update the ticket
+ ticket_flows.update_case_ticket(case=case, db_session=db_session)
+
+ db_session.commit()
+
+ # update case message
+ blocks = create_case_message(case=case, channel_id=context["subject"].channel_id)
+ client.chat_update(
+ blocks=blocks, ts=case.conversation.thread_id, channel=case.conversation.channel_id
+ )
+
+
+@app.action(
+ CaseNotificationActions.escalate,
+ middleware=[button_context_middleware, db_middleware, user_middleware],
+)
+def escalate_button_click(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+):
+ ack()
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+ blocks = [
+ Context(elements=[MarkdownText(text="Accept the defaults or adjust as needed.")]),
+ title_input(initial_value=case.title),
+ description_input(initial_value=case.description),
+ project_select(
+ db_session=db_session,
+ initial_option={"text": case.project.display_name, "value": case.project.id},
+ action_id=CaseEscalateActions.project_select,
+ dispatch_action=True,
+ ),
+ incident_type_select(
+ db_session=db_session,
+ initial_option=(
+ {
+ "text": case.case_type.incident_type.name,
+ "value": case.case_type.incident_type.id,
+ }
+ if case.case_type.incident_type
+ else None
+ ),
+ project_id=case.project.id,
+ ),
+ incident_priority_select(db_session=db_session, project_id=case.project.id, optional=True),
+ ]
+
+ modal = Modal(
+ title="Escalate Case",
+ blocks=blocks,
+ submit="Escalate",
+ close="Close",
+ callback_id=CaseEscalateActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+@app.action(
+ CaseEscalateActions.project_select, middleware=[action_context_middleware, db_middleware]
+)
+def handle_project_select_action(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+):
+ ack()
+ values = body["view"]["state"]["values"]
+
+ project_id = int(
+ values[DefaultBlockIds.project_select][CaseEscalateActions.project_select][
+ "selected_option"
+ ]["value"]
+ )
+
+ project = project_service.get(db_session=db_session, project_id=project_id)
+
+ blocks = [
+ Context(elements=[MarkdownText(text="Accept the defaults or adjust as needed.")]),
+ title_input(),
+ description_input(),
+ project_select(
+ db_session=db_session,
+ initial_option={"text": project.display_name, "value": project.id},
+ action_id=CaseEscalateActions.project_select,
+ dispatch_action=True,
+ ),
+ incident_type_select(
+ db_session=db_session, initial_option=None, project_id=project.id, block_id=None
+ ),
+ incident_priority_select(
+ db_session=db_session,
+ project_id=project.id,
+ initial_option=None,
+ optional=True,
+ block_id=None, # ensures state is reset
+ ),
+ ]
+
+ modal = Modal(
+ title="Escalate Case",
+ blocks=blocks,
+ submit="Submit",
+ close="Close",
+ callback_id=CaseEscalateActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ trigger_id=body["trigger_id"],
+ view=modal,
+ )
+
+
+def ack_handle_escalation_submission_event(ack: Ack, case: Case) -> None:
+ """Handles the escalation submission event."""
+
+ msg = (
+ "The case has been escalated to an incident. This channel will be reused for the incident."
+ if case.dedicated_channel
+ else "The case has been escalated to an incident. All further triage work will take place in the incident channel."
+ )
+ modal = Modal(
+ title="Escalating Case",
+ close="Close",
+ blocks=[Section(text=msg)],
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ CaseEscalateActions.submit,
+ middleware=[
+ action_context_middleware,
+ user_middleware,
+ db_middleware,
+ modal_submit_middleware,
+ ],
+)
+def handle_escalation_submission_event(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ form_data: dict,
+ user: DispatchUser,
+):
+ """Handles the escalation submission event."""
+
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+ ack_handle_escalation_submission_event(ack=ack, case=case)
+
+ case.status = CaseStatus.escalated
+ db_session.commit()
+
+ modal = Modal(
+ title="Case Escalated",
+ close="Close",
+ blocks=[Section(text="Running case escalation flows...")],
+ ).build()
+
+ result = client.views_update(
+ view_id=body["view"]["id"],
+ trigger_id=body["trigger_id"],
+ view=modal,
+ )
+
+ incident_type = None
+ if form_data.get(DefaultBlockIds.incident_type_select):
+ incident_type = get_type_by_name(
+ db_session=db_session,
+ project_id=case.project.id,
+ name=form_data[DefaultBlockIds.incident_type_select]["name"],
+ )
+
+ incident_priority = None
+ if form_data.get(DefaultBlockIds.incident_priority_select):
+ incident_priority = get_priority_by_name(
+ db_session=db_session,
+ project_id=case.project.id,
+ name=form_data[DefaultBlockIds.incident_priority_select]["name"],
+ )
+ incident_description = form_data.get(DefaultBlockIds.description_input, case.description)
+ title = form_data.get(DefaultBlockIds.title_input, case.title)
+ case_flows.case_escalated_status_flow(
+ case=case,
+ organization_slug=context["subject"].organization_slug,
+ db_session=db_session,
+ title=title,
+ incident_priority=incident_priority,
+ incident_type=incident_type,
+ incident_description=incident_description,
+ )
+ incident = case.incidents[0]
+
+ blocks = create_case_message(case=case, channel_id=context["subject"].channel_id)
+ if case.has_thread:
+ client.chat_update(
+ blocks=blocks,
+ ts=case.conversation.thread_id,
+ channel=case.conversation.channel_id,
+ )
+ client.chat_postMessage(
+ text=f"This case has been escalated to incident {incident.name}. All further triage work will take place in the incident channel.",
+ channel=case.conversation.channel_id,
+ thread_ts=case.conversation.thread_id if case.has_thread else None,
+ )
+
+ # Add all case participants to the incident
+ conversation_flows.add_incident_participants_to_conversation(
+ incident=incident,
+ participant_emails=case.participant_emails,
+ db_session=db_session,
+ )
+
+ blocks = [
+ Section(
+ text="This is a confirmation that you have reported an incident with the following information. You will be invited to an incident Slack conversation shortly."
+ ),
+ Section(text=f"*Title*\n {incident.title}"),
+ Section(text=f"*Description*\n {incident.description}"),
+ Section(
+ fields=[
+ MarkdownText(
+ text=f"*Commander*\n<{incident.commander.individual.weblink}|{incident.commander.individual.name}>"
+ ),
+ MarkdownText(text=f"*Type*\n {incident.incident_type.name}"),
+ MarkdownText(text=f"*Severity*\n {incident.incident_severity.name}"),
+ MarkdownText(text=f"*Priority*\n {incident.incident_priority.name}"),
+ ]
+ ),
+ ]
+
+ send_success_modal(
+ client=client,
+ view_id=body["view"]["id"],
+ trigger_id=result["trigger_id"],
+ title="Case Escalated",
+ message="Case escalated successfully.",
+ )
+
+
+@app.action(
+ CaseNotificationActions.migrate,
+ middleware=[button_context_middleware, db_middleware, user_middleware],
+)
+def create_channel_button_click(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ form_data: dict,
+ db_session: Session,
+):
+ ack()
+
+ blocks = [
+ Section(text="Migrate the thread conversation to a dedicated channel?"),
+ Context(elements=[MarkdownText(text="This action will remove the case from this thread.")]),
+ ]
+
+ modal = Modal(
+ title="Create Case Channel",
+ blocks=blocks,
+ submit="Create Channel",
+ close="Close",
+ callback_id=CaseMigrateActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+@app.action(
+ CaseNotificationActions.update,
+ middleware=[button_context_middleware, db_middleware, user_middleware],
+)
+def update_case_button_click(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+):
+ return handle_update_case_command(
+ ack=ack,
+ body=body,
+ client=client,
+ context=context,
+ db_session=db_session,
+ )
+
+
+@app.action(
+ CaseNotificationActions.user_mfa,
+ middleware=[button_context_middleware, db_middleware, user_middleware],
+)
+def user_mfa_button_click(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+):
+ return handle_engage_user_command(
+ ack=ack,
+ body=body,
+ client=client,
+ context=context,
+ db_session=db_session,
+ )
+
+
+def ack_handle_create_channel_event(ack: Ack, case: Case) -> None:
+ """Handles the case channel creation event."""
+ msg = (
+ "The case already has a dedicated channel. No actions will be performed."
+ if case.has_channel
+ else "Creating a dedicated case channel..."
+ )
+
+ modal = Modal(
+ title="Creating Case Channel",
+ close="Close",
+ blocks=[Section(text=msg)],
+ ).build()
+
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ CaseMigrateActions.submit,
+ middleware=[
+ action_context_middleware,
+ db_middleware,
+ ],
+)
+def handle_create_channel_event(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ form_data: dict,
+ user: DispatchUser,
+):
+ """Handles the escalation submission event."""
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+ ack_handle_create_channel_event(ack=ack, case=case)
+
+ case.dedicated_channel = True
+ db_session.commit()
+
+ msg = (
+ "Creating a dedicated case channel..."
+ if not case.has_channel
+ else "The case already has a dedicated channel. No actions will be performed."
+ )
+
+ modal = Modal(
+ title="Creating Case Channel",
+ close="Close",
+ blocks=[Section(text=msg)],
+ ).build()
+
+ result = client.views_update(
+ view_id=body["view"]["id"],
+ trigger_id=body["trigger_id"],
+ view=modal,
+ )
+
+ channel_id = case.conversation.channel_id
+ thread_id = case.conversation.thread_id
+
+ # Add all case participants to the case channel
+ case_flows.case_create_conversation_flow(
+ db_session=db_session,
+ case=case,
+ participant_emails=case.participant_emails,
+ conversation_target=None,
+ )
+
+ # This should update the original message?
+ blocks = create_case_message(case=case, channel_id=channel_id)
+ client.chat_update(blocks=blocks, ts=thread_id, channel=channel_id)
+
+ send_success_modal(
+ client=client,
+ view_id=body["view"]["id"],
+ trigger_id=result["trigger_id"],
+ title="Channel Created",
+ message="Case channel created successfully.",
+ )
+
+
+def extract_mentioned_users(text: str) -> list[str]:
+ """Extracts mentioned users from a message."""
+ return re.findall(r"<@(\w+)>", text)
+
+
+def format_emails(emails: list[str]) -> str:
+ """Format a list of names into a string with commas and 'and' before the last name."""
+ usernames = [email.split("@")[0] for email in emails]
+
+ if not usernames:
+ return ""
+ elif len(usernames) == 1:
+ return f"@{usernames[0]}"
+ elif len(usernames) == 2:
+ return f"@{usernames[0]} and @{usernames[1]}"
+ else:
+ return ", ".join(f"@{username}" for username in usernames[:-1]) + f", and @{usernames[-1]}"
+
+
+@message_dispatcher.add(
+ subject=CaseSubjects.case, exclude={"subtype": ["channel_join", "group_join"]}
+) # we ignore user channel and group join messages
+def handle_user_mention(
+ ack: Ack,
+ context: BoltContext,
+ client: WebClient,
+ db_session: Session,
+ payload: dict,
+) -> None:
+ """Handles user posted message events."""
+ ack()
+
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+ if not case or case.dedicated_channel:
+ # we do not need to handle mentions for cases with dedicated channels
+ return
+
+ mentioned_users = extract_mentioned_users(payload["text"])
+ users_not_in_case = []
+ for user_id in mentioned_users:
+ user_email = dispatch_slack_service.get_user_email(client, user_id)
+ if user_email and not participant_service.get_by_case_id_and_email(
+ db_session=db_session, case_id=int(context["subject"].id), email=user_email
+ ):
+ users_not_in_case.append(user_email)
+
+ if users_not_in_case:
+ # send a private message to the user who posted the message to see
+ # if they want to add the mentioned user(s) to the case
+ button_metadata = AddUserMetadata(
+ **dict(context["subject"]),
+ users=users_not_in_case,
+ ).json()
+ blocks = [
+ Section(
+ text=f"You mentioned {format_emails(users_not_in_case)}, but they're not in this case."
+ ),
+ Actions(
+ block_id=DefaultBlockIds.add_user_actions,
+ elements=[
+ Button(
+ text="Add Them",
+ style="primary",
+ action_id=CaseNotificationActions.add_user,
+ value=button_metadata,
+ ),
+ Button(
+ text="Do Nothing",
+ action_id=CaseNotificationActions.do_nothing,
+ ),
+ ],
+ ),
+ ]
+ blocks = Message(blocks=blocks).build()["blocks"]
+ client.chat_postEphemeral(
+ channel=payload["channel"],
+ thread_ts=payload.get("thread_ts"),
+ user=payload["user"],
+ blocks=blocks,
+ )
+
+
+@app.action(
+ CaseNotificationActions.add_user,
+ middleware=[add_user_middleware, button_context_middleware, db_middleware, user_middleware],
+)
+def add_users_to_case(
+ ack: Ack,
+ db_session: Session,
+ context: BoltContext,
+ respond: Respond,
+):
+ ack()
+
+ case_id = int(context["subject"].id)
+
+ case = case_service.get(db_session=db_session, case_id=case_id)
+ if not case:
+ log.error(f"Could not find case with id: {case_id}")
+ return
+
+ users = context["users"]
+ if users:
+ for user_email in users:
+ conversation_flows.add_case_participants(case, [user_email], db_session)
+ participant = participant_service.get_by_case_id_and_email(
+ db_session=db_session, case_id=case.id, email=user_email
+ )
+ if not participant:
+ participant_flows.add_participant(
+ user_email,
+ case,
+ db_session,
+ roles=[ParticipantRoleType.participant],
+ )
+
+ # Delete the ephemeral message
+ respond(delete_original=True)
+
+
+@app.action(CaseNotificationActions.do_nothing)
+def handle_do_nothing_button(
+ ack: Ack,
+ respond: Respond,
+):
+ # Acknowledge the action
+ ack()
+
+ try:
+ # Delete the ephemeral message
+ respond(delete_original=True)
+ except SlackApiError as e:
+ log.error(f"Error deleting ephemeral message: {e.response['error']}")
+
+
+@app.action(
+ CaseNotificationActions.join_incident,
+ middleware=[button_context_middleware, db_middleware, user_middleware],
+)
+def join_incident_button_click(
+ ack: Ack, user: DispatchUser, db_session: Session, context: BoltContext
+):
+ ack()
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+
+ # we add the user to the incident conversation
+ conversation_flows.add_incident_participants_to_conversation(
+ # TODO: handle case where there are multiple related incidents
+ incident=case.incidents[0],
+ participant_emails=[user.email],
+ db_session=db_session,
+ )
+
+
+@app.action(
+ CaseNotificationActions.invite_user_case,
+ middleware=[button_context_middleware, db_middleware, user_middleware],
+)
+def handle_case_notification_join_button_click(
+ ack: Ack,
+ user: DispatchUser,
+ client: WebClient,
+ respond: Respond,
+ db_session: Session,
+ context: BoltContext,
+):
+ """Handles the case join button click event."""
+ ack()
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+
+ if not case:
+ message = "Sorry, we can't invite you to this case. The case does not exist."
+ elif case.visibility == Visibility.restricted:
+ message = "Sorry, we can't invite you to this case. The case's visibility is restricted. Please, reach out to the case assignee if you have any questions."
+ elif case.status == CaseStatus.closed:
+ message = "Sorry, you can't join this case. The case has already been marked as closed. Please, reach out to the case assignee if you have any questions."
+ elif case.status == CaseStatus.escalated:
+ conversation_flows.add_incident_participants_to_conversation(
+ incident=case.incidents[0],
+ participant_emails=[user.email],
+ db_session=db_session,
+ )
+ message = f"The case has already been escalated to incident {case.incidents[0].name}. We've added you to the incident conversation. Please, check your Slack sidebar for the new incident channel."
+ else:
+ user_id = context["user_id"]
+ try:
+ client.conversations_invite(channel=case.conversation.channel_id, users=[user_id])
+ message = f"Success! We've added you to case {case.name}. Please, check your Slack sidebar for the new case channel."
+ except SlackApiError as e:
+ if e.response.get("error") == SlackAPIErrorCode.ALREADY_IN_CHANNEL:
+ message = f"Sorry, we can't invite you to this case - you're already a member. Search for a channel called {case.name.lower()} in your Slack sidebar."
+
+ respond(text=message, response_type="ephemeral", replace_original=False, delete_original=False)
+
+
+@app.action(CaseNotificationActions.edit, middleware=[button_context_middleware, db_middleware])
+def edit_button_click(
+ ack: Ack, body: dict, db_session: Session, context: BoltContext, client: WebClient
+):
+ ack()
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+
+ try:
+ assignee_initial_user = client.users_lookupByEmail(email=case.assignee.individual.email)[
+ "user"
+ ]["id"]
+ except SlackApiError as e:
+ if e.response.get("error") == SlackAPIErrorCode.USERS_NOT_FOUND:
+ log.warning(
+ f"Assignee {case.assignee.individual.email} not found in Slack workspace. "
+ "Using None for initial assignee selection."
+ )
+ assignee_initial_user = None
+ else:
+ raise
+
+ blocks = [
+ title_input(initial_value=case.title),
+ description_input(initial_value=case.description),
+ case_resolution_reason_select(optional=True),
+ resolution_input(initial_value=case.resolution),
+ assignee_select(initial_user=assignee_initial_user),
+ case_status_select(initial_option={"text": case.status, "value": case.status}),
+ case_type_select(
+ db_session=db_session,
+ initial_option={"text": case.case_type.name, "value": case.case_type.id},
+ project_id=case.project.id,
+ ),
+ case_priority_select(
+ db_session=db_session,
+ initial_option={"text": case.case_priority.name, "value": case.case_priority.id},
+ project_id=case.project.id,
+ optional=True,
+ ),
+ ]
+
+ modal = Modal(
+ title="Edit Case",
+ blocks=blocks,
+ submit="Update",
+ close="Close",
+ callback_id=CaseEditActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+@app.view(
+ CaseEditActions.submit,
+ middleware=[
+ action_context_middleware,
+ db_middleware,
+ user_middleware,
+ modal_submit_middleware,
+ ],
+)
+def handle_edit_submission_event(
+ ack: Ack,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ form_data: dict,
+ user: DispatchUser,
+):
+ ack()
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+ previous_case = CaseRead.from_orm(case)
+
+ case_priority = None
+ if form_data.get(DefaultBlockIds.case_priority_select):
+ case_priority = {"name": form_data[DefaultBlockIds.case_priority_select]["name"]}
+
+ case_type = None
+ if form_data.get(DefaultBlockIds.case_type_select):
+ case_type = {"name": form_data[DefaultBlockIds.case_type_select]["name"]}
+
+ case_visibility = case.visibility
+ if form_data.get(DefaultBlockIds.case_visibility_select):
+ case_visibility = form_data[DefaultBlockIds.case_visibility_select]["value"]
+
+ assignee_email = None
+ if form_data.get(DefaultBlockIds.case_assignee_select):
+ assignee_email = client.users_info(
+ user=form_data[DefaultBlockIds.case_assignee_select]["value"]
+ )["user"]["profile"]["email"]
+
+ resolution_reason = None
+ if form_data.get(DefaultBlockIds.case_resolution_reason_select):
+ resolution_reason = form_data[DefaultBlockIds.case_resolution_reason_select]["value"]
+
+ case_in = CaseUpdate(
+ title=form_data[DefaultBlockIds.title_input],
+ description=form_data[DefaultBlockIds.description_input],
+ resolution=form_data[DefaultBlockIds.resolution_input],
+ resolution_reason=resolution_reason,
+ status=form_data[DefaultBlockIds.case_status_select]["name"],
+ visibility=case_visibility,
+ case_priority=case_priority,
+ case_type=case_type,
+ )
+
+ case = case_service.update(db_session=db_session, case=case, case_in=case_in, current_user=user)
+
+ case_flows.case_update_flow(
+ case_id=case.id,
+ previous_case=previous_case,
+ db_session=db_session,
+ reporter_email=case.reporter.individual.email if case.reporter else None,
+ assignee_email=assignee_email,
+ organization_slug=context["subject"].organization_slug,
+ )
+
+ return case
+
+
+@app.action(CaseNotificationActions.resolve, middleware=[button_context_middleware, db_middleware])
+def resolve_button_click(
+ ack: Ack, body: dict, db_session: Session, context: BoltContext, client: WebClient
+):
+ ack()
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+
+ reason = case.resolution_reason
+ blocks = [
+ (
+ case_resolution_reason_select(
+ initial_option={"text": reason, "value": reason}, dispatch_action=True
+ )
+ if reason
+ else case_resolution_reason_select(dispatch_action=True)
+ ),
+ Context(elements=[MarkdownText(text="Select a resolution reason to see its description")]),
+ resolution_input(initial_value=case.resolution),
+ ]
+
+ modal = Modal(
+ title="Resolve Case",
+ blocks=blocks,
+ submit="Resolve",
+ close="Close",
+ callback_id=CaseResolveActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+@app.action(
+ DefaultActionIds.case_resolution_reason_select,
+ middleware=[action_context_middleware, db_middleware],
+)
+def handle_resolution_reason_select_action(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+):
+ """Handles the resolution reason select action."""
+ ack()
+
+ # Get the selected resolution reason
+ values = body["view"]["state"]["values"]
+ block_id = DefaultBlockIds.case_resolution_reason_select
+ action_id = DefaultActionIds.case_resolution_reason_select
+ resolution_reason = values[block_id][action_id]["selected_option"]["value"]
+
+ # Get the description for the selected reason
+ try:
+ # Map the resolution reason string to the enum key
+ reason_key = resolution_reason.lower().replace(" ", "_")
+ description = CaseResolutionReasonDescription[reason_key].value
+ except KeyError:
+ description = "No description available"
+
+ # Get the current case
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+
+ # Rebuild the modal with the updated description
+ blocks = [
+ case_resolution_reason_select(
+ initial_option={"text": resolution_reason, "value": resolution_reason},
+ dispatch_action=True,
+ ),
+ Context(elements=[MarkdownText(text=f"*Description:* {description}")]),
+ resolution_input(initial_value=case.resolution),
+ ]
+
+ modal = Modal(
+ title="Resolve Case",
+ blocks=blocks,
+ submit="Resolve",
+ close="Close",
+ callback_id=CaseResolveActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ view=modal,
+ )
+
+
+@app.action(CaseNotificationActions.triage, middleware=[button_context_middleware, db_middleware])
+def triage_button_click(
+ ack: Ack, body: dict, db_session: Session, context: BoltContext, client: WebClient
+):
+ ack()
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+ # we run the case status transition flow
+ case_flows.case_status_transition_flow_dispatcher(
+ case=case,
+ current_status=CaseStatus.triage,
+ db_session=db_session,
+ previous_status=case.status,
+ organization_slug=context["subject"].organization_slug,
+ )
+ case.status = CaseStatus.triage
+ db_session.commit()
+
+ # we update the ticket
+ ticket_flows.update_case_ticket(case=case, db_session=db_session)
+
+ case_flows.update_conversation(case, db_session)
+
+
+@app.view(
+ CaseResolveActions.submit,
+ middleware=[action_context_middleware, db_middleware, user_middleware, modal_submit_middleware],
+)
+def handle_resolve_submission_event(
+ ack: Ack,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ form_data: dict,
+ user: DispatchUser,
+):
+ ack()
+ # we get the current case and store it as previous case
+ current_case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+ previous_case = CaseRead.from_orm(current_case)
+
+ # we update the case with the new resolution, resolution reason and status
+ case_in = CaseUpdate(
+ title=current_case.title,
+ resolution_reason=form_data[DefaultBlockIds.case_resolution_reason_select]["value"],
+ resolution=form_data[DefaultBlockIds.resolution_input],
+ visibility=current_case.visibility,
+ status=CaseStatus.closed,
+ )
+ updated_case = case_service.update(
+ db_session=db_session,
+ case=current_case,
+ case_in=case_in,
+ current_user=user,
+ )
+
+ # we update the case notification with the resolution, resolution reason and status
+ blocks = create_case_message(case=updated_case, channel_id=context["subject"].channel_id)
+ client.chat_update(
+ blocks=blocks,
+ ts=updated_case.conversation.thread_id,
+ channel=updated_case.conversation.channel_id,
+ )
+
+ try:
+ # we run the case update flow
+ case_flows.case_update_flow(
+ case_id=updated_case.id,
+ previous_case=previous_case,
+ db_session=db_session,
+ reporter_email=(
+ updated_case.reporter.individual.email if updated_case.reporter else None
+ ),
+ assignee_email=(
+ updated_case.assignee.individual.email if updated_case.assignee else None
+ ),
+ organization_slug=context["subject"].organization_slug,
+ )
+ except Exception as e:
+ log.error(f"Error running case update flow from Slack plugin: {e}")
+
+
+@app.shortcut(CaseShortcutCallbacks.report, middleware=[db_middleware, shortcut_context_middleware])
+def report_issue(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+):
+ ack()
+ initial_description = None
+ if body.get("message"):
+ permalink = (
+ client.chat_getPermalink(
+ channel=context["subject"].channel_id, message_ts=body["message"]["ts"]
+ )
+ )["permalink"]
+ initial_description = f"{body['message']['text']}\n\n{permalink}"
+
+ blocks = [
+ Context(
+ elements=[
+ MarkdownText(
+ text="Cases are meant for triaging events that do not raise to the level of incidents, but can be escalated to incidents if necessary."
+ )
+ ]
+ ),
+ title_input(),
+ description_input(initial_value=initial_description),
+ project_select(
+ db_session=db_session,
+ action_id=CaseReportActions.project_select,
+ dispatch_action=True,
+ ),
+ ]
+
+ modal = Modal(
+ title="Open a Case",
+ blocks=blocks,
+ submit="Report",
+ close="Close",
+ callback_id=CaseReportActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+@app.action(CaseReportActions.project_select, middleware=[db_middleware, action_context_middleware])
+def handle_report_project_select_action(
+ ack: Ack,
+ body: dict,
+ db_session: Session,
+ context: BoltContext,
+ client: WebClient,
+) -> None:
+ ack()
+ values = body["view"]["state"]["values"]
+
+ project_id = int(
+ values[DefaultBlockIds.project_select][CaseReportActions.project_select]["selected_option"][
+ "value"
+ ]
+ )
+
+ project = project_service.get(
+ db_session=db_session,
+ project_id=project_id,
+ )
+
+ blocks = [
+ title_input(),
+ description_input(),
+ project_select(
+ db_session=db_session,
+ initial_option={"text": project.display_name, "value": project.id},
+ action_id=CaseReportActions.project_select,
+ dispatch_action=True,
+ ),
+ case_type_select(
+ db_session=db_session,
+ initial_option=None,
+ project_id=project.id,
+ action_id=CaseReportActions.case_type_select,
+ dispatch_action=True,
+ ),
+ Context(
+ elements=[
+ MarkdownText(
+ text="đĄ Case Types determine the initial assignee based on their configured on-call schedule."
+ )
+ ]
+ ),
+ ]
+
+ modal = Modal(
+ title="Open a Case",
+ blocks=blocks,
+ submit="Report",
+ close="Close",
+ callback_id=CaseReportActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ trigger_id=body["trigger_id"],
+ view=modal,
+ )
+
+
+@app.action(
+ CaseReportActions.case_type_select,
+ middleware=[
+ db_middleware,
+ action_context_middleware,
+ ],
+)
+def handle_report_case_type_select_action(
+ ack: Ack,
+ body: dict,
+ db_session: Session,
+ context: BoltContext,
+ client: WebClient,
+) -> None:
+ ack()
+ values = body["view"]["state"]["values"]
+
+ project_id = int(
+ values[DefaultBlockIds.project_select][CaseReportActions.project_select]["selected_option"][
+ "value"
+ ]
+ )
+
+ case_type_id = int(
+ values[DefaultBlockIds.case_type_select][CaseReportActions.case_type_select][
+ "selected_option"
+ ]["value"]
+ )
+
+ project = project_service.get(
+ db_session=db_session,
+ project_id=project_id,
+ )
+
+ case_type = case_type_service.get(
+ db_session=db_session,
+ case_type_id=case_type_id,
+ )
+
+ assignee_email = None
+ assignee_slack_id = None
+ oncall_service_name = None
+ service_url = None
+
+ if case_type.oncall_service:
+ assignee_email = service_flows.resolve_oncall(
+ service=case_type.oncall_service, db_session=db_session
+ )
+ oncall_service_name = case_type.oncall_service.name
+
+ oncall_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project.id, plugin_type="oncall"
+ )
+ if not oncall_plugin:
+ log.debug("Unable to send email since oncall plugin is not active.")
+ else:
+ service_url = oncall_plugin.instance.get_service_url(
+ case_type.oncall_service.external_id
+ )
+
+ if assignee_email:
+ # Get the Slack user ID for the assignee
+ try:
+ assignee_slack_id = client.users_lookupByEmail(email=assignee_email)["user"]["id"]
+ except SlackApiError:
+ log.error(f"Failed to find Slack user for email: {assignee_email}")
+ assignee_slack_id = None
+
+ blocks = [
+ title_input(),
+ description_input(),
+ project_select(
+ db_session=db_session,
+ initial_option={"text": project.display_name, "value": project.id},
+ action_id=CaseReportActions.project_select,
+ dispatch_action=True,
+ ),
+ case_type_select(
+ db_session=db_session,
+ initial_option={"text": case_type.name, "value": case_type.id},
+ project_id=project.id,
+ action_id=CaseReportActions.case_type_select,
+ dispatch_action=True,
+ ),
+ Context(
+ elements=[
+ MarkdownText(
+ text="đĄ Case Types determine the initial assignee based on their configured on-call schedule."
+ )
+ ]
+ ),
+ ]
+
+ # Create a new assignee_select block with a unique block_id
+ new_block_id = f"{DefaultBlockIds.case_assignee_select}_{case_type_id}"
+ blocks.append(
+ assignee_select(
+ initial_user=assignee_slack_id if assignee_slack_id else None,
+ action_id=CaseReportActions.assignee_select,
+ block_id=new_block_id,
+ ),
+ )
+
+ # Conditionally add context blocks
+ if oncall_service_name and assignee_email:
+ if service_url:
+ oncall_text = (
+ f"đŠâđ {assignee_email} is on-call for <{service_url}|{oncall_service_name}>"
+ )
+ else:
+ oncall_text = f"đŠâđ {assignee_email} is on-call for {oncall_service_name}"
+
+ blocks.extend(
+ [
+ Context(elements=[MarkdownText(text=oncall_text)]),
+ Divider(),
+ Context(
+ elements=[
+ MarkdownText(
+ text="Not who you're looking for? You can override the assignee for this case."
+ )
+ ]
+ ),
+ ]
+ )
+ else:
+ blocks.extend(
+ [
+ Context(
+ elements=[
+ MarkdownText(
+ text="There is no on-call service associated with this case type."
+ )
+ ]
+ ),
+ Context(elements=[MarkdownText(text="Please select an assignee for this case.")]),
+ ]
+ )
+
+ blocks.append(
+ case_priority_select(
+ db_session=db_session,
+ project_id=project.id,
+ initial_option=None,
+ optional=True,
+ block_id=None, # ensures state is reset
+ ),
+ )
+
+ modal = Modal(
+ title="Open a Case",
+ blocks=blocks,
+ submit="Report",
+ close="Close",
+ callback_id=CaseReportActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ view=modal,
+ )
+
+
+def ack_report_case_submission_event(ack: Ack) -> None:
+ """Handles the report case submission event acknowledgment."""
+ modal = Modal(
+ title="Open a Case",
+ close="Close",
+ blocks=[Section(text="Creating case resources...")],
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ CaseReportActions.submit,
+ middleware=[db_middleware, action_context_middleware, modal_submit_middleware, user_middleware],
+)
+def handle_report_submission_event(
+ ack: Ack,
+ body: dict,
+ context: BoltContext,
+ form_data: dict,
+ db_session: Session,
+ user: DispatchUser,
+ client: WebClient,
+):
+ ack_report_case_submission_event(ack=ack)
+
+ case_priority = None
+ if form_data.get(DefaultBlockIds.case_priority_select):
+ case_priority = {"name": form_data[DefaultBlockIds.case_priority_select]["name"]}
+
+ case_type = None
+ if form_data.get(DefaultBlockIds.case_type_select):
+ case_type = {"name": form_data[DefaultBlockIds.case_type_select]["name"]}
+
+ assignee_block_id = next(
+ (key for key in form_data.keys() if key.startswith(DefaultBlockIds.case_assignee_select)),
+ None,
+ )
+
+ if not assignee_block_id:
+ raise ValueError("Assignee block not found in form data")
+
+ assignee_email = client.users_info(user=form_data[assignee_block_id]["value"])["user"][
+ "profile"
+ ]["email"]
+
+ case_in = CaseCreate(
+ title=form_data[DefaultBlockIds.title_input],
+ description=form_data[DefaultBlockIds.description_input],
+ status=CaseStatus.new,
+ case_priority=case_priority,
+ case_type=case_type,
+ dedicated_channel=True,
+ reporter=ParticipantUpdate(individual=IndividualContactRead(email=user.email)),
+ assignee=ParticipantUpdate(individual=IndividualContactRead(email=assignee_email)),
+ )
+
+ case = case_service.create(db_session=db_session, case_in=case_in, current_user=user)
+
+ modal = Modal(
+ title="Case Created",
+ close="Close",
+ blocks=[Section(text="Running case execution flows...")],
+ ).build()
+
+ result = client.views_update(
+ view_id=body["view"]["id"],
+ trigger_id=body["trigger_id"],
+ view=modal,
+ )
+
+ case_flows.case_new_create_flow(
+ case_id=case.id,
+ db_session=db_session,
+ organization_slug=context["subject"].organization_slug,
+ )
+
+ send_success_modal(
+ client=client,
+ view_id=body["view"]["id"],
+ trigger_id=result["trigger_id"],
+ title="Case Created",
+ message="Case created successfully.",
+ )
+
+
+@app.action(
+ SignalEngagementActions.approve,
+ middleware=[
+ engagement_button_context_middleware,
+ db_middleware,
+ user_middleware,
+ ],
+)
+def engagement_button_approve_click(
+ ack: Ack,
+ body: dict,
+ db_session: Session,
+ context: BoltContext,
+ client: WebClient,
+ user: DispatchUser,
+):
+ ack()
+
+ # Engaged user is extracted from the context of the engagement button, which stores the email
+ # address of the user who was engaged, and is parsed by the engagement_button_context_middleware.
+ engaged_user = context["subject"].user
+
+ user_who_clicked_button = user
+
+ # We check the role of the user who clicked the button to ensure they are authorized to approve
+ role = user_who_clicked_button.get_organization_role(
+ organization_slug=context["subject"].organization_slug
+ )
+
+ # If the user who clicked the button is not the enaged user or a Dispatch admin,
+ # we return a modal informing them that they are not authorized to approve the signal.
+ if engaged_user != user_who_clicked_button.email and role not in (
+ UserRoles.admin,
+ UserRoles.owner,
+ ):
+ modal = Modal(
+ title="Not Authorized",
+ close="Close",
+ blocks=[
+ Section(
+ text=f"Sorry, only {engaged_user} or Dispatch administrators can approve this signal."
+ )
+ ],
+ ).build()
+ return client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+ engagement = signal_service.get_signal_engagement(
+ db_session=db_session,
+ signal_engagement_id=int(context["subject"].engagement_id),
+ )
+
+ mfa_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=int(context["subject"].project_id), plugin_type="auth-mfa"
+ )
+
+ require_mfa = engagement.require_mfa if engagement else True
+ mfa_enabled = True if mfa_plugin and require_mfa else False
+
+ blocks = [
+ Section(text="Confirm that this is expected and that it is not suspicious behavior."),
+ Divider(),
+ description_input(label="Additional Context", optional=False),
+ ]
+
+ if mfa_enabled:
+ blocks.append(Section(text=" "))
+ blocks.append(
+ Context(
+ elements=[
+ MarkdownText(
+ text="đĄ After submission, you will be asked to validate your identity by completing a Multi-Factor Authentication challenge."
+ )
+ ]
+ ),
+ )
+
+ modal = Modal(
+ submit="Submit",
+ close="Cancel",
+ title="Confirmation",
+ callback_id=SignalEngagementActions.approve_submit,
+ private_metadata=context["subject"].json(),
+ blocks=blocks,
+ ).build()
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def ack_mfa_required_submission_event(
+ ack: Ack, mfa_enabled: bool, challenge_url: str | None = None
+) -> None:
+ """Handles the add engagement submission event acknowledgement."""
+
+ blocks = []
+
+ if mfa_enabled:
+ blocks.extend(
+ [
+ Section(
+ text="To complete this action, you need to verify your identity through Multi-Factor Authentication (MFA).\n\n"
+ "Please click the verify button to open the MFA verification page."
+ ),
+ Actions(
+ elements=[
+ Button(
+ text="đ Verify",
+ action_id="button-link",
+ style="primary",
+ url=challenge_url,
+ )
+ ]
+ ),
+ ]
+ )
+ else:
+ blocks.append(
+ Section(
+ text="â
No additional verification required. You can proceed with the confirmation."
+ )
+ )
+
+ blocks.extend(
+ [
+ Divider(),
+ Context(
+ elements=[
+ MarkdownText(
+ text="đĄ This step protects against unauthorized confirmation if your account is compromised."
+ )
+ ]
+ ),
+ ]
+ )
+
+ modal = Modal(
+ title="Verify Your Identity",
+ close="Cancel",
+ blocks=blocks,
+ ).build()
+
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ SignalEngagementActions.approve_submit,
+ middleware=[
+ action_context_middleware,
+ db_middleware,
+ user_middleware,
+ ],
+)
+def handle_engagement_submission_event(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ user: DispatchUser,
+) -> None:
+ """Handles the add engagement submission event."""
+ metadata = json.loads(body["view"]["private_metadata"])
+ engaged_user: str = metadata["user"]
+
+ # we reassign for clarity
+ user_who_clicked_button = user
+
+ engagement = signal_service.get_signal_engagement(
+ db_session=db_session,
+ signal_engagement_id=int(metadata["engagement_id"]),
+ )
+
+ mfa_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=int(context["subject"].project_id), plugin_type="auth-mfa"
+ )
+ if not mfa_plugin:
+ log.error("Unable to engage user. No enabled MFA plugin found.")
+ return
+
+ require_mfa = engagement.require_mfa if engagement else True
+ mfa_enabled = True if mfa_plugin and require_mfa else False
+
+ challenge, challenge_url = mfa_plugin.instance.create_mfa_challenge(
+ action="signal-engagement-confirmation",
+ current_user=user,
+ db_session=db_session,
+ project_id=int(context["subject"].project_id),
+ )
+
+ ack_mfa_required_submission_event(ack=ack, mfa_enabled=mfa_enabled, challenge_url=challenge_url)
+
+ case = case_service.get(db_session=db_session, case_id=int(metadata["id"]))
+ signal_instance = (
+ signal_service.get_signal_instance(
+ db_session=db_session, signal_instance_id=UUID(metadata["signal_instance_id"])
+ )
+ if metadata["signal_instance_id"]
+ else None
+ )
+ # Get context provided by the user
+ context_from_user = body["view"]["state"]["values"][DefaultBlockIds.description_input][
+ DefaultBlockIds.description_input
+ ]["value"]
+
+ # wait for the mfa challenge
+ response = mfa_plugin.instance.wait_for_challenge(
+ challenge_id=challenge.challenge_id,
+ db_session=db_session,
+ )
+ if response == MfaChallengeStatus.APPROVED:
+ send_engagement_response(
+ case=case,
+ client=client,
+ context_from_user=context_from_user,
+ db_session=db_session,
+ engagement=engagement,
+ engaged_user=engaged_user,
+ response=response,
+ signal_instance=signal_instance,
+ user=user_who_clicked_button,
+ view_id=body["view"]["id"],
+ thread_id=context["subject"].thread_id,
+ )
+ db_session.commit()
+ return
+ else:
+ return send_engagement_response(
+ case=case,
+ client=client,
+ context_from_user=context_from_user,
+ db_session=db_session,
+ engagement=engagement,
+ engaged_user=engaged_user,
+ response=response,
+ signal_instance=signal_instance,
+ user=user_who_clicked_button,
+ view_id=body["view"]["id"],
+ thread_id=context["subject"].thread_id,
+ )
+
+
+def send_engagement_response(
+ case: Case,
+ client: WebClient,
+ context_from_user: str,
+ db_session: Session,
+ engagement: SignalEngagement | None,
+ engaged_user: str,
+ response: str,
+ signal_instance: SignalInstance | None,
+ user: DispatchUser,
+ view_id: str,
+ thread_id: str,
+):
+ if response == MfaChallengeStatus.APPROVED:
+ title = "Approve"
+ text = "Confirmation... Success!"
+ message_text = f"{engaged_user} provided the following context:\n```{context_from_user}```"
+ engagement_status = SignalEngagementStatus.approved
+ else:
+ title = "MFA Failed"
+ engagement_status = SignalEngagementStatus.denied
+
+ if response == MfaChallengeStatus.EXPIRED:
+ text = "Confirmation failed, the MFA request timed out. Please try again and complete the MFA verification within the given time frame."
+ elif response == MfaChallengeStatus.DENIED:
+ text = f"We couldn't find {engaged_user} in our MFA system."
+ else:
+ text = "Confirmation failed. You must accept the MFA prompt."
+
+ message_text = f":warning: {engaged_user} attempted to confirm the behavior as expected, but we ran into an error during MFA validation (`{response}`)\n\n{text}\n\n *Context Provided* \n```{context_from_user}```\n\n"
+
+ send_success_modal(
+ client=client,
+ view_id=view_id,
+ title=title,
+ message=text,
+ )
+ client.chat_postMessage(
+ text=message_text,
+ channel=case.conversation.channel_id,
+ thread_ts=case.conversation.thread_id,
+ )
+
+ if response == MfaChallengeStatus.APPROVED:
+ # We only update engagement message (which removes Confirm/Deny button) for success
+ # this allows the user to retry the confirmation if the MFA check failed
+ if not engagement:
+ # assume the message is from a manual MFA challenge
+ blocks = create_manual_engagement_message(
+ case=case,
+ channel_id=case.conversation.channel_id,
+ user_email=engaged_user,
+ engagement_status=engagement_status,
+ )
+ else:
+ blocks = create_signal_engagement_message(
+ case=case,
+ channel_id=case.conversation.channel_id,
+ engagement=engagement,
+ signal_instance=signal_instance,
+ user_email=engaged_user,
+ engagement_status=engagement_status,
+ )
+ if signal_instance:
+ client.chat_update(
+ blocks=blocks,
+ channel=case.conversation.channel_id,
+ ts=signal_instance.engagement_thread_ts,
+ )
+ resolve_case(
+ case=case,
+ channel_id=case.conversation.channel_id,
+ client=client,
+ db_session=db_session,
+ context_from_user=context_from_user,
+ user=user,
+ )
+ else:
+ client.chat_update(
+ blocks=blocks,
+ channel=case.conversation.channel_id,
+ ts=thread_id,
+ )
+
+
+def resolve_case(
+ case: Case,
+ channel_id: str,
+ client: WebClient,
+ db_session: Session,
+ context_from_user: str,
+ user: DispatchUser,
+) -> None:
+ previous_case = CaseRead.from_orm(case)
+ case_flows.case_status_transition_flow_dispatcher(
+ case=case,
+ current_status=CaseStatus.closed,
+ db_session=db_session,
+ previous_status=case.status,
+ organization_slug=case.project.organization.slug,
+ )
+ case_in = CaseUpdate(
+ title=case.title,
+ resolution_reason=CaseResolutionReason.user_acknowledged,
+ resolution=context_from_user,
+ visibility=case.visibility,
+ status=CaseStatus.closed,
+ closed_at=datetime.utcnow(),
+ )
+ case = case_service.update(db_session=db_session, case=case, case_in=case_in, current_user=user)
+
+ case_flows.case_update_flow(
+ case_id=case.id,
+ previous_case=previous_case,
+ db_session=db_session,
+ reporter_email=case.reporter.individual.email if case.reporter else None,
+ assignee_email=case.assignee.individual.email if case.assignee else None,
+ organization_slug=case.project.organization.slug,
+ )
+
+ blocks = create_case_message(case=case, channel_id=channel_id)
+ client.chat_update(
+ blocks=blocks, ts=case.conversation.thread_id, channel=case.conversation.channel_id
+ )
+
+
+@app.action(
+ SignalEngagementActions.deny,
+ middleware=[engagement_button_context_middleware, db_middleware, user_middleware],
+)
+def engagement_button_deny_click(
+ ack: Ack,
+ body: dict,
+ context: BoltContext,
+ client: WebClient,
+ user: DispatchUser,
+):
+ ack()
+ engaged_user = context["subject"].user
+
+ role = user.get_organization_role(organization_slug=context["subject"].organization_slug)
+ if engaged_user != user.email and role not in (
+ UserRoles.admin,
+ UserRoles.owner,
+ ):
+ modal = Modal(
+ title="Not Authorized",
+ close="Close",
+ blocks=[
+ Section(
+ text=f"Sorry, only {engaged_user} or Dispatch administrators can deny this signal."
+ )
+ ],
+ ).build()
+ return client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+ modal = Modal(
+ submit="Submit",
+ close="Cancel",
+ title="Not expected",
+ callback_id=SignalEngagementActions.deny_submit,
+ private_metadata=context["subject"].json(),
+ blocks=[
+ Section(text="Confirm that this is not expected and that the activity is suspicious."),
+ Divider(),
+ description_input(label="Additional Context", optional=False),
+ ],
+ ).build()
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def ack_engagement_deny_submission_event(ack: Ack) -> None:
+ """Handles the deny engagement submission event acknowledgement."""
+ modal = Modal(
+ title="Confirm",
+ close="Close",
+ blocks=[Section(text="Confirming event is not expected...")],
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ SignalEngagementActions.deny_submit,
+ middleware=[
+ action_context_middleware,
+ db_middleware,
+ user_middleware,
+ ],
+)
+def handle_engagement_deny_submission_event(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ user: DispatchUser,
+) -> None:
+ """Handles the add engagement submission event."""
+ ack_engagement_deny_submission_event(ack=ack)
+
+ metadata = json.loads(body["view"]["private_metadata"])
+ engaged_user = metadata["user"]
+ case = case_service.get(db_session=db_session, case_id=metadata["id"])
+ signal_instance = (
+ signal_service.get_signal_instance(
+ db_session=db_session, signal_instance_id=UUID(metadata["signal_instance_id"])
+ )
+ if metadata["signal_instance_id"]
+ else None
+ )
+
+ engagement = signal_service.get_signal_engagement(
+ db_session=db_session,
+ signal_engagement_id=int(metadata["engagement_id"]),
+ )
+ send_success_modal(
+ client=client,
+ view_id=body["view"]["id"],
+ title="Confirm",
+ message="Event has been confirmed as not expected.",
+ )
+
+ context_from_user = body["view"]["state"]["values"][DefaultBlockIds.description_input][
+ DefaultBlockIds.description_input
+ ]["value"]
+
+ client.chat_postMessage(
+ text=f":warning: {engaged_user} confirmed the behavior was *not expected*.\n\n *Context Provided* \n```{context_from_user}```",
+ channel=case.conversation.channel_id,
+ thread_ts=case.conversation.thread_id,
+ )
+
+ thread_ts = (
+ signal_instance.engagement_thread_ts if signal_instance else context["subject"].thread_id
+ )
+ blocks = create_signal_engagement_message(
+ case=case,
+ channel_id=case.conversation.channel_id,
+ engagement=engagement,
+ signal_instance=signal_instance,
+ user_email=user.email,
+ engagement_status=SignalEngagementStatus.denied,
+ )
+ client.chat_update(
+ blocks=blocks,
+ channel=case.conversation.channel_id,
+ ts=thread_ts,
+ )
+
+
+@app.action(
+ CaseNotificationActions.investigate, middleware=[button_context_middleware, db_middleware]
+)
+def investigate_button_click(
+ ack: Ack, body: dict, db_session: Session, context: BoltContext, client: WebClient
+):
+ ack()
+
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+
+ if not case:
+ log.error("Unable to open an investigation. Case not found.")
+ client.chat_postMessage(
+ text=":warning: Unable to open an investigation. Case not found.",
+ channel=case.conversation.channel_id,
+ thread_ts=case.conversation.thread_id,
+ )
+ return
+
+ investigation_plugin = plugin_service.get_active_instance(
+ db_session=db_session,
+ project_id=case.project.id,
+ plugin_type="investigation-tooling",
+ )
+
+ if not investigation_plugin:
+ log.error("Unable to open an investigation. No investigation tooling plugin found.")
+ client.chat_postMessage(
+ text=f":warning: Unable to open an investigation. Investigation tooling plugin is not enabled for project {case.project.name}.",
+ channel=case.conversation.channel_id,
+ thread_ts=case.conversation.thread_id,
+ )
+ return
+
+ result = investigation_plugin.instance.create_investigation(case=case)
+
+ if not result:
+ log.error(
+ "Unable to open an investigation. Investigation tooling plugin failed to create an investigation."
+ )
+ client.chat_postMessage(
+ text=":warning: Unable to open an investigation. Investigation tooling plugin failed to create an investigation.",
+ channel=case.conversation.channel_id,
+ thread_ts=case.conversation.thread_id,
+ )
+ return
+
+ client.chat_postMessage(
+ text=f":mag: {result}",
+ channel=case.conversation.channel_id,
+ thread_ts=case.conversation.thread_id,
+ )
diff --git a/src/dispatch/plugins/dispatch_slack/case/messages.py b/src/dispatch/plugins/dispatch_slack/case/messages.py
new file mode 100644
index 000000000000..16803a846604
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/case/messages.py
@@ -0,0 +1,584 @@
+import logging
+from typing import NamedTuple
+
+from blockkit import (
+ Actions,
+ Button,
+ Context,
+ Divider,
+ MarkdownText,
+ Message,
+ Section,
+)
+from blockkit.surfaces import Block
+from dispatch.plugins.dispatch_slack.service import create_genai_message_metadata_blocks
+from sqlalchemy.orm import Session
+
+from dispatch.ai import service as ai_service
+from dispatch.case import service as case_service
+from dispatch.case.enums import CaseStatus
+from dispatch.case.models import Case
+from dispatch.config import DISPATCH_UI_URL
+from dispatch.plugin import service as plugin_service
+from dispatch.plugins.dispatch_slack.case.enums import (
+ CaseNotificationActions,
+ SignalEngagementActions,
+ SignalNotificationActions,
+)
+from dispatch.plugins.dispatch_slack.config import (
+ MAX_SECTION_TEXT_LENGTH,
+)
+from dispatch.plugins.dispatch_slack.models import (
+ CaseSubjects,
+ EngagementMetadata,
+ SignalSubjects,
+ SubjectMetadata,
+)
+from dispatch.signal import service as signal_service
+from dispatch.signal.enums import SignalEngagementStatus
+from dispatch.signal.models import (
+ SignalEngagement,
+ SignalInstance,
+)
+
+log = logging.getLogger(__name__)
+
+
+def map_priority_color(color: str) -> str:
+ """Maps a priority color to its corresponding emoji symbol."""
+ if not color:
+ return ""
+
+ # TODO we should probably restrict the possible colors to make this work
+ priority_color_mapping = {
+ "#9e9e9e": "âĒ",
+ "#8bc34a": "đĸ",
+ "#ffeb3b": "đĄ",
+ "#ff9800": "đ ",
+ "#f44336": "đ´",
+ "#9c27b0": "đŖ",
+ }
+
+ return priority_color_mapping.get(color.lower(), "")
+
+
+def create_case_message(case: Case, channel_id: str) -> list[Block]:
+ """
+ Creates a Slack message for a given case.
+
+ Args:
+ case (Case): The case object containing details to be included in the message.
+ channel_id (str): The ID of the Slack channel where the message will be sent.
+
+ Returns:
+ list[Block]: A list of Block objects representing the structure of the Slack message.
+ """
+ priority_color = map_priority_color(color=case.case_priority.color)
+
+ title_prefix = "*Detection*" if case.signal_instances else "*Title*"
+ title = f"{title_prefix} \n {case.title}."
+
+ fields = [
+ f"*Assignee* \n {case.assignee.individual.email}",
+ f"*Status* \n {case.status}",
+ f"*Case Type* \n {case.case_type.name}",
+ f"*Case Priority* \n {priority_color} {case.case_priority.name}",
+ ]
+
+ if case.signal_instances:
+ if variant := case.signal_instances[0].signal.variant:
+ fields.append(f"*Variant* \n {variant}")
+
+ case_description = (
+ case.description if len(case.description) <= 2500 else f"{case.description[:2500]}..."
+ )
+
+ blocks = [
+ Context(elements=[MarkdownText(text=f"* {case.name} - Case Details*")]),
+ Section(
+ text=title,
+ accessory=Button(
+ text="View in Dispatch",
+ action_id="button-link",
+ url=f"{DISPATCH_UI_URL}/{case.project.organization.slug}/cases/{case.name}",
+ ),
+ ),
+ Section(text=f"*Description* \n {case_description}"),
+ Section(fields=fields),
+ Section(text="*Actions*"),
+ ]
+
+ button_metadata = SubjectMetadata(
+ type=CaseSubjects.case,
+ organization_slug=case.project.organization.slug,
+ id=case.id,
+ project_id=case.project.id,
+ channel_id=channel_id,
+ ).json()
+
+ if case.has_channel:
+ action_buttons = [
+ Button(
+ text=":slack: Case Channel",
+ style="primary",
+ url=case.conversation.weblink if case.conversation else "",
+ )
+ ]
+ blocks.extend([Actions(elements=action_buttons)])
+ elif case.status == CaseStatus.escalated:
+ blocks.extend(
+ [
+ Actions(
+ elements=[
+ Button(
+ text=":siren: Join Incident",
+ action_id=CaseNotificationActions.join_incident,
+ style="primary",
+ value=button_metadata,
+ )
+ ]
+ )
+ ]
+ )
+ elif case.status == CaseStatus.closed:
+ blocks.extend(
+ [
+ Section(text=f"*Resolution reason* \n {case.resolution_reason}"),
+ Section(
+ text=f"*Resolution description* \n {case.resolution}"[:MAX_SECTION_TEXT_LENGTH]
+ ),
+ ]
+ )
+ if case.resolved_by:
+ blocks.append(Section(text=f"*Resolved by* \n {case.resolved_by.email}"))
+ blocks.append(
+ Actions(
+ elements=[
+ Button(
+ text="Re-open",
+ action_id=CaseNotificationActions.reopen,
+ style="primary",
+ value=button_metadata,
+ )
+ ]
+ ),
+ )
+ else:
+ action_buttons = [
+ Button(
+ text=":white_check_mark: Resolve",
+ action_id=CaseNotificationActions.resolve,
+ value=button_metadata,
+ ),
+ Button(
+ text=":pencil: Edit",
+ action_id=CaseNotificationActions.edit,
+ value=button_metadata,
+ ),
+ Button(
+ text=":slack: Create Channel",
+ action_id=CaseNotificationActions.migrate,
+ value=button_metadata,
+ ),
+ Button(
+ text=":fire: Escalate",
+ action_id=CaseNotificationActions.escalate,
+ value=button_metadata,
+ ),
+ ]
+ if case.status == CaseStatus.new:
+ action_buttons.insert(
+ 0,
+ Button(
+ text=":mag: Triage",
+ action_id=CaseNotificationActions.triage,
+ value=button_metadata,
+ ),
+ )
+ blocks.extend([Actions(elements=action_buttons)])
+
+ return Message(blocks=blocks).build()["blocks"]
+
+
+class EntityGroup(NamedTuple):
+ value: str
+ related_case_count: int
+
+
+def create_signal_message(case_id: int, channel_id: str, db_session: Session) -> list[Message]:
+ """
+ Creates a signal message for a given case.
+
+ This function generates a signal message for a specific case by fetching the first signal instance
+ associated with the case and creating metadata blocks for the message.
+
+ Args:
+ case_id (int): The ID of the case for which to create the signal message.
+ channel_id (str): The ID of the Slack channel where the message will be sent.
+ db_session (Session): The database session to use for querying signal instances.
+
+ Returns:
+ list[Message]: A list of Message objects representing the structure of the Slack messages.
+ """
+ # we fetch the first instance to get the organization slug and project id
+ instances = signal_service.get_instances_in_case(db_session=db_session, case_id=case_id)
+ (first_instance_id, first_instance_signal) = instances.first()
+
+ case = case_service.get(db_session=db_session, case_id=case_id)
+
+ # we create the signal metadata blocks
+ signal_metadata_blocks = [
+ Section(text="*Alerts*"),
+ Section(
+ text=f"We observed <{DISPATCH_UI_URL}/{first_instance_signal.project.organization.slug}/cases/{case.name}/signal/{first_instance_id}|{instances.count()} alert(s)> in this case. The first alert for this case can be seen below."
+ ),
+ ]
+
+ return Message(blocks=signal_metadata_blocks).build()["blocks"]
+
+
+def create_action_buttons_message(
+ case: Case, channel_id: str, db_session: Session
+) -> list[Message]:
+ """
+ Creates a message with action buttons for a given case.
+
+ This function generates a message containing action buttons for a specific case by fetching the first signal instance
+ associated with the case and creating metadata blocks for the message.
+
+ Args:
+ case_id (int): The ID of the case for which to create the action buttons message.
+ channel_id (str): The ID of the Slack channel where the message will be sent.
+ db_session (Session): The database session to use for querying signal instances.
+
+ Returns:
+ list[Message]: A list of Message objects representing the structure of the Slack messages.
+ """
+ # we fetch the first instance to get the organization slug and project id
+ instances = signal_service.get_instances_in_case(db_session=db_session, case_id=case.id)
+ (first_instance_id, first_instance_signal) = instances.first()
+
+ organization_slug = first_instance_signal.project.organization.slug
+ project_id = first_instance_signal.project.id
+ button_metadata = SubjectMetadata(
+ type=SignalSubjects.signal_instance,
+ organization_slug=organization_slug,
+ id=str(first_instance_id),
+ project_id=project_id,
+ channel_id=channel_id,
+ ).json()
+
+ case_button_metadata = SubjectMetadata(
+ type=CaseSubjects.case,
+ organization_slug=organization_slug,
+ id=case.id,
+ project_id=project_id,
+ channel_id=channel_id,
+ ).json()
+
+ # we create the response plan and the snooze buttons
+ elements = []
+
+ if first_instance_signal.external_url:
+ elements.append(
+ Button(
+ text="đ View Response Plan",
+ action_id="button-link",
+ url=first_instance_signal.external_url,
+ )
+ )
+
+ elements.extend(
+ [
+ Button(
+ text="đ¤ Snooze Alert",
+ action_id=SignalNotificationActions.snooze,
+ value=case_button_metadata,
+ ),
+ Button(
+ text="đ¤ User MFA Challenge",
+ action_id=CaseNotificationActions.user_mfa,
+ value=case_button_metadata,
+ ),
+ ]
+ )
+
+ investigation_plugin = plugin_service.get_active_instance(
+ db_session=db_session,
+ project_id=case.project.id,
+ plugin_type="investigation-tooling",
+ )
+
+ if investigation_plugin:
+ elements.extend(
+ Button(
+ text=":mag: Investigate",
+ action_id=CaseNotificationActions.investigate,
+ value=button_metadata,
+ ),
+ )
+
+ # we create the signal metadata blocks
+ signal_metadata_blocks = [
+ Divider(),
+ Section(text="*Actions*"),
+ Actions(elements=elements),
+ Divider(),
+ ]
+
+ return Message(blocks=signal_metadata_blocks).build()["blocks"]
+
+
+def create_genai_signal_analysis_message(
+ case: Case,
+ db_session: Session,
+) -> tuple[str | dict[str, str], list[Block]]:
+ """
+ Generates a GenAI signal analysis message for a given case.
+
+ This function generates a GenAI signal analysis message for a specific case by creating metadata blocks
+ for the message and attempting to generate a case signal summary using the AI service.
+
+ Args:
+ case (Case): The case object for which to create the GenAI signal analysis message.
+ db_session (Session): The database session to use for querying and generating the case signal summary.
+
+ Returns:
+ tuple[str | dict[str, str], list[Block]]: A tuple containing the GenAI analysis message (either as a string or a dictionary)
+ and the updated list of signal metadata blocks with the GenAI analysis section appended.
+ """
+ signal_metadata_blocks = []
+ try:
+ response = ai_service.generate_case_signal_summary(case, db_session)
+ if s := response.summary:
+ summary = {
+ "Summary": s.summary,
+ "Historical Summary": s.historical_summary,
+ "Critical Analysis": s.critical_analysis,
+ "Recommendation": s.recommendation,
+ }
+ elif response.error_message:
+ summary = response.error_message
+ else:
+ summary = (
+ "We encountered an error while generating the GenAI analysis summary for this case."
+ )
+ except Exception as e:
+ summary = (
+ "We encountered an error while generating the GenAI analysis summary for this case."
+ )
+ log.warning(f"Error generating GenAI analysis summary for case {case.id}. Error: {e}")
+ return summary, create_genai_message_metadata_blocks(
+ title="GenAI Alert Analysis", blocks=signal_metadata_blocks, message=summary
+ )
+
+
+def create_signal_engagement_message(
+ case: Case,
+ channel_id: str,
+ engagement: SignalEngagement | None,
+ signal_instance: SignalInstance | None,
+ user_email: str,
+ engagement_status: SignalEngagementStatus = SignalEngagementStatus.new,
+) -> list[Block]:
+ """
+ Generate a list of blocks for a signal engagement message.
+
+ Args:
+ case (Case): The case object related to the signal instance.
+ channel_id (str): The ID of the Slack channel where the message will be sent.
+ message (str): Additional context information to include in the message.
+ signal_instance (SignalInstance): The signal instance object related to the engagement.
+ user_email (str): The email of the user being engaged.
+ engagement (SignalEngagement): The engagement object.
+
+ Returns:
+ list[Block]: A list of blocks representing the message structure for the engagement message.
+ """
+ button_metadata = EngagementMetadata(
+ id=case.id,
+ type=CaseSubjects.case,
+ organization_slug=case.project.organization.slug,
+ project_id=case.project.id,
+ channel_id=channel_id,
+ signal_instance_id=str(signal_instance.id) if signal_instance else "",
+ engagement_id=engagement.id if engagement else 0,
+ user=user_email,
+ ).json()
+
+ username, _ = user_email.split("@")
+ blocks = [
+ Section(
+ text=f"{engagement.message if engagement and engagement.message else 'No context provided for this alert.'}"
+ ),
+ ]
+
+ if engagement_status == SignalEngagementStatus.new:
+ blocks.extend(
+ [
+ Section(
+ text="Can you please confirm this was you and whether the behavior was expected?"
+ ),
+ Actions(
+ elements=[
+ Button(
+ text="Confirm",
+ style="primary",
+ action_id=SignalEngagementActions.approve,
+ value=button_metadata,
+ ),
+ Button(
+ text="Deny",
+ style="danger",
+ action_id=SignalEngagementActions.deny,
+ value=button_metadata,
+ ),
+ ]
+ ),
+ ]
+ )
+
+ elif engagement_status == SignalEngagementStatus.approved:
+ blocks.extend(
+ [
+ Section(text=f":white_check_mark: @{username} confirmed the behavior as expected."),
+ ]
+ )
+ else:
+ blocks.extend(
+ [
+ Section(
+ text=f":warning: @{username} denied the behavior as expected. Please investigate the case and escalate to incident if necessary."
+ ),
+ ]
+ )
+
+ return Message(blocks=blocks).build()["blocks"]
+
+
+def create_manual_engagement_message(
+ case: Case,
+ channel_id: str,
+ user_email: str,
+ engagement_status: SignalEngagementStatus = SignalEngagementStatus.new,
+ user_id: str = "",
+ engagement: str = "",
+ thread_ts: str = None,
+) -> list[Block]:
+ """
+ Generate a list of blocks for a manual engagement message.
+
+ Args:
+ case (Case): The case object related to the engagement.
+ channel_id (str): The ID of the Slack channel where the message will be sent.
+ engagement_message (str): The engagement text.
+ user_email (str): The email of the user being engaged.
+
+ Returns:
+ list[Block]: A list of blocks representing the message structure for the engagement message.
+ """
+ button_metadata = EngagementMetadata(
+ id=case.id,
+ type=CaseSubjects.case,
+ organization_slug=case.project.organization.slug,
+ project_id=case.project.id,
+ channel_id=channel_id,
+ signal_instance_id="",
+ engagement_id=0,
+ user=user_email,
+ thread_id=thread_ts,
+ ).json()
+
+ username, _ = user_email.split("@")
+ if engagement:
+ blocks = [
+ Section(text=f"<@{user_id}>: {engagement}"),
+ ]
+ else:
+ blocks = []
+
+ if engagement_status == SignalEngagementStatus.new:
+ blocks.extend(
+ [
+ Actions(
+ elements=[
+ Button(
+ text="Confirm",
+ style="primary",
+ action_id=SignalEngagementActions.approve,
+ value=button_metadata,
+ ),
+ Button(
+ text="Deny",
+ style="danger",
+ action_id=SignalEngagementActions.deny,
+ value=button_metadata,
+ ),
+ ]
+ ),
+ ]
+ )
+
+ elif engagement_status == SignalEngagementStatus.approved:
+ blocks.extend(
+ [
+ Section(text=f":white_check_mark: @{username} confirmed the behavior as expected."),
+ ]
+ )
+ else:
+ blocks.extend(
+ [
+ Section(
+ text=f":warning: @{username} denied the behavior as expected. Please investigate the case and escalate to incident if necessary."
+ ),
+ ]
+ )
+
+ return Message(blocks=blocks).build()["blocks"]
+
+
+def create_case_thread_migration_message(channel_weblink: str) -> list[Block]:
+ blocks = [
+ Context(
+ elements=[
+ f"This conversation has been migrated to a dedicated Case channel. All future updates and discussions will take place <{channel_weblink}|here>."
+ ]
+ ),
+ Divider(),
+ ]
+
+ return Message(blocks=blocks).build()["blocks"]
+
+
+def create_case_channel_migration_message(thread_weblink: str) -> list[Block]:
+ blocks = [
+ Context(
+ elements=[
+ f"Migrated Case conversation from the <{thread_weblink}|original Case thread>."
+ ]
+ ),
+ Divider(),
+ ]
+
+ return Message(blocks=blocks).build()["blocks"]
+
+
+def create_case_user_not_in_slack_workspace_message(user_email: str) -> list[Block]:
+ """
+ Creates a message indicating that a user identified in an alert is not a member of the Slack workspace.
+
+ Args:
+ user_email (str): The email of the user who is not in the Slack workspace.
+
+ Returns:
+ list[Block]: A list of blocks representing the message structure.
+ """
+ blocks = [
+ Context(
+ elements=[
+ f"Individual identified in the alert ({user_email}) is not a member of the Slack workspace. Please reach out to them via email or other means to resolve the alert."
+ ]
+ ),
+ ]
+
+ return Message(blocks=blocks).build()["blocks"]
diff --git a/src/dispatch/plugins/dispatch_slack/config.py b/src/dispatch/plugins/dispatch_slack/config.py
index 1f9a81197143..ce42fbc1b374 100644
--- a/src/dispatch/plugins/dispatch_slack/config.py
+++ b/src/dispatch/plugins/dispatch_slack/config.py
@@ -1,40 +1,172 @@
-from dispatch.config import config, Secret
-
-
-SLACK_APP_USER_SLUG = config("SLACK_APP_USER_SLUG")
-SLACK_WORKSPACE_NAME = config("SLACK_WORKSPACE_NAME")
-
-SLACK_API_BOT_TOKEN = config("SLACK_API_BOT_TOKEN", cast=Secret)
-SLACK_SIGNING_SECRET = config("SLACK_SIGNING_SECRET", cast=Secret)
-SLACK_USER_ID_OVERRIDE = config("SLACK_USER_ID_OVERRIDE", default=None)
-
-SLACK_COMMAND_MARK_ACTIVE_SLUG = config(
- "SLACK_COMMAND_MARK_ACTIVE_SLUG", default="/dispatch-mark-active"
-)
-SLACK_COMMAND_MARK_STABLE_SLUG = config(
- "SLACK_COMMAND_MARK_STABLE_SLUG", default="/dispatch-mark-stable"
-)
-SLACK_COMMAND_MARK_CLOSED_SLUG = config(
- "SLACK_COMMAND_MARK_CLOSED_SLUG", default="/dispatch-mark-closed"
-)
-SLACK_COMMAND_STATUS_REPORT_SLUG = config(
- "SLACK_COMMAND_STATUS_REPORT_SLUG", default="/dispatch-status-report"
-)
-SLACK_COMMAND_LIST_TASKS_SLUG = config(
- "SLACK_COMMAND_LIST_TASKS_SLUG", default="/dispatch-list-tasks"
-)
-SLACK_COMMAND_LIST_PARTICIPANTS_SLUG = config(
- "SLACK_COMMAND_LIST_PARTICIPANTS_SLUG", default="/dispatch-list-participants"
-)
-SLACK_COMMAND_ASSIGN_ROLE_SLUG = config(
- "SLACK_COMMAND_ASSIGN_ROLE_SLUG", default="/dispatch-assign-role"
-)
-SLACK_COMMAND_UPDATE_INCIDENT_SLUG = config(
- "SLACK_COMMAND_UPDATE_INCIDENT_SLUG", default="/dispatch-update-incident"
-)
-SLACK_COMMAND_ENGAGE_ONCALL_SLUG = config(
- "SLACK_COMMAND_ENGAGE_ONCALL_SLUG", default="/dispatch-engage-oncall"
-)
-SLACK_COMMAND_LIST_RESOURCES_SLUG = config(
- "SLACK_COMMAND_LIST_RESOURCES_SLUG", default="/dispatch-list-resources"
-)
+from pydantic import Field, SecretStr
+from dispatch.config import BaseConfigurationModel
+
+
+MAX_SECTION_TEXT_LENGTH = 2999
+
+
+class SlackConfiguration(BaseConfigurationModel):
+ """Slack configuration description."""
+
+ api_bot_token: SecretStr = Field(
+ title="API Bot Token", description="Token to use when plugin is in http/api mode."
+ )
+ socket_mode_app_token: SecretStr | None = Field(
+ title="Socket Mode App Token", description="Token used when plugin is in socket mode."
+ )
+ signing_secret: SecretStr = Field(
+ title="Signing Secret",
+ description="Secret used to validate incoming messages from the Slack events API.",
+ )
+
+
+class SlackContactConfiguration(SlackConfiguration):
+ """Slack contact configuration."""
+
+ profile_department_field_id: str | None = Field(
+ None,
+ title="Profile Department Field Id",
+ description="Defines the field in the slack profile where Dispatch should fetch the users department.",
+ )
+ profile_team_field_id: str | None = Field(
+ title="Profile Team Field Id",
+ description="Defines the field in the slack profile where Dispatch should fetch a users team.",
+ )
+ profile_weblink_field_id: str | None = Field(
+ title="Profile Weblink Field Id",
+ description="Defines the field in the slack profile where Dispatch should fetch the users weblink.",
+ )
+
+
+class SlackConversationConfiguration(SlackConfiguration):
+ """Slack conversation configuration."""
+
+ app_user_slug: str = Field(
+ title="App User Id",
+ description="Defines the user id of the Slack app in your environment. You can use Slack's tester endpoint auth.test to find the user id.",
+ )
+ private_channels: bool = Field(
+ True,
+ title="Private Channels",
+ description="The visibility of the slack channel created by Dispatch.",
+ )
+ ban_threads: bool = Field(
+ True,
+ title="Ban Threads",
+ description="If enabled, Dispatch will message users reminding them to not use threads in incident channels.",
+ )
+ timeline_event_reaction: str = Field(
+ "stopwatch",
+ title="Timeline Event Reaction",
+ description="Defines the emoji that Dispatch will monitor for adding slack messages to the timeline.",
+ )
+ slack_command_list_tasks: str = Field(
+ "/dispatch-list-tasks",
+ title="List Tasks Command String",
+ description="Defines the string used to list all tasks in an incident. Must match what is defined in Slack.",
+ )
+ slack_command_list_my_tasks: str = Field(
+ "/dispatch-list-my-tasks",
+ title="List My Tasks Command String",
+ description="Defines the string used to list a caller's tasks in an incident. Must match what is defined in Slack.",
+ )
+ slack_command_list_participants: str = Field(
+ "/dispatch-list-participants",
+ title="List Participants Command String",
+ description="Defines the string used to list all incident participants. Must match what is defined in Slack.",
+ )
+ slack_command_list_signals: str = Field(
+ "/dispatch-list-signals",
+ title="List Signals Command String",
+ description="Defines the string used to list all signals for the conversation where the command was ran. Must match what is defined in Slack.",
+ )
+ slack_command_assign_role: str = Field(
+ "/dispatch-assign-role",
+ title="Assign Role Command String",
+ description="Defines the string used to assign a role in an incident. Must match what is defined in Slack.",
+ )
+ slack_command_update_incident: str = Field(
+ "/dispatch-update-incident",
+ title="Update Incident Command String",
+ description="Defines the string used to update an incident. Must match what is defined in Slack.",
+ )
+ slack_command_update_participant: str = Field(
+ "/dispatch-update-participant",
+ title="Update Participant Command String",
+ description="Defines the string used to update a participant. Must match what is defined in Slack.",
+ )
+ slack_command_engage_oncall: str = Field(
+ "/dispatch-engage-oncall",
+ title="Engage Oncall Command String",
+ description="Defines the string used to engage an oncall. Must match what is defined in Slack.",
+ )
+ slack_command_create_case: str = Field(
+ "/dispatch-create-case",
+ title="Create Case Command String",
+ description="Defines the string used to create a case. Must match what is defined in Slack.",
+ )
+ slack_command_update_case: str = Field(
+ "/dispatch-update-case",
+ title="Update Case Command String",
+ description="Defines the string used to update a case. Must match what is defined in Slack.",
+ )
+ slack_command_escalate_case: str = Field(
+ "/dispatch-escalate-case",
+ title="Escalates a case to an incident",
+ description="Only works from within a channel based Case.",
+ )
+ slack_command_report_incident: str = Field(
+ "/dispatch-report-incident",
+ title="Report Incident Command String",
+ description="Defines the string used to report an incident. Must match what is defined in Slack.",
+ )
+ slack_command_report_tactical: str = Field(
+ "/dispatch-report-tactical",
+ title="Report Tactical Command String",
+ description="Defines the string used to create a tactical report. Must match is defined in Slack.",
+ )
+ slack_command_report_executive: str = Field(
+ "/dispatch-report-executive",
+ title="Report Executive Command String",
+ description="Defines the string used to create an executive report. Must match what is defined in Slack.",
+ )
+ slack_command_update_notifications_group: str = Field(
+ "/dispatch-notifications-group",
+ title="Update Notifications Group Command String",
+ description="Defines the string used to update the incident notification group. Must match what is defined in Slack.",
+ )
+ slack_command_add_timeline_event: str = Field(
+ "/dispatch-add-timeline-event",
+ title="Add Timeline Event Command String",
+ description="Defines the string used to add a new event to the timeline. Must match what is defined in Slack",
+ )
+ slack_command_list_incidents: str = Field(
+ "/dispatch-list-incidents",
+ title="List Incidents Command String",
+ description="Defines the string used to list current active and stable incidents, and closed incidents in the last 24 hours. Must match what is defined in Slack.",
+ )
+ slack_command_run_workflow: str = Field(
+ "/dispatch-run-workflow",
+ title="Run Workflow Command String",
+ description="Defines the string used to run a workflow. Must match what is defined in Slack.",
+ )
+ slack_command_list_workflows: str = Field(
+ "/dispatch-list-workflows",
+ title="List Workflows Command String",
+ description="Defines the string used to list all available workflows. Must match what is defined in Slack",
+ )
+ slack_command_create_task: str = Field(
+ "/dispatch-create-task",
+ title="Create Task Command String",
+ description="Defines the string used to create a task. Must match what is defined in Slack.",
+ )
+ slack_command_engage_user: str = Field(
+ "/dispatch-engage-user",
+ title="Engage User Command String",
+ description="Defines the string used to engage a user via MFA prompt. Must match what is defined in Slack.",
+ )
+ slack_command_summary: str = Field(
+ "/dispatch-summary",
+ title="Generate Summary Command String",
+ description="Defines the string used to generate a summary. Must match what is defined in Slack.",
+ )
diff --git a/src/dispatch/plugins/dispatch_slack/decorators.py b/src/dispatch/plugins/dispatch_slack/decorators.py
new file mode 100644
index 000000000000..50ebb3c72c32
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/decorators.py
@@ -0,0 +1,61 @@
+import logging
+import inspect
+
+log = logging.getLogger(__file__)
+
+
+class MessageDispatcher:
+ """Dispatches current message to any registered function: https://github.com/slackapi/bolt-python/issues/786"""
+
+ registered_funcs = []
+
+ def add(self, *args, **kwargs):
+ """Adds a function to the dispatcher."""
+
+ def decorator(func):
+ if not kwargs.get("name"):
+ name = func.__name__
+ else:
+ name = kwargs.pop("name")
+
+ self.registered_funcs.append(
+ {
+ "name": name,
+ "func": func,
+ "subject": kwargs.pop("subject"),
+ "exclude": kwargs.pop("exclude", []),
+ }
+ )
+
+ return decorator
+
+ def dispatch(self, *args, **kwargs):
+ """Runs all registered functions."""
+ for f in self.registered_funcs:
+ # only inject the args the function cares about
+ func_args = inspect.getfullargspec(inspect.unwrap(f["func"])).args
+ injected_args = (kwargs[a] for a in func_args)
+
+ if subject := f["subject"]:
+ if subject_meta := kwargs.get("context", {}).get("subject"):
+ if subject_meta:
+ if subject != subject_meta.type:
+ log.debug(
+ f"Skipping dispatch function due to subject exclusion. ({f['name']})"
+ )
+ continue
+
+ if exclude := f["exclude"]:
+ subtype: str = kwargs.get("body", {}).get("event", {}).get("subtype", "")
+ if subtype in exclude.get("subtype", []):
+ log.debug(f"Skipping dispatched function due to event exclusion. ({f['name']})")
+ continue
+
+ try:
+ f["func"](*injected_args)
+ except Exception as e:
+ log.exception(e)
+ log.debug(f"Failed to run dispatched function {f['name']}. Reason: ({e})")
+
+
+message_dispatcher = MessageDispatcher()
diff --git a/src/dispatch/plugins/dispatch_slack/endpoints.py b/src/dispatch/plugins/dispatch_slack/endpoints.py
new file mode 100644
index 000000000000..9ccdf425c215
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/endpoints.py
@@ -0,0 +1,145 @@
+from http import HTTPStatus
+import json
+
+from fastapi import APIRouter, HTTPException, Depends
+from starlette.background import BackgroundTask
+from starlette.responses import JSONResponse
+from slack_sdk.signature import SignatureVerifier
+from sqlalchemy import true
+from starlette.requests import Request, Headers
+
+from dispatch.database.core import refetch_db_session
+from dispatch.plugin.models import Plugin, PluginInstance
+
+from .bolt import app
+from .case.interactive import configure as case_configure
+from .handler import SlackRequestHandler
+from .incident.interactive import configure as incident_configure
+from .feedback.interactive import configure as feedback_configure
+from .workflow import configure as workflow_configure
+from .messaging import get_incident_conversation_command_message
+
+router = APIRouter()
+
+
+async def get_body(request: Request):
+ return await request.body()
+
+
+async def parse_request(request: Request):
+ request_body_form = await request.form()
+ try:
+ request = json.loads(request_body_form.get("payload"))
+ except Exception:
+ raise HTTPException(
+ status_code=HTTPStatus.BAD_REQUEST, detail=[{"msg": "Bad Request"}]
+ ) from None
+ return request
+
+
+def is_current_configuration(
+ body: bytes, headers: Headers, plugin_instance: PluginInstance
+) -> bool:
+ """Uses the signing secret to determine which configuration to use."""
+
+ verifier = SignatureVerifier(
+ signing_secret=plugin_instance.configuration.signing_secret.get_secret_value()
+ )
+
+ return verifier.is_valid_request(body, headers)
+
+
+def get_request_handler(request: Request, body: bytes, organization: str) -> SlackRequestHandler:
+ """Creates a slack request handler for use by the api."""
+ session = refetch_db_session(organization)
+ plugin_instances: list[PluginInstance] = (
+ session.query(PluginInstance)
+ .join(Plugin)
+ .filter(PluginInstance.enabled == true(), Plugin.slug == "slack-conversation")
+ .all()
+ )
+ for p in plugin_instances:
+ if is_current_configuration(body=body, headers=request.headers, plugin_instance=p):
+ case_configure(p.configuration)
+ feedback_configure(p.configuration)
+ incident_configure(p.configuration)
+ workflow_configure(p.configuration)
+ app._configuration = p.configuration
+ app._token = p.configuration.api_bot_token.get_secret_value()
+ app._signing_secret = p.configuration.signing_secret.get_secret_value()
+ session.close()
+ return SlackRequestHandler(app)
+
+ session.close()
+ raise HTTPException(
+ status_code=HTTPStatus.FORBIDDEN.value, detail=[{"msg": "Invalid request signature"}]
+ )
+
+
+@router.post(
+ "/slack/event",
+)
+async def slack_events(request: Request, organization: str, body: bytes = Depends(get_body)):
+ """Handle all incoming Slack events."""
+
+ handler = get_request_handler(request=request, body=body, organization=organization)
+ try:
+ body_json = json.loads(body)
+ # if we're getting the url verification request,
+ # handle it synchronously so that slack api verification works
+ if body_json.get("type") == "url_verification":
+ return handler.handle(req=request, body=body)
+ except json.JSONDecodeError:
+ pass
+
+ # otherwise, handle it asynchronously
+ task = BackgroundTask(handler.handle, req=request, body=body)
+ return JSONResponse(
+ background=task,
+ content=HTTPStatus.OK.phrase,
+ status_code=HTTPStatus.OK,
+ )
+
+
+@router.post(
+ "/slack/command",
+)
+async def slack_commands(organization: str, request: Request, body: bytes = Depends(get_body)):
+ """Handle all incoming Slack commands."""
+ # We build the background task
+ handler = get_request_handler(request=request, body=body, organization=organization)
+ task = BackgroundTask(
+ handler.handle,
+ req=request,
+ body=body,
+ )
+
+ # We get the name of command that was run
+ request_body_form = await request.form()
+ command = request_body_form._dict.get("command")
+ message = get_incident_conversation_command_message(
+ config=app._configuration, command_string=command
+ )
+ return JSONResponse(
+ background=task,
+ content=message,
+ status_code=HTTPStatus.OK,
+ )
+
+
+@router.post(
+ "/slack/action",
+)
+async def slack_actions(request: Request, organization: str, body: bytes = Depends(get_body)):
+ """Handle all incoming Slack actions."""
+ handler = get_request_handler(request=request, body=body, organization=organization)
+ return handler.handle(req=request, body=body)
+
+
+@router.post(
+ "/slack/menu",
+)
+async def slack_menus(request: Request, organization: str, body: bytes = Depends(get_body)):
+ """Handle all incoming Slack menus."""
+ handler = get_request_handler(request=request, body=body, organization=organization)
+ return handler.handle(req=request, body=body)
diff --git a/src/dispatch/plugins/dispatch_slack/enums.py b/src/dispatch/plugins/dispatch_slack/enums.py
new file mode 100644
index 000000000000..6ca5a17d4851
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/enums.py
@@ -0,0 +1,51 @@
+from dispatch.enums import DispatchEnum
+
+
+class SlackAPIGetEndpoints(DispatchEnum):
+ chat_permalink = "chat.getPermalink"
+ conversations_history = "conversations.history"
+ conversations_replies = "conversations.replies"
+ conversations_info = "conversations.info"
+ team_info = "team.info"
+ users_conversations = "users.conversations"
+ users_info = "users.info"
+ users_lookup_by_email = "users.lookupByEmail"
+ users_profile_get = "users.profile.get"
+ conversations_members = "conversations.members"
+
+
+class SlackAPIPostEndpoints(DispatchEnum):
+ bookmarks_add = "bookmarks.add"
+ canvas_access_set = "canvases.access.set"
+ canvas_create = "canvases.create"
+ canvas_delete = "canvases.delete"
+ canvas_update = "canvases.edit"
+ chat_post_message = "chat.postMessage"
+ chat_post_ephemeral = "chat.postEphemeral"
+ chat_update = "chat.update"
+ conversations_archive = "conversations.archive"
+ conversations_create = "conversations.create"
+ conversations_invite = "conversations.invite"
+ conversations_kick = "conversations.kick"
+ conversations_rename = "conversations.rename"
+ conversations_set_topic = "conversations.setTopic"
+ conversations_set_purpose = "conversations.setPurpose"
+ conversations_unarchive = "conversations.unarchive"
+ pins_add = "pins.add"
+
+
+class SlackAPIErrorCode(DispatchEnum):
+ ALREADY_IN_CHANNEL = "already_in_channel"
+ CHANNEL_NOT_ARCHIVED = "not_archived"
+ CHANNEL_NOT_FOUND = "channel_not_found"
+ FATAL_ERROR = "fatal_error"
+ IS_ARCHIVED = "is_archived" # Channel is archived
+ MISSING_SCOPE = "missing_scope"
+ NOT_IN_CHANNEL = "not_in_channel"
+ ORG_USER_NOT_IN_TEAM = "org_user_not_in_team"
+ USERS_NOT_FOUND = "users_not_found"
+ USER_IN_CHANNEL = "user_in_channel"
+ USER_NOT_FOUND = "user_not_found"
+ USER_NOT_IN_CHANNEL = "user_not_in_channel"
+ VIEW_EXPIRED = "expired_trigger_id" # The provided trigger_id is no longer valid
+ VIEW_NOT_FOUND = "not_found" # Could not find corresponding view for the provided view_id
diff --git a/src/dispatch/plugins/dispatch_slack/events.py b/src/dispatch/plugins/dispatch_slack/events.py
new file mode 100644
index 000000000000..4a16cc49d5a1
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/events.py
@@ -0,0 +1,67 @@
+import logging
+from slack_sdk import WebClient
+
+from dispatch.plugins.base import IPluginEvent
+
+from .service import (
+ get_channel_activity,
+ get_thread_activity,
+)
+
+log = logging.getLogger(__name__)
+
+
+class SlackPluginEvent(IPluginEvent):
+ def fetch_activity(self):
+ raise NotImplementedError
+
+
+class ChannelActivityEvent(SlackPluginEvent):
+ name = "Slack Channel Activity"
+ slug = "slack-channel-activity"
+ description = "Analyzes incident/case activity within a specific Slack channel.\n \
+ By periodically polling channel messages, this gathers insights into the \
+ activity and engagement levels of each participant."
+
+ def fetch_activity(self, client: WebClient, subject: None, oldest: str = "0") -> list:
+ if not subject:
+ log.warning("No subject provided. Cannot fetch channel activity.")
+ elif not subject.conversation:
+ log.info("No conversation provided. Cannot fetch channel activity.")
+ elif not subject.conversation.channel_id:
+ log.info("No channel id provided. Cannot fetch channel activity.")
+ elif subject.conversation.thread_id:
+ log.info(
+ "Subject is a thread, not a channel. Fetching channel activity is not applicable for threads."
+ )
+ else:
+ return get_channel_activity(
+ client, conversation_id=subject.conversation.channel_id, oldest=oldest
+ )
+ return []
+
+
+class ThreadActivityEvent(SlackPluginEvent):
+ name = "Slack Thread Activity"
+ slug = "slack-thread-activity"
+ description = "Analyzes incident/case activity within a specific Slack thread.\n \
+ By periodically polling thread replies, this gathers insights \
+ into the activity and engagement levels of each participant."
+
+ def fetch_activity(self, client: WebClient, subject: None, oldest: str = "0") -> list:
+ if not subject:
+ log.warning("No subject provided. Cannot fetch thread activity.")
+ elif not subject.conversation:
+ log.info("No conversation provided. Cannot fetch thread activity.")
+ elif not subject.conversation.channel_id:
+ log.info("No channel id provided. Cannot fetch thread activity.")
+ elif not subject.conversation.thread_id:
+ log.info("No thread id provided. Cannot fetch thread activity.")
+ else:
+ return get_thread_activity(
+ client,
+ conversation_id=subject.conversation.channel_id,
+ ts=subject.conversation.thread_id,
+ oldest=oldest,
+ )
+ return []
diff --git a/src/dispatch/plugins/dispatch_slack/exceptions.py b/src/dispatch/plugins/dispatch_slack/exceptions.py
new file mode 100644
index 000000000000..050eded8ef81
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/exceptions.py
@@ -0,0 +1,31 @@
+from dispatch.exceptions import DispatchException
+
+
+class BotNotPresentError(DispatchException):
+ code = "bot_not_present"
+ msg_template = "{msg}"
+
+
+class CommandError(DispatchException):
+ code = "command"
+ msg_template = "{msg}"
+
+
+class ContextError(DispatchException):
+ code = "context"
+ msg_template = "{msg}"
+
+
+class EventError(DispatchException):
+ code = "command"
+ msg_template = "{msg}"
+
+
+class RoleError(DispatchException):
+ code = "role"
+ msg_template = "{msg}"
+
+
+class SubmissionError(DispatchException):
+ code = "submission"
+ msg_template = "{msg}"
diff --git a/src/dispatch/plugins/dispatch_slack/feedback/enums.py b/src/dispatch/plugins/dispatch_slack/feedback/enums.py
new file mode 100644
index 000000000000..162e8f839480
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/feedback/enums.py
@@ -0,0 +1,55 @@
+from dispatch.conversation.enums import ConversationButtonActions
+from dispatch.enums import DispatchEnum
+
+
+class IncidentFeedbackNotificationBlockIds(DispatchEnum):
+ feedback_input = "incident-feedback-notification-feedback-input"
+ rating_select = "incident-feedback-notification-rating-select"
+ anonymous_checkbox = "incident-feedback-notification-anonymous-checkbox"
+
+
+class CaseFeedbackNotificationBlockIds(DispatchEnum):
+ feedback_input = "case-feedback-notification-feedback-input"
+ rating_select = "case-feedback-notification-rating-select"
+ anonymous_checkbox = "case-feedback-notification-anonymous-checkbox"
+
+
+class IncidentFeedbackNotificationActionIds(DispatchEnum):
+ feedback_input = "incident-feedback-notification-feedback-input"
+ rating_select = "incident-feedback-notification-rating-select"
+ anonymous_checkbox = "incident-feedback-notification-anonymous-checkbox"
+
+
+class CaseFeedbackNotificationActionIds(DispatchEnum):
+ feedback_input = "case-feedback-notification-feedback-input"
+ rating_select = "case-feedback-notification-rating-select"
+ anonymous_checkbox = "case-feedback-notification-anonymous-checkbox"
+
+
+class IncidentFeedbackNotificationActions(DispatchEnum):
+ submit = "incident-feedback-notification-submit"
+ provide = ConversationButtonActions.feedback_notification_provide
+
+
+class CaseFeedbackNotificationActions(DispatchEnum):
+ submit = "case-feedback-notification-submit"
+ provide = ConversationButtonActions.case_feedback_notification_provide
+
+
+class ServiceFeedbackNotificationBlockIds(DispatchEnum):
+ anonymous_checkbox = "service-feedback-notification-anonymous-checkbox"
+ feedback_input = "service-feedback-notification-feedback-input"
+ hours_input = "service-feedback-notification-hours-input"
+ rating_select = "service-feedback-notification-rating-select"
+
+
+class ServiceFeedbackNotificationActionIds(DispatchEnum):
+ anonymous_checkbox = "service-feedback-notification-anonymous-checkbox"
+ feedback_input = "service-feedback-notification-feedback-input"
+ hours_input = "service-feedback-notification-hours-input"
+ rating_select = "service-feedback-notification-rating-select"
+
+
+class ServiceFeedbackNotificationActions(DispatchEnum):
+ provide = ConversationButtonActions.service_feedback
+ submit = "service-feedback-notification-submit"
diff --git a/src/dispatch/plugins/dispatch_slack/feedback/interactive.py b/src/dispatch/plugins/dispatch_slack/feedback/interactive.py
new file mode 100644
index 000000000000..258fc6fb7840
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/feedback/interactive.py
@@ -0,0 +1,579 @@
+import logging
+import ast
+
+from blockkit import (
+ Checkboxes,
+ Context,
+ Input,
+ MarkdownText,
+ Modal,
+ PlainOption,
+ PlainTextInput,
+ Section,
+)
+from slack_bolt import Ack, BoltContext, Respond
+from slack_sdk.web.client import WebClient
+from sqlalchemy.orm import Session
+from datetime import datetime
+
+from dispatch.auth.models import DispatchUser
+from dispatch.feedback.incident import service as subject_feedback_service
+from dispatch.feedback.incident.enums import FeedbackRating
+from dispatch.feedback.incident.models import FeedbackCreate
+from dispatch.feedback.service import service as feedback_service
+from dispatch.individual import service as individual_service
+from dispatch.feedback.service.models import ServiceFeedbackRating, ServiceFeedbackCreate
+from dispatch.incident import service as incident_service
+from dispatch.case import service as case_service
+from dispatch.participant import service as participant_service
+from dispatch.feedback.service.reminder import service as reminder_service
+from dispatch.plugin import service as plugin_service
+from dispatch.project import service as project_service
+from dispatch.plugins.dispatch_slack.bolt import app
+from dispatch.plugins.dispatch_slack.fields import static_select_block
+from dispatch.plugins.dispatch_slack.middleware import (
+ action_context_middleware,
+ button_context_middleware,
+ db_middleware,
+ modal_submit_middleware,
+ user_middleware,
+)
+
+from .enums import (
+ IncidentFeedbackNotificationActionIds,
+ IncidentFeedbackNotificationActions,
+ IncidentFeedbackNotificationBlockIds,
+ ServiceFeedbackNotificationActionIds,
+ ServiceFeedbackNotificationActions,
+ ServiceFeedbackNotificationBlockIds,
+ CaseFeedbackNotificationActionIds,
+ CaseFeedbackNotificationActions,
+ CaseFeedbackNotificationBlockIds,
+)
+from dispatch.messaging.strings import (
+ ONCALL_SHIFT_FEEDBACK_RECEIVED,
+ MessageType,
+)
+
+log = logging.getLogger(__file__)
+
+
+def configure(config):
+ """Placeholder configure function."""
+ pass
+
+
+# Incident Feedback
+
+
+def rating_select(
+ action_id: str = IncidentFeedbackNotificationActionIds.rating_select,
+ block_id: str = IncidentFeedbackNotificationBlockIds.rating_select,
+ initial_option: dict = None,
+ label: str = "Rate your experience",
+ **kwargs,
+):
+ rating_options = [{"text": r.value, "value": r.value} for r in FeedbackRating]
+ return static_select_block(
+ action_id=action_id,
+ block_id=block_id,
+ initial_option=initial_option,
+ label=label,
+ options=rating_options,
+ placeholder="Select a rating",
+ **kwargs,
+ )
+
+
+def feedback_input(
+ action_id: str = IncidentFeedbackNotificationActionIds.feedback_input,
+ block_id: str = IncidentFeedbackNotificationBlockIds.feedback_input,
+ initial_value: str = None,
+ label: str = "Give us feedback",
+ **kwargs,
+):
+ return Input(
+ block_id=block_id,
+ element=PlainTextInput(
+ action_id=action_id,
+ initial_value=initial_value,
+ multiline=True,
+ placeholder="How would you describe your experience?",
+ ),
+ label=label,
+ **kwargs,
+ )
+
+
+def anonymous_checkbox(
+ action_id: str = IncidentFeedbackNotificationActionIds.anonymous_checkbox,
+ block_id: str = IncidentFeedbackNotificationBlockIds.anonymous_checkbox,
+ initial_value: str = None,
+ label: str = "Check the box if you wish to provide your feedback anonymously",
+ **kwargs,
+):
+ options = [PlainOption(text="Anonymize my feedback", value="anonymous")]
+ return Input(
+ block_id=block_id,
+ element=Checkboxes(options=options, action_id=action_id),
+ label=label,
+ optional=True,
+ **kwargs,
+ )
+
+
+@app.action(
+ IncidentFeedbackNotificationActions.provide,
+ middleware=[button_context_middleware, db_middleware],
+)
+def handle_incident_feedback_direct_message_button_click(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ respond: Respond,
+ db_session: Session,
+ context: BoltContext,
+):
+ """Handles the feedback button in the feedback direct message."""
+ ack()
+ incident = incident_service.get(db_session=db_session, incident_id=context["subject"].id)
+
+ if not incident:
+ message = (
+ "Sorry, you cannot submit feedback about this incident. The incident does not exist."
+ )
+ respond(message=message, ephemeral=True)
+ return
+
+ blocks = [
+ Context(
+ elements=[
+ MarkdownText(text="Use this form to rate your experience about the incident.")
+ ]
+ ),
+ rating_select(),
+ feedback_input(),
+ anonymous_checkbox(),
+ ]
+
+ modal = Modal(
+ title="Incident Feedback",
+ blocks=blocks,
+ submit="Submit",
+ close="Cancel",
+ callback_id=IncidentFeedbackNotificationActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def ack_incident_feedback_submission_event(ack: Ack) -> None:
+ """Handles the feedback submission event acknowledgement."""
+ modal = Modal(
+ title="Incident Feedback", close="Close", blocks=[Section(text="Submitting feedback...")]
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ IncidentFeedbackNotificationActions.submit,
+ middleware=[action_context_middleware, db_middleware, user_middleware, modal_submit_middleware],
+)
+def handle_incident_feedback_submission_event(
+ ack: Ack,
+ body: dict,
+ context: BoltContext,
+ user: DispatchUser,
+ client: WebClient,
+ db_session: Session,
+ form_data: dict,
+):
+ # TODO: handle multiple organizations during submission
+ ack_incident_feedback_submission_event(ack=ack)
+ incident = incident_service.get(db_session=db_session, incident_id=context["subject"].id)
+
+ feedback = form_data.get(IncidentFeedbackNotificationBlockIds.feedback_input, "")
+ rating = form_data.get(IncidentFeedbackNotificationBlockIds.rating_select, {}).get("value")
+
+ feedback_in = FeedbackCreate(
+ rating=rating, feedback=feedback, project=incident.project, incident=incident
+ )
+ feedback = subject_feedback_service.create(db_session=db_session, feedback_in=feedback_in)
+ incident.feedback.append(feedback)
+
+ # we only really care if this exists, if it doesn't then flag is false
+ if not form_data.get(IncidentFeedbackNotificationBlockIds.anonymous_checkbox):
+ participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=context["subject"].id, email=user.email
+ )
+ participant.feedback.append(feedback)
+ db_session.add(participant)
+
+ db_session.add(incident)
+ db_session.commit()
+
+ modal = Modal(
+ title="Incident Feedback",
+ close="Close",
+ blocks=[Section(text="Submitting feedback... Success!")],
+ ).build()
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ view=modal,
+ )
+
+
+# Oncall Shift Feedback
+
+
+def oncall_shift_feedback_rating_select(
+ action_id: str = ServiceFeedbackNotificationActionIds.rating_select,
+ block_id: str = ServiceFeedbackNotificationBlockIds.rating_select,
+ initial_option: dict = None,
+ label: str = "When you consider the whole of the past shift, how much 'mental and emotional effort' did you dedicate toward incident response?",
+ **kwargs,
+):
+ rating_options = [{"text": r.value, "value": r.value} for r in ServiceFeedbackRating]
+ return static_select_block(
+ action_id=action_id,
+ block_id=block_id,
+ initial_option=initial_option,
+ label=label,
+ options=rating_options,
+ placeholder="Select a rating",
+ **kwargs,
+ )
+
+
+def oncall_shift_feedback_hours_input(
+ action_id: str = ServiceFeedbackNotificationActionIds.hours_input,
+ block_id: str = ServiceFeedbackNotificationBlockIds.hours_input,
+ initial_value: str = None,
+ label: str = "Please estimate the number of 'off hours' you spent on incident response tasks during this shift.",
+ placeholder: str = "Provide a number of hours",
+ **kwargs,
+):
+ return Input(
+ block_id=block_id,
+ element=PlainTextInput(
+ action_id=action_id,
+ initial_value=initial_value,
+ multiline=False,
+ placeholder=placeholder,
+ ),
+ label=label,
+ **kwargs,
+ )
+
+
+def oncall_shift_feedback_input(
+ action_id: str = ServiceFeedbackNotificationActionIds.feedback_input,
+ block_id: str = ServiceFeedbackNotificationBlockIds.feedback_input,
+ initial_value: str = None,
+ label: str = "Describe your experience.",
+ **kwargs,
+):
+ return Input(
+ block_id=block_id,
+ element=PlainTextInput(
+ action_id=action_id,
+ initial_value=initial_value,
+ multiline=True,
+ placeholder="How would you describe your experience?",
+ ),
+ optional=True,
+ label=label,
+ **kwargs,
+ )
+
+
+def oncall_shift_feedback_anonymous_checkbox(
+ action_id: str = ServiceFeedbackNotificationActionIds.anonymous_checkbox,
+ block_id: str = ServiceFeedbackNotificationBlockIds.anonymous_checkbox,
+ initial_value: str = None,
+ label: str = "Check this box if you wish to provide your feedback anonymously.",
+ **kwargs,
+):
+ options = [PlainOption(text="Anonymize my feedback", value="anonymous")]
+ return Input(
+ block_id=block_id,
+ element=Checkboxes(options=options, action_id=action_id),
+ label=label,
+ optional=True,
+ **kwargs,
+ )
+
+
+@app.action(
+ ServiceFeedbackNotificationActions.provide,
+ middleware=[db_middleware],
+)
+def handle_oncall_shift_feedback_direct_message_button_click(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ respond: Respond,
+ db_session: Session,
+ context: BoltContext,
+):
+ """Handles the feedback button in the oncall shift feedback direct message."""
+ ack()
+
+ metadata = body["actions"][0]["value"]
+
+ blocks = [
+ Context(
+ elements=[
+ MarkdownText(
+ text="Help us understand the impact of your on-call shift. Use this form to provide feedback."
+ )
+ ]
+ ),
+ oncall_shift_feedback_rating_select(),
+ oncall_shift_feedback_hours_input(),
+ oncall_shift_feedback_input(),
+ oncall_shift_feedback_anonymous_checkbox(),
+ ]
+
+ modal = Modal(
+ title="Oncall Shift Feedback",
+ blocks=blocks,
+ submit="Submit",
+ close="Cancel",
+ callback_id=ServiceFeedbackNotificationActions.submit,
+ private_metadata=metadata,
+ ).build()
+
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def ack_oncall_shift_feedback_submission_event(ack: Ack) -> None:
+ """Handles the oncall shift feedback submission event acknowledgement."""
+ modal = Modal(
+ title="Oncall Shift Feedback",
+ close="Close",
+ blocks=[Section(text="Submitting feedback...")],
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+def ack_with_error(ack: Ack) -> None:
+ """Handles the oncall shift feedback submission form validation."""
+ ack(
+ response_action="errors",
+ errors={
+ ServiceFeedbackNotificationBlockIds.hours_input: "The number of hours field must be numeric"
+ },
+ )
+
+
+@app.view(
+ ServiceFeedbackNotificationActions.submit,
+ middleware=[db_middleware, user_middleware, modal_submit_middleware],
+)
+def handle_oncall_shift_feedback_submission_event(
+ ack: Ack,
+ body: dict,
+ context: BoltContext,
+ user: DispatchUser,
+ client: WebClient,
+ db_session: Session,
+ form_data: dict,
+):
+ hours = form_data.get(ServiceFeedbackNotificationBlockIds.hours_input, "")
+ if not hours.replace(".", "", 1).isdigit():
+ ack_with_error(ack=ack)
+ return
+
+ ack_oncall_shift_feedback_submission_event(ack=ack)
+
+ feedback = form_data.get(ServiceFeedbackNotificationBlockIds.feedback_input, "")
+ rating = form_data.get(ServiceFeedbackNotificationBlockIds.rating_select, {}).get("value")
+
+ # metadata is organization_slug|project_id|schedule_id|shift_end_at|reminder_id|details
+ metadata = body["view"]["private_metadata"].split("|")
+ project_id = metadata[1]
+ schedule_id = metadata[2]
+ shift_end_raw = metadata[3]
+ shift_end_at = (
+ datetime.strptime(shift_end_raw, "%Y-%m-%dT%H:%M:%SZ")
+ if "T" in shift_end_raw
+ else datetime.strptime(shift_end_raw, "%Y-%m-%d %H:%M:%S")
+ )
+ # if there's a reminder id, delete the reminder
+ if len(metadata) > 4:
+ reminder_id = metadata[4]
+ if reminder_id.isnumeric():
+ reminder_service.delete(db_session=db_session, reminder_id=reminder_id)
+ if len(metadata) > 5:
+ # if there are other details, store those
+ details = ast.literal_eval(metadata[5])
+ else:
+ details = None
+
+ individual = (
+ None
+ if form_data.get(ServiceFeedbackNotificationBlockIds.anonymous_checkbox)
+ else individual_service.get_by_email_and_project(
+ db_session=db_session, email=user.email, project_id=project_id
+ )
+ )
+
+ project = project_service.get(db_session=db_session, project_id=project_id)
+
+ service_feedback = ServiceFeedbackCreate(
+ feedback=feedback,
+ hours=hours,
+ individual=individual,
+ rating=ServiceFeedbackRating(rating),
+ schedule=schedule_id,
+ shift_end_at=shift_end_at,
+ project=project,
+ details=details,
+ )
+
+ service_feedback = feedback_service.create(
+ db_session=db_session, service_feedback_in=service_feedback
+ )
+
+ modal = Modal(
+ title="Oncall Shift Feedback",
+ close="Close",
+ blocks=[Section(text="Submitting feedback... Success!")],
+ ).build()
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ view=modal,
+ )
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=project_id, plugin_type="conversation"
+ )
+
+ if plugin:
+ notification_text = "Oncall Shift Feedback Received"
+ notification_template = ONCALL_SHIFT_FEEDBACK_RECEIVED
+ items = [
+ {
+ "shift_end_at": shift_end_at,
+ }
+ ]
+ try:
+ plugin.instance.send_direct(
+ user.email,
+ notification_text,
+ notification_template,
+ MessageType.service_feedback,
+ items=items,
+ )
+ except Exception as e:
+ log.exception(e)
+
+
+@app.action(
+ CaseFeedbackNotificationActions.provide,
+ middleware=[button_context_middleware, db_middleware],
+)
+def handle_case_feedback_direct_message_button_click(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ respond: Respond,
+ db_session: Session,
+ context: BoltContext,
+):
+ """Handles the feedback button in the feedback direct message."""
+ ack()
+ case = case_service.get(db_session=db_session, case_id=context["subject"].id)
+
+ if not case:
+ message = "Sorry, you cannot submit feedback about this case. The case does not exist."
+ respond(message=message, ephemeral=True)
+ return
+
+ blocks = [
+ Context(
+ elements=[MarkdownText(text="Use this form to rate your experience about the case.")]
+ ),
+ rating_select(
+ action_id=CaseFeedbackNotificationActionIds.rating_select,
+ block_id=CaseFeedbackNotificationBlockIds.rating_select,
+ ),
+ feedback_input(
+ action_id=CaseFeedbackNotificationActionIds.feedback_input,
+ block_id=CaseFeedbackNotificationBlockIds.feedback_input,
+ ),
+ anonymous_checkbox(
+ action_id=CaseFeedbackNotificationActionIds.anonymous_checkbox,
+ block_id=CaseFeedbackNotificationBlockIds.anonymous_checkbox,
+ ),
+ ]
+
+ modal = Modal(
+ title="Case Feedback",
+ blocks=blocks,
+ submit="Submit",
+ close="Cancel",
+ callback_id=CaseFeedbackNotificationActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def ack_case_feedback_submission_event(ack: Ack) -> None:
+ """Handles the feedback submission event acknowledgement."""
+ modal = Modal(
+ title="Case Feedback", close="Close", blocks=[Section(text="Submitting feedback...")]
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ CaseFeedbackNotificationActions.submit,
+ middleware=[action_context_middleware, db_middleware, user_middleware, modal_submit_middleware],
+)
+def handle_case_feedback_submission_event(
+ ack: Ack,
+ body: dict,
+ context: BoltContext,
+ user: DispatchUser,
+ client: WebClient,
+ db_session: Session,
+ form_data: dict,
+):
+ # TODO: handle multiple organizations during submission
+ ack_case_feedback_submission_event(ack=ack)
+ case = case_service.get(db_session=db_session, case_id=context["subject"].id)
+
+ feedback = form_data.get(CaseFeedbackNotificationBlockIds.feedback_input, "")
+ rating = form_data.get(CaseFeedbackNotificationBlockIds.rating_select, {}).get("value")
+
+ feedback_in = FeedbackCreate(rating=rating, feedback=feedback, project=case.project, case=case)
+ feedback = subject_feedback_service.create(db_session=db_session, feedback_in=feedback_in)
+ case.feedback.append(feedback)
+
+ # we only really care if this exists, if it doesn't then flag is false
+ if not form_data.get(CaseFeedbackNotificationBlockIds.anonymous_checkbox):
+ participant = participant_service.get_by_case_id_and_email(
+ db_session=db_session, case_id=context["subject"].id, email=user.email
+ )
+ participant.feedback.append(feedback)
+ db_session.add(participant)
+
+ db_session.add(case)
+ db_session.commit()
+
+ modal = Modal(
+ title="Case Feedback",
+ close="Close",
+ blocks=[Section(text="Submitting feedback... Success!")],
+ ).build()
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ view=modal,
+ )
diff --git a/src/dispatch/plugins/dispatch_slack/fields.py b/src/dispatch/plugins/dispatch_slack/fields.py
new file mode 100644
index 000000000000..888e6cd505cd
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/fields.py
@@ -0,0 +1,815 @@
+import logging
+from datetime import timedelta
+from sqlalchemy.orm import Session
+
+from blockkit import (
+ Checkboxes,
+ DatePicker,
+ Input,
+ MultiExternalSelect,
+ MultiStaticSelect,
+ PlainOption,
+ PlainTextInput,
+ StaticSelect,
+)
+
+from dispatch.case.enums import CaseResolutionReason, CaseStatus
+from dispatch.case.priority import service as case_priority_service
+from dispatch.case.severity import service as case_severity_service
+from dispatch.case.type import service as case_type_service
+from dispatch.entity import service as entity_service
+from dispatch.enums import DispatchEnum, Visibility
+from dispatch.incident.enums import IncidentStatus
+from dispatch.incident.priority import service as incident_priority_service
+from dispatch.incident.severity import service as incident_severity_service
+from dispatch.incident.type import service as incident_type_service
+from dispatch.participant.models import Participant
+from dispatch.plugins.dispatch_slack.config import MAX_SECTION_TEXT_LENGTH
+from dispatch.project import service as project_service
+from dispatch.signal.models import Signal
+
+log = logging.getLogger(__name__)
+
+
+class DefaultBlockIds(DispatchEnum):
+ add_user_actions = "add-user-actions"
+ date_picker_input = "date-picker-input"
+ description_input = "description-input"
+ hour_picker_input = "hour-picker-input"
+ minute_picker_input = "minute-picker-input"
+ project_select = "project-select"
+ relative_date_picker_input = "relative-date-picker-input"
+ resolution_input = "resolution-input"
+ timezone_picker_input = "timezone-picker-input"
+ title_input = "title-input"
+
+ # incidents
+ incident_priority_select = "incident-priority-select"
+ incident_status_select = "incident-status-select"
+ incident_severity_select = "incident-severity-select"
+ incident_type_select = "incident-type-select"
+
+ # cases
+ case_priority_select = "case-priority-select"
+ case_resolution_reason_select = "case-resolution-reason-select"
+ case_status_select = "case-status-select"
+ case_severity_select = "case-severity-select"
+ case_type_select = "case-type-select"
+ case_visibility_select = "case-visibility-select"
+ case_assignee_select = "case-assignee-select"
+
+ # entities
+ entity_select = "entity-select"
+
+ # participants
+ participant_select = "participant-select"
+
+ # signals
+ signal_definition_select = "signal-definition-select"
+ extension_request_checkbox = "extension_request_checkbox"
+
+ # tags
+ tags_multi_select = "tag-multi-select"
+
+
+class DefaultActionIds(DispatchEnum):
+ date_picker_input = "date-picker-input"
+ description_input = "description-input"
+ hour_picker_input = "hour-picker-input"
+ minute_picker_input = "minute-picker-input"
+ project_select = "project-select"
+ relative_date_picker_input = "relative-date-picker-input"
+ resolution_input = "resolution-input"
+ timezone_picker_input = "timezone-picker-input"
+ title_input = "title-input"
+
+ # incidents
+ incident_priority_select = "incident-priority-select"
+ incident_status_select = "incident-status-select"
+ incident_severity_select = "incident-severity-select"
+ incident_type_select = "incident-type-select"
+
+ # cases
+ case_resolution_reason_select = "case-resolution-reason-select"
+ case_priority_select = "case-priority-select"
+ case_status_select = "case-status-select"
+ case_severity_select = "case-severity-select"
+ case_type_select = "case-type-select"
+ case_visibility_select = "case-visibility-select"
+
+ # entities
+ entity_select = "entity-select"
+
+ # participants
+ participant_select = "participant-select"
+
+ # signals
+ signal_definition_select = "signal-definition-select"
+ extension_request_checkbox = "extension_request_checkbox"
+
+ # tags
+ tags_multi_select = "tag-multi-select"
+
+
+class TimezoneOptions(DispatchEnum):
+ local = "Local Time (based on your Slack profile)"
+ utc = "UTC"
+
+
+def relative_date_picker_input(
+ action_id: str = DefaultActionIds.relative_date_picker_input,
+ block_id: str = DefaultBlockIds.relative_date_picker_input,
+ initial_option: dict = None,
+ label: str = "Date",
+ **kwargs,
+):
+ """Builds a relative date picker input."""
+ relative_dates = [
+ {"text": "1 hour", "value": str(timedelta(hours=1))},
+ {"text": "3 hours", "value": str(timedelta(hours=3))},
+ {"text": "1 day", "value": str(timedelta(days=1))},
+ {"text": "3 days", "value": str(timedelta(days=3))},
+ {"text": "1 week", "value": str(timedelta(weeks=1))},
+ {"text": "2 weeks", "value": str(timedelta(weeks=2))},
+ ]
+
+ return static_select_block(
+ action_id=action_id,
+ block_id=block_id,
+ initial_option=initial_option,
+ options=relative_dates,
+ label=label,
+ placeholder="Relative Time",
+ **kwargs,
+ )
+
+
+def date_picker_input(
+ action_id: str = DefaultActionIds.date_picker_input,
+ block_id: str = DefaultBlockIds.date_picker_input,
+ initial_date: str = None,
+ label: str = "Date",
+ **kwargs,
+):
+ """Builds a date picker input."""
+ return Input(
+ element=DatePicker(
+ action_id=action_id, initial_date=initial_date, placeholder="Select Date"
+ ),
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def hour_picker_input(
+ action_id: str = DefaultActionIds.hour_picker_input,
+ block_id: str = DefaultBlockIds.hour_picker_input,
+ initial_option: dict = None,
+ label: str = "Hour",
+ **kwargs,
+):
+ """Builds an hour picker input."""
+ hours = [{"text": str(h).zfill(2), "value": str(h).zfill(2)} for h in range(0, 24)]
+ return static_select_block(
+ action_id=action_id,
+ block_id=block_id,
+ initial_option=initial_option,
+ options=hours,
+ label=label,
+ placeholder="Hour",
+ )
+
+
+def minute_picker_input(
+ action_id: str = DefaultActionIds.minute_picker_input,
+ block_id: str = DefaultBlockIds.minute_picker_input,
+ initial_option: dict = None,
+ label: str = "Minute",
+ **kwargs,
+):
+ """Builds a minute picker input."""
+ minutes = [{"text": str(m).zfill(2), "value": str(m).zfill(2)} for m in range(0, 60)]
+ return static_select_block(
+ action_id=action_id,
+ block_id=block_id,
+ initial_option=initial_option,
+ options=minutes,
+ label=label,
+ placeholder="Minute",
+ )
+
+
+def timezone_picker_input(
+ action_id: str = DefaultActionIds.timezone_picker_input,
+ block_id: str = DefaultBlockIds.timezone_picker_input,
+ initial_option: dict = None,
+ label: str = "Timezone",
+ **kwargs,
+):
+ """Builds a timezone picker input."""
+ if not initial_option:
+ initial_option = {
+ "text": TimezoneOptions.local.value,
+ "value": TimezoneOptions.local.value,
+ }
+ return static_select_block(
+ action_id=action_id,
+ block_id=block_id,
+ initial_option=initial_option,
+ options=[{"text": tz.value, "value": tz.value} for tz in TimezoneOptions],
+ label=label,
+ placeholder="Timezone",
+ )
+
+
+def datetime_picker_block(
+ action_id: str = None,
+ block_id: str = None,
+ initial_option: str = None,
+ label: str = None,
+ **kwargs,
+):
+ """Builds a datetime picker block"""
+ hour = None
+ minute = None
+ date = initial_option.split("|")[0] if initial_option.split("|")[0] != "" else None
+
+ if initial_option.split("|")[1] != "":
+ # appends zero if time is not entered in hh format
+ if len(initial_option.split("|")[1].split(":")[0]) == 1:
+ h = "0" + initial_option.split("|")[1].split(":")[0]
+ else:
+ h = initial_option.split("|")[1].split(":")[0]
+ hour = {"text": h, "value": h}
+ minute = {
+ "text": initial_option.split("|")[1].split(":")[1],
+ "value": initial_option.split("|")[1].split(":")[1],
+ }
+ return [
+ date_picker_input(initial_date=date),
+ hour_picker_input(initial_option=hour),
+ minute_picker_input(initial_option=minute),
+ timezone_picker_input(),
+ ]
+
+
+def static_select_block(
+ options: list[dict[str, str]],
+ placeholder: str,
+ action_id: str = None,
+ block_id: str = None,
+ initial_option: dict[str, str] = None,
+ label: str = None,
+ **kwargs,
+):
+ """Builds a static select block."""
+ # Ensure all values in options are strings
+ processed_options = []
+ if options:
+ for x in options:
+ option_dict = {k: str(v) if k == "value" else v for k, v in x.items()}
+ processed_options.append(option_dict)
+
+ # Ensure value in initial_option is a string
+ processed_initial_option = None
+ if initial_option:
+ processed_initial_option = {
+ k: str(v) if k == "value" else v for k, v in initial_option.items()
+ }
+
+ return Input(
+ element=StaticSelect(
+ placeholder=placeholder,
+ options=[PlainOption(**x) for x in processed_options] if processed_options else None,
+ initial_option=(
+ PlainOption(**processed_initial_option) if processed_initial_option else None
+ ),
+ action_id=action_id,
+ ),
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def multi_select_block(
+ options: list[dict[str, str]],
+ placeholder: str,
+ action_id: str = None,
+ block_id: str = None,
+ label: str = None,
+ **kwargs,
+):
+ """Builds a multi select block."""
+ # Ensure all values in options are strings
+ processed_options = []
+ if options:
+ for x in options:
+ option_dict = {k: str(v) if k == "value" else v for k, v in x.items()}
+ processed_options.append(option_dict)
+
+ return Input(
+ element=MultiStaticSelect(
+ placeholder=placeholder,
+ options=[PlainOption(**x) for x in processed_options] if processed_options else None,
+ action_id=action_id,
+ ),
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def project_select(
+ db_session: Session,
+ action_id: str = DefaultActionIds.project_select,
+ block_id: str = DefaultBlockIds.project_select,
+ label: str = "Project",
+ initial_option: dict = None,
+ **kwargs,
+):
+ """Creates a project select."""
+ projects = [
+ {"text": p.display_name, "value": p.id}
+ for p in project_service.get_all(db_session=db_session)
+ if p.enabled
+ ]
+ if not projects:
+ log.warning("Unable to create a select block for projects. No projects found.")
+ return
+
+ return static_select_block(
+ placeholder="Select Project",
+ options=projects,
+ initial_option=initial_option,
+ action_id=action_id,
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def title_input(
+ label: str = "Title",
+ placeholder: str = "A brief explanatory title. You can change this later.",
+ action_id: str = DefaultActionIds.title_input,
+ block_id: str = DefaultBlockIds.title_input,
+ initial_value: str = None,
+ **kwargs,
+):
+ """Builds a title input."""
+ return Input(
+ element=PlainTextInput(
+ placeholder=placeholder,
+ initial_value=initial_value,
+ action_id=action_id,
+ max_length=MAX_SECTION_TEXT_LENGTH,
+ ),
+ label=label,
+ block_id=block_id,
+ **kwargs,
+ )
+
+
+def description_input(
+ label: str = "Description",
+ placeholder: str = "A summary of what you know so far. It's okay if this is incomplete.",
+ action_id: str = DefaultActionIds.description_input,
+ block_id: str = DefaultBlockIds.description_input,
+ initial_value: str = None,
+ **kwargs,
+):
+ """Builds a description input."""
+ return Input(
+ element=PlainTextInput(
+ placeholder=placeholder,
+ initial_value=initial_value,
+ multiline=True,
+ action_id=action_id,
+ max_length=MAX_SECTION_TEXT_LENGTH,
+ ),
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def resolution_input(
+ label: str = "Resolution",
+ action_id: str = DefaultActionIds.resolution_input,
+ block_id: str = DefaultBlockIds.resolution_input,
+ initial_value: str = None,
+ **kwargs,
+):
+ """Builds a resolution input."""
+ return Input(
+ element=PlainTextInput(
+ placeholder="A description of the actions you have taken toward resolution.",
+ initial_value=initial_value,
+ multiline=True,
+ action_id=action_id,
+ max_length=MAX_SECTION_TEXT_LENGTH,
+ ),
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def case_resolution_reason_select(
+ action_id: str = DefaultActionIds.case_resolution_reason_select,
+ block_id: str = DefaultBlockIds.case_resolution_reason_select,
+ label: str = "Resolution Reason",
+ initial_option: dict = None,
+ **kwargs,
+):
+ """Creates an incident priority select."""
+ reasons = [{"text": str(s), "value": str(s)} for s in CaseResolutionReason]
+
+ return static_select_block(
+ placeholder="Select Resolution Reason",
+ options=reasons,
+ initial_option=initial_option,
+ block_id=block_id,
+ action_id=action_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def incident_status_select(
+ block_id: str = DefaultActionIds.incident_status_select,
+ action_id: str = DefaultBlockIds.incident_status_select,
+ label: str = "Incident Status",
+ initial_option: dict = None,
+ **kwargs,
+):
+ """Creates an incident status select."""
+ statuses = [{"text": s.value, "value": s.value} for s in IncidentStatus]
+ return static_select_block(
+ placeholder="Select Status",
+ options=statuses,
+ initial_option=initial_option,
+ action_id=action_id,
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def incident_priority_select(
+ db_session: Session,
+ action_id: str = DefaultActionIds.incident_priority_select,
+ block_id: str = DefaultBlockIds.incident_priority_select,
+ label: str = "Incident Priority",
+ initial_option: dict = None,
+ project_id: int = None,
+ **kwargs,
+):
+ """Creates an incident priority select."""
+ priorities = [
+ {"text": p.name, "value": p.id}
+ for p in incident_priority_service.get_all_enabled(
+ db_session=db_session, project_id=project_id
+ )
+ ]
+ if not priorities:
+ log.warning(
+ "Unable to create a select block for incident priorities. No incident priorities found."
+ )
+ return
+
+ return static_select_block(
+ placeholder="Select Priority",
+ options=priorities,
+ initial_option=initial_option,
+ block_id=block_id,
+ action_id=action_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def incident_severity_select(
+ db_session: Session,
+ action_id: str = DefaultActionIds.incident_severity_select,
+ block_id: str = DefaultBlockIds.incident_severity_select,
+ label="Incident Severity",
+ initial_option: dict = None,
+ project_id: int = None,
+ **kwargs,
+):
+ """Creates an incident severity select."""
+ severities = [
+ {"text": s.name, "value": s.id}
+ for s in incident_severity_service.get_all_enabled(
+ db_session=db_session, project_id=project_id
+ )
+ ]
+ if not severities:
+ log.warning(
+ "Unable to create a select block for incident severities. No incident severities found."
+ )
+ return
+
+ return static_select_block(
+ placeholder="Select Severity",
+ options=severities,
+ initial_option=initial_option,
+ action_id=action_id,
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def incident_type_select(
+ db_session: Session,
+ action_id: str = DefaultActionIds.incident_type_select,
+ block_id: str = DefaultBlockIds.incident_type_select,
+ label="Incident Type",
+ initial_option: dict = None,
+ project_id: int = None,
+ **kwargs,
+):
+ """Creates an incident type select."""
+ types = [
+ {"text": t.name, "value": t.id}
+ for t in incident_type_service.get_all_enabled(db_session=db_session, project_id=project_id)
+ ]
+ if not types:
+ log.warning("Unable to create a select block for incident types. No incident types found.")
+ return
+
+ return static_select_block(
+ placeholder="Select Type",
+ options=types,
+ initial_option=initial_option,
+ action_id=action_id,
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def tag_multi_select(
+ action_id: str = DefaultActionIds.tags_multi_select,
+ block_id: str = DefaultBlockIds.tags_multi_select,
+ label="Tags",
+ initial_options: str = None,
+ **kwargs,
+):
+ """Creates an incident tag select."""
+ return Input(
+ element=MultiExternalSelect(
+ placeholder="Select Tag(s)", action_id=action_id, initial_options=initial_options
+ ),
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def case_status_select(
+ action_id: str = DefaultActionIds.case_status_select,
+ block_id: str = DefaultBlockIds.case_status_select,
+ label: str = "Status",
+ initial_option: dict | None = None,
+ statuses: list[dict[str, str]] | None = None,
+ **kwargs,
+):
+ """Creates a case status select."""
+ if not statuses:
+ statuses = [{"text": str(s), "value": str(s)} for s in CaseStatus]
+
+ return static_select_block(
+ placeholder="Select Status",
+ options=statuses,
+ initial_option=initial_option,
+ action_id=action_id,
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def case_priority_select(
+ db_session: Session,
+ action_id: str = DefaultActionIds.case_priority_select,
+ block_id: str = DefaultBlockIds.case_priority_select,
+ label="Case Priority",
+ initial_option: dict = None,
+ project_id: int = None,
+ **kwargs,
+):
+ """Creates a case priority select."""
+ priorities = [
+ {"text": p.name, "value": p.id}
+ for p in case_priority_service.get_all_enabled(db_session=db_session, project_id=project_id)
+ ]
+ if not priorities:
+ log.warning(
+ "Unable to create a select block for case priorities. No case priorities found."
+ )
+ return
+
+ return static_select_block(
+ placeholder="Select Priority",
+ options=priorities,
+ initial_option=initial_option,
+ action_id=action_id,
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def case_severity_select(
+ db_session: Session,
+ action_id: str = DefaultActionIds.case_severity_select,
+ block_id: str = DefaultBlockIds.case_severity_select,
+ label: str = "Case Severity",
+ initial_option: dict = None,
+ project_id: int = None,
+ **kwargs,
+):
+ """Creates a case severity select."""
+ severities = [
+ {"text": s.name, "value": s.id}
+ for s in case_severity_service.get_all_enabled(db_session=db_session, project_id=project_id)
+ ]
+ if not severities:
+ log.warning(
+ "Unable to create a select block for case severities. No case severities found."
+ )
+ return
+
+ return static_select_block(
+ placeholder="Select Severity",
+ options=severities,
+ initial_option=initial_option,
+ action_id=action_id,
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def case_type_select(
+ db_session: Session,
+ action_id: str = DefaultActionIds.case_type_select,
+ block_id: str = DefaultBlockIds.case_type_select,
+ label: str = "Case Type",
+ initial_option: dict | None = None,
+ project_id: int = None,
+ **kwargs,
+):
+ """Creates a case type select."""
+ types = [
+ {"text": t.name, "value": t.id}
+ for t in case_type_service.get_all_enabled(db_session=db_session, project_id=project_id)
+ ]
+
+ if not types:
+ log.warning("Unable to create a select block for case types. No case types found.")
+ return
+
+ return static_select_block(
+ placeholder="Select Type",
+ options=types,
+ initial_option=initial_option,
+ action_id=action_id,
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def case_visibility_select(
+ action_id: str = DefaultActionIds.case_visibility_select,
+ block_id: str = DefaultBlockIds.case_visibility_select,
+ label: str = "Case Visibility",
+ initial_option: dict | None = None,
+ **kwargs,
+):
+ """Creates a case visibility select."""
+ visibility = [
+ {"text": Visibility.restricted, "value": Visibility.restricted},
+ {"text": Visibility.open, "value": Visibility.open}
+ ]
+
+ return static_select_block(
+ placeholder="Select Visibility",
+ options=visibility,
+ initial_option=initial_option,
+ action_id=action_id,
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def entity_select(
+ signal_id: int,
+ db_session: Session,
+ action_id: str = DefaultActionIds.entity_select,
+ block_id: str = DefaultBlockIds.entity_select,
+ label="Entities",
+ case_id: int = None,
+ **kwargs,
+):
+ """Creates an entity select."""
+ entity_options = [
+ {"text": entity.value[:75], "value": entity.id}
+ for entity in entity_service.get_all_desc_by_signal(
+ db_session=db_session, signal_id=signal_id, case_id=case_id
+ )
+ if entity.value
+ ]
+
+ if not entity_options:
+ log.warning("Unable to create a select block for entities. No entities found.")
+ return
+
+ return multi_select_block(
+ placeholder="Select Entities",
+ options=entity_options[:100], # Limit the entities to the first 100 most recent
+ action_id=action_id,
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def participant_select(
+ participants: list[Participant],
+ action_id: str = DefaultActionIds.participant_select,
+ block_id: str = DefaultBlockIds.participant_select,
+ label: str = "Participant",
+ initial_option: Participant = None,
+ **kwargs,
+):
+ """Creates a static select of available participants."""
+ participants = [{"text": p.individual.name, "value": p.id} for p in participants]
+ if not participants:
+ log.warning("Unable to create a select block for participants. No participants found.")
+ return
+
+ return static_select_block(
+ placeholder="Select Participant",
+ options=participants,
+ initial_option=initial_option,
+ action_id=action_id,
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def signal_definition_select(
+ signals: list[Signal],
+ action_id: str = DefaultActionIds.signal_definition_select,
+ block_id: str = DefaultBlockIds.signal_definition_select,
+ label: str = "Signal Definitions",
+ initial_option: Participant = None,
+ **kwargs,
+):
+ """Creates a static select of available signal definitions."""
+ signals = [{"text": s.name, "value": s.id} for s in signals]
+ if not signals:
+ log.warning(
+ "Unable to create a select block for signal definitions. No signals definitions found."
+ )
+ return
+
+ return static_select_block(
+ placeholder="Select Signal Definition",
+ options=signals,
+ initial_option=initial_option,
+ action_id=action_id,
+ block_id=block_id,
+ label=label,
+ **kwargs,
+ )
+
+
+def extension_request_checkbox(
+ action_id: str = DefaultActionIds.extension_request_checkbox,
+ block_id: str = DefaultBlockIds.extension_request_checkbox,
+ label: str = "Request longer expiration",
+ **kwargs,
+):
+ options = [
+ PlainOption(
+ text=("Check this box to request an expiration longer than 2 weeks."),
+ value="Yes",
+ )
+ ]
+ return Input(
+ block_id=block_id,
+ element=Checkboxes(options=options, action_id=action_id),
+ label=label,
+ optional=True,
+ **kwargs,
+ )
diff --git a/src/dispatch/plugins/dispatch_slack/handler.py b/src/dispatch/plugins/dispatch_slack/handler.py
new file mode 100644
index 000000000000..2bbcc5b272e7
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/handler.py
@@ -0,0 +1,84 @@
+"""
+https://github.com/slackapi/bolt-python/blob/c99c23fd056d26f8b1e39436bd1fcd2c83a3e1bd/slack_bolt/adapter/starlette/handler.py
+
+Fork of the built-in Bolt Starlette adapter. Removes async to allow instant acknowledgment of interactivity payloads from Slack.
+"""
+
+from http import HTTPStatus
+from typing import Any
+
+from starlette.requests import Request
+from starlette.responses import Response
+
+from slack_bolt import BoltRequest, App, BoltResponse
+from slack_bolt.oauth import OAuthFlow
+
+
+def to_bolt_request(
+ req: Request,
+ body: bytes,
+ addition_context_properties: dict[str, Any | None] = None,
+) -> BoltRequest:
+ request = BoltRequest(
+ body=body.decode("utf-8"),
+ query=req.query_params,
+ headers=req.headers,
+ )
+ if addition_context_properties is not None:
+ for k, v in addition_context_properties.items():
+ request.context[k] = v
+ return request
+
+
+def to_starlette_response(bolt_resp: BoltResponse) -> Response:
+ resp = Response(
+ status_code=bolt_resp.status,
+ content=bolt_resp.body,
+ headers=bolt_resp.first_headers_without_set_cookie(),
+ )
+ for cookie in bolt_resp.cookies():
+ for name, c in cookie.items():
+ resp.set_cookie(
+ key=name,
+ value=c.value,
+ max_age=c.get("max-age"),
+ expires=c.get("expires"),
+ path=c.get("path"),
+ domain=c.get("domain"),
+ secure=True,
+ httponly=True,
+ )
+ return resp
+
+
+class SlackRequestHandler:
+ def __init__(self, app: App): # type: ignore
+ self.app = app
+
+ def handle(
+ self,
+ req: Request,
+ body: bytes,
+ addition_context_properties: dict[str, Any | None] = None,
+ ) -> Response:
+ if req.method == "GET":
+ if self.app.oauth_flow is not None:
+ oauth_flow: OAuthFlow = self.app.oauth_flow
+ if req.url.path == oauth_flow.install_path:
+ bolt_resp = oauth_flow.handle_installation(
+ to_bolt_request(req, body, addition_context_properties)
+ )
+ return to_starlette_response(bolt_resp)
+ elif req.url.path == oauth_flow.redirect_uri_path:
+ bolt_resp = oauth_flow.handle_callback(
+ to_bolt_request(req, body, addition_context_properties)
+ )
+ return to_starlette_response(bolt_resp)
+ elif req.method == "POST":
+ bolt_resp = self.app.dispatch(to_bolt_request(req, body, addition_context_properties))
+ return to_starlette_response(bolt_resp)
+
+ return Response(
+ status_code=HTTPStatus.NOT_FOUND.value,
+ content=HTTPStatus.NOT_FOUND.phrase,
+ )
diff --git a/src/dispatch/plugins/dispatch_slack/incident/enums.py b/src/dispatch/plugins/dispatch_slack/incident/enums.py
new file mode 100644
index 000000000000..2acc336b88ac
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/incident/enums.py
@@ -0,0 +1,115 @@
+from dispatch.conversation.enums import ConversationButtonActions
+from dispatch.enums import DispatchEnum
+
+
+class AddTimelineEventActions(DispatchEnum):
+ submit = "add-timeline-event-submit"
+
+
+class IncidentNotificationActions(DispatchEnum):
+ invite_user = ConversationButtonActions.invite_user
+ subscribe_user = ConversationButtonActions.subscribe_user
+
+
+class TaskNotificationActionIds(DispatchEnum):
+ update_status = "update-task-status"
+
+
+class LinkMonitorActionIds(DispatchEnum):
+ monitor = "link-monitor-monitor"
+ ignore = "link-monitor-ignore"
+
+
+class LinkMonitorBlockIds(DispatchEnum):
+ monitor = "link-monitor-monitor"
+
+
+class UpdateParticipantActions(DispatchEnum):
+ submit = "update-participant-submit"
+
+
+class UpdateParticipantBlockIds(DispatchEnum):
+ reason = "update-participant-reason"
+ participant = "update-participant-participant"
+
+
+class AssignRoleActions(DispatchEnum):
+ submit = "assign-role-submit"
+
+
+class AssignRoleBlockIds(DispatchEnum):
+ user = "assign-role-user"
+ role = "assign-role-role"
+
+
+class EngageOncallActions(DispatchEnum):
+ submit = "engage-oncall-submit"
+
+
+class EngageOncallActionIds(DispatchEnum):
+ service = "engage-oncall-service"
+ page = "engage-oncall-page"
+
+
+class EngageOncallBlockIds(DispatchEnum):
+ service = "engage-oncall-service"
+ page = "engage-oncall-page"
+
+
+class ReportTacticalActions(DispatchEnum):
+ submit = "report-tactical-submit"
+
+
+class ReportTacticalBlockIds(DispatchEnum):
+ needs = "report-tactical-needs"
+ actions = "report-tactical-actions"
+ conditions = "report-tactical-conditions"
+
+
+class ReportExecutiveActions(DispatchEnum):
+ submit = "report-executive-submit"
+
+
+class ReportExecutiveBlockIds(DispatchEnum):
+ current_status = "report-executive-current-status"
+ overview = "report-executive-overview"
+ next_steps = "report-executive-next-steps"
+
+
+class CreateTaskBlockIds(DispatchEnum):
+ assignee_select = "create-task-assignee-select"
+ description = "create-task-description"
+
+
+class CreateTaskActionIds(DispatchEnum):
+ submit = "create-task-submit"
+
+
+class IncidentUpdateActions(DispatchEnum):
+ submit = "incident-update-submit"
+ project_select = "incident-update-project-select"
+
+
+class IncidentReportActions(DispatchEnum):
+ submit = "incident-report-submit"
+ project_select = "incident-report-project-select"
+
+
+class IncidentShortcutCallbacks(DispatchEnum):
+ report = "incident-report"
+
+
+class UpdateNotificationGroupActions(DispatchEnum):
+ submit = "update-notification-group-submit"
+
+
+class UpdateNotificationGroupActionIds(DispatchEnum):
+ members = "update-notification-group-members"
+
+
+class UpdateNotificationGroupBlockIds(DispatchEnum):
+ members = "update-notification-group-members"
+
+
+class RemindAgainActions(DispatchEnum):
+ submit = ConversationButtonActions.remind_again
diff --git a/src/dispatch/plugins/dispatch_slack/incident/interactive.py b/src/dispatch/plugins/dispatch_slack/incident/interactive.py
new file mode 100644
index 000000000000..7a2a19f9410f
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/incident/interactive.py
@@ -0,0 +1,3299 @@
+import logging
+import re
+import time
+import uuid
+from functools import partial
+from datetime import datetime, timedelta
+from typing import Any
+
+import pytz
+from blockkit import (
+ Actions,
+ Button,
+ Checkboxes,
+ Context,
+ Divider,
+ Image,
+ Input,
+ MarkdownText,
+ Message,
+ Modal,
+ PlainOption,
+ PlainTextInput,
+ Section,
+ UsersSelect,
+)
+from dispatch.ai.constants import TACTICAL_REPORT_SLACK_ACTION
+from slack_bolt import Ack, BoltContext, BoltRequest, Respond
+from slack_sdk.errors import SlackApiError
+from slack_sdk.web.client import WebClient
+from sqlalchemy.orm import Session
+
+from dispatch.ai import service as ai_service
+from dispatch.ai.models import ReadInSummary
+from dispatch.auth.models import DispatchUser
+from dispatch.case import service as case_service
+from dispatch.case import flows as case_flows
+from dispatch.config import DISPATCH_UI_URL
+from dispatch.database.service import search_filter_sort_paginate
+from dispatch.enums import Visibility, EventType, SubjectNames
+from dispatch.event import service as event_service
+from dispatch.exceptions import DispatchException
+from dispatch.group import flows as group_flows
+from dispatch.group.enums import GroupAction
+from dispatch.incident import flows as incident_flows
+from dispatch.incident import service as incident_service
+from dispatch.incident.enums import IncidentStatus
+from dispatch.incident.models import IncidentCreate, IncidentRead, IncidentUpdate
+from dispatch.incident.priority import service as incident_priority_service
+from dispatch.individual import service as individual_service
+from dispatch.individual.models import IndividualContactRead
+from dispatch.monitor import service as monitor_service
+from dispatch.monitor.models import MonitorCreate
+from dispatch.participant import service as participant_service
+from dispatch.participant.models import ParticipantUpdate
+from dispatch.participant_role import service as participant_role_service
+from dispatch.incident.severity import service as incident_severity_service
+from dispatch.participant_role.enums import ParticipantRoleType
+from dispatch.plugin import service as plugin_service
+from dispatch.plugins.dispatch_slack import service as dispatch_slack_service
+from dispatch.plugins.dispatch_slack.bolt import app
+from dispatch.plugins.dispatch_slack.decorators import message_dispatcher
+from dispatch.plugins.dispatch_slack.enums import SlackAPIErrorCode
+from dispatch.plugins.dispatch_slack.exceptions import CommandError, EventError
+from dispatch.plugins.dispatch_slack.fields import (
+ DefaultActionIds,
+ DefaultBlockIds,
+ TimezoneOptions,
+ datetime_picker_block,
+ description_input,
+ incident_priority_select,
+ incident_severity_select,
+ incident_status_select,
+ incident_type_select,
+ participant_select,
+ project_select,
+ resolution_input,
+ static_select_block,
+ tag_multi_select,
+ title_input,
+)
+from dispatch.plugins.dispatch_slack.incident.enums import (
+ AddTimelineEventActions,
+ AssignRoleActions,
+ AssignRoleBlockIds,
+ CreateTaskActionIds,
+ CreateTaskBlockIds,
+ EngageOncallActionIds,
+ EngageOncallActions,
+ EngageOncallBlockIds,
+ IncidentNotificationActions,
+ IncidentReportActions,
+ IncidentUpdateActions,
+ IncidentShortcutCallbacks,
+ LinkMonitorActionIds,
+ LinkMonitorBlockIds,
+ RemindAgainActions,
+ ReportExecutiveActions,
+ ReportExecutiveBlockIds,
+ ReportTacticalActions,
+ ReportTacticalBlockIds,
+ TaskNotificationActionIds,
+ UpdateNotificationGroupActionIds,
+ UpdateNotificationGroupActions,
+ UpdateNotificationGroupBlockIds,
+ UpdateParticipantActions,
+ UpdateParticipantBlockIds,
+)
+from dispatch.plugins.dispatch_slack.middleware import (
+ action_context_middleware,
+ button_context_middleware,
+ command_context_middleware,
+ configuration_middleware,
+ db_middleware,
+ is_bot,
+ message_context_middleware,
+ modal_submit_middleware,
+ reaction_context_middleware,
+ restricted_command_middleware,
+ subject_middleware,
+ user_middleware,
+ select_context_middleware,
+ shortcut_context_middleware,
+)
+from dispatch.plugins.dispatch_slack.modals.common import send_success_modal
+from dispatch.plugins.dispatch_slack.models import (
+ MonitorMetadata,
+ TaskMetadata,
+ IncidentSubjects,
+ CaseSubjects,
+)
+from dispatch.plugins.dispatch_slack.service import (
+ get_user_email,
+ get_user_profile_by_email,
+)
+from dispatch.project import service as project_service
+from dispatch.report import flows as report_flows
+from dispatch.report import service as report_service
+from dispatch.report.enums import ReportTypes
+from dispatch.report.models import ExecutiveReportCreate, TacticalReportCreate
+from dispatch.service import service as service_service
+from dispatch.tag import service as tag_service
+from dispatch.task import service as task_service
+from dispatch.task.enums import TaskStatus
+from dispatch.task.models import Task, TaskCreate
+from dispatch.ticket import flows as ticket_flows
+from dispatch.messaging.strings import reminder_select_values
+from dispatch.plugins.dispatch_slack.messaging import build_unexpected_error_message
+from dispatch.auth import service as auth_service
+
+log = logging.getLogger(__file__)
+
+
+def is_target_reaction(reaction: str) -> bool:
+ """Returns True if given reaction matches the events' reaction."""
+
+ def is_target(event) -> bool:
+ return event["reaction"] == reaction
+
+ return is_target
+
+
+def configure(config):
+ """Maps commands/events to their functions."""
+ incident_command_context_middleware = partial(
+ command_context_middleware,
+ expected_subject=SubjectNames.INCIDENT,
+ )
+
+ middleware = [
+ subject_middleware,
+ configuration_middleware,
+ ]
+
+ # don't need an incident context
+ app.command(config.slack_command_report_incident, middleware=middleware)(
+ handle_report_incident_command
+ )
+ app.command(config.slack_command_list_incidents, middleware=middleware)(
+ handle_list_incidents_command
+ )
+
+ # non-sensitive-commands
+ middleware = [
+ subject_middleware,
+ configuration_middleware,
+ incident_command_context_middleware,
+ ]
+
+ app.command(config.slack_command_list_tasks, middleware=middleware)(handle_list_tasks_command)
+ app.command(config.slack_command_list_my_tasks, middleware=middleware)(
+ handle_list_tasks_command
+ )
+ app.command(config.slack_command_list_participants, middleware=middleware)(
+ handle_list_participants_command
+ )
+ app.command(config.slack_command_update_participant, middleware=middleware)(
+ handle_update_participant_command
+ )
+ app.command(
+ config.slack_command_engage_oncall,
+ middleware=[message_context_middleware, subject_middleware, configuration_middleware],
+ )(handle_engage_oncall_command)
+
+ # sensitive commands
+ middleware = [
+ subject_middleware,
+ configuration_middleware,
+ incident_command_context_middleware,
+ user_middleware,
+ restricted_command_middleware,
+ ]
+
+ app.command(config.slack_command_assign_role, middleware=middleware)(handle_assign_role_command)
+ app.command(config.slack_command_update_incident, middleware=middleware)(
+ handle_update_incident_command
+ )
+ app.command(config.slack_command_update_notifications_group, middleware=middleware)(
+ handle_update_notifications_group_command
+ )
+ app.command(config.slack_command_report_tactical, middleware=middleware)(
+ handle_report_tactical_command
+ )
+ app.command(config.slack_command_report_executive, middleware=middleware)(
+ handle_report_executive_command
+ )
+ app.command(config.slack_command_add_timeline_event, middleware=middleware)(
+ handle_add_timeline_event_command
+ )
+ app.command(config.slack_command_create_task, middleware=middleware)(handle_create_task_command)
+
+ app.command(
+ config.slack_command_summary,
+ middleware=[
+ message_context_middleware,
+ subject_middleware,
+ configuration_middleware,
+ user_middleware,
+ ],
+ )(handle_summary_command)
+
+ app.event(
+ event="reaction_added",
+ matchers=[is_target_reaction(config.timeline_event_reaction)],
+ middleware=[reaction_context_middleware],
+ )(handle_timeline_added_event)
+
+ app.event(
+ event="reaction_added",
+ middleware=[reaction_context_middleware],
+ )(handle_not_configured_reaction_event)
+
+
+@app.options(
+ DefaultActionIds.tags_multi_select, middleware=[action_context_middleware, db_middleware]
+)
+def handle_tag_search_action(
+ ack: Ack, payload: dict, context: BoltContext, db_session: Session
+) -> None:
+ """Handles tag lookup actions."""
+ query_str = payload["value"]
+
+ filter_spec = {
+ "and": [
+ {
+ "model": "Project",
+ "op": "==",
+ "field": "id",
+ "value": int(context["subject"].project_id),
+ }
+ ]
+ }
+
+ if "/" in query_str:
+ # first check to make sure there's only one slash
+ if query_str.count("/") > 1:
+ ack()
+ return
+
+ tag_type, query_str = query_str.split("/")
+ filter_spec["and"].append(
+ {"model": "TagType", "op": "==", "field": "name", "value": tag_type}
+ )
+
+ tags = search_filter_sort_paginate(
+ db_session=db_session, model="Tag", query_str=query_str, filter_spec=filter_spec
+ )
+
+ options = []
+ for t in tags["items"]:
+ options.append(
+ {
+ "text": {"type": "plain_text", "text": f"{t.tag_type.name}/{t.name}"},
+ "value": str(t.id), # NOTE: slack doesn't accept int's as values (fails silently)
+ }
+ )
+
+ ack(options=options)
+
+
+@app.action(
+ IncidentUpdateActions.project_select, middleware=[action_context_middleware, db_middleware]
+)
+def handle_update_incident_project_select_action(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+) -> None:
+ ack()
+ values = body["view"]["state"]["values"]
+
+ project_id = int(
+ values[DefaultBlockIds.project_select][IncidentUpdateActions.project_select][
+ "selected_option"
+ ]["value"]
+ )
+
+ context["subject"].project_id = project_id
+
+ project = project_service.get(
+ db_session=db_session,
+ project_id=project_id,
+ )
+
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+
+ blocks = [
+ Context(elements=[MarkdownText(text="Use this form to update the incident's details.")]),
+ title_input(initial_value=incident.title),
+ description_input(initial_value=incident.description),
+ resolution_input(initial_value=incident.resolution),
+ incident_status_select(initial_option={"text": incident.status, "value": incident.status}),
+ project_select(
+ db_session=db_session,
+ initial_option={"text": project.display_name, "value": project.id},
+ action_id=IncidentUpdateActions.project_select,
+ dispatch_action=True,
+ ),
+ incident_type_select(
+ db_session=db_session,
+ initial_option={
+ "text": incident.incident_type.name,
+ "value": incident.incident_type.id,
+ },
+ project_id=project.id,
+ ),
+ incident_severity_select(
+ db_session=db_session,
+ initial_option={
+ "text": incident.incident_severity.name,
+ "value": incident.incident_severity.id,
+ },
+ project_id=project.id,
+ ),
+ incident_priority_select(
+ db_session=db_session,
+ initial_option={
+ "text": incident.incident_priority.name,
+ "value": incident.incident_priority.id,
+ },
+ project_id=project.id,
+ ),
+ tag_multi_select(
+ optional=True,
+ initial_options=[t.name for t in incident.tags],
+ ),
+ ]
+ modal = Modal(
+ title="Update Incident",
+ blocks=blocks,
+ submit="Update",
+ close="Cancel",
+ callback_id=IncidentUpdateActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ hash=body["view"]["hash"],
+ trigger_id=body["trigger_id"],
+ view=modal,
+ )
+
+
+# COMMANDS
+def handle_list_incidents_command(
+ ack: Ack,
+ body: dict,
+ payload: dict,
+ db_session: Session,
+ context: BoltContext,
+ client: WebClient,
+) -> None:
+ """Handles the list incidents command."""
+ ack()
+
+ projects = []
+
+ if context["subject"].type == IncidentSubjects.incident:
+ # command was run in an incident conversation
+ incident = incident_service.get(
+ db_session=db_session, incident_id=int(context["subject"].id)
+ )
+ projects.append(incident.project)
+ else:
+ # command was run in a non-incident conversation
+ args = payload["command"].split(" ")
+
+ if len(args) == 2:
+ project = project_service.get_by_name(db_session=db_session, name=args[1])
+
+ if project:
+ projects.append(project)
+ else:
+ raise CommandError(
+ f"Project name '{args[1]}' in organization '{args[0]}' not found. Check your spelling.",
+ )
+ else:
+ projects = project_service.get_all(db_session=db_session)
+
+ incidents = []
+ for project in projects:
+ # We fetch active incidents
+ incidents.extend(
+ incident_service.get_all_by_status(
+ db_session=db_session, project_id=project.id, status=IncidentStatus.active
+ )
+ )
+ # We fetch stable incidents
+ incidents.extend(
+ incident_service.get_all_by_status(
+ db_session=db_session,
+ project_id=project.id,
+ status=IncidentStatus.stable,
+ )
+ )
+ # We fetch closed incidents in the last 24 hours
+ incidents.extend(
+ incident_service.get_all_last_x_hours_by_status(
+ db_session=db_session,
+ project_id=project.id,
+ status=IncidentStatus.closed,
+ hours=24,
+ )
+ )
+
+ blocks = []
+
+ if incidents:
+ open_incidents = [i for i in incidents if i.visibility == Visibility.open]
+ if len(open_incidents) > 50:
+ blocks.extend(
+ [
+ Context(
+ elements=[
+ "đĄ There are more than 50 open incidents, which is the max we can display."
+ ]
+ ),
+ Divider(),
+ ]
+ )
+
+ for idx, incident in enumerate(open_incidents[0:49], 1):
+ incident_weblink = f"{DISPATCH_UI_URL}/{incident.project.organization.name}/incidents/{incident.name}?project={incident.project.name}"
+ blocks.extend(
+ [
+ Section(
+ fields=[
+ f"*<{incident_weblink}|{incident.name}>*\n {incident.title}",
+ f"*Commander*\n<{incident.commander.individual.weblink}|{incident.commander.individual.name}>",
+ f"*Project*\n{incident.project.name}",
+ f"*Status*\n{incident.status}",
+ f"*Type*\n {incident.incident_type.name}",
+ f"*Severity*\n {incident.incident_severity.name}",
+ f"*Priority*\n {incident.incident_priority.name}",
+ ]
+ )
+ ]
+ )
+ # Don't add a divider if we are at the last incident
+ if idx != len(open_incidents):
+ blocks.extend([Divider()])
+ else:
+ blocks.append(Section(text="No incidents found."))
+
+ modal = Modal(
+ title="Incident List",
+ blocks=blocks,
+ close="Close",
+ ).build()
+
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def handle_list_participants_command(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+) -> None:
+ """Handles list participants command."""
+ ack()
+ blocks = []
+
+ participants = participant_service.get_all_by_incident_id(
+ db_session=db_session, incident_id=int(context["subject"].id)
+ )
+
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+
+ contact_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="contact"
+ )
+ if not contact_plugin:
+ raise CommandError(
+ "Contact plugin is not enabled. Unable to list participants.",
+ )
+
+ active_participants = [p for p in participants if p.active_roles]
+ for idx, participant in enumerate(active_participants, 1):
+ participant_email = participant.individual.email
+ participant_info = contact_plugin.instance.get(participant_email, db_session=db_session)
+ participant_name = participant_info.get("fullname", participant.individual.email)
+ participant_team = participant_info.get("team", "Unknown")
+ participant_department = participant_info.get("department", "Unknown")
+ participant_location = participant_info.get("location", "Unknown")
+ participant_weblink = participant_info.get("weblink")
+
+ participant_active_roles = participant_role_service.get_all_active_roles(
+ db_session=db_session, participant_id=participant.id
+ )
+ participant_roles = []
+ for role in participant_active_roles:
+ participant_roles.append(role.role)
+
+ accessory = None
+ # don't load avatars for large incidents
+ if len(participants) < 20:
+ participant_avatar_url = dispatch_slack_service.get_user_avatar_url(
+ client, participant_email
+ )
+ accessory = Image(image_url=participant_avatar_url, alt_text=participant_name)
+
+ blocks.extend(
+ [
+ Section(
+ fields=[
+ f"*Name* \n<{participant_weblink}|{participant_name}>",
+ f"*Team*\n {participant_team}, {participant_department}",
+ f"*Location* \n{participant_location}",
+ f"*Incident Role(s)* \n{(', ').join(participant_roles)}",
+ (
+ f"*Added By* \n{participant.added_by.individual.name}"
+ if participant.added_by
+ else "*Added By* \nUnknown"
+ ),
+ (
+ f"*Added Reason* \n{participant.added_reason}"
+ if participant.added_reason
+ else "*Added Reason* \nUnknown"
+ ),
+ ],
+ accessory=accessory,
+ ),
+ ]
+ )
+
+ # Don't add a divider if we are at the last participant
+ if idx != len(active_participants):
+ blocks.extend([Divider()])
+
+ modal = Modal(
+ title="Incident Participants",
+ blocks=blocks,
+ close="Close",
+ ).build()
+
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def filter_tasks_by_assignee_and_creator(
+ tasks: list[Task], by_assignee: str, by_creator: str
+) -> list[Task]:
+ """Filters a list of tasks looking for a given creator or assignee."""
+ filtered_tasks = []
+ for t in tasks:
+ if by_creator:
+ creator_email = t.creator.individual.email
+ if creator_email == by_creator:
+ filtered_tasks.append(t)
+ # lets avoid duplication if creator is also assignee
+ continue
+
+ if by_assignee:
+ assignee_emails = [a.individual.email for a in t.assignees]
+ if by_assignee in assignee_emails:
+ filtered_tasks.append(t)
+
+ return filtered_tasks
+
+
+def handle_list_tasks_command(
+ ack: Ack,
+ body: dict,
+ payload: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+) -> None:
+ """Handles the list tasks command."""
+ ack()
+
+ tasks = task_service.get_all_by_incident_id(
+ db_session=db_session,
+ incident_id=int(context["subject"].id),
+ )
+
+ caller_only = False
+ email = None
+ if body["command"] == context["config"].slack_command_list_my_tasks:
+ caller_only = True
+
+ if caller_only:
+ email = (client.users_info(user=payload["user_id"]))["user"]["profile"]["email"]
+ tasks = filter_tasks_by_assignee_and_creator(tasks, email, email)
+
+ draw_task_modal(
+ channel_id=context.channel_id,
+ client=client,
+ first_open=True,
+ tasks=tasks,
+ view_id=body["trigger_id"],
+ )
+
+
+def draw_task_message(
+ channel_id: str, client: WebClient, first_open: bool, task: Task, thread_id: int
+):
+ """Draws a task message in a Slack channel.
+
+ Args:
+ channel_id (str): The channel id.
+ client (WebClient): The Slack client.
+ first_open (bool): Whether this is the first time the message is being drawn.
+ task (Task): The task to draw.
+ thread_id (int): The thread id of the task message.
+
+ Overwrites the existing message if it already exists.
+ """
+ button_text = "Resolve" if task.status == TaskStatus.open else "Re-open"
+ action_type = "resolve" if task.status == TaskStatus.open else "reopen"
+
+ # If this is the first time the message is being drawn, we post a new message.
+ if first_open:
+ result = client.chat_postMessage(
+ text=f"*<{task.creator.individual.weblink}|{task.creator.individual.name}>* created a new task.",
+ channel=channel_id,
+ )
+ thread_id = result.data.get("ts")
+
+ button_metadata = TaskMetadata(
+ type=IncidentSubjects.incident,
+ action_type=action_type,
+ organization_slug=task.project.organization.slug,
+ id=task.incident.id,
+ task_id=task.id,
+ project_id=task.project.id,
+ resource_id=task.resource_id,
+ channel_id=channel_id,
+ thread_id=thread_id,
+ ).json()
+
+ assignees = [f"<{a.individual.weblink}|{a.individual.name}>" for a in task.assignees]
+ blocks = {
+ "type": "section",
+ "text": {
+ "type": "mrkdwn",
+ "text": f"*<{task.creator.individual.weblink}|{task.creator.individual.name}>* created a new task.",
+ },
+ "fields": [
+ {"type": "mrkdwn", "text": "*Assignees:*"},
+ {"type": "mrkdwn", "text": "*Description:*"},
+ {
+ "type": "mrkdwn",
+ "text": ", ".join(assignees),
+ },
+ {"type": "plain_text", "text": task.description},
+ ],
+ "accessory": {
+ "type": "button",
+ "text": {"type": "plain_text", "text": button_text},
+ "style": "primary",
+ "value": button_metadata,
+ "action_id": TaskNotificationActionIds.update_status,
+ },
+ }
+
+ # Update the message with the task details and resolve/re-open button.
+ client.chat_update(channel=channel_id, ts=thread_id, blocks=[blocks])
+
+
+def draw_task_modal(
+ channel_id: str,
+ client: WebClient,
+ first_open: bool,
+ tasks: list[Task],
+ view_id: str,
+) -> None:
+ """Builds and draws the list tasks modal on first open and after each button click."""
+ blocks = []
+
+ for status in TaskStatus:
+ blocks.append(Section(text=f"*{status} Incident Tasks*"))
+ button_text = "Resolve" if status == TaskStatus.open else "Re-open"
+ action_type = "resolve" if status == TaskStatus.open else "reopen"
+
+ tasks_for_status = [task for task in tasks if task.status == status]
+
+ if not tasks_for_status:
+ blocks.append(Section(text="No tasks."))
+
+ for idx, task in enumerate(tasks_for_status, 1):
+ assignees = [f"<{a.individual.weblink}|{a.individual.name}>" for a in task.assignees]
+
+ button_metadata = TaskMetadata(
+ type=IncidentSubjects.incident,
+ action_type=action_type,
+ organization_slug=task.project.organization.slug,
+ id=task.incident.id,
+ task_id=task.id,
+ project_id=task.project.id,
+ resource_id=task.resource_id,
+ channel_id=channel_id,
+ ).json()
+
+ blocks.append(
+ Section(
+ fields=[
+ (
+ f"*Description:* \n <{task.weblink}|{task.description}>"
+ if task.weblink
+ else f"*Description:* \n {task.description}"
+ ),
+ f"*Creator:* \n <{task.creator.individual.weblink}|{task.creator.individual.name}>",
+ f"*Assignees:* \n {', '.join(assignees)}",
+ ],
+ accessory=Button(
+ text=button_text,
+ value=button_metadata,
+ action_id=TaskNotificationActionIds.update_status,
+ ),
+ )
+ )
+ # Don't add a divider if we are at the last task
+ if idx != len(tasks_for_status):
+ blocks.extend([Divider()])
+
+ modal = Modal(
+ title="Incident Tasks",
+ blocks=blocks,
+ close="Close",
+ ).build()
+
+ if first_open is True:
+ client.views_open(trigger_id=view_id, view=modal)
+ else:
+ client.views_update(view_id=view_id, view=modal)
+
+
+# EVENTS
+
+
+def handle_not_configured_reaction_event(
+ ack: Ack, client: Any, context: BoltContext, payload: Any, db_session: Session
+) -> None:
+ """Ignores reaction_added events for reactions that are not configured and mapped to a handler function."""
+ ack()
+
+
+def get_user_name_from_id(client: Any, user_id: str) -> str:
+ """Returns the user's name given their user ID."""
+ try:
+ user = client.users_info(user=user_id)
+ return user["user"]["profile"]["real_name"]
+ except SlackApiError:
+ # if can't find user, just return the original text
+ return user_id
+
+
+def replace_slack_users_in_message(client: Any, message: str) -> str:
+ """Replaces slack user ids in a message with their names."""
+ return re.sub(r"<@([^>]+)>", lambda x: f"@{get_user_name_from_id(client, x.group(1))}", message)
+
+
+def create_read_in_summary_blocks(summary: ReadInSummary) -> list:
+ """Creates Slack blocks from a structured read-in summary."""
+ blocks = []
+
+ # Add AI disclaimer at the top
+ blocks.append(
+ Context(
+ elements=[
+ MarkdownText(
+ text=":sparkles: *This entire block is AI-generated and may contain errors or inaccuracies. Please verify the information before relying on it.*"
+ )
+ ]
+ ).build()
+ )
+
+ # Add AI-generated summary if available
+ if summary.summary:
+ blocks.append(
+ Section(text=":magic_wand: *AI-Generated Summary*\n{0}".format(summary.summary)).build()
+ )
+
+ # Add timeline events if available
+ if summary.timeline:
+ timeline_text = "\n".join([f"âĸ {event}" for event in summary.timeline])
+ blocks.append(
+ Section(
+ text=":alarm_clock: *Timeline* _(times in UTC)_\n{0}".format(timeline_text)
+ ).build()
+ )
+
+ # Add actions taken if available
+ if summary.actions_taken:
+ actions_text = "\n".join([f"âĸ {action}" for action in summary.actions_taken])
+ blocks.append(
+ Section(text=":white_check_mark: *Actions Taken*\n{0}".format(actions_text)).build()
+ )
+
+ # Add current status if available
+ if summary.current_status:
+ blocks.append(
+ Section(text=":bar_chart: *Current Status*\n{0}".format(summary.current_status)).build()
+ )
+
+ return blocks
+
+
+def handle_timeline_added_event(
+ ack: Ack, client: Any, context: BoltContext, payload: Any, db_session: Session
+) -> None:
+ """Handles an event where the configured timeline reaction is added to a message for incidents or cases."""
+ ack()
+
+ conversation_id = context["channel_id"]
+ message_ts = payload["item"]["ts"]
+ message_ts_utc = datetime.utcfromtimestamp(float(message_ts))
+
+ response = dispatch_slack_service.list_conversation_messages(
+ client, conversation_id, latest=message_ts, limit=1, inclusive=1
+ )
+ message_text = replace_slack_users_in_message(client, response["messages"][0]["text"])
+ message_sender_id = response["messages"][0]["user"]
+
+ subject_type = context["subject"].type
+ individual = None
+ source = "Slack message"
+ event_id = None
+
+ if subject_type == IncidentSubjects.incident:
+ incident = incident_service.get(
+ db_session=db_session, incident_id=int(context["subject"].id)
+ )
+ try:
+ message_sender_email = get_user_email(client=client, user_id=message_sender_id)
+ if message_sender_email:
+ individual = individual_service.get_by_email_and_project(
+ db_session=db_session,
+ email=message_sender_email,
+ project_id=incident.project.id,
+ )
+ except Exception as e:
+ log.error(f"Error getting user email: {e}")
+ individual = None
+ if not individual:
+ if bot_user_id := context.get("bot_user_id"):
+ try:
+ bot = dispatch_slack_service.get_user_info_by_id(client, bot_user_id)
+ bot_name = bot["profile"]["real_name"]
+ source = f"Slack message from {bot_name}"
+ except Exception as e:
+ log.error(f"Error getting bot info: {e}")
+ else:
+ source = f"Slack message from {individual.name}"
+ event = event_service.log_incident_event(
+ db_session=db_session,
+ source=source,
+ description=message_text,
+ incident_id=int(context["subject"].id),
+ individual_id=individual.id if individual else None,
+ started_at=message_ts_utc,
+ type=EventType.imported_message,
+ owner=individual.name if individual else None,
+ )
+ db_session.commit()
+ event_id = event.id
+ log.info(f"Logged incident event with ID: {event_id}")
+ elif subject_type == CaseSubjects.case:
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+ try:
+ message_sender_email = dispatch_slack_service.get_user_email(
+ client=client, user_id=message_sender_id
+ )
+ if message_sender_email:
+ individual = individual_service.get_by_email_and_project(
+ db_session=db_session,
+ email=message_sender_email,
+ project_id=case.project.id,
+ )
+ except Exception as e:
+ log.error(f"Error getting user email: {e}")
+ individual = None
+ if not individual:
+ if bot_user_id := context.get("bot_user_id"):
+ try:
+ bot = dispatch_slack_service.get_user_info_by_id(client, bot_user_id)
+ bot_name = bot["profile"]["real_name"]
+ source = f"Slack message from {bot_name}"
+ except Exception as e:
+ log.error(f"Error getting bot info: {e}")
+ else:
+ source = f"Slack message from {individual.name}"
+ dispatch_user_id = None
+ if individual:
+ dispatch_user = auth_service.get_by_email(db_session=db_session, email=individual.email)
+ dispatch_user_id = dispatch_user.id if dispatch_user else None
+ event = event_service.log_case_event(
+ db_session=db_session,
+ source=source,
+ description=message_text,
+ case_id=int(context["subject"].id),
+ dispatch_user_id=dispatch_user_id,
+ started_at=message_ts_utc,
+ type=EventType.imported_message,
+ owner=individual.name if individual else "Unknown",
+ )
+ db_session.commit()
+ event_id = event.id
+ log.info(f"Logged case event with ID: {event_id}")
+ else:
+ log.info(f"TIMELINE HANDLER: Unknown subject type: {subject_type}")
+
+
+@message_dispatcher.add(
+ subject=IncidentSubjects.incident, exclude={"subtype": ["channel_join", "channel_leave"]}
+) # we ignore channel join and leave messages
+def handle_participant_role_activity(
+ ack: Ack, db_session: Session, context: BoltContext, user: DispatchUser
+) -> None:
+ """
+ Increments the participant role's activity counter and assesses the need of changing
+ a participant's role based on its activity and changes it if needed.
+ """
+ ack()
+ participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=int(context["subject"].id), email=user.email
+ )
+
+ if participant:
+ for participant_role in participant.active_roles:
+ participant_role.activity += 1
+
+ # re-assign role once threshold is reached
+ if participant_role.role == ParticipantRoleType.observer:
+ if participant_role.activity >= 3: # three messages sent to the incident channel
+ # we change the participant's role to the participant one
+ participant_role_service.renounce_role(
+ db_session=db_session, participant_role=participant_role
+ )
+ participant_role_service.add_role(
+ db_session=db_session,
+ participant_id=participant.id,
+ participant_role=ParticipantRoleType.participant,
+ )
+
+ # we log the event
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Slack Plugin - Conversation Management",
+ description=(
+ f"{participant.individual.name}'s role changed from {participant_role.role} to "
+ f"{ParticipantRoleType.participant} due to activity in the incident channel"
+ ),
+ incident_id=int(context["subject"].id),
+ type=EventType.participant_updated,
+ )
+
+ db_session.commit()
+
+
+@message_dispatcher.add(
+ subject=IncidentSubjects.incident, exclude={"subtype": ["channel_join", "group_join"]}
+) # we ignore user channel and group join messages
+def handle_after_hours_message(
+ ack: Ack,
+ context: BoltContext,
+ client: WebClient,
+ db_session: Session,
+ payload: dict,
+ user: DispatchUser,
+) -> None:
+ """Notifies the user that this incident is currently in after hours mode."""
+ ack()
+
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+ owner_email = incident.commander.individual.email
+ participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=int(context["subject"].id), email=user.email
+ )
+
+ # get incident priority settings and if delayed message warning is disabled, log and return
+ incident_priority_data = incident_priority_service.get(
+ db_session=db_session, incident_priority_id=incident.incident_priority_id
+ )
+
+ if incident_priority_data.disable_delayed_message_warning:
+ log.debug("delayed messaging is disabled, not sending a warning")
+ return
+
+ # handle no participant found
+ if not participant:
+ log.warning(
+ f"Participant not found for {user.email} in incident {incident.id}. Skipping after hours notification."
+ )
+ return
+
+ # get their timezone from slack
+ try:
+ owner_tz = (dispatch_slack_service.get_user_info_by_email(client, email=owner_email))["tz"]
+ except SlackApiError as e:
+ if e.response["error"] == SlackAPIErrorCode.USERS_NOT_FOUND:
+ e.add_note(
+ "This error usually indicates that the incident commanders Slack account is deactivated."
+ )
+
+ log.warning(f"Failed to fetch timezone from Slack API: {e}")
+ owner_tz = (
+ "UTC" # set a default timezone value (change this to your preferred default timezone)
+ )
+
+ message = f"Responses may be delayed. The current incident priority is *{incident.incident_priority.name}* and your message was sent outside of the Incident Commander's working hours (Weekdays, 9am-5pm, {owner_tz} timezone)."
+
+ now = datetime.now(pytz.timezone(owner_tz))
+ is_business_hours = now.weekday() not in [5, 6] and 9 <= now.hour < 17
+
+ if not is_business_hours:
+ if not participant.after_hours_notification:
+ participant.after_hours_notification = True
+ db_session.add(participant)
+ db_session.commit()
+ client.chat_postEphemeral(
+ text=message,
+ channel=payload["channel"],
+ user=payload["user"],
+ )
+
+
+@message_dispatcher.add(subject=IncidentSubjects.incident)
+def handle_thread_creation(
+ ack: Ack, client: WebClient, payload: dict, context: BoltContext, request: BoltRequest
+) -> None:
+ """Sends the user an ephemeral message if they use threads."""
+ ack()
+
+ if not context["config"].ban_threads:
+ return
+
+ if payload.get("thread_ts") and not is_bot(request):
+ message = "Please refrain from using threads in incident channels. Threads make it harder for incident participants to maintain context."
+ client.chat_postEphemeral(
+ text=message,
+ channel=payload["channel"],
+ thread_ts=payload["thread_ts"],
+ user=payload["user"],
+ )
+
+
+@message_dispatcher.add(subject=IncidentSubjects.incident)
+def handle_message_monitor(
+ ack: Ack,
+ payload: dict,
+ context: BoltContext,
+ client: WebClient,
+ db_session: Session,
+) -> None:
+ """Looks for strings that are available for monitoring (e.g. links)."""
+ ack()
+
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+ project_id = incident.project.id
+
+ plugins = plugin_service.get_active_instances(
+ db_session=db_session, project_id=project_id, plugin_type="monitor"
+ )
+
+ for p in plugins:
+ for matcher in p.instance.get_matchers():
+ for match in matcher.finditer(payload["text"]):
+ match_data = match.groupdict()
+ monitor = monitor_service.get_by_weblink(
+ db_session=db_session, weblink=match_data["weblink"]
+ )
+
+ # silence ignored matches
+ if monitor:
+ continue
+
+ current_status = p.instance.get_match_status(match_data)
+ if current_status:
+ status_text = ""
+ for k, v in current_status.items():
+ status_text += f"*{k.title()}*:\n{v.title()}\n"
+
+ button_metadata = MonitorMetadata(
+ type=IncidentSubjects.incident,
+ organization_slug=incident.project.organization.slug,
+ id=incident.id,
+ plugin_instance_id=p.id,
+ project_id=incident.project.id,
+ channel_id=context["channel_id"],
+ weblink=match_data["weblink"],
+ ).json()
+
+ blocks = [
+ Section(
+ text=f"Hi! Dispatch is able to monitor the status of the following resource: \n {match_data['weblink']} \n\n Would you like to be notified about changes in its status in the incident channel?"
+ ),
+ Section(text=status_text),
+ Actions(
+ block_id=LinkMonitorBlockIds.monitor,
+ elements=[
+ Button(
+ text="Monitor",
+ action_id=LinkMonitorActionIds.monitor,
+ style="primary",
+ value=button_metadata,
+ ),
+ Button(
+ text="Ignore",
+ action_id=LinkMonitorActionIds.ignore,
+ style="primary",
+ value=button_metadata,
+ ),
+ ],
+ ),
+ ]
+ blocks = Message(blocks=blocks).build()["blocks"]
+ client.chat_postEphemeral(
+ text="Link Monitor",
+ channel=payload["channel"],
+ thread_ts=payload.get("thread_ts"),
+ blocks=blocks,
+ user=payload["user"],
+ )
+
+
+@app.event(
+ "member_joined_channel",
+ middleware=[
+ message_context_middleware,
+ user_middleware,
+ configuration_middleware,
+ ],
+)
+def handle_member_joined_channel(
+ ack: Ack,
+ user: DispatchUser,
+ body: dict,
+ client: WebClient,
+ db_session: Session,
+ context: BoltContext,
+) -> None:
+ """Handles the member_joined_channel Slack event."""
+ ack()
+
+ if not user:
+ raise EventError(
+ "Unable to handle member_joined_channel Slack event. Dispatch user unknown."
+ )
+
+ # sleep for a second to allow the participant to be added to the incident
+ time.sleep(1)
+
+ generate_read_in_summary = False
+ subject_type = context["subject"].type
+ project = None
+
+ if subject_type == IncidentSubjects.incident:
+ participant = incident_flows.incident_add_or_reactivate_participant_flow(
+ user_email=user.email, incident_id=int(context["subject"].id), db_session=db_session
+ )
+
+ if not participant:
+ # Participant is already in the incident channel.
+ return
+
+ participant.user_conversation_id = context["user_id"]
+
+ incident = incident_service.get(
+ db_session=db_session, incident_id=int(context["subject"].id)
+ )
+ project = incident.project
+ generate_read_in_summary = getattr(
+ incident.incident_type, "generate_read_in_summary", False
+ )
+ if incident.visibility == Visibility.restricted:
+ generate_read_in_summary = False
+
+ # If the user was invited, the message will include an inviter property containing the user ID of the inviting user.
+ # The property will be absent when a user manually joins a channel, or a user is added by default (e.g. #general channel).
+ inviter = body.get("event", {}).get("inviter", None)
+ inviter_is_user = (
+ dispatch_slack_service.is_user(context["config"], inviter) if inviter else None
+ )
+
+ if inviter and inviter_is_user:
+ # Participant is added into the incident channel using an @ message or /invite command.
+ inviter_email = get_user_email(client=client, user_id=inviter)
+ if inviter_email:
+ added_by_participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session,
+ incident_id=int(context["subject"].id),
+ email=inviter_email,
+ )
+ participant.added_by = added_by_participant
+
+ if not participant.added_by:
+ # User joins via the `join` button on Web Application or Slack.
+ # We default to the incident commander when we don't know who added the user or the user is the Dispatch bot.
+ incident = incident_service.get(
+ db_session=db_session, incident_id=int(context["subject"].id)
+ )
+ participant.added_by = incident.commander
+
+ if participant.added_by:
+ # Message text when someone @'s a user is not available in body, use generic added by reason
+ participant.added_reason = (
+ f"Participant added by {participant.added_by.individual.name}"
+ )
+ else:
+ # We couldn't find a user to attribute the addition to, add generic reason
+ participant.added_reason = "Participant added by Dispatch"
+
+ db_session.add(participant)
+ db_session.commit()
+
+ if subject_type == CaseSubjects.case:
+ subject_type = "case"
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+
+ if not case.dedicated_channel:
+ return
+
+ participant = case_flows.case_add_or_reactivate_participant_flow(
+ user_email=user.email,
+ case_id=int(context["subject"].id),
+ db_session=db_session,
+ )
+
+ if not participant:
+ # Participant is already in the case channel.
+ return
+
+ project = case.project
+ generate_read_in_summary = getattr(case.case_type, "generate_read_in_summary", False)
+ if case.visibility == Visibility.restricted:
+ generate_read_in_summary = False
+ if not case.dedicated_channel:
+ generate_read_in_summary = False
+
+ participant.user_conversation_id = context["user_id"]
+
+ # If the user was invited, the message will include an inviter property containing the user ID of the inviting user.
+ # The property will be absent when a user manually joins a channel, or a user is added by default (e.g. #general channel).
+ inviter = body.get("event", {}).get("inviter", None)
+ inviter_is_user = (
+ dispatch_slack_service.is_user(context["config"], inviter) if inviter else None
+ )
+
+ if inviter and inviter_is_user:
+ # Participant is added into the incident channel using an @ message or /invite command.
+ inviter_email = get_user_email(client=client, user_id=inviter)
+ if inviter_email:
+ added_by_participant = participant_service.get_by_case_id_and_email(
+ db_session=db_session,
+ case_id=int(context["subject"].id),
+ email=inviter_email,
+ )
+ participant.added_by = added_by_participant
+
+ if not participant.added_by:
+ # User joins via the `join` button on Web Application or Slack.
+ # We default to the incident commander when we don't know who added the user or the user is the Dispatch bot.
+ participant.added_by = case.assignee
+
+ if participant.added_by:
+ # Message text when someone @'s a user is not available in body, use generic added by reason
+ participant.added_reason = (
+ f"Participant added by {participant.added_by.individual.name}"
+ )
+ else:
+ # We couldn't find a user to attribute the addition to, add generic reason
+ participant.added_reason = "Participant added by Dispatch"
+
+ db_session.add(participant)
+ db_session.commit()
+
+ if not generate_read_in_summary:
+ return
+
+ # Generate read-in summary for user
+ summary_response = ai_service.generate_read_in_summary(
+ db_session=db_session,
+ subject=context["subject"],
+ project=project,
+ channel_id=context["channel_id"],
+ important_reaction=context["config"].timeline_event_reaction,
+ participant_email=user.email,
+ )
+
+ if summary_response and summary_response.summary:
+ blocks = create_read_in_summary_blocks(summary_response.summary)
+ blocks.append(
+ Context(
+ elements=[
+ MarkdownText(
+ text="NOTE: The block above was AI-generated and may contain errors or inaccuracies. Please verify the information before relying on it."
+ )
+ ]
+ ).build()
+ )
+ dispatch_slack_service.send_ephemeral_message(
+ client=client,
+ conversation_id=context["channel_id"],
+ user_id=context["user_id"],
+ text=f"Here is a summary of what has happened so far in this {subject_type}",
+ blocks=blocks,
+ )
+ elif summary_response and summary_response.error_message:
+ # Log the error but don't show it to the user to avoid confusion
+ log.warning(f"Failed to generate read-in summary: {summary_response.error_message}")
+
+
+@app.event(
+ "member_left_channel",
+ middleware=[
+ message_context_middleware,
+ user_middleware,
+ ],
+)
+def handle_member_left_channel(
+ ack: Ack, context: BoltContext, db_session: Session, user: DispatchUser
+) -> None:
+ ack()
+
+ if context["subject"].type == IncidentSubjects.incident:
+ incident_flows.incident_remove_participant_flow(
+ user.email, context["subject"].id, db_session=db_session
+ )
+
+ if context["subject"].type == CaseSubjects.case:
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+
+ if not case.dedicated_channel:
+ return
+
+ case_flows.case_remove_participant_flow(
+ user_email=user.email,
+ case_id=int(context["subject"].id),
+ db_session=db_session,
+ )
+
+
+# MODALS
+
+
+def handle_add_timeline_event_command(
+ ack: Ack, body: dict, client: WebClient, context: BoltContext
+) -> None:
+ """Handles the add timeline event command."""
+ ack()
+ description = None
+ date = ""
+ time = ""
+ if re.match(".*DESC\\s*(.+?)(?: DATE|$|TIME)", body["text"], re.IGNORECASE):
+ description = (
+ re.match("DESC\\s*(.+?)(?: DATE|$|TIME)", body["text"], re.IGNORECASE).group(1)
+ ).strip()
+ if re.match(
+ ".*DATE\\s*(\\d{4}\\-\\d{2}\\-\\d{2})(?: TIME|$|DESC)", body["text"], re.IGNORECASE
+ ):
+ date = (
+ re.match(
+ ".*DATE\\s*(\\d{4}\\-\\d{2}\\-\\d{2})(?: TIME|$|DESC)", body["text"], re.IGNORECASE
+ ).group(1)
+ ).strip()
+ if re.match(
+ ".*TIME\\s*(([01]?[0-9]|2[0-3]):[0-5][0-9])(?: |DATE|$|DESC)", body["text"], re.IGNORECASE
+ ):
+ time = (
+ re.match(
+ ".*TIME\\s*(([01]?[0-9]|2[0-3]):[0-5][0-9])(?: |DATE|$|DESC)",
+ body["text"],
+ re.IGNORECASE,
+ ).group(1)
+ ).strip()
+
+ blocks = [
+ Context(
+ elements=[
+ MarkdownText(text="Use this form to add an event to the incident's timeline.")
+ ]
+ ),
+ description_input(initial_value=description),
+ ]
+
+ blocks.extend(datetime_picker_block(initial_option=date + "|" + time))
+
+ modal = Modal(
+ title="Add Timeline Event",
+ blocks=blocks,
+ submit="Add",
+ close="Close",
+ callback_id=AddTimelineEventActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def ack_add_timeline_submission_event(ack: Ack) -> None:
+ """Handles the add timeline submission event acknowledgement."""
+ modal = Modal(
+ title="Add Timeline Event", close="Close", blocks=[Section(text="Adding timeline event...")]
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ AddTimelineEventActions.submit,
+ middleware=[action_context_middleware, db_middleware, user_middleware, modal_submit_middleware],
+)
+def handle_add_timeline_submission_event(
+ ack: Ack,
+ body: dict,
+ user: DispatchUser,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ form_data: dict,
+):
+ """Handles the add timeline submission event."""
+ ack_add_timeline_submission_event(ack=ack)
+
+ event_date = form_data.get(DefaultBlockIds.date_picker_input)
+ event_hour = form_data.get(DefaultBlockIds.hour_picker_input)["value"]
+ event_minute = form_data.get(DefaultBlockIds.minute_picker_input)["value"]
+ event_timezone_selection = form_data.get(DefaultBlockIds.timezone_picker_input)["value"]
+ event_description = form_data.get(DefaultBlockIds.description_input)
+
+ participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=int(context["subject"].id), email=user.email
+ )
+
+ event_timezone = event_timezone_selection
+ if event_timezone_selection == TimezoneOptions.local:
+ participant_profile = get_user_profile_by_email(client, user.email)
+ if participant_profile.get("tz"):
+ event_timezone = participant_profile.get("tz")
+
+ event_dt = datetime.fromisoformat(f"{event_date}T{event_hour}:{event_minute}")
+ event_dt_utc = pytz.timezone(event_timezone).localize(event_dt).astimezone(pytz.utc)
+
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=f"Slack message from {participant.individual.name}",
+ started_at=event_dt_utc,
+ description=event_description,
+ incident_id=int(context["subject"].id),
+ individual_id=participant.individual.id,
+ type=EventType.imported_message,
+ owner=participant.individual.name,
+ )
+
+ send_success_modal(
+ client=client,
+ view_id=body["view"]["id"],
+ title="Add Timeline Event",
+ message="Timeline event added successfully.",
+ )
+
+
+def handle_update_participant_command(
+ ack: Ack,
+ body: dict,
+ context: BoltContext,
+ client: WebClient,
+) -> None:
+ """Handles the update participant command."""
+ ack()
+
+ if context["subject"].type == CaseSubjects.case:
+ raise CommandError("Command is not currently available for cases.")
+
+ incident = incident_service.get(
+ db_session=context["db_session"], incident_id=int(context["subject"].id)
+ )
+
+ blocks = [
+ Context(
+ elements=[
+ MarkdownText(
+ text="Use this form to update the reason why the participant was added to the incident."
+ )
+ ]
+ ),
+ participant_select(
+ block_id=UpdateParticipantBlockIds.participant,
+ participants=incident.participants,
+ ),
+ Input(
+ element=PlainTextInput(placeholder="Reason for addition"),
+ label="Reason added",
+ block_id=UpdateParticipantBlockIds.reason,
+ ),
+ ]
+
+ modal = Modal(
+ title="Update Participant",
+ blocks=blocks,
+ submit="Update",
+ close="Cancel",
+ callback_id=UpdateParticipantActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def ack_update_participant_submission_event(ack: Ack):
+ """Handles the update participant submission event."""
+ modal = Modal(
+ title="Update Participant", close="Close", blocks=[Section(text="Updating participant...")]
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ UpdateParticipantActions.submit,
+ middleware=[action_context_middleware, db_middleware, user_middleware, modal_submit_middleware],
+)
+def handle_update_participant_submission_event(
+ body: dict,
+ ack: Ack,
+ client: WebClient,
+ db_session: Session,
+ form_data: dict,
+) -> None:
+ """Handles the update participant submission event."""
+ ack_update_participant_submission_event(ack=ack)
+
+ added_reason = form_data.get(UpdateParticipantBlockIds.reason)
+ participant_id = int(form_data.get(UpdateParticipantBlockIds.participant)["value"])
+ selected_participant = participant_service.get(
+ db_session=db_session, participant_id=participant_id
+ )
+ participant_service.update(
+ db_session=db_session,
+ participant=selected_participant,
+ participant_in=ParticipantUpdate(added_reason=added_reason),
+ )
+
+ send_success_modal(
+ client=client,
+ view_id=body["view"]["id"],
+ title="Update Participant",
+ message="Participant added successfully.",
+ )
+
+
+def handle_update_notifications_group_command(
+ ack: Ack, body: dict, context: BoltContext, client: WebClient, db_session: Session
+) -> None:
+ """Handles the update notification group command."""
+ ack()
+
+ # TODO handle cases
+ if context["subject"].type == CaseSubjects.case:
+ raise CommandError("Command is not currently available for cases.")
+
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+
+ group_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="participant-group"
+ )
+ if not group_plugin:
+ raise CommandError(
+ "Group plugin is not enabled. Unable to update notifications group.",
+ )
+
+ if not incident.notifications_group:
+ raise CommandError("No notification group available for this incident.")
+
+ members = group_plugin.instance.list(incident.notifications_group.email)
+ blocks = [
+ Context(
+ elements=[
+ MarkdownText(
+ text="Use this form to update the membership of the notifications group."
+ )
+ ]
+ ),
+ Input(
+ label="Members",
+ element=PlainTextInput(
+ initial_value=", ".join(members) if members else None,
+ multiline=True,
+ action_id=UpdateNotificationGroupActionIds.members,
+ ),
+ block_id=UpdateNotificationGroupBlockIds.members,
+ ),
+ Context(elements=[MarkdownText(text="Separate email addresses with commas")]),
+ ]
+
+ modal = Modal(
+ title="Update Group Members", # 24 Char Limit
+ blocks=blocks,
+ close="Cancel",
+ submit="Update",
+ callback_id=UpdateNotificationGroupActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def ack_update_notifications_group_submission_event(ack: Ack):
+ """Handles the update notifications group submission acknowledgement."""
+ modal = Modal(
+ title="Update Group Members",
+ close="Close",
+ blocks=[Section(text="Updating notifications group...")],
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ UpdateNotificationGroupActions.submit,
+ middleware=[action_context_middleware, db_middleware, user_middleware, modal_submit_middleware],
+)
+def handle_update_notifications_group_submission_event(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ form_data: dict,
+) -> None:
+ """Handles the update notifications group submission event."""
+ ack_update_notifications_group_submission_event(ack=ack)
+
+ if initial_value := body["view"]["blocks"][1]["element"].get("initial_value"):
+ current_members = initial_value.replace(" ", "").split(",")
+ else:
+ current_members = []
+
+ updated_members = (
+ form_data.get(UpdateNotificationGroupBlockIds.members).replace(" ", "").split(",")
+ )
+ members_added = list(set(updated_members) - set(current_members))
+ members_removed = list(set(current_members) - set(updated_members))
+
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+
+ group_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="participant-group"
+ )
+
+ group_plugin.instance.add(incident.notifications_group.email, members_added)
+ group_plugin.instance.remove(incident.notifications_group.email, members_removed)
+
+ send_success_modal(
+ client=client,
+ view_id=body["view"]["id"],
+ title="Update Group Members",
+ message="Notification group members added successfully.",
+ )
+
+
+def handle_assign_role_command(
+ ack: Ack, body: dict, context: BoltContext, client: WebClient
+) -> None:
+ """Handles the assign role command."""
+ ack()
+
+ roles = [
+ {"text": r.value, "value": r.value}
+ for r in ParticipantRoleType
+ if r != ParticipantRoleType.participant
+ ]
+
+ blocks = [
+ Context(
+ elements=[
+ MarkdownText(
+ text="Assign a role to a participant. Note: The participant will be invited to the incident channel if they are not yet a member."
+ )
+ ]
+ ),
+ Input(
+ block_id=AssignRoleBlockIds.user,
+ label="Participant",
+ element=UsersSelect(placeholder="Participant"),
+ ),
+ static_select_block(
+ placeholder="Select Role", label="Role", options=roles, block_id=AssignRoleBlockIds.role
+ ),
+ ]
+
+ modal = Modal(
+ title="Assign Role",
+ submit="Assign",
+ close="Cancel",
+ blocks=blocks,
+ callback_id=AssignRoleActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def ack_assign_role_submission_event(ack: Ack):
+ """Handles the assign role submission acknowledgement."""
+ modal = Modal(
+ title="Assign Role", close="Close", blocks=[Section(text="Assigning role...")]
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ AssignRoleActions.submit,
+ middleware=[action_context_middleware, db_middleware, user_middleware, modal_submit_middleware],
+)
+def handle_assign_role_submission_event(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ user: DispatchUser,
+ form_data: dict,
+) -> None:
+ """Handles the assign role submission."""
+ ack_assign_role_submission_event(ack=ack)
+ assignee_user_id = form_data[AssignRoleBlockIds.user]["value"]
+ assignee_role = form_data[AssignRoleBlockIds.role]["value"]
+ assignee_email = (
+ get_user_email(client=client, user_id=assignee_user_id) or "unknown@unknown.com"
+ )
+
+ # we assign the role
+ incident_flows.incident_assign_role_flow(
+ incident_id=int(context["subject"].id),
+ assigner_email=user.email,
+ assignee_email=assignee_email,
+ assignee_role=assignee_role,
+ db_session=db_session,
+ )
+
+ if (
+ assignee_role == ParticipantRoleType.reporter
+ or assignee_role == ParticipantRoleType.incident_commander # noqa
+ ):
+ # we update the external ticket
+ ticket_flows.update_incident_ticket(
+ incident_id=int(context["subject"].id), db_session=db_session
+ )
+
+ send_success_modal(
+ client=client,
+ view_id=body["view"]["id"],
+ title="Assign Role",
+ message="Role assigned successfully.",
+ )
+
+
+def handle_create_task_command(
+ ack: Ack,
+ body,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+) -> None:
+ """Displays a modal for task creation."""
+ ack()
+
+ initial_modal = Modal(
+ title="Create Task",
+ close="Close",
+ blocks=[Section(text="Opening a dialog to create a new incident task...")],
+ ).build()
+ response = client.views_open(trigger_id=body["trigger_id"], view=initial_modal)
+
+ if context["subject"].type == CaseSubjects.case:
+ modal = Modal(
+ title="Invalid Command",
+ close="Close",
+ blocks=[Section(text="Create Task command is not currently available for cases.")],
+ ).build()
+ return client.views_update(view_id=response.get("view").get("id"), view=modal)
+
+ participants = participant_service.get_all_by_incident_id(
+ db_session=db_session, incident_id=int(context["subject"].id)
+ )
+
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+
+ contact_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="contact"
+ )
+ if not contact_plugin:
+ modal = Modal(
+ title="Plugin Not Enabled",
+ close="Close",
+ blocks=[Section(text="Contact plugin is not enabled. Unable to list participants.")],
+ ).build()
+ return client.views_update(view_id=response.get("view").get("id"), view=modal)
+
+ active_participants = [p for p in participants if p.active_roles]
+ participant_list = []
+
+ for participant in active_participants:
+ participant_email = participant.individual.email
+ participant_info = contact_plugin.instance.get(participant_email, db_session=db_session)
+ participant_name = participant_info.get("fullname", participant.individual.email)
+ participant_list.append({"text": participant_name, "value": participant_email})
+
+ blocks = [
+ static_select_block(
+ label="Assignee",
+ block_id=CreateTaskBlockIds.assignee_select,
+ placeholder="Select Assignee",
+ options=participant_list,
+ ),
+ Input(
+ label="Task Description",
+ element=PlainTextInput(placeholder="Task description", multiline=True),
+ block_id=CreateTaskBlockIds.description,
+ ),
+ ]
+
+ modal = Modal(
+ title="Create Task",
+ blocks=blocks,
+ submit="Create",
+ close="Close",
+ callback_id=CreateTaskActionIds.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+ return client.views_update(view_id=response.get("view").get("id"), view=modal)
+
+
+def ack_create_task_submission_event(ack: Ack) -> None:
+ """Handles task creation acknowledgment."""
+ modal = Modal(
+ title="Create Task", close="Close", blocks=[Section(text="Creating task...")]
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ CreateTaskActionIds.submit,
+ middleware=[action_context_middleware, db_middleware, user_middleware, modal_submit_middleware],
+)
+def handle_create_task_submission_event(
+ ack: Ack,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ form_data: dict,
+ user: DispatchUser,
+) -> None:
+ """Handles the create task submission."""
+ ack()
+
+ participant_email = form_data.get(CreateTaskBlockIds.assignee_select).get("value", "")
+ assignee = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=int(context["subject"].id), email=participant_email
+ )
+ creator = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=int(context["subject"].id), email=user.email
+ )
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+
+ task_in = TaskCreate(
+ assignees=[ParticipantUpdate.from_orm(assignee)],
+ creator=ParticipantUpdate.from_orm(creator),
+ description=form_data.get(CreateTaskBlockIds.description, ""),
+ incident=IncidentRead.from_orm(incident),
+ )
+ task = task_service.create(db_session=db_session, task_in=task_in)
+ draw_task_message(
+ channel_id=incident.conversation.channel_id,
+ client=client,
+ task=task,
+ first_open=True,
+ thread_id=None,
+ )
+
+
+def handle_engage_oncall_command(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+) -> None:
+ """Handles the engage oncall command."""
+ ack()
+
+ project_id = None
+ if context["subject"].type == CaseSubjects.case:
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+ if not case.dedicated_channel:
+ raise CommandError("Command is not currently available for threaded cases.")
+ project_id = case.project.id
+ else:
+ incident = incident_service.get(
+ db_session=db_session, incident_id=int(context["subject"].id)
+ )
+ project_id = incident.project.id
+
+ oncall_services = service_service.get_all_by_project_id_and_status(
+ db_session=db_session, project_id=project_id, is_active=True
+ )
+
+ if not oncall_services.count():
+ raise CommandError(
+ "No oncall services have been defined. You can define them in the Dispatch UI at /services."
+ )
+
+ services = [{"text": s.name, "value": s.external_id} for s in oncall_services]
+
+ blocks = [
+ static_select_block(
+ label="Service",
+ action_id=EngageOncallActionIds.service,
+ block_id=EngageOncallBlockIds.service,
+ placeholder="Select Service",
+ options=services,
+ ),
+ Input(
+ block_id=EngageOncallBlockIds.page,
+ label="Page",
+ element=Checkboxes(
+ options=[PlainOption(text="Page", value="Yes")],
+ action_id=EngageOncallActionIds.page,
+ ),
+ optional=True,
+ ),
+ ]
+
+ modal = Modal(
+ title="Engage Oncall",
+ blocks=blocks,
+ submit="Engage",
+ close="Close",
+ callback_id=EngageOncallActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def ack_engage_oncall_submission_event(ack: Ack) -> None:
+ """Handles engage oncall acknowledgment."""
+ modal = Modal(
+ title="Engage Oncall", close="Close", blocks=[Section(text="Engaging oncall...")]
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ EngageOncallActions.submit,
+ middleware=[action_context_middleware, db_middleware, user_middleware, modal_submit_middleware],
+)
+def handle_engage_oncall_submission_event(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ form_data: dict,
+ user: DispatchUser,
+) -> None:
+ """Handles the engage oncall submission.
+
+ Notes:
+ If the page checkbox is checked, form_data will contain the `engage-oncall-page` key. For example:
+
+ "engage-oncall-service": {
+ "name": "Security Incident Response Team (SIRT) - TEST",
+ "value": "1337WOW",
+ },
+ "engage-oncall-page": [{"name": "Page", "value": "Yes"}],
+
+ Otherwise, the `engage-oncall-page` key is omitted. For example:
+
+ "engage-oncall-service": {
+ "name": "Security Incident Response Team (SIRT)",
+ "value": "1337WOW",
+ }
+ """
+ ack_engage_oncall_submission_event(ack=ack)
+ oncall_service_external_id = form_data[EngageOncallBlockIds.service]["value"]
+ page_block = form_data.get(EngageOncallBlockIds.page)
+ page = page_block[0]["value"] if page_block else None # page_block[0]["value"] == "Yes"
+
+ oncall_individual, oncall_service = (
+ case_flows.case_engage_oncall_flow(
+ user_email=user.email,
+ case_id=int(context["subject"].id),
+ oncall_service_external_id=oncall_service_external_id,
+ page=page,
+ db_session=db_session,
+ )
+ if context["subject"].type == CaseSubjects.case
+ else incident_flows.incident_engage_oncall_flow(
+ user_email=user.email,
+ incident_id=int(context["subject"].id),
+ oncall_service_external_id=oncall_service_external_id,
+ page=page,
+ db_session=db_session,
+ )
+ )
+
+ if not oncall_individual and not oncall_service:
+ message = "Could not engage oncall. Oncall service plugin not enabled."
+
+ if not oncall_individual and oncall_service:
+ message = f"A member of {oncall_service.name} is already in the conversation."
+
+ if oncall_individual and oncall_service:
+ message = f"You have successfully engaged {oncall_individual.name} from the {oncall_service.name} oncall rotation."
+
+ send_success_modal(
+ client=client,
+ view_id=body["view"]["id"],
+ title="Engagement",
+ message=message,
+ )
+
+
+def tactical_report_modal(
+ context: BoltContext,
+ conditions: str | None = None,
+ actions: str | None = None,
+ needs: str | None = None,
+ genai_loading: bool = False,
+):
+ """
+ Reusable skeleton for auto-populating the fields of the tactical report modal.
+ """
+ blocks = [
+ Input(
+ label="Conditions",
+ element=PlainTextInput(
+ placeholder="Current incident conditions", initial_value=conditions, multiline=True
+ ),
+ block_id=ReportTacticalBlockIds.conditions,
+ ),
+ Input(
+ label="Actions",
+ element=PlainTextInput(
+ placeholder="Current incident actions", initial_value=actions, multiline=True
+ ),
+ block_id=ReportTacticalBlockIds.actions,
+ ),
+ Input(
+ label="Needs",
+ element=PlainTextInput(
+ placeholder="Current incident needs", initial_value=needs, multiline=True
+ ),
+ block_id=ReportTacticalBlockIds.needs,
+ ),
+ ]
+
+ if genai_loading:
+ blocks.append(
+ Section(
+ text=MarkdownText(
+ text=":hourglass_flowing_sand: This may take a moment. Be sure to verify all information before relying on it!"
+ )
+ )
+ )
+ else:
+ blocks.append(
+ Actions(
+ elements=[
+ Button(
+ text=":sparkles: Draft with GenAI",
+ action_id=TACTICAL_REPORT_SLACK_ACTION,
+ style="primary",
+ value=context["subject"].json(),
+ )
+ ]
+ )
+ )
+
+ modal = Modal(
+ title="Tactical Report",
+ blocks=blocks,
+ submit="Create",
+ close="Close",
+ callback_id=ReportTacticalActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ return modal
+
+
+def handle_report_tactical_command(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+) -> None:
+ """Handles the report tactical command."""
+ ack()
+
+ if context["subject"].type == CaseSubjects.case:
+ raise CommandError("Command is not available outside of incident channels.")
+
+ # we load the most recent tactical report
+ tactical_report = report_service.get_most_recent_by_incident_id_and_type(
+ db_session=db_session,
+ incident_id=int(context["subject"].id),
+ report_type=ReportTypes.tactical_report,
+ )
+
+ conditions = actions = needs = None
+ if tactical_report:
+ conditions = tactical_report.details.get("conditions")
+ actions = tactical_report.details.get("actions")
+ needs = tactical_report.details.get("needs")
+
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+ outstanding_actions = "" if actions is None else actions
+ if incident.tasks:
+ outstanding_actions += "\n\nOutstanding Incident Tasks:\n".join(
+ [
+ "-" + task.description
+ for task in incident.tasks
+ if task.status != TaskStatus.resolved
+ ]
+ )
+
+ if len(outstanding_actions):
+ actions = outstanding_actions
+
+ modal = tactical_report_modal(context, conditions, actions, needs, genai_loading=False)
+
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+@app.action(
+ TACTICAL_REPORT_SLACK_ACTION,
+ middleware=[
+ button_context_middleware,
+ configuration_middleware,
+ db_middleware,
+ user_middleware,
+ ],
+)
+def handle_tactical_report_draft_with_genai(
+ ack: Ack, body: dict, client: WebClient, context: BoltContext, user: DispatchUser
+):
+ ack()
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ view=tactical_report_modal(
+ context,
+ conditions="Drafting...",
+ actions="Drafting...",
+ needs="Drafting...",
+ genai_loading=True,
+ ),
+ )
+
+ db_session = context["db_session"]
+ incident_id = context["subject"].id
+
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+
+ if not incident:
+ log.error(
+ f"Unable to retrieve incident with id {incident_id} to generate tactical report via Slack"
+ )
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ view=Modal(
+ title="Tactical Report",
+ blocks=[
+ Section(
+ text=MarkdownText(
+ text=f":exclamation: Unable to retrieve incident with id {incident_id}. Please contact your Dispatch admin."
+ )
+ )
+ ],
+ close="Close",
+ private_metadata=context["subject"].json(),
+ ).build(),
+ )
+ return
+
+ draft_report = ai_service.generate_tactical_report(
+ db_session=db_session,
+ project=incident.project,
+ incident=incident,
+ important_reaction=context["config"].timeline_event_reaction,
+ )
+
+ tactical_report = draft_report.tactical_report
+ if not tactical_report:
+ error_message = (
+ draft_report.error_message
+ if draft_report.error_message
+ else "Unexpected error encountered generating tactical report."
+ )
+ log.error(error_message)
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ view=Modal(
+ title="Tactical Report",
+ blocks=[Section(text=MarkdownText(text=f":exclamation: {error_message}"))],
+ close="Close",
+ private_metadata=context["subject"].json(),
+ ).build(),
+ )
+ return
+
+ conditions, actions, needs = (
+ tactical_report.conditions,
+ tactical_report.actions,
+ tactical_report.needs,
+ )
+ actions = "- " + "\n- ".join(actions)
+ needs = "- " + "\n- ".join(needs)
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ view=tactical_report_modal(
+ context=context,
+ conditions=conditions,
+ actions=actions,
+ needs=needs
+ + "\n\nThis report was generated with AI. Please verify all information before relying on it!",
+ genai_loading=False,
+ ),
+ )
+
+
+def ack_report_tactical_submission_event(ack: Ack) -> None:
+ """Handles report tactical acknowledgment."""
+ modal = Modal(
+ title="Report Tactical", close="Close", blocks=[Section(text="Creating tactical report...")]
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ ReportTacticalActions.submit,
+ middleware=[action_context_middleware, db_middleware, user_middleware, modal_submit_middleware],
+)
+def handle_report_tactical_submission_event(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ form_data: dict,
+ user: DispatchUser,
+) -> None:
+ """Handles the report tactical submission"""
+ ack_report_tactical_submission_event(ack=ack)
+ tactical_report_in = TacticalReportCreate(
+ conditions=form_data[ReportTacticalBlockIds.conditions],
+ actions=form_data[ReportTacticalBlockIds.actions],
+ needs=form_data[ReportTacticalBlockIds.needs],
+ )
+
+ report_flows.create_tactical_report(
+ user_email=user.email,
+ incident_id=int(context["subject"].id),
+ tactical_report_in=tactical_report_in,
+ organization_slug=context["subject"].organization_slug,
+ )
+
+ send_success_modal(
+ client=client,
+ view_id=body["view"]["id"],
+ title="Tactical Report",
+ message="Tactical report successfully created.",
+ )
+
+
+def handle_report_executive_command(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+) -> None:
+ """Handles executive report command."""
+ ack()
+
+ if context["subject"].type == CaseSubjects.case:
+ raise CommandError("Command is not available outside of incident channels.")
+
+ executive_report = report_service.get_most_recent_by_incident_id_and_type(
+ db_session=db_session,
+ incident_id=int(context["subject"].id),
+ report_type=ReportTypes.executive_report,
+ )
+
+ current_status = overview = next_steps = None
+ if executive_report:
+ current_status = executive_report.details.get("current_status")
+ overview = executive_report.details.get("overview")
+ next_steps = executive_report.details.get("next_steps")
+
+ blocks = [
+ Input(
+ label="Current Status",
+ element=PlainTextInput(
+ placeholder="Current status", initial_value=current_status, multiline=True
+ ),
+ block_id=ReportExecutiveBlockIds.current_status,
+ ),
+ Input(
+ label="Overview",
+ element=PlainTextInput(placeholder="Overview", initial_value=overview, multiline=True),
+ block_id=ReportExecutiveBlockIds.overview,
+ ),
+ Input(
+ label="Next Steps",
+ element=PlainTextInput(
+ placeholder="Next steps", initial_value=next_steps, multiline=True
+ ),
+ block_id=ReportExecutiveBlockIds.next_steps,
+ ),
+ Context(
+ elements=[
+ MarkdownText(
+ text=f"Use {context['config'].slack_command_update_notifications_group} to update the list of recipients of this report."
+ )
+ ]
+ ),
+ ]
+
+ modal = Modal(
+ title="Executive Report",
+ blocks=blocks,
+ submit="Create",
+ close="Close",
+ callback_id=ReportExecutiveActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def ack_report_executive_submission_event(ack: Ack) -> None:
+ """Handles executive submission acknowledgement."""
+ modal = Modal(
+ title="Executive Report",
+ close="Close",
+ blocks=[Section(text="Creating executive report...")],
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ ReportExecutiveActions.submit,
+ middleware=[
+ action_context_middleware,
+ db_middleware,
+ user_middleware,
+ modal_submit_middleware,
+ configuration_middleware,
+ ],
+)
+def handle_report_executive_submission_event(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ form_data: dict,
+ user: DispatchUser,
+) -> None:
+ """Handles the report executive submission."""
+ ack_report_executive_submission_event(ack=ack)
+
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+
+ executive_report_in = ExecutiveReportCreate(
+ current_status=form_data[ReportExecutiveBlockIds.current_status],
+ overview=form_data[ReportExecutiveBlockIds.overview],
+ next_steps=form_data[ReportExecutiveBlockIds.next_steps],
+ )
+
+ executive_report = report_flows.create_executive_report(
+ user_email=user.email,
+ incident_id=incident.id,
+ executive_report_in=executive_report_in,
+ organization_slug=context["subject"].organization_slug,
+ )
+
+ blocks = []
+ if executive_report and incident.notifications_group:
+ blocks = [
+ Section(text="Creating executive report... Success!"),
+ Section(
+ text=f"The executive report document has been created and can be found in the incident storage here: {executive_report.document.weblink}"
+ ),
+ Section(
+ text=f"The executive report has been emailed to the incident notifications group: {incident.notifications_group.email}",
+ ),
+ ]
+ else:
+ blocks = [
+ Section(text="Creating executive report... Failed!"),
+ Section(
+ text="The executive report document was not created successfully or the incident notifications group does not exist."
+ ),
+ ]
+
+ send_success_modal(
+ client=client,
+ view_id=body["view"]["id"],
+ title="Executive Report",
+ blocks=blocks,
+ )
+
+
+def handle_update_incident_command(
+ ack: Ack, body: dict, client: WebClient, context: BoltContext, db_session: Session
+) -> None:
+ """Creates the incident update modal."""
+ ack()
+
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+
+ blocks = [
+ Context(elements=[MarkdownText(text="Use this form to update the incident's details.")]),
+ title_input(initial_value=incident.title),
+ description_input(initial_value=incident.description),
+ resolution_input(initial_value=incident.resolution),
+ incident_status_select(initial_option={"text": incident.status, "value": incident.status}),
+ incident_type_select(
+ db_session=db_session,
+ initial_option={
+ "text": incident.incident_type.name,
+ "value": incident.incident_type.id,
+ },
+ project_id=incident.project.id,
+ ),
+ Section(text=f"*Project*: {incident.project.display_name}"),
+ Context(elements=[MarkdownText(text="Project is read-only")]),
+ incident_severity_select(
+ db_session=db_session,
+ initial_option={
+ "text": incident.incident_severity.name,
+ "value": incident.incident_severity.id,
+ },
+ project_id=incident.project.id,
+ ),
+ incident_priority_select(
+ db_session=db_session,
+ initial_option={
+ "text": incident.incident_priority.name,
+ "value": incident.incident_priority.id,
+ },
+ project_id=incident.project.id,
+ ),
+ tag_multi_select(
+ optional=True,
+ initial_options=[{"text": t.name, "value": str(t.id)} for t in incident.tags],
+ ),
+ ]
+
+ modal = Modal(
+ title="Update Incident",
+ blocks=blocks,
+ submit="Update",
+ close="Cancel",
+ callback_id=IncidentUpdateActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def ack_incident_update_submission_event(ack: Ack) -> None:
+ """Handles incident update submission event."""
+ modal = Modal(
+ title="Incident Update",
+ close="Close",
+ blocks=[Section(text="Updating incident...")],
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ IncidentUpdateActions.submit,
+ middleware=[action_context_middleware, db_middleware, user_middleware, modal_submit_middleware],
+)
+def handle_update_incident_submission_event(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ form_data: dict,
+ user: DispatchUser,
+) -> None:
+ """Handles the update incident submission"""
+ incident_severity_id = int(form_data[DefaultBlockIds.incident_severity_select]["value"])
+ incident_severity = incident_severity_service.get(
+ db_session=db_session, incident_severity_id=incident_severity_id
+ )
+ status = form_data[DefaultBlockIds.incident_status_select]["name"]
+ if not incident_severity.allowed_for_stable_incidents and (
+ status == IncidentStatus.stable or status == IncidentStatus.closed
+ ):
+ errors = {
+ DefaultBlockIds.incident_severity_select: f"Severity cannot be {incident_severity.name} for {status} incidents"
+ }
+ ack(response_action="errors", errors=errors)
+ return
+
+ ack_incident_update_submission_event(ack=ack)
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+
+ tags = []
+ for t in form_data.get(DefaultBlockIds.tags_multi_select, []):
+ # we have to fetch as only the IDs are embedded in slack
+ tag = tag_service.get(db_session=db_session, tag_id=int(t["value"]))
+ tags.append(tag)
+
+ incident_in = IncidentUpdate(
+ title=form_data[DefaultBlockIds.title_input],
+ description=form_data[DefaultBlockIds.description_input],
+ resolution=form_data[DefaultBlockIds.resolution_input],
+ incident_type={"name": form_data[DefaultBlockIds.incident_type_select]["name"]},
+ incident_severity={"name": form_data[DefaultBlockIds.incident_severity_select]["name"]},
+ incident_priority={"name": form_data[DefaultBlockIds.incident_priority_select]["name"]},
+ status=form_data[DefaultBlockIds.incident_status_select]["name"],
+ tags=tags,
+ )
+
+ previous_incident = IncidentRead.from_orm(incident)
+
+ # we currently don't allow users to update the incident's visibility,
+ # costs, terms, or duplicates via Slack, so we copy them over
+ incident_in.visibility = incident.visibility
+ incident_in.incident_costs = incident.incident_costs
+ incident_in.terms = incident.terms
+ incident_in.duplicates = incident.duplicates
+
+ updated_incident = incident_service.update(
+ db_session=db_session, incident=incident, incident_in=incident_in
+ )
+
+ commander_email = updated_incident.commander.individual.email
+ reporter_email = updated_incident.reporter.individual.email
+
+ incident_flows.incident_update_flow(
+ user.email,
+ commander_email,
+ reporter_email,
+ context["subject"].id,
+ previous_incident,
+ db_session=db_session,
+ )
+
+ send_success_modal(
+ client=client,
+ view_id=body["view"]["id"],
+ title="Update Incident",
+ message="Incident updated successfully.",
+ )
+
+
+@app.shortcut(
+ IncidentShortcutCallbacks.report, middleware=[db_middleware, shortcut_context_middleware]
+)
+def report_incident(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ shortcut: dict,
+):
+ ack()
+ initial_description = None
+ if body.get("message"):
+ permalink = (
+ client.chat_getPermalink(
+ channel=context["subject"].channel_id, message_ts=body["message"]["ts"]
+ )
+ )["permalink"]
+ initial_description = f"{body['message']['text']}\n\n{permalink}"
+
+ blocks = [
+ Context(
+ elements=[
+ MarkdownText(
+ text="If you suspect an incident and need help, please fill out this form to the best of your abilities."
+ )
+ ]
+ ),
+ title_input(),
+ description_input(initial_value=initial_description),
+ project_select(
+ db_session=db_session,
+ action_id=IncidentReportActions.project_select,
+ dispatch_action=True,
+ ),
+ ]
+
+ modal = Modal(
+ title="Report Incident",
+ blocks=blocks,
+ submit="Report",
+ close="Cancel",
+ callback_id=IncidentReportActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_open(trigger_id=shortcut["trigger_id"], view=modal)
+
+
+def handle_report_incident_command(
+ ack: Ack,
+ body: dict,
+ context: BoltContext,
+ client: WebClient,
+ db_session: Session,
+) -> None:
+ """Handles the report incident command."""
+ ack()
+
+ if body.get("channel_id"):
+ context["subject"].channel_id = body["channel_id"]
+
+ blocks = [
+ Context(
+ elements=[
+ MarkdownText(
+ text="If you suspect an incident and need help, please fill out this form to the best of your abilities."
+ )
+ ]
+ ),
+ title_input(),
+ description_input(),
+ project_select(
+ db_session=db_session,
+ action_id=IncidentReportActions.project_select,
+ dispatch_action=True,
+ ),
+ ]
+
+ modal = Modal(
+ title="Report Incident",
+ blocks=blocks,
+ submit="Report",
+ close="Cancel",
+ callback_id=IncidentReportActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def ack_report_incident_submission_event(ack: Ack) -> None:
+ """Handles the report incident submission event acknowledgment."""
+ modal = Modal(
+ title="Report Incident",
+ close="Close",
+ blocks=[Section(text="Creating incident resources...")],
+ ).build()
+ ack(response_action="update", view=modal)
+
+
+@app.view(
+ IncidentReportActions.submit,
+ middleware=[action_context_middleware, db_middleware, user_middleware, modal_submit_middleware],
+)
+def handle_report_incident_submission_event(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ db_session: Session,
+ form_data: dict,
+ user: DispatchUser,
+) -> None:
+ """Handles the report incident submission"""
+ ack_report_incident_submission_event(ack=ack)
+ tags = []
+ for t in form_data.get(DefaultBlockIds.tags_multi_select, []):
+ # we have to fetch as only the IDs are embedded in Slack
+ tag = tag_service.get(db_session=db_session, tag_id=int(t["value"]))
+ tags.append(tag)
+
+ project = {"name": form_data[DefaultBlockIds.project_select]["name"]}
+
+ incident_type = None
+ if form_data.get(DefaultBlockIds.incident_type_select):
+ incident_type = {"name": form_data[DefaultBlockIds.incident_type_select]["name"]}
+
+ incident_priority = None
+ if form_data.get(DefaultBlockIds.incident_priority_select):
+ incident_priority = {"name": form_data[DefaultBlockIds.incident_priority_select]["name"]}
+
+ incident_severity = None
+ if form_data.get(DefaultBlockIds.incident_severity_select):
+ incident_severity = {"name": form_data[DefaultBlockIds.incident_severity_select]["name"]}
+
+ incident_in = IncidentCreate(
+ title=form_data[DefaultBlockIds.title_input],
+ description=form_data[DefaultBlockIds.description_input],
+ incident_type=incident_type,
+ incident_priority=incident_priority,
+ incident_severity=incident_severity,
+ project=project,
+ reporter=ParticipantUpdate(individual=IndividualContactRead(email=user.email)),
+ tags=tags,
+ )
+
+ blocks = [
+ Section(text="Creating your incident..."),
+ ]
+
+ modal = Modal(title="Incident Report", blocks=blocks, close="Close").build()
+
+ result = client.views_update(
+ view_id=body["view"]["id"],
+ trigger_id=body["trigger_id"],
+ view=modal,
+ )
+
+ # Create the incident
+ incident = incident_service.create(db_session=db_session, incident_in=incident_in)
+
+ incident_description = (
+ incident.description
+ if len(incident.description) <= 2500
+ else f"{incident.description[:2500]}..."
+ )
+
+ blocks = [
+ Section(
+ text="This is a confirmation that you have reported an incident with the following information. You will be invited to an incident Slack conversation shortly."
+ ),
+ Section(text=f"*Title*\n {incident.title}"),
+ Section(text=f"*Description*\n {incident_description}"),
+ Section(
+ fields=[
+ MarkdownText(
+ text=f"*Commander*\n<{incident.commander.individual.weblink}|{incident.commander.individual.name}>"
+ ),
+ MarkdownText(text=f"*Type*\n {incident.incident_type.name}"),
+ MarkdownText(text=f"*Severity*\n {incident.incident_severity.name}"),
+ MarkdownText(text=f"*Priority*\n {incident.incident_priority.name}"),
+ ]
+ ),
+ ]
+ modal = Modal(title="Incident Report", blocks=blocks, close="Close").build()
+
+ result = client.views_update(
+ view_id=result["view"]["id"],
+ trigger_id=result["trigger_id"],
+ view=modal,
+ )
+
+ incident_flows.incident_create_flow(
+ incident_id=incident.id,
+ db_session=db_session,
+ organization_slug=incident.project.organization.slug,
+ )
+
+
+@app.action(
+ IncidentReportActions.project_select, middleware=[action_context_middleware, db_middleware]
+)
+def handle_report_incident_project_select_action(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+) -> None:
+ ack()
+ values = body["view"]["state"]["values"]
+
+ project_id = int(
+ values[DefaultBlockIds.project_select][IncidentReportActions.project_select][
+ "selected_option"
+ ]["value"]
+ )
+
+ context["subject"].project_id = project_id
+
+ project = project_service.get(db_session=db_session, project_id=project_id)
+
+ blocks = [
+ Context(elements=[MarkdownText(text="Use this form to update the incident's details.")]),
+ title_input(),
+ description_input(),
+ project_select(
+ db_session=db_session,
+ action_id=IncidentReportActions.project_select,
+ dispatch_action=True,
+ ),
+ incident_type_select(db_session=db_session, project_id=project.id, optional=True),
+ incident_severity_select(db_session=db_session, project_id=project.id, optional=True),
+ incident_priority_select(db_session=db_session, project_id=project.id, optional=True),
+ tag_multi_select(optional=True),
+ ]
+
+ modal = Modal(
+ title="Report Incident",
+ blocks=blocks,
+ submit="Report",
+ close="Cancel",
+ callback_id=IncidentReportActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ hash=body["view"]["hash"],
+ trigger_id=body["trigger_id"],
+ view=modal,
+ )
+
+
+# BUTTONS
+@app.action(
+ IncidentNotificationActions.invite_user, middleware=[button_context_middleware, db_middleware]
+)
+def handle_incident_notification_join_button_click(
+ ack: Ack,
+ client: WebClient,
+ respond: Respond,
+ db_session: Session,
+ context: BoltContext,
+):
+ """Handles the incident join button click event."""
+ ack()
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+
+ if not incident:
+ message = "Sorry, we can't invite you to this incident. The incident does not exist."
+ elif incident.visibility == Visibility.restricted:
+ message = "Sorry, we can't invite you to this incident. The incident's visibility is restricted. Please, reach out to the incident commander if you have any questions."
+ elif incident.status == IncidentStatus.closed:
+ message = "Sorry, you can't join this incident. The incident has already been marked as closed. Please, reach out to the incident commander if you have any questions."
+ else:
+ user_id = context["user_id"]
+ try:
+ client.conversations_invite(channel=incident.conversation.channel_id, users=[user_id])
+ message = f"Success! We've added you to incident {incident.name}. Please, check your Slack sidebar for the new incident channel."
+ except SlackApiError as e:
+ if e.response.get("error") == SlackAPIErrorCode.ALREADY_IN_CHANNEL:
+ message = f"Sorry, we can't invite you to this incident - you're already a member. Search for a channel called {incident.name.lower()} in your Slack sidebar."
+
+ respond(text=message, response_type="ephemeral", replace_original=False, delete_original=False)
+
+
+@app.action(
+ IncidentNotificationActions.subscribe_user,
+ middleware=[button_context_middleware, db_middleware],
+)
+def handle_incident_notification_subscribe_button_click(
+ ack: Ack,
+ client: WebClient,
+ respond: Respond,
+ db_session: Session,
+ context: BoltContext,
+):
+ """Handles the incident subscribe button click event."""
+ ack()
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+
+ if not incident:
+ message = "Sorry, we can't invite you to this incident. The incident does not exist."
+ elif incident.visibility == Visibility.restricted:
+ message = "Sorry, we can't invite you to this incident. The incident's visibility is restricted. Please, reach out to the incident commander if you have any questions."
+ elif incident.status == IncidentStatus.closed:
+ message = "Sorry, you can't subscribe to this incident. The incident has already been marked as closed. Please, reach out to the incident commander if you have any questions."
+ else:
+ user_id = context["user_id"]
+ user_email = get_user_email(client=client, user_id=user_id)
+
+ if not user_email:
+ message = "Sorry, we can't invite you to this incident. There was a problem finding your user."
+ else:
+ if incident.tactical_group:
+ group_flows.update_group(
+ subject=incident,
+ group=incident.tactical_group,
+ group_action=GroupAction.add_member,
+ group_member=user_email,
+ db_session=db_session,
+ )
+ message = f"Success! We've subscribed you to incident {incident.name}. You will start receiving all tactical reports about this incident via email."
+
+ respond(text=message, response_type="ephemeral", replace_original=False, delete_original=False)
+
+
+@app.action(
+ LinkMonitorActionIds.monitor,
+ middleware=[button_context_middleware, db_middleware, user_middleware],
+)
+def handle_monitor_link_monitor_button_click(
+ ack: Ack,
+ body: dict,
+ context: BoltContext,
+ db_session: Session,
+ respond: Respond,
+ user: DispatchUser,
+) -> None:
+ """Handles the monitor button in the handle_message_monitor() event."""
+ ack()
+
+ message = monitor_link_button_click(
+ body=body,
+ context=context,
+ db_session=db_session,
+ enabled=True,
+ email=user.email,
+ )
+
+ respond(
+ text=message,
+ response_type="ephemeral",
+ delete_original=False,
+ replace_original=False,
+ )
+
+
+@app.action(
+ LinkMonitorActionIds.ignore,
+ middleware=[button_context_middleware, db_middleware, user_middleware],
+)
+def handle_monitor_link_ignore_button_click(
+ ack: Ack,
+ body: dict,
+ context: BoltContext,
+ db_session: Session,
+ respond: Respond,
+ user: DispatchUser,
+) -> None:
+ """Handles the ignore button in the handle_message_monitor() event."""
+ ack()
+
+ message = monitor_link_button_click(
+ body=body,
+ context=context,
+ db_session=db_session,
+ enabled=False,
+ email=user.email,
+ )
+
+ respond(
+ text=message,
+ response_type="ephemeral",
+ delete_original=False,
+ replace_original=False,
+ )
+
+
+def monitor_link_button_click(
+ body: dict,
+ context: BoltContext,
+ db_session: Session,
+ enabled: bool,
+ email: str,
+) -> str:
+ """Core logic for handle_message_monitor() button click that builds MonitorCreate object and message."""
+
+ button = MonitorMetadata.parse_raw((body["actions"][0]["value"]))
+ incident = incident_service.get(db_session=db_session, incident_id=int(context["subject"].id))
+ plugin_instance = plugin_service.get_instance(
+ db_session=db_session, plugin_instance_id=button.plugin_instance_id
+ )
+
+ creator = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=incident.id, email=email
+ )
+
+ monitor_in = MonitorCreate(
+ incident=incident,
+ enabled=enabled,
+ plugin_instance=plugin_instance,
+ creator=creator,
+ weblink=button.weblink,
+ )
+
+ monitor_service.create_or_update(db_session=db_session, monitor_in=monitor_in)
+
+ return (
+ f"A new monitor instance has been created.\n\n *Weblink:* {button.weblink}"
+ if enabled is True
+ else f"This monitor is now ignored. Dispatch won't remind this incident channel about it again.\n\n *Weblink:* {button.weblink}"
+ )
+
+
+@app.action(
+ TaskNotificationActionIds.update_status,
+ middleware=[button_context_middleware, db_middleware],
+)
+def handle_update_task_status_button_click(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+):
+ """Handles the update task button in the list-my-tasks message."""
+ ack()
+
+ button = TaskMetadata.parse_raw(body["actions"][0]["value"])
+
+ resolve = True
+ if button.action_type == "reopen":
+ resolve = False
+
+ # When the task originated from the Dispatch front-end.
+ # Tasks created in the front-end do not have an associated "resource".
+ # Thus, no "resource_id" entry is necessary.
+ from_drive = True if button.resource_id is not None else False
+
+ if from_drive:
+ # When the task originated from the drive plugin
+ task = task_service.get_by_resource_id(
+ db_session=db_session, resource_id=button.resource_id
+ )
+ else:
+ task = task_service.get(db_session=db_session, task_id=int(button.task_id))
+
+ # avoid external calls if we are already in the desired state
+ if resolve and task.status == TaskStatus.resolved:
+ return
+
+ if not resolve and task.status == TaskStatus.open:
+ return
+
+ if from_drive:
+ # we don't currently have a good way to get the correct file_id (we don't store a task <-> relationship)
+ # lets try in both the incident doc and PIR doc
+ drive_task_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=task.incident.project.id, plugin_type="task"
+ )
+ try:
+ file_id = task.incident.incident_document.resource_id
+ drive_task_plugin.instance.update(file_id, button.resource_id, resolved=resolve)
+ except DispatchException:
+ file_id = task.incident.incident_review_document.resource_id
+ drive_task_plugin.instance.update(file_id, button.resource_id, resolved=resolve)
+ else:
+ status = TaskStatus.resolved if resolve is True else TaskStatus.open
+ task_service.resolve_or_reopen(db_session=db_session, task_id=task.id, status=status)
+
+ tasks = task_service.get_all_by_incident_id(
+ db_session=db_session,
+ incident_id=int(context["subject"].id),
+ )
+
+ if not button.thread_id:
+ # we are in a modal
+ draw_task_modal(
+ channel_id=button.channel_id,
+ client=client,
+ first_open=False,
+ view_id=body["view"]["id"],
+ tasks=tasks,
+ )
+ else:
+ # We are in a message
+ task = task_service.get(db_session=db_session, task_id=button.task_id)
+ draw_task_message(
+ channel_id=button.channel_id,
+ client=client,
+ task=task,
+ first_open=False,
+ thread_id=button.thread_id,
+ )
+
+
+@app.action(RemindAgainActions.submit, middleware=[select_context_middleware, db_middleware])
+def handle_remind_again_select_action(
+ ack: Ack,
+ body: dict,
+ context: BoltContext,
+ db_session: Session,
+ respond: Respond,
+ user: DispatchUser,
+) -> None:
+ """Handles remind again select event."""
+ ack()
+ try:
+ incident = incident_service.get(
+ db_session=db_session, incident_id=int(context["subject"].id)
+ )
+
+ # User-selected option as org-id-report_type-delay
+ value = body["actions"][0]["selected_option"]["value"]
+
+ # Parse out report type and selected delay
+ *_, report_type, selection = value.split("-")
+ selection_as_message = reminder_select_values[selection]["message"]
+ hours = float(reminder_select_values[selection]["value"])
+
+ # Get new remind time
+ delay_to_time = datetime.utcnow() + timedelta(hours=hours)
+
+ # Store in incident
+ if report_type == ReportTypes.tactical_report:
+ incident.delay_tactical_report_reminder = delay_to_time
+ elif report_type == ReportTypes.executive_report:
+ incident.delay_executive_report_reminder = delay_to_time
+
+ db_session.add(incident)
+ db_session.commit()
+
+ message = f"Success! We'll remind you again in {selection_as_message}."
+ respond(
+ text=message, response_type="ephemeral", replace_original=False, delete_original=False
+ )
+ except Exception as e:
+ guid = str(uuid.uuid4())
+ log.error(f"ERROR trying to save reminder delay with guid {guid}.")
+ log.exception(e)
+ message = build_unexpected_error_message(guid)
+ respond(
+ text=message, response_type="ephemeral", replace_original=False, delete_original=False
+ )
+
+
+def handle_summary_command(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ user: DispatchUser,
+) -> None:
+ """Handles the summary command to generate a read-in summary."""
+ ack()
+
+ try:
+ if context["subject"].type == IncidentSubjects.incident:
+ incident = incident_service.get(
+ db_session=db_session, incident_id=int(context["subject"].id)
+ )
+ project = incident.project
+ subject_type = "incident"
+
+ if incident.visibility == Visibility.restricted:
+ dispatch_slack_service.send_ephemeral_message(
+ client=client,
+ conversation_id=context["channel_id"],
+ user_id=context["user_id"],
+ text=":x: Cannot generate summary for restricted incidents.",
+ )
+ return
+
+ if not incident.incident_type.generate_read_in_summary:
+ dispatch_slack_service.send_ephemeral_message(
+ client=client,
+ conversation_id=context["channel_id"],
+ user_id=context["user_id"],
+ text=":x: Read-in summaries are not enabled for this incident type.",
+ )
+ return
+
+ elif context["subject"].type == CaseSubjects.case:
+ case = case_service.get(db_session=db_session, case_id=int(context["subject"].id))
+ project = case.project
+ subject_type = "case"
+
+ if case.visibility == Visibility.restricted:
+ dispatch_slack_service.send_ephemeral_message(
+ client=client,
+ conversation_id=context["channel_id"],
+ user_id=context["user_id"],
+ text=":x: Cannot generate summary for restricted cases.",
+ )
+ return
+
+ if not case.case_type.generate_read_in_summary:
+ dispatch_slack_service.send_ephemeral_message(
+ client=client,
+ conversation_id=context["channel_id"],
+ user_id=context["user_id"],
+ text=":x: Read-in summaries are not enabled for this case type.",
+ )
+ return
+
+ if not case.dedicated_channel:
+ dispatch_slack_service.send_ephemeral_message(
+ client=client,
+ conversation_id=context["channel_id"],
+ user_id=context["user_id"],
+ text=":x: Read-in summaries are only available for cases with a dedicated channel.",
+ )
+ return
+ else:
+ dispatch_slack_service.send_ephemeral_message(
+ client=client,
+ conversation_id=context["channel_id"],
+ user_id=context["user_id"],
+ text=":x: Error: Unable to determine subject type for summary generation.",
+ )
+ return
+
+ # All validations passed
+ dispatch_slack_service.send_ephemeral_message(
+ client=client,
+ conversation_id=context["channel_id"],
+ user_id=context["user_id"],
+ text=":hourglass_flowing_sand: Generating read-in summary... This may take a moment.",
+ )
+
+ summary_response = ai_service.generate_read_in_summary(
+ db_session=db_session,
+ subject=context["subject"],
+ project=project,
+ channel_id=context["channel_id"],
+ important_reaction=context["config"].timeline_event_reaction,
+ participant_email=user.email,
+ )
+
+ if summary_response and summary_response.summary:
+ blocks = create_read_in_summary_blocks(summary_response.summary)
+ blocks.append(
+ Context(
+ elements=[
+ MarkdownText(
+ text="NOTE: The block above was AI-generated and may contain errors or inaccuracies. Please verify the information before relying on it."
+ )
+ ]
+ ).build()
+ )
+
+ dispatch_slack_service.send_ephemeral_message(
+ client=client,
+ conversation_id=context["channel_id"],
+ user_id=context["user_id"],
+ text=f"Here is a summary of what has happened so far in this {subject_type}",
+ blocks=blocks,
+ )
+ elif summary_response and summary_response.error_message:
+ log.warning(f"Failed to generate read-in summary: {summary_response.error_message}")
+
+ dispatch_slack_service.send_ephemeral_message(
+ client=client,
+ conversation_id=context["channel_id"],
+ user_id=context["user_id"],
+ text=":x: Unable to generate summary at this time. Please try again later.",
+ )
+ else:
+ # No summary generated
+ dispatch_slack_service.send_ephemeral_message(
+ client=client,
+ conversation_id=context["channel_id"],
+ user_id=context["user_id"],
+ text=":x: No summary could be generated. There may not be enough information available.",
+ )
+
+ except Exception as e:
+ log.error(f"Error generating summary: {e}")
+
+ dispatch_slack_service.send_ephemeral_message(
+ client=client,
+ conversation_id=context["channel_id"],
+ user_id=context["user_id"],
+ text=":x: An error occurred while generating the summary. Please try again later.",
+ )
diff --git a/src/dispatch/plugins/dispatch_slack/incident/messages.py b/src/dispatch/plugins/dispatch_slack/incident/messages.py
new file mode 100644
index 000000000000..00a9c1b0ebea
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/incident/messages.py
@@ -0,0 +1,19 @@
+"""The file/approach to Slack message building, which leverages Bolt SDK & Blockkit SDK instead of Jinja and raw Slack API calls"""
+
+from blockkit import (
+ Context,
+ Message,
+ Divider,
+)
+from blockkit.surfaces import Block
+
+
+def create_incident_channel_escalate_message() -> list[Block]:
+ """Generate a escalation."""
+
+ blocks = [
+ Context(elements=["This Case has been escalated to an Incident"]),
+ Divider(),
+ ]
+
+ return Message(blocks=blocks).build()["blocks"]
diff --git a/src/dispatch/plugins/dispatch_slack/messaging.py b/src/dispatch/plugins/dispatch_slack/messaging.py
index 0c8a9601cad0..1f23682053db 100644
--- a/src/dispatch/plugins/dispatch_slack/messaging.py
+++ b/src/dispatch/plugins/dispatch_slack/messaging.py
@@ -4,156 +4,282 @@
:copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
:license: Apache, see LICENSE for more details.
"""
+
import logging
-from typing import List, Optional
-from jinja2 import Template
+from typing import Any
+from blockkit import (
+ Actions,
+ Button,
+ Context,
+ Divider,
+ MarkdownText,
+ Section,
+ StaticSelect,
+ PlainOption,
+)
+from slack_sdk.web.client import WebClient
+from slack_sdk.errors import SlackApiError
-from dispatch.messaging import (
+from dispatch.messaging.strings import (
+ EVERGREEN_REMINDER_DESCRIPTION,
+ INCIDENT_PARTICIPANT_SUGGESTED_READING_DESCRIPTION,
INCIDENT_TASK_LIST_DESCRIPTION,
INCIDENT_TASK_REMINDER_DESCRIPTION,
MessageType,
render_message_template,
)
+from dispatch.plugins.dispatch_slack.config import SlackConfiguration
+from dispatch.plugins.dispatch_slack.enums import SlackAPIErrorCode
-from .config import (
- SLACK_COMMAND_ASSIGN_ROLE_SLUG,
- SLACK_COMMAND_UPDATE_INCIDENT_SLUG,
- SLACK_COMMAND_ENGAGE_ONCALL_SLUG,
- SLACK_COMMAND_LIST_PARTICIPANTS_SLUG,
- SLACK_COMMAND_LIST_RESOURCES_SLUG,
- SLACK_COMMAND_LIST_TASKS_SLUG,
- SLACK_COMMAND_MARK_ACTIVE_SLUG,
- SLACK_COMMAND_MARK_CLOSED_SLUG,
- SLACK_COMMAND_MARK_STABLE_SLUG,
- SLACK_COMMAND_STATUS_REPORT_SLUG,
-)
+log = logging.getLogger(__name__)
-logger = logging.getLogger(__name__)
+def get_template(message_type: MessageType):
+ """Fetches the correct template based on message type."""
+ template_map = {
+ MessageType.evergreen_reminder: (
+ default_notification,
+ EVERGREEN_REMINDER_DESCRIPTION,
+ ),
+ MessageType.incident_participant_suggested_reading: (
+ default_notification,
+ INCIDENT_PARTICIPANT_SUGGESTED_READING_DESCRIPTION,
+ ),
+ MessageType.incident_task_list: (default_notification, INCIDENT_TASK_LIST_DESCRIPTION),
+ MessageType.incident_task_reminder: (
+ default_notification,
+ INCIDENT_TASK_REMINDER_DESCRIPTION,
+ ),
+ }
+ return template_map.get(message_type, (default_notification, None))
-INCIDENT_CONVERSATION_STATUS_REPORT_SUGGESTION = (
- f"Consider providing a status report using the `{SLACK_COMMAND_STATUS_REPORT_SLUG}` command"
-)
-INCIDENT_CONVERSATION_COMMAND_MESSAGE = {
- SLACK_COMMAND_MARK_ACTIVE_SLUG: {
- "response_type": "ephemeral",
- "text": f"The command `{SLACK_COMMAND_MARK_ACTIVE_SLUG}` has been deprecated. Please use `{SLACK_COMMAND_UPDATE_INCIDENT_SLUG}` instead.",
- },
- SLACK_COMMAND_MARK_STABLE_SLUG: {
- "response_type": "ephemeral",
- "text": f"The command `{SLACK_COMMAND_MARK_STABLE_SLUG}` has been deprecated. Please use `{SLACK_COMMAND_UPDATE_INCIDENT_SLUG}` instead.",
- },
- SLACK_COMMAND_MARK_CLOSED_SLUG: {
- "response_type": "ephemeral",
- "text": f"The command `{SLACK_COMMAND_MARK_CLOSED_SLUG}` has been deprecated. Please use `{SLACK_COMMAND_UPDATE_INCIDENT_SLUG}` instead.",
- },
- SLACK_COMMAND_STATUS_REPORT_SLUG: {
- "response_type": "ephemeral",
- "text": "Opening a dialog to write a status report...",
- },
- SLACK_COMMAND_LIST_TASKS_SLUG: {
- "response_type": "ephemeral",
- "text": "Fetching the list of incident tasks...",
- },
- SLACK_COMMAND_LIST_PARTICIPANTS_SLUG: {
- "response_type": "ephemeral",
- "text": "Fetching the list of incident participants...",
- },
- SLACK_COMMAND_ASSIGN_ROLE_SLUG: {
- "response_type": "ephemeral",
- "text": "Opening a dialog to assign a role to a participant...",
- },
- SLACK_COMMAND_UPDATE_INCIDENT_SLUG: {
- "response_type": "ephemeral",
- "text": "Opening a dialog to update incident information...",
- },
- SLACK_COMMAND_ENGAGE_ONCALL_SLUG: {
- "response_type": "ephemeral",
- "text": "Opening a dialog to engage an oncall person...",
- },
- SLACK_COMMAND_LIST_RESOURCES_SLUG: {
+def get_incident_conversation_command_message(
+ command_string: str, config: SlackConfiguration | None = None
+) -> dict[str, str]:
+ """Fetches a custom message and response type for each respective slash command."""
+
+ default = {
"response_type": "ephemeral",
- "text": "Listing all incident resources...",
- },
-}
+ "text": f"Running command... `{command_string}`",
+ }
-INCIDENT_CONVERSATION_NON_INCIDENT_CONVERSATION_COMMAND_ERROR = """
-Looks like you tried to run `{{command}}` in an non-incident conversation. You can only run Dispatch commands in incident conversations.""".replace(
- "\n", " "
-).strip()
+ if not config:
+ return default
+
+ command_messages = {
+ config.slack_command_run_workflow: {
+ "response_type": "ephemeral",
+ "text": "Opening a modal to run a workflow...",
+ },
+ config.slack_command_report_tactical: {
+ "response_type": "ephemeral",
+ "text": "Opening a dialog to write a tactical report...",
+ },
+ config.slack_command_list_tasks: {
+ "response_type": "ephemeral",
+ "text": "Fetching the list of incident tasks...",
+ },
+ config.slack_command_list_my_tasks: {
+ "response_type": "ephemeral",
+ "text": "Fetching your incident tasks...",
+ },
+ config.slack_command_list_participants: {
+ "response_type": "ephemeral",
+ "text": "Fetching the list of incident participants...",
+ },
+ config.slack_command_assign_role: {
+ "response_type": "ephemeral",
+ "text": "Opening a dialog to assign a role to a participant...",
+ },
+ config.slack_command_update_incident: {
+ "response_type": "ephemeral",
+ "text": "Opening a dialog to update incident information...",
+ },
+ config.slack_command_update_participant: {
+ "response_type": "ephemeral",
+ "text": "Opening a dialog to update participant information...",
+ },
+ config.slack_command_engage_oncall: {
+ "response_type": "ephemeral",
+ "text": "Opening a dialog to engage an oncall person...",
+ },
+ config.slack_command_report_incident: {
+ "response_type": "ephemeral",
+ "text": "Opening a dialog to report an incident...",
+ },
+ config.slack_command_report_executive: {
+ "response_type": "ephemeral",
+ "text": "Opening a dialog to write an executive report...",
+ },
+ config.slack_command_update_notifications_group: {
+ "response_type": "ephemeral",
+ "text": "Opening a dialog to update the membership of the notifications group...",
+ },
+ config.slack_command_add_timeline_event: {
+ "response_type": "ephemeral",
+ "text": "Opening a dialog to add an event to the incident timeline...",
+ },
+ config.slack_command_list_incidents: {
+ "response_type": "ephemeral",
+ "text": "Fetching the list of incidents...",
+ },
+ config.slack_command_list_workflows: {
+ "response_type": "ephemeral",
+ "text": "Fetching the list of workflows...",
+ },
+ config.slack_command_create_task: {
+ "response_type": "ephemeral",
+ "text": "Opening a dialog to create a new incident task...",
+ },
+ config.slack_command_create_case: {
+ "response_type": "ephemeral",
+ "text": "Opening a dialog to create a new case...",
+ },
+ }
+ return command_messages.get(command_string, default)
-def render_non_incident_conversation_command_error_message(command: str):
- """Renders a non-incident conversation command error ephemeral message."""
- return {
- "response_type": "ephemeral",
- "text": Template(INCIDENT_CONVERSATION_NON_INCIDENT_CONVERSATION_COMMAND_ERROR).render(
- command=command
- ),
- }
+def build_command_error_message(payload: dict, error: Any) -> str:
+ message = f"""Unfortunately we couldn't run `{payload['command']}` due to the following reason: {str(error)} """
+ return message
+
+
+def build_role_error_message(payload: dict) -> str:
+ message = f"""I see you tried to run `{payload['command']}`. This is a sensitive command and cannot be run with the incident role you are currently assigned."""
+ return message
-def get_template(message_type: MessageType):
- """Fetches the correct template based on message type."""
- template_map = {
- MessageType.incident_notification: (default_notification, None),
- MessageType.incident_participant_welcome: (default_notification, None),
- MessageType.incident_resources_message: (default_notification, None),
- MessageType.incident_status_report: (default_notification, None),
- MessageType.incident_task_reminder: (
- default_notification,
- INCIDENT_TASK_REMINDER_DESCRIPTION,
- ),
- MessageType.incident_task_list: (default_notification, INCIDENT_TASK_LIST_DESCRIPTION),
- }
- template_func, description = template_map.get(message_type, (None, None))
+def build_context_error_message(payload: dict, error: Any) -> str:
+ message = (
+ f"""I see you tried to run `{payload['command']}` in an non-incident conversation. Incident-specific commands can only be run in incident conversations.""" # command_context_middleware()
+ if payload.get("command")
+ else str(error) # everything else
+ )
+ return message
- if not template_func:
- raise Exception(f"Unable to determine template. MessageType: {message_type}")
- return template_func, description
+def build_bot_not_present_message(client: WebClient, command: str, conversations: dict) -> str:
+ team_id = client.team_info(client)["team"]["id"]
+
+ deep_links = [
+ f"" for c in conversations
+ ]
+
+ message = f"""
+ Looks like you tried to run `{command}` in a conversation where the Dispatch bot is not present. Add the bot to your conversation or run the command in one of the following conversations:\n\n {(", ").join(deep_links)}"""
+ return message
+
+
+def build_slack_api_error_message(error: SlackApiError) -> str:
+ return (
+ "Sorry, the request to Slack timed out. Try running your command again."
+ if error.response.get("error") == SlackAPIErrorCode.VIEW_EXPIRED
+ else "Sorry, we've run into an unexpected error with Slack."
+ )
+
+
+def build_unexpected_error_message(guid: str) -> str:
+ message = f"""Sorry, we've run into an unexpected error. \
+For help please reach out to your Dispatch admins and provide them with the following token: `{guid}`"""
+ return message
def format_default_text(item: dict):
- """Creates the correct slack text string based on the item context."""
+ """Creates the correct Slack text string based on the item context."""
if item.get("title_link"):
return f"*<{item['title_link']}|{item['title']}>*\n{item['text']}"
if item.get("datetime"):
- return f"*{item['title']}* \n Subject | None:
+ """Attempts to resolve a conversation based on the channel id and thread_id."""
+ organization_slugs = []
+ with get_session() as db_session:
+ organization_slugs = [o.slug for o in organization_service.get_all(db_session=db_session)]
+
+ for slug in organization_slugs:
+ with get_organization_session(slug) as scoped_db_session:
+ conversation = conversation_service.get_by_channel_id_ignoring_channel_type(
+ db_session=scoped_db_session, channel_id=channel_id, thread_id=thread_id
+ )
+
+ if conversation:
+ if conversation.incident:
+ subject = SubjectMetadata(
+ type=IncidentSubjects.incident,
+ id=conversation.incident_id,
+ organization_slug=slug,
+ project_id=conversation.incident.project_id,
+ )
+ else:
+ subject = SubjectMetadata(
+ type=CaseSubjects.case,
+ id=conversation.case_id,
+ organization_slug=slug,
+ project_id=conversation.case.project_id,
+ )
+ return Subject(subject, db_session=scoped_db_session)
+
+
+def select_context_middleware(payload: dict, context: BoltContext, next: Callable) -> None:
+ """Attempt to determine the current context of the selection."""
+ if not payload.get("selected_option"):
+ return next()
+
+ organization_slug, incident_id, *_ = payload["selected_option"]["value"].split("-")
+ subject_data = SubjectMetadata(
+ organization_slug=organization_slug, id=incident_id, type="Incident"
+ )
+
+ context.update({"subject": subject_data})
+ next()
+
+
+def shortcut_context_middleware(context: BoltContext, next: Callable) -> None:
+ """Attempts to determine the current context of the event."""
+ context.update({"subject": SubjectMetadata(channel_id=context.channel_id)})
+ next()
+
+
+def button_context_middleware(payload: dict, context: BoltContext, next: Callable) -> None:
+ """Attempt to determine the current context of the event."""
+ try:
+ subject_data = SubjectMetadata.parse_raw(payload["value"])
+ except Exception:
+ log.debug("Extracting context data from legacy template...")
+ organization_slug, incident_id = payload["value"].split("-")
+ subject_data = SubjectMetadata(
+ organization_slug=organization_slug, id=incident_id, type="Incident"
+ )
+
+ context.update({"subject": subject_data})
+ next()
+
+
+def engagement_button_context_middleware(
+ payload: dict, context: BoltContext, next: Callable
+) -> None:
+ """Attempt to determine the current context of the event. Payload is populated by
+ the `create_signal_engagement_message` function in `dispatch_slack/case/messages.py`.
+
+ Example Payload:
+ {
+ 'action_id': 'signal-engagement-approve',
+ 'block_id': 'Zqr+v',
+ 'text': {
+ 'type': 'plain_text',
+ 'text': 'Approve',
+ 'emoji': True
+ },
+ 'value': '{
+ "id": "5362",
+ "type": "case",
+ "organization_slug":
+ "default", "project_id": "1",
+ "channel_id": "C04KJP0BLUT",
+ "signal_instance_id": "21077511-c53c-4d10-a2eb-998a5c972e09",
+ "engagement_id": 5,
+ "user": "wshel@netflix.com"
+ }',
+ 'style': 'primary',
+ 'type': 'button', 'action_ts': '1689348164.534992'
+ }
+ """
+ subject_data = EngagementMetadata.parse_raw(payload["value"])
+ context.update({"subject": subject_data})
+ next()
+
+
+def action_context_middleware(body: dict, context: BoltContext, next: Callable) -> None:
+ """Attempt to determine the current context of the event."""
+ private_metadata = json.loads(body["view"]["private_metadata"])
+ if private_metadata.get("form_data"):
+ context.update({"subject": FormMetadata(**private_metadata)})
+ else:
+ context.update({"subject": SubjectMetadata(**private_metadata)})
+ next()
+
+
+def message_context_middleware(
+ request: BoltRequest, payload: dict, context: BoltContext, next: Callable
+) -> None:
+ """Attempts to determine the current context of the event."""
+ if is_bot(request):
+ return context.ack()
+
+ if subject := resolve_context_from_conversation(
+ channel_id=context.channel_id, thread_id=payload.get("thread_ts")
+ ):
+ context.update(subject._asdict())
+ else:
+ raise ContextError("Unable to determine context for message.")
+
+ next()
+
+
+# TODO should we support reactions for cases?
+def reaction_context_middleware(context: BoltContext, next: Callable) -> None:
+ """Attempts to determine the current context of a reaction event."""
+ if subject := resolve_context_from_conversation(channel_id=context.channel_id):
+ context.update(subject._asdict())
+ else:
+ raise ContextError("Unable to determine context for reaction.")
+ next()
+
+
+@timer
+def restricted_command_middleware(
+ context: BoltContext, db_session, user: DispatchUser, next: Callable, payload: dict
+):
+ """Rejects commands from unauthorized individuals."""
+ allowed_roles = [ParticipantRoleType.incident_commander, ParticipantRoleType.scribe]
+ participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=context["subject"].id, email=user.email
+ )
+ if not participant:
+ raise RoleError(
+ f"User is not a participant in the incident and does not have permission to run `{payload['command']}`."
+ )
+
+ # if any required role is active, allow command
+ for active_role in participant.active_roles:
+ for allowed_role in allowed_roles:
+ if active_role.role == allowed_role:
+ return next()
+
+ raise RoleError(
+ f"Participant does not have permission to run `{payload['command']}`. Roles with permission: {','.join([r.name for r in allowed_roles])}",
+ )
+
+
+# filter out member join bot events as the built in slack-bolt doesn't catch these events
+# https://github.com/slackapi/bolt-python/blob/main/slack_bolt/middleware/ignoring_self_events/ignoring_self_events.py#L37
+def is_bot(request: BoltRequest) -> bool:
+ body = request.body
+ user = body.get("event", {}).get("user")
+ if user == "USLACKBOT":
+ return True
+
+ auth_result = request.context.authorize_result
+ user_id = request.context.user_id
+ bot_id = body.get("event", {}).get("bot_id")
+
+ return (
+ auth_result is not None
+ and ( # noqa
+ (user_id is not None and user_id == auth_result.bot_user_id)
+ or ( # noqa
+ bot_id is not None and bot_id == auth_result.bot_id # for bot_message events
+ )
+ )
+ and body.get("event") is not None # noqa
+ )
+
+
+@timer
+def user_middleware(
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+ request: BoltRequest,
+ next: Callable,
+ payload: dict,
+) -> None:
+ """Attempts to determine the user making the request."""
+ if is_bot(request):
+ return context.ack()
+
+ user_id = None
+
+ # for modals
+ if body.get("user"):
+ user_id = body["user"]["id"]
+
+ # for messages
+ if payload.get("user"):
+ user_id = payload["user"]
+
+ # for commands
+ if payload.get("user_id"):
+ user_id = payload["user_id"]
+
+ if not user_id:
+ raise ContextError("Unable to determine user from context.")
+
+ context["user_id"] = user_id
+
+ if not db_session:
+ slug = (
+ context["subject"].organization_slug
+ if context["subject"].organization_slug
+ else get_default_org_slug()
+ )
+ db_session = refetch_db_session(slug)
+
+ participant = None
+ # in the case of creating new incidents or cases we don't have a subject yet
+ if context["subject"].id:
+ if context["subject"].type == "incident":
+ participant = participant_service.get_by_incident_id_and_conversation_id(
+ db_session=db_session,
+ incident_id=context["subject"].id,
+ user_conversation_id=user_id,
+ )
+ else:
+ participant = participant_service.get_by_case_id_and_conversation_id(
+ db_session=db_session, case_id=context["subject"].id, user_conversation_id=user_id
+ )
+
+ user = None
+ if participant:
+ user = user_service.get_or_create(
+ db_session=db_session,
+ organization=context["subject"].organization_slug,
+ user_in=UserRegister(email=participant.individual.email),
+ )
+ else:
+ user_info = client.users_info(user=user_id).get("user", {})
+
+ if user_info.get("is_bot", False):
+ return context.ack()
+
+ email = user_info.get("profile", {}).get("email")
+
+ if not email:
+ raise ContextError("Unable to get user email address.")
+
+ user = user_service.get_or_create(
+ db_session=db_session,
+ organization=context["subject"].organization_slug,
+ user_in=UserRegister(email=email),
+ )
+
+ if not user:
+ raise ContextError("Unable to determine user from context.")
+
+ context["user"] = user
+ return next()
+
+
+def modal_submit_middleware(body: dict, context: BoltContext, next: Callable) -> None:
+ """Parses view data into a reasonable data struct."""
+ parsed_data = {}
+ state_elem = body["view"].get("state")
+ state_values = state_elem.get("values")
+
+ for state in state_values:
+ state_key_value_pair = state_values[state]
+
+ for elem_key in state_key_value_pair:
+ elem_key_value_pair = state_values[state][elem_key]
+
+ if elem_key_value_pair.get("selected_option") and elem_key_value_pair.get(
+ "selected_option"
+ ).get("value"):
+ parsed_data[state] = {
+ "name": elem_key_value_pair.get("selected_option").get("text").get("text"),
+ "value": elem_key_value_pair.get("selected_option").get("value"),
+ }
+ elif elem_key_value_pair.get("selected_user"):
+ parsed_data[state] = {
+ "name": "user",
+ "value": elem_key_value_pair.get("selected_user"),
+ }
+ elif "selected_options" in elem_key_value_pair.keys():
+ name = "No option selected"
+ value = ""
+
+ if elem_key_value_pair.get("selected_options"):
+ options = []
+ for selected in elem_key_value_pair["selected_options"]:
+ name = selected.get("text").get("text")
+ value = selected.get("value")
+ options.append({"name": name, "value": value})
+
+ parsed_data[state] = options
+ elif elem_key_value_pair.get("selected_date"):
+ parsed_data[state] = elem_key_value_pair.get("selected_date")
+ else:
+ parsed_data[state] = elem_key_value_pair.get("value")
+
+ context["form_data"] = parsed_data
+ next()
+
+
+# NOTE we don't need to handle cases because commands are not available in threads.
+def command_context_middleware(
+ context: BoltContext,
+ payload: SlackCommandPayload,
+ next: Callable,
+ expected_subject: SubjectNames = SubjectNames.INCIDENT,
+) -> None:
+ subject = resolve_context_from_conversation(channel_id=context.channel_id)
+ if not subject:
+ raise ContextError(
+ f"Sorry, we were unable to determine the correct context to run the command `{payload['command']}`. Are you running this command in a {expected_subject.lower()} channel?"
+ )
+
+ if not subject.subject.type == expected_subject.lower():
+ raise CommandError(
+ f"This command is only available with {expected_subject}s. {payload.get('channel_name', 'This channel')} is a {subject.subject.type}."
+ )
+
+ context.update(subject._asdict())
+ next()
+
+
+def add_user_middleware(payload: dict, context: BoltContext, next: Callable):
+ """Attempts to determine the user to add to the incident."""
+ value = payload.get("value")
+ if value:
+ context["users"] = json.loads(value).get("users")
+ next()
+
+
+def db_middleware(context: BoltContext, next: Callable):
+ if not context.get("subject"):
+ slug = get_default_org_slug()
+ context.update({"subject": SubjectMetadata(organization_slug=slug)})
+ else:
+ slug = context["subject"].organization_slug
+
+ with get_organization_session(slug) as db_session:
+ context["db_session"] = db_session
+ next()
+
+
+def subject_middleware(context: BoltContext, next: Callable):
+ """"""
+ if not context.get("subject"):
+ slug = get_default_org_slug()
+ context.update({"subject": SubjectMetadata(organization_slug=slug)})
+ next()
+
+
+def configuration_middleware(context: BoltContext, next: Callable):
+ if context.get("config"):
+ return next()
+
+ if not context.get("db_session"):
+ slug = get_default_org_slug()
+ db_session = refetch_db_session(slug)
+ context["db_session"] = db_session
+ else:
+ # handle_message_events() flow can contain a db_session in context
+ db_session = context["db_session"]
+
+ project_id = project_service.get_default(db_session=db_session).id
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session,
+ project_id=project_id,
+ plugin_type="conversation",
+ )
+
+ context["config"] = plugin.configuration
+ next()
+
+
+def get_default_org_slug() -> str:
+ with get_session() as db_session:
+ slug = organization_service.get_default(db_session=db_session).slug
+ return slug
diff --git a/src/dispatch/plugins/dispatch_slack/modals/common.py b/src/dispatch/plugins/dispatch_slack/modals/common.py
new file mode 100644
index 000000000000..76c4c1bc6651
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/modals/common.py
@@ -0,0 +1,80 @@
+import logging
+from blockkit import Modal, Section
+from pydantic import ValidationError
+from slack_sdk.errors import SlackApiError
+from slack_sdk.web.client import WebClient
+
+from dispatch.plugins.dispatch_slack.enums import SlackAPIErrorCode
+
+log = logging.getLogger(__file__)
+
+
+def send_success_modal(
+ client: WebClient,
+ view_id: str,
+ title: str,
+ trigger_id: str | None = None,
+ message: str | None = "Success!",
+ blocks: list[Section] | None = None,
+ close_title: str = "Close",
+) -> None:
+ """Send a success modal to a user in Slack.
+
+ Args:
+ client (WebClient): A Slack WebClient instance.
+ view_id (str): The ID of the view to update.
+ title (str): The title of the modal.
+ trigger_id (str | None, optional): The trigger ID used to identify the source of the modal action. Defaults to None.
+ message (str | None, optional): The success message to display in the modal. Defaults to "Success!".
+ blocks (list[Section] | None, optional): A list of additional blocks to display in the modal. Defaults to None.
+ close_title (str, optional): The title of the close button in the modal. Defaults to "Close".
+
+ Note:
+ This function catches the SlackApiError exception "not_found" and logs a warning message instead of raising the exception.
+ This error usually indicates that the user closed the loading modal early and is transparent to the end user.
+
+ Example:
+ from slack_sdk import WebClient
+
+ slack_client = WebClient(token="your_slack_bot_token")
+ view_id = "your_view_id"
+ title = "Success Modal"
+ trigger_id = "your_trigger_id"
+
+ send_success_modal(
+ client=slack_client,
+ view_id=view_id,
+ title=title,
+ trigger_id=trigger_id,
+ message="Your request was processed successfully!",
+ close_title="Close",
+ )
+ """
+ try:
+ modal = Modal(
+ title=title,
+ close=close_title,
+ blocks=[Section(text=message)] if not blocks else blocks,
+ ).build()
+ except ValidationError as e:
+ log.error(
+ f"Blockkit raised an exception building success modal, falling back to default: {e}"
+ )
+ modal = Modal(
+ title="Done",
+ close="Close",
+ blocks=[Section(text="Success!")],
+ ).build()
+
+ try:
+ client.views_update(
+ view_id=view_id,
+ trigger_id=trigger_id if trigger_id else None,
+ view=modal,
+ )
+ except SlackApiError as e:
+ if e.response["error"] == SlackAPIErrorCode.VIEW_NOT_FOUND:
+ e.add_note(
+ "This error usually indicates that the user closed the loading modal early and is transparent."
+ )
+ log.warning(f"Failed to send success Modal: {e}")
diff --git a/src/dispatch/plugins/dispatch_slack/models.py b/src/dispatch/plugins/dispatch_slack/models.py
new file mode 100644
index 000000000000..d3391ed3aba1
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/models.py
@@ -0,0 +1,112 @@
+"""Models for Slack command payloads in the Dispatch application."""
+
+from typing import TypedDict, NewType
+from pydantic import BaseModel, AnyHttpUrl, ConfigDict
+import logging
+from dispatch.enums import DispatchEnum
+
+log = logging.getLogger(__name__)
+
+
+class SlackCommandPayload(TypedDict):
+ """TypedDict for Slack command payload values."""
+
+ token: str
+ team_id: str
+ team_domain: str
+ channel_id: str
+ channel_name: str
+ user_id: str
+ user_name: str
+ command: str
+ text: str
+ api_app_id: str
+ is_enterprise_install: str
+ response_url: str
+ trigger_id: str
+
+
+class SubjectMetadata(BaseModel):
+ """Base model for subject metadata in Slack payloads."""
+
+ id: str | None = None
+ type: str | None = None
+ organization_slug: str = "default"
+
+ project_id: str | None = None
+ channel_id: str | None = None
+ thread_id: str | None = None
+
+ model_config = ConfigDict(
+ coerce_numbers_to_str=True
+ ) # allow coercion of id from number to string
+
+
+class AddUserMetadata(SubjectMetadata):
+ """Model for metadata when adding users."""
+
+ users: list[str]
+
+
+class EngagementMetadata(SubjectMetadata):
+ """Model for engagement-related metadata."""
+
+ signal_instance_id: str
+ engagement_id: int
+ user: str | None = None
+
+
+class TaskMetadata(SubjectMetadata):
+ """Model for task-related metadata."""
+
+ task_id: str | None = None
+ resource_id: str | None = None
+ action_type: str
+
+
+class MonitorMetadata(SubjectMetadata):
+ """Model for monitor-related metadata."""
+
+ weblink: AnyHttpUrl | None = None
+ plugin_instance_id: int
+
+
+class BlockSelection(BaseModel):
+ """Model for a block selection in Slack forms."""
+
+ name: str
+ value: str
+
+
+FormData = NewType(
+ "FormData",
+ dict[
+ str,
+ str | BlockSelection | list[BlockSelection],
+ ],
+)
+
+
+class FormMetadata(SubjectMetadata):
+ """Model for form metadata in Slack payloads."""
+
+ form_data: FormData
+
+
+class CaseSubjects(DispatchEnum):
+ """Enum for case subjects."""
+
+ case = "case"
+
+
+class IncidentSubjects(DispatchEnum):
+ """Enum for incident subjects."""
+
+ incident = "incident"
+
+
+class SignalSubjects(DispatchEnum):
+ """Enum for signal subjects."""
+
+ signal = "signal"
+ signal_instance = "signal_instance"
diff --git a/src/dispatch/plugins/dispatch_slack/plugin.py b/src/dispatch/plugins/dispatch_slack/plugin.py
index 9ee6def276aa..44f8eff6511a 100644
--- a/src/dispatch/plugins/dispatch_slack/plugin.py
+++ b/src/dispatch/plugins/dispatch_slack/plugin.py
@@ -5,107 +5,299 @@
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Kevin Glisson
"""
-from joblib import Memory
-from typing import List, Optional
-import logging
-import os
-import re
-import slack
+import io
+import json
+import logging
+from typing import Any
+from blockkit import Message
+from blockkit.surfaces import Block
+from slack_sdk.errors import SlackApiError
+from sqlalchemy.orm import Session
+
+from dispatch.auth.models import DispatchUser
+from dispatch.case.models import Case
from dispatch.conversation.enums import ConversationCommands
from dispatch.decorators import apply, counter, timer
-from dispatch.exceptions import DispatchPluginException
+from dispatch.plugin import service as plugin_service
from dispatch.plugins import dispatch_slack as slack_plugin
-from dispatch.plugins.bases import ConversationPlugin, DocumentPlugin, ContactPlugin
-
-from .config import (
- SLACK_API_BOT_TOKEN,
- SLACK_COMMAND_ASSIGN_ROLE_SLUG,
- SLACK_COMMAND_UPDATE_INCIDENT_SLUG,
- SLACK_COMMAND_ENGAGE_ONCALL_SLUG,
- SLACK_COMMAND_LIST_PARTICIPANTS_SLUG,
- SLACK_COMMAND_LIST_RESOURCES_SLUG,
- SLACK_COMMAND_LIST_TASKS_SLUG,
- SLACK_COMMAND_MARK_ACTIVE_SLUG,
- SLACK_COMMAND_MARK_CLOSED_SLUG,
- SLACK_COMMAND_MARK_STABLE_SLUG,
- SLACK_COMMAND_STATUS_REPORT_SLUG,
+from dispatch.plugins.bases import ContactPlugin, ConversationPlugin
+from dispatch.plugins.dispatch_slack.config import (
+ SlackContactConfiguration,
+ SlackConversationConfiguration,
+)
+from dispatch.signal.enums import SignalEngagementStatus
+from dispatch.signal.models import SignalEngagement, SignalInstance
+
+from .case.messages import (
+ create_action_buttons_message,
+ create_case_message,
+ create_genai_signal_analysis_message,
+ create_signal_engagement_message,
+ create_signal_message,
)
-from .views import router as slack_event_router
+from .endpoints import router as slack_event_router
+from .enums import SlackAPIErrorCode
+from .events import ChannelActivityEvent, ThreadActivityEvent
from .messaging import create_message_blocks
from .service import (
+ add_conversation_bookmark,
add_users_to_conversation,
+ add_users_to_conversation_thread,
archive_conversation,
+ chunks,
+ conversation_archived,
create_conversation,
- get_conversation_by_name,
+ create_slack_client,
+ does_user_exist,
+ emails_to_user_ids,
+ get_channel_activity,
get_user_avatar_url,
- get_user_email,
get_user_info_by_id,
- get_user_info_by_email,
- get_user_username,
- list_conversation_messages,
- list_conversations,
- message_filter,
- open_dialog_with_user,
+ get_user_profile_by_email,
+ is_user,
+ remove_member_from_channel,
+ rename_conversation,
resolve_user,
send_ephemeral_message,
send_message,
+ set_conversation_description,
set_conversation_topic,
+ unarchive_conversation,
+ update_message,
+ create_canvas,
+ update_canvas,
+ delete_canvas,
)
-
logger = logging.getLogger(__name__)
-command_mappings = {
- ConversationCommands.mark_active: SLACK_COMMAND_MARK_ACTIVE_SLUG,
- ConversationCommands.mark_stable: SLACK_COMMAND_MARK_STABLE_SLUG,
- ConversationCommands.mark_closed: SLACK_COMMAND_MARK_CLOSED_SLUG,
- ConversationCommands.status_report: SLACK_COMMAND_STATUS_REPORT_SLUG,
- ConversationCommands.list_tasks: SLACK_COMMAND_LIST_TASKS_SLUG,
- ConversationCommands.list_participants: SLACK_COMMAND_LIST_PARTICIPANTS_SLUG,
- ConversationCommands.assign_role: SLACK_COMMAND_ASSIGN_ROLE_SLUG,
- ConversationCommands.edit_incident: SLACK_COMMAND_UPDATE_INCIDENT_SLUG,
- ConversationCommands.engage_oncall: SLACK_COMMAND_ENGAGE_ONCALL_SLUG,
- ConversationCommands.list_resources: SLACK_COMMAND_LIST_RESOURCES_SLUG,
-}
-
@apply(counter, exclude=["__init__"])
@apply(timer, exclude=["__init__"])
class SlackConversationPlugin(ConversationPlugin):
- title = "Slack - Conversation"
+ title = "Slack Plugin - Conversation Management"
slug = "slack-conversation"
- description = "Uses slack to facilitate conversations."
+ description = "Uses Slack to facilitate conversations."
version = slack_plugin.__version__
events = slack_event_router
+ plugin_events = [ChannelActivityEvent, ThreadActivityEvent]
- author = "Kevin Glisson"
+ author = "Netflix"
author_url = "https://github.com/netflix/dispatch.git"
def __init__(self):
- self.client = slack.WebClient(token=SLACK_API_BOT_TOKEN)
+ self.configuration_schema = SlackConversationConfiguration
+
+ def create(self, name: str):
+ """Creates a new Slack conversation."""
+ client = create_slack_client(self.configuration)
+ return create_conversation(client, name, self.configuration.private_channels)
+
+ def create_threaded(self, case: Case, conversation_id: str, db_session: Session):
+ """Creates a new threaded conversation."""
+ client = create_slack_client(self.configuration)
+ blocks = create_case_message(case=case, channel_id=conversation_id)
+ response = send_message(client=client, conversation_id=conversation_id, blocks=blocks)
+ response_timestamp = response["timestamp"]
+
+ if case.signal_instances:
+ signal_response = None
+
+ # we try to generate a GenAI signal analysis message
+ try:
+ message, message_blocks = create_genai_signal_analysis_message(
+ case=case,
+ db_session=db_session,
+ )
+ if message and isinstance(message, dict):
+ # we update the genai_analysis field in the case model with the message if it's a dict
+ # if the message is a string, it means there was an error generating the analysis
+ case.genai_analysis = message
+
+ if message_blocks:
+ signal_response = send_message(
+ client=client,
+ conversation_id=conversation_id,
+ ts=response_timestamp,
+ blocks=message_blocks,
+ )
+ except Exception as e:
+ logger.exception(f"Error generating GenAI signal analysis message: {e}")
+
+ case.signal_thread_ts = (
+ signal_response.get("timestamp") if signal_response else response_timestamp
+ )
+
+ # we try to generate a signal message
+ try:
+ message = create_signal_message(
+ case_id=case.id, channel_id=conversation_id, db_session=db_session
+ )
+ signal_response = send_message(
+ client=client,
+ conversation_id=conversation_id,
+ ts=case.signal_thread_ts,
+ blocks=message,
+ )
+ if signal_response:
+ case.signal_thread_ts = signal_response.get("timestamp")
+ except Exception as e:
+ logger.exception(f"Error generating signal message: {e}")
+
+ # we try to upload the alert JSON to the case thread
+ try:
+ client.files_upload_v2(
+ channel=signal_response.get(
+ "id"
+ ), # we need the conversation ID not the name here
+ thread_ts=case.signal_thread_ts,
+ file=io.BytesIO(json.dumps(case.signal_instances[0].raw, indent=4).encode()),
+ )
+ except SlackApiError as e:
+ if e.response["error"] == SlackAPIErrorCode.MISSING_SCOPE:
+ exception_message = (
+ "Error uploading alert JSON to the case thread due to a missing scope"
+ )
+ else:
+ exception_message = "Error uploading alert JSON to the case thread"
+ logger.exception(f"{exception_message}: {e}")
+
+ except Exception as e:
+ logger.exception(f"Error uploading alert JSON to the case thread: {e}")
+
+ # we try to generate action buttons
+ try:
+ message = create_action_buttons_message(
+ case=case, channel_id=conversation_id, db_session=db_session
+ )
+ send_message(
+ client=client,
+ conversation_id=conversation_id,
+ ts=case.signal_thread_ts,
+ blocks=message,
+ )
+ except Exception as e:
+ logger.exception(f"Error generating action buttons message: {e}")
+
+ db_session.commit()
+ return response
- def create(self, name: str, participants: List[dict], is_private: bool = True):
- """Creates a new slack conversation."""
- participants = [resolve_user(self.client, p)["id"] for p in participants]
- return create_conversation(self.client, name, participants, is_private)
+ def create_engagement_threaded(
+ self,
+ case: Case,
+ conversation_id: str,
+ thread_id: str,
+ user: DispatchUser,
+ engagement: SignalEngagement,
+ signal_instance: SignalInstance,
+ engagement_status: SignalEngagementStatus = SignalEngagementStatus.new,
+ ):
+ """Creates a new engagement message."""
+ client = create_slack_client(self.configuration)
+ if not does_user_exist(client=client, email=user.email):
+ not_found_msg = (
+ f"Unable to engage user: {user.email}. User not found in the Slack workspace."
+ )
+ return send_message(
+ client=client,
+ conversation_id=conversation_id,
+ text=not_found_msg,
+ ts=thread_id,
+ )
+
+ blocks = create_signal_engagement_message(
+ case=case,
+ channel_id=conversation_id,
+ user_email=user.email,
+ engagement=engagement,
+ signal_instance=signal_instance,
+ engagement_status=engagement_status,
+ )
+ return send_message(
+ client=client,
+ conversation_id=conversation_id,
+ blocks=blocks,
+ ts=thread_id,
+ )
+
+ def update_thread(self, case: Case, conversation_id: str, ts: str):
+ """Updates an existing threaded conversation."""
+ client = create_slack_client(self.configuration)
+ blocks = create_case_message(case=case, channel_id=conversation_id)
+ return update_message(client=client, conversation_id=conversation_id, ts=ts, blocks=blocks)
+
+ def update_signal_message(
+ self,
+ case_id: int,
+ conversation_id: str,
+ db_session: Session,
+ thread_id: str,
+ ):
+ """Updates the signal message."""
+ client = create_slack_client(self.configuration)
+ blocks = create_signal_message(
+ case_id=case_id, channel_id=conversation_id, db_session=db_session
+ )
+ return update_message(
+ client=client, conversation_id=conversation_id, blocks=blocks, ts=thread_id
+ )
+
+ def send_message(self, conversation_id: str, blocks: list[Block]):
+ """Updates an existing threaded conversation."""
+ client = create_slack_client(self.configuration)
+ return send_message(
+ client=client,
+ conversation_id=conversation_id,
+ blocks=blocks,
+ )
def send(
self,
conversation_id: str,
text: str,
- message_template: dict,
+ message_template: list[dict],
notification_type: str,
- items: Optional[List] = None,
- blocks: Optional[List] = None,
+ items: list | None = None,
+ blocks: list | None = None,
+ ts: str | None = None,
persist: bool = False,
**kwargs,
):
"""Sends a new message based on data and type."""
- if not blocks:
- blocks = create_message_blocks(message_template, notification_type, items, **kwargs)
-
- return send_message(self.client, conversation_id, text, blocks, persist)
+ try:
+ client = create_slack_client(self.configuration)
+ messages = []
+ if not blocks:
+ blocks = create_message_blocks(message_template, notification_type, items, **kwargs)
+
+ for c in chunks(blocks, 50):
+ messages.append(
+ send_message(
+ client,
+ conversation_id,
+ text,
+ ts,
+ Message(blocks=c).build()["blocks"],
+ persist,
+ )
+ )
+ else:
+ for c in chunks(blocks, 50):
+ messages.append(send_message(client, conversation_id, text, ts, c, persist))
+ return messages
+ except SlackApiError as exception:
+ error = exception.response["error"]
+ if error == SlackAPIErrorCode.IS_ARCHIVED:
+ # swallow send errors if the channel is archived
+ message = (
+ f"SlackAPIError trying to send: {exception.response}. "
+ f"Message: {text}. Type: {notification_type}. "
+ f"Template: {message_template}"
+ )
+ logger.error(message)
+ else:
+ raise exception
def send_direct(
self,
@@ -113,17 +305,23 @@ def send_direct(
text: str,
message_template: dict,
notification_type: str,
- items: Optional[List] = None,
- blocks: Optional[List] = None,
+ items: list | None = None,
+ ts: str | None = None,
+ blocks: list | None = None,
**kwargs,
):
- """Sends a message directly to a user."""
- user_id = resolve_user(self.client, user)["id"]
+ """Sends a message directly to a user if the user exists."""
+ client = create_slack_client(self.configuration)
+ if not does_user_exist(client, user):
+ return {}
+ user_id = resolve_user(client, user)["id"]
if not blocks:
- blocks = create_message_blocks(message_template, notification_type, items, **kwargs)
+ blocks = Message(
+ blocks=create_message_blocks(message_template, notification_type, items, **kwargs)
+ ).build()["blocks"]
- return send_message(self.client, user_id, text, blocks)
+ return send_message(client, user_id, text, ts, blocks)
def send_ephemeral(
self,
@@ -132,144 +330,346 @@ def send_ephemeral(
text: str,
message_template: dict = None,
notification_type: str = None,
- items: Optional[List] = None,
- blocks: Optional[List] = None,
+ items: list | None = None,
+ blocks: list | None = None,
**kwargs,
):
- """Sends an ephemeral message to a user in a channel."""
- user_id = resolve_user(self.client, user)["id"]
+ """Sends an ephemeral message to a user in a channel if the user exists."""
+ client = create_slack_client(self.configuration)
+ if not does_user_exist(client, user):
+ return {}
+ user_id = resolve_user(client, user)["id"]
if not blocks:
- blocks = create_message_blocks(message_template, notification_type, items, **kwargs)
-
- return send_ephemeral_message(self.client, conversation_id, user_id, text, blocks)
-
- def add(self, conversation_id: str, participants: List[str]):
- """Adds users to conversation."""
- participants = [resolve_user(self.client, p)["id"] for p in participants]
- return add_users_to_conversation(self.client, conversation_id, participants)
-
- def open_dialog(self, trigger_id: str, dialog: dict):
- """Opens a dialog with a user."""
- return open_dialog_with_user(self.client, trigger_id, dialog)
+ blocks = Message(
+ blocks=create_message_blocks(message_template, notification_type, items, **kwargs)
+ ).build()["blocks"]
+
+ archived = conversation_archived(client, conversation_id)
+ if not archived:
+ send_ephemeral_message(client, conversation_id, user_id, text, blocks)
+
+ def add(self, conversation_id: str, participants: list[str]):
+ """Adds users to conversation if it is not archived."""
+ client = create_slack_client(self.configuration)
+ archived = conversation_archived(client, conversation_id)
+ if not archived:
+ participants = [resolve_user(client, p)["id"] for p in set(participants)]
+ add_users_to_conversation(client, conversation_id, participants)
+
+ def add_to_thread(self, conversation_id: str, thread_id: str, participants: list[str]):
+ """Adds users to a thread conversation."""
+ client = create_slack_client(self.configuration)
+ user_ids = emails_to_user_ids(client=client, participants=participants)
+ add_users_to_conversation_thread(client, conversation_id, thread_id, user_ids)
def archive(self, conversation_id: str):
- """Archives conversation."""
- return archive_conversation(self.client, conversation_id)
+ """Archives a conversation."""
+ client = create_slack_client(self.configuration)
+
+ archived = conversation_archived(client, conversation_id)
+ if not archived:
+ archive_conversation(client, conversation_id)
- def get_participant_username(self, participant_id: str):
- """Gets the participant's username."""
- return get_user_username(self.client, participant_id)
+ def unarchive(self, conversation_id: str):
+ """Unarchives a conversation."""
+ client = create_slack_client(self.configuration)
+ return unarchive_conversation(client, conversation_id)
- def get_participant_email(self, participant_id: str):
- """Gets the participant's email."""
- return get_user_email(self.client, participant_id)
+ def rename(self, conversation_id: str, name: str):
+ """Renames a conversation."""
+ client = create_slack_client(self.configuration)
+ return rename_conversation(client, conversation_id, name)
def get_participant_avatar_url(self, participant_id: str):
"""Gets the participant's avatar url."""
- return get_user_avatar_url(self.client, participant_id)
+ client = create_slack_client(self.configuration)
+ return get_user_avatar_url(client, participant_id)
def set_topic(self, conversation_id: str, topic: str):
"""Sets the conversation topic."""
- return set_conversation_topic(self.client, conversation_id, topic)
+ client = create_slack_client(self.configuration)
+ return set_conversation_topic(client, conversation_id, topic)
+
+ def set_description(self, conversation_id: str, description: str):
+ """Sets the conversation description."""
+ client = create_slack_client(self.configuration)
+ return set_conversation_description(client, conversation_id, description)
+
+ def remove_user(self, conversation_id: str, user_email: str):
+ """Removes a user from a conversation.
+
+ Args:
+ conversation_id: The Slack conversation/channel ID
+ user_email: The email address of the user to remove
+
+ Returns:
+ The API response if successful, None if user not found
+
+ Raises:
+ SlackApiError: For non-recoverable Slack API errors
+ """
+ client = create_slack_client(self.configuration)
+
+ try:
+ user_info = resolve_user(client, user_email)
+ user_id = user_info.get("id")
+
+ if user_id:
+ return remove_member_from_channel(
+ client=client, conversation_id=conversation_id, user_id=user_id
+ )
+ else:
+ logger.warning(
+ "Cannot remove user %s from conversation %s: "
+ "User ID not found in resolve_user response",
+ user_email,
+ conversation_id,
+ )
+ return None
+
+ except SlackApiError as e:
+ if e.response.get("error") == SlackAPIErrorCode.USERS_NOT_FOUND:
+ logger.warning(
+ "User %s not found in Slack workspace. "
+ "Cannot remove from conversation %s. "
+ "User may have been deactivated or never had Slack access.",
+ user_email,
+ conversation_id,
+ )
+ return None
+ else:
+ # Re-raise for other Slack API errors
+ raise
+
+ def add_bookmark(self, conversation_id: str, weblink: str, title: str):
+ """Adds a bookmark to the conversation."""
+ client = create_slack_client(self.configuration)
+ return add_conversation_bookmark(client, conversation_id, weblink, title)
def get_command_name(self, command: str):
"""Gets the command name."""
+ command_mappings = {
+ ConversationCommands.assign_role: self.configuration.slack_command_assign_role,
+ ConversationCommands.update_incident: self.configuration.slack_command_update_incident,
+ ConversationCommands.engage_oncall: self.configuration.slack_command_engage_oncall,
+ ConversationCommands.executive_report: self.configuration.slack_command_report_executive,
+ ConversationCommands.list_participants: self.configuration.slack_command_list_participants,
+ ConversationCommands.list_tasks: self.configuration.slack_command_list_tasks,
+ ConversationCommands.tactical_report: self.configuration.slack_command_report_tactical,
+ ConversationCommands.escalate_case: self.configuration.slack_command_escalate_case,
+ }
return command_mappings.get(command, [])
+ def fetch_events(
+ self, db_session: Session, subject: Any, plugin_event_id: int, oldest: str = "0", **kwargs
+ ):
+ """Fetches incident events from the Slack plugin.
+
+ Args:
+ subject: An Incident or Case object.
+ plugin_event_id: The plugin event id.
+ oldest: The oldest timestamp to fetch events from.
+
+ Returns:
+ A sorted list of tuples (utc_dt, user_id).
+ """
+ try:
+ client = create_slack_client(self.configuration)
+ plugin_event = plugin_service.get_plugin_event_by_id(
+ db_session=db_session, plugin_event_id=plugin_event_id
+ )
+ event = self.get_event(plugin_event)
+ if event is None:
+ raise ValueError(f"No event found for Slack plugin event: {plugin_event}")
+
+ event_instance = event()
+ activity = event_instance.fetch_activity(client, subject, oldest)
+ return activity
+ except Exception as e:
+ logger.exception(
+ "An error occurred while fetching incident or case events from the Slack plugin.",
+ exc_info=e,
+ )
+ raise
+
+ def get_conversation(
+ self,
+ conversation_id: str,
+ oldest: str = "0",
+ include_user_details=False,
+ important_reaction: str | None = None,
+ ) -> list:
+ """
+ Fetches the top-level posts from a Slack conversation.
+
+ Args:
+ conversation_id (str): The ID of the Slack conversation.
+ oldest (str): The oldest timestamp to fetch messages from.
+ include_user_details (bool): Whether to resolve user name and email information.
+ important_reaction (str): Emoji reaction indicating important messages.
+
+ Returns:
+ list: A list of tuples containing the timestamp and user ID of each message.
+ """
+ client = create_slack_client(self.configuration)
+ return get_channel_activity(
+ client,
+ conversation_id,
+ oldest,
+ include_message_text=True,
+ include_user_details=include_user_details,
+ important_reaction=important_reaction,
+ )
+
+ def get_conversation_replies(self, conversation_id: str, thread_ts: str) -> list[str]:
+ """
+ Fetches replies from a specific thread in a Slack conversation.
+
+ Args:
+ conversation_id (str): The ID of the Slack conversation.
+ thread_ts (str): The timestamp of the thread to fetch replies from.
+
+ Returns:
+ list[str]: A list of replies from users in the specified thread.
+ """
+ client = create_slack_client(self.configuration)
+ conversation_replies = client.conversations_replies(
+ channel=conversation_id,
+ ts=thread_ts,
+ )["messages"]
+
+ replies = []
+ for reply in conversation_replies:
+ if is_user(config=self.configuration, user_id=reply.get("user")):
+ # we only include messages from users
+ replies.append(f"{reply['text']}")
+ return replies
+
+ def get_all_member_emails(self, conversation_id: str) -> list[str]:
+ """
+ Fetches all members of a Slack conversation.
+
+ Args:
+ conversation_id (str): The ID of the Slack conversation.
+
+ Returns:
+ list[str]: A list of the emails for all members in the conversation.
+ """
+ client = create_slack_client(self.configuration)
+ member_ids = client.conversations_members(channel=conversation_id).get("members", [])
+
+ member_emails = []
+ for member_id in member_ids:
+ if is_user(config=self.configuration, user_id=member_id):
+ user = get_user_info_by_id(client, member_id)
+ if user and (profile := user.get("profile")) and (email := profile.get("email")):
+ member_emails.append(email)
+
+ return member_emails
+
+ def create_canvas(
+ self, conversation_id: str, title: str, user_emails: list[str] = None, content: str = None
+ ) -> str:
+ """
+ Creates a new Slack canvas in the specified conversation.
+
+ Args:
+ conversation_id (str): The ID of the Slack conversation where the canvas will be created.
+ title (str): The title of the canvas.
+ user_emails (list[str], optional): List of email addresses to grant editing permissions to.
+ content (str, optional): The markdown content of the canvas. Defaults to None.
+
+ Returns:
+ str | None: The ID of the created canvas, or None if creation failed.
+ """
+ if user_emails is None:
+ user_emails = []
+
+ client = create_slack_client(self.configuration)
+
+ user_ids = emails_to_user_ids(client, user_emails)
+
+ result = create_canvas(
+ client=client,
+ conversation_id=conversation_id,
+ title=title,
+ user_ids=user_ids,
+ content=content,
+ )
+ if result is None:
+ logger.exception(f"Failed to create canvas in conversation {conversation_id}")
+ return result
+
+ def edit_canvas(self, canvas_id: str, content: str) -> bool:
+ """
+ Edits an existing Slack canvas.
+
+ Args:
+ canvas_id (str): The ID of the canvas to edit.
+ content (str): The new markdown content for the canvas.
+
+ Returns:
+ bool: True if the canvas was successfully edited, False otherwise.
+ """
+ client = create_slack_client(self.configuration)
+ return update_canvas(client=client, canvas_id=canvas_id, content=content)
+
+ def delete_canvas(self, canvas_id: str) -> bool:
+ """
+ Deletes a Slack canvas.
+
+ Args:
+ canvas_id (str): The ID of the canvas to delete.
+
+ Returns:
+ bool: True if the canvas was successfully deleted, False otherwise.
+ """
+ client = create_slack_client(self.configuration)
+ return delete_canvas(client=client, canvas_id=canvas_id)
+
@apply(counter, exclude=["__init__"])
@apply(timer, exclude=["__init__"])
class SlackContactPlugin(ContactPlugin):
- title = "Slack - Contact"
+ title = "Slack Plugin - Contact Information Resolver"
slug = "slack-contact"
- description = "Uses slack to resolve user details."
+ description = "Uses Slack to resolve contact information details."
version = slack_plugin.__version__
- author = "Kevin Glisson"
+ author = "Netflix"
author_url = "https://github.com/netflix/dispatch.git"
def __init__(self):
- self.client = slack.WebClient(token=SLACK_API_BOT_TOKEN)
+ self.configuration_schema = SlackContactConfiguration
- def get(self, email: str):
+ def get(self, email: str, **kwargs):
"""Fetch user info by email."""
- info = get_user_info_by_email(self.client, email)
- profile = info["profile"]
+ client = create_slack_client(self.configuration)
+ team = department = "Unknown"
+ weblink = ""
+
+ profile = get_user_profile_by_email(client, email)
+ profile_fields = profile.get("fields")
+ if profile_fields:
+ team = profile_fields.get(self.configuration.profile_team_field_id, {}).get(
+ "value", "Unknown"
+ )
+ department = profile_fields.get(self.configuration.profile_department_field_id, {}).get(
+ "value", "Unknown"
+ )
+ weblink = str(
+ profile_fields.get(self.configuration.profile_weblink_field_id, {}).get("value", "")
+ )
return {
"fullname": profile["real_name"],
- "email": profile["email"],
- "title": "",
- "team": "",
- "department": "",
- "location": info["tz"],
- "weblink": "",
+ # https://api.slack.com/methods/users.profile.get#email-addresses
+ "email": profile.get("email", email),
+ "title": profile["title"],
+ "team": team,
+ "department": department,
+ "location": profile["tz"],
+ "weblink": weblink,
"thumbnail": profile["image_512"],
}
-
-
-class SlackDocumentPlugin(DocumentPlugin):
- title = "Slack - Document"
- slug = "slack-document"
- description = "Uses slack as a document source"
- version = slack_plugin.__version__
-
- author = "Kevin Glisson"
- author_url = "https://github.com/netflix/dispatch.git"
-
- def __init__(self):
- self.cachedir = os.path.dirname(os.path.realpath(__file__))
- self.memory = Memory(cachedir=self.cachedir, verbose=0)
- self.client = slack.WebClient(token=SLACK_API_BOT_TOKEN)
-
- def get(self, **kwargs) -> dict:
- """Queries slack for documents."""
- conversations = []
-
- if kwargs["channels"]:
- logger.debug(f"Querying slack for documents. Channels: {kwargs['channels']}")
-
- channels = kwargs["channels"].split(",")
- for c in channels:
- conversations.append(get_conversation_by_name(self.client, c))
-
- if kwargs["channel_match_pattern"]:
- try:
- regex = kwargs["channel_match_pattern"]
- pattern = re.compile(regex)
- except re.error as e:
- raise DispatchPluginException(
- message=f"Invalid regex. Is everything escaped properly? Regex: '{regex}' Message: {e}"
- )
-
- logger.debug(
- f"Querying slack for documents. ChannelsPattern: {kwargs['channel_match_pattern']}"
- )
- for c in list_conversations(self.client):
- if pattern.match(c["name"]):
- conversations.append(c)
-
- for c in conversations:
- logger.info(f'Fetching channel messages. Channel Name: {c["name"]}')
-
- messages = list_conversation_messages(self.client, c["id"], lookback=kwargs["lookback"])
-
- logger.info(f'Found {len(messages)} messages in slack. Channel Name: {c["name"]}')
-
- for m in messages:
- if not message_filter(m):
- continue
-
- user_email = get_user_info_by_id(self.client, m["user"])["user"]["profile"]["email"]
-
- yield {
- "person": {"email": user_email},
- "doc": {
- "text": m["text"],
- "is_private": c["is_private"],
- "subject": c["name"],
- "source": "slack",
- },
- "ref": {"timestamp": m["ts"]},
- }
diff --git a/src/dispatch/plugins/dispatch_slack/service.py b/src/dispatch/plugins/dispatch_slack/service.py
index db463a3d074b..31f990625645 100644
--- a/src/dispatch/plugins/dispatch_slack/service.py
+++ b/src/dispatch/plugins/dispatch_slack/service.py
@@ -1,373 +1,935 @@
-"""
-.. module: dispatch.plugins.dispatch_slack.service
- :platform: Unix
- :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
- :license: Apache, see LICENSE for more details.
-.. moduleauthor:: Kevin Glisson
-"""
-from datetime import datetime, timezone
-from tenacity import TryAgain, retry, retry_if_exception_type, stop_after_attempt
-from typing import Any, Dict, List, Optional
import functools
+import heapq
import logging
import re
-import slack
-import time
-
-from .config import SLACK_API_BOT_TOKEN, SLACK_USER_ID_OVERRIDE, SLACK_APP_USER_SLUG
-
+from datetime import datetime
+
+from blockkit.surfaces import Block
+from blockkit import Divider, Message, Section
+from requests import Timeout
+from slack_sdk.errors import SlackApiError
+from slack_sdk.web.client import WebClient
+from slack_sdk.web.slack_response import SlackResponse
+from tenacity import (
+ RetryCallState,
+ retry,
+ retry_if_exception,
+ stop_after_attempt,
+ wait_exponential,
+)
+
+from .config import SlackConversationConfiguration
+from .enums import SlackAPIErrorCode, SlackAPIGetEndpoints, SlackAPIPostEndpoints
+
+Conversation = dict[str, str]
log = logging.getLogger(__name__)
-class NoConversationFoundException(Exception):
- pass
+class WebClientWrapper:
+ """A wrapper for WebClient to make all instances with same token equal for caching."""
+ def __init__(self, client):
+ self._client = client
-def create_slack_client(run_async=False):
- return slack.WebClient(token=SLACK_API_BOT_TOKEN, run_async=run_async)
+ @property
+ def client(self):
+ return self._client
+ def __eq__(self, other):
+ return other._client.token == self._client.token
-def contains_numbers(string):
- return any(char.isdigit() for char in string)
+ def __hash__(self):
+ return hash(type(self._client.token))
-def resolve_user(client: Any, user_id: str):
- """Attempts to resolve a user object regardless if email, id, or prefix."""
- if SLACK_USER_ID_OVERRIDE:
- log.warning("SLACK_USER_ID_OVERIDE set. Using override.")
- return {"id": SLACK_USER_ID_OVERRIDE}
+def create_slack_client(config: SlackConversationConfiguration) -> WebClient:
+ """Creates a Slack Web API client."""
+ return WebClient(token=config.api_bot_token.get_secret_value())
+
+def resolve_user(client: WebClient, user_id: str) -> dict:
+ """Attempts to resolve a user object regardless if email, id, or prefix is provided."""
if "@" in user_id:
return get_user_info_by_email(client, user_id)
-
return {"id": user_id}
-def chunks(l, n):
- """Yield successive n-sized chunks from l."""
- for i in range(0, len(l), n):
- yield l[i : i + n]
-
-
-def paginated(data_key):
- def decorator(func):
- @functools.wraps(func)
- def decorated_function(*args, **kwargs):
- results = []
-
- while True:
- response = func(*args, **kwargs)
- results += response[data_key]
-
- # stop if we hit an empty string
- next_cursor = response["response_metadata"]["next_cursor"]
- if not next_cursor:
- break
+def emails_to_user_ids(client: WebClient, participants: list[str]) -> list[str]:
+ """
+ Resolves a list of email addresses to Slack user IDs.
+
+ This function takes a list of email addresses and attempts to resolve them to Slack user IDs.
+ If a user cannot be found for a given email address, it logs a warning and continues with the next email.
+ If an error other than a user not found occurs, it logs the exception.
+
+ Args:
+ client (WebClient): A Slack WebClient object used to interact with the Slack API.
+ participants (list[str]): A list of participant email addresses to resolve.
+
+ Returns:
+ list[str]: A list of resolved user IDs.
+
+ Raises:
+ SlackApiError: If an error other than a user not found occurs.
+
+ Example:
+ >>> from slack_sdk import WebClient
+ >>> client = WebClient(token="your-slack-token")
+ >>> emails = ["user1@example.com", "user2@example.com"]
+ >>> user_ids = emails_to_user_ids(client, emails)
+ >>> print(user_ids)
+ ["U01ABCDE1", "U01ABCDE2"]
+ """
+ user_ids = []
+
+ for participant in set(participants):
+ try:
+ user_id = resolve_user(client, participant)["id"]
+ except SlackApiError as e:
+ msg = f"Unable to resolve Slack participant {participant}: {e}"
+
+ if e.response["error"] == SlackAPIErrorCode.USERS_NOT_FOUND:
+ log.warning(msg)
+ continue
+ else:
+ log.exception(msg)
+ continue
+ else:
+ user_ids.append(user_id)
- kwargs.update({"cursor": next_cursor})
+ return user_ids
- return results
- return decorated_function
+def chunks(ids, n):
+ """Yield successive n-sized chunks from l."""
+ for i in range(0, len(ids), n):
+ yield ids[i : i + n]
+
+
+def should_retry(exception: Exception) -> bool:
+ """
+ Determine if a retry should be attempted based on the exception type.
+
+ Args:
+ exception (Exception): The exception that was raised.
+
+ Returns:
+ bool: True if a retry should be attempted, False otherwise.
+ """
+ match exception:
+ case SlackApiError():
+ # Don't retry for exceptions we have defined.
+ return exception.response["error"] not in SlackAPIErrorCode.__members__.values()
+ case TimeoutError() | Timeout():
+ # Always retry on timeout errors
+ return True
+ case _:
+ # Don't retry for other types of exceptions
+ return False
+
+
+def get_wait_time(retry_state: RetryCallState) -> int | float:
+ """
+ Determine the wait time before the next retry attempt.
+
+ Args:
+ retry_state (RetryCallState): The current state of the retry process.
+
+ Returns:
+ int | float: The number of seconds to wait before the next retry.
+ """
+ exception = retry_state.outcome.exception()
+ match exception:
+ case SlackApiError() if "Retry-After" in exception.response.headers:
+ # Use the Retry-After header value if present
+ return int(exception.response.headers["Retry-After"])
+ case _:
+ # Use exponential backoff for other cases
+ return wait_exponential(multiplier=1, min=1, max=60)(retry_state)
+
+
+@retry(
+ stop=stop_after_attempt(5),
+ retry=retry_if_exception(should_retry),
+ wait=get_wait_time,
+)
+def make_call(
+ client: WebClient,
+ endpoint: str,
+ **kwargs,
+) -> SlackResponse:
+ """
+ Make a call to the Slack API with built-in retry logic.
+
+ Args:
+ client (WebClient): The Slack WebClient instance.
+ endpoint (str): The Slack API endpoint to call.
+ **kwargs: Additional keyword arguments to pass to the API call.
+
+ Returns:
+ SlackResponse: The response from the Slack API.
+
+ Raises:
+ SlackApiError: If there's an error from the Slack API.
+ TimeoutError: If the request times out.
+ Timeout: If the request times out (from requests library).
+ """
+ try:
+ if endpoint in SlackAPIGetEndpoints:
+ # Use GET method for specific endpoints
+ return client.api_call(endpoint, http_verb="GET", params=kwargs)
+ # Use POST method (default) for other endpoints
+ return client.api_call(endpoint, json=kwargs)
+ except (SlackApiError, TimeoutError, Timeout) as exc:
+ log.warning(
+ f"{type(exc).__name__} for Slack API. Endpoint: {endpoint}. Kwargs: {kwargs}",
+ exc_info=exc if isinstance(exc, SlackApiError) else None,
+ )
+ raise
+
+
+def list_conversation_messages(client: WebClient, conversation_id: str, **kwargs) -> SlackResponse:
+ """Returns a list of conversation messages."""
+ return make_call(
+ client, SlackAPIGetEndpoints.conversations_history, channel=conversation_id, **kwargs
+ )
- return decorator
+@functools.lru_cache()
+def _get_domain(wrapper: WebClientWrapper) -> str:
+ """Gets the team's Slack domain."""
+ return make_call(wrapper.client, SlackAPIGetEndpoints.team_info)["team"]["domain"]
-def time_pagination(data_key):
- def decorator(func):
- @functools.wraps(func)
- def decorated_function(*args, **kwargs):
- results = []
- while True:
- response = func(*args, **kwargs)
- results += response[data_key]
+def get_domain(client: WebClient) -> str:
+ """Gets the team's Slack domain."""
+ return _get_domain(WebClientWrapper(client))
- # stop if we hit an empty string
- if not response["has_more"]:
- break
- kwargs.update({"latest": response["messages"][0]["ts"]})
+@functools.lru_cache()
+def _get_user_info_by_id(wrapper: WebClientWrapper, user_id: str) -> dict:
+ return make_call(wrapper.client, SlackAPIGetEndpoints.users_info, user=user_id)["user"]
- return results
- return decorated_function
+def get_user_info_by_id(client: WebClient, user_id: str) -> dict:
+ """Gets profile information about a user by id."""
+ return _get_user_info_by_id(WebClientWrapper(client), user_id)
- return decorator
+@functools.lru_cache()
+def _get_user_info_by_email(wrapper: WebClientWrapper, email: str) -> dict:
+ """Gets profile information about a user by email."""
+ return make_call(wrapper.client, SlackAPIGetEndpoints.users_lookup_by_email, email=email)[
+ "user"
+ ]
-# NOTE I don't like this but slack client is annoying (kglisson)
-SLACK_GET_ENDPOINTS = ["users.lookupByEmail", "users.info"]
+def get_user_info_by_email(client: WebClient, email: str) -> dict:
+ """Gets profile information about a user by email."""
+ return _get_user_info_by_email(WebClientWrapper(client), email)
-@retry(stop=stop_after_attempt(5), retry=retry_if_exception_type(TryAgain))
-def make_call(client: Any, endpoint: str, **kwargs):
- """Make an slack client api call."""
+@functools.lru_cache()
+def _does_user_exist(wrapper: WebClientWrapper, email: str) -> bool:
+ """Checks if a user exists in the Slack workspace by their email."""
try:
- if endpoint in SLACK_GET_ENDPOINTS:
- response = client.api_call(endpoint, http_verb="GET", params=kwargs)
- else:
- response = client.api_call(endpoint, json=kwargs)
- except slack.errors.SlackApiError as e:
- log.error(f"SlackError. Response: {e.response} Endpoint: {endpoint} kwargs: {kwargs}")
-
- # NOTE we've seen some eventual consistency problems with channel creation
- if e.response["error"] == "channel_not_found":
- raise TryAgain
-
- # NOTE we've seen some eventual consistency problems after adding users to a channel
- if e.response["error"] == "user_not_in_channel":
- raise TryAgain
-
- if e.response.headers.get("Retry-After"):
- wait = int(e.response.headers["Retry-After"])
- log.info(f"SlackError: Rate limit hit. Waiting {wait} seconds.")
- time.sleep(wait)
- raise TryAgain
+ get_user_info_by_email(wrapper.client, email)
+ return True
+ except SlackApiError as e:
+ if e.response["error"] == SlackAPIErrorCode.USERS_NOT_FOUND:
+ return False
else:
- raise e
+ raise
- return response
-
-
-async def make_call_async(client: Any, endpoint: str, **kwargs):
- """Make an slack client api call."""
-
- try:
- if endpoint in SLACK_GET_ENDPOINTS:
- response = await client.api_call(endpoint, http_verb="GET", params=kwargs)
- else:
- response = await client.api_call(endpoint, json=kwargs)
- except slack.errors.SlackApiError as e:
- log.error(f"SlackError. Response: {e.response} Endpoint: {endpoint} kwargs: {kwargs}")
-
- if e.response.headers.get("Retry-After"):
- wait = int(response.headers["Retry-After"])
- log.info(f"SlackError: Rate limit hit. Waiting {wait} seconds.")
- time.sleep(wait)
- raise TryAgain
- else:
- raise e
- return response
-
-
-@paginated("channels")
-def list_conversations(client: Any, **kwargs):
- return make_call(client, "conversations.list", types="private_channel", **kwargs)
-
-
-@time_pagination("messages")
-def list_conversation_messages(client: Any, conversation_id: str, **kwargs):
- return make_call(client, "conversations.history", channel=conversation_id, **kwargs)
+def does_user_exist(client: WebClient, email: str) -> bool:
+ """Checks if a user exists in the Slack workspace by their email."""
+ return _does_user_exist(WebClientWrapper(client), email)
@functools.lru_cache()
-def get_user_info_by_id(client: Any, user_id: str):
+def _get_user_profile_by_id(wrapper: WebClientWrapper, user_id: str) -> dict:
"""Gets profile information about a user by id."""
- return make_call(client, "users.info", user=user_id)["user"]
+ return make_call(wrapper.client, SlackAPIGetEndpoints.users_profile_get, user_id=user_id)[
+ "profile"
+ ]
-@functools.lru_cache()
-async def get_user_info_by_id_async(client: Any, user_id: str):
+def get_user_profile_by_id(client: WebClient, user_id: str) -> dict:
"""Gets profile information about a user by id."""
- return (await make_call_async(client, "users.info", user=user_id))["user"]
+ return _get_user_profile_by_id(WebClientWrapper(client), user_id)
@functools.lru_cache()
-def get_user_info_by_email(client: Any, email: str):
- """Gets profile information about a user by email."""
- return make_call(client, "users.lookupByEmail", email=email)["user"]
+def _get_user_profile_by_email(wrapper: WebClientWrapper, email: str) -> SlackResponse:
+ """Gets extended profile information about a user by email."""
+ user = get_user_info_by_email(wrapper.client, email)
+ profile = get_user_profile_by_id(wrapper.client, user["id"])
+ profile["tz"] = user["tz"]
+ return profile
-def get_user_email(client: Any, user_id: str):
- """Gets the user's email."""
- return get_user_info_by_id(client, user_id)["profile"]["email"]
+def get_user_profile_by_email(client: WebClient, email: str) -> SlackResponse:
+ """Gets extended profile information about a user by email."""
+ return _get_user_profile_by_email(WebClientWrapper(client), email)
-async def get_user_email_async(client: Any, user_id: str):
+def get_user_email(client: WebClient, user_id: str) -> str | None:
"""Gets the user's email."""
- return (await get_user_info_by_id_async(client, user_id))["profile"]["email"]
+ user_info = get_user_info_by_id(client, user_id)
+ return user_info["profile"].get("email")
-def get_user_username(client: Any, user_id: str):
- """Gets the user's username."""
- return get_user_email(client, user_id).split("@")[0]
-
-
-def get_user_avatar_url(client: Any, email: str):
+def get_user_avatar_url(client: WebClient, email: str) -> str:
"""Gets the user's avatar url."""
return get_user_info_by_email(client, email)["profile"]["image_512"]
-def get_escaped_user_from_command(command_text: str):
- """Gets escaped user sent to Slack command."""
- return re.match(r"<@(?P\w+)\|(?P\w+)>", command_text).group("user_id")
+def get_conversations_by_user_id(client: WebClient, user_id: str, type: str) -> list[Conversation]:
+ result = make_call(
+ client,
+ SlackAPIGetEndpoints.users_conversations,
+ user=user_id,
+ types=f"{type}_channel",
+ exclude_archived="true",
+ )
+
+ conversations = []
+ for channel in result["channels"]:
+ conversations.append({k: v for (k, v) in channel.items() if k == "id" or k == "name"})
+
+ return conversations
# note this will get slower over time, we might exclude archived to make it sane
-def get_conversation_by_name(client: Any, name: str):
- """Fetches a conversation by name."""
- for c in list_conversations(client):
- if c["name"] == name:
- return c
-
-
-def get_conversation_messages_by_reaction(client: Any, conversation_id: str, reaction: str):
- """Fetches messages from a conversation by reaction type."""
- messages = []
- for m in list_conversation_messages(client, conversation_id):
- if "reactions" in m and m["reactions"][0]["name"] == reaction and m["text"].strip():
- messages.insert(
- 0,
- {
- "datetime": datetime.fromtimestamp(float(m["ts"]))
- .astimezone(timezone("America/Los_Angeles"))
- .strftime("%Y-%m-%d %H:%M:%S"),
- "message": m["text"],
- "user": get_user_info_by_id(client, m["user"])["user"]["real_name"],
- },
- )
- return messages
+def get_conversation_name_by_id(client: WebClient, conversation_id: str) -> SlackResponse:
+ """Fetches a conversation by id and returns its name."""
+ try:
+ return make_call(client, SlackAPIGetEndpoints.conversations_info, channel=conversation_id)[
+ "channel"
+ ]["name"]
+ except SlackApiError as e:
+ if e.response["error"] == SlackAPIErrorCode.CHANNEL_NOT_FOUND:
+ return None
+ else:
+ raise e
+
+
+def set_conversation_topic(client: WebClient, conversation_id: str, topic: str) -> SlackResponse:
+ """Sets the topic of the specified conversation."""
+ return make_call(
+ client, SlackAPIPostEndpoints.conversations_set_topic, channel=conversation_id, topic=topic
+ )
-def set_conversation_topic(client: Any, conversation_id: str, topic: str):
+def set_conversation_description(
+ client: WebClient, conversation_id: str, description: str
+) -> SlackResponse:
"""Sets the topic of the specified conversation."""
- return make_call(client, "conversations.setTopic", channel=conversation_id, topic=topic)
+ return make_call(
+ client,
+ SlackAPIPostEndpoints.conversations_set_purpose,
+ channel=conversation_id,
+ purpose=description,
+ )
-def set_conversation_purpose(client: Any, conversation_id: str, purpose: str):
- """Sets the purpose of the specified conversation."""
- return make_call(client, "conversations.setPurpose", channel=conversation_id, purpose=purpose)
+def add_conversation_bookmark(
+ client: WebClient, conversation_id: str, weblink: str, title: str
+) -> SlackResponse:
+ """Adds a bookmark for the specified conversation."""
+ return make_call(
+ client,
+ SlackAPIPostEndpoints.bookmarks_add,
+ channel_id=conversation_id,
+ title=title,
+ type="link",
+ link=weblink,
+ )
+
+def remove_member_from_channel(client: WebClient, conversation_id: str, user_id: str) -> None:
+ """Removes a user from a channel."""
+ log.info(f"Attempting to remove user {user_id} from channel {conversation_id}")
-def create_conversation(client: Any, name: str, participants: List[str], is_private: bool = False):
- """Make a new slack conversation."""
- participants = list(set(participants))
+ # Check if user is actually in the channel before attempting removal
+ if not is_member_in_channel(client, conversation_id, user_id):
+ log.info(f"User {user_id} is not in channel {conversation_id}, skipping removal")
+ return
+
+ return make_call(
+ client, SlackAPIPostEndpoints.conversations_kick, channel=conversation_id, user=user_id
+ )
+
+
+def create_conversation(client: WebClient, name: str, is_private: bool = False) -> dict:
+ """Make a new Slack conversation."""
response = make_call(
client,
- "conversations.create",
+ SlackAPIPostEndpoints.conversations_create,
name=name.lower(), # slack disallows upperCase
is_group=is_private,
is_private=is_private,
- # user_ids=participants, # NOTE this allows for 30 folks max
)["channel"]
- add_users_to_conversation(client, response["id"], participants)
-
return {
"id": response["id"],
"name": response["name"],
- "weblink": f"https://slack.com/app_redirect?channel={response['id']}",
+ "weblink": f"https://{get_domain(client)}.slack.com/app_redirect?channel={response['id']}",
}
-def close_conversation(client: Any, conversation_id):
- """Closes an existing conversation."""
- return make_call(client, "conversations.close", channel=conversation_id)
+def archive_conversation(client: WebClient, conversation_id: str) -> SlackResponse:
+ """Archives an existing conversation."""
+ return make_call(client, SlackAPIPostEndpoints.conversations_archive, channel=conversation_id)
-def archive_conversation(client: Any, conversation_id: str):
- """Archives an existing conversation."""
- return make_call(client, "conversations.archive", channel=conversation_id)
+def unarchive_conversation(client: WebClient, conversation_id: str) -> SlackResponse:
+ """Unarchives an existing conversation."""
+ try:
+ return make_call(
+ client, SlackAPIPostEndpoints.conversations_unarchive, channel=conversation_id
+ )
+ except SlackApiError as e:
+ # if the channel isn't archived thats okay
+ if e.response["error"] != SlackAPIErrorCode.CHANNEL_NOT_ARCHIVED:
+ raise e
-def add_users_to_conversation(client: Any, conversation_id: str, user_ids: List[str]):
- """Add users to conversation."""
- # NOTE this will trigger a member_joined_channel event, which we will capture and run the incident.incident_add_or_reactivate_participant_flow() as a result
- for c in chunks(user_ids, 30): # NOTE api only allows 30 at a time.
- make_call(client, "conversations.invite", users=c, channel=conversation_id)
+def rename_conversation(client: WebClient, conversation_id: str, name: str) -> SlackResponse:
+ """Renames an existing conversation."""
+ return make_call(
+ client,
+ SlackAPIPostEndpoints.conversations_rename,
+ channel=conversation_id,
+ name=name.lower(),
+ )
-@paginated("members")
-def get_conversation_members(
- client: Any, conversation_id: str, include_bots: bool = False, **kwargs
-):
- response = make_call(client, "conversations.members", channel=conversation_id, **kwargs)
+def conversation_archived(client: WebClient, conversation_id: str) -> bool | None:
+ """Returns whether a given conversation has been archived or not."""
+ try:
+ return make_call(client, SlackAPIGetEndpoints.conversations_info, channel=conversation_id)[
+ "channel"
+ ]["is_archived"]
+ except SlackApiError as e:
+ if e.response["error"] == SlackAPIErrorCode.CHANNEL_NOT_FOUND:
+ return None
+ else:
+ raise e
- details = []
- for m in response["members"]:
- details.append(make_call(client, "users.info", user=m)["user"])
- response["members"] = details
- return response
+def add_users_to_conversation_thread(
+ client: WebClient,
+ conversation_id: str,
+ thread_id,
+ user_ids: list[str],
+) -> None:
+ """Adds user to a threaded conversation."""
+
+ users = [f"<@{user_id}>" for user_id in user_ids]
+ if users:
+ # @'ing them isn't enough if they aren't already in the channel
+ add_users_to_conversation(client=client, conversation_id=conversation_id, user_ids=user_ids)
+ blocks = Message(
+ blocks=[
+ Section(
+ text="Adding the following individuals to help resolve this case:", fields=users
+ )
+ ]
+ ).build()["blocks"]
+ send_message(client=client, conversation_id=conversation_id, blocks=blocks, ts=thread_id)
+
+
+def add_users_to_conversation(client: WebClient, conversation_id: str, user_ids: list[str]) -> None:
+ """Add users to conversation."""
+ # NOTE this will trigger a member_joined_channel event, which we will capture and run
+ # the incident.incident_add_or_reactivate_participant_flow() as a result
+ for c in chunks(user_ids, 30): # NOTE api only allows 30 at a time.
+ try:
+ make_call(
+ client, SlackAPIPostEndpoints.conversations_invite, users=c, channel=conversation_id
+ )
+ except SlackApiError as e:
+ # sometimes slack sends duplicate member_join events
+ # that result in folks already existing in the channel.
+ if e.response["error"] == SlackAPIErrorCode.USER_IN_CHANNEL:
+ pass
+ elif e.response["error"] == SlackAPIErrorCode.ALREADY_IN_CHANNEL:
+ pass
-def get_conversation_details(client: Any, conversation_id):
- """Get conversation details."""
- return make_call(client, "conversations.info", channel=conversation_id)
+def get_message_permalink(client: WebClient, conversation_id: str, ts: str) -> str:
+ return make_call(
+ client,
+ SlackAPIGetEndpoints.chat_permalink,
+ channel=conversation_id,
+ message_ts=ts,
+ )["permalink"]
def send_message(
- client: Any, conversation_id: str, text: str = None, blocks: Dict = None, persist: bool = False
-):
+ client: WebClient,
+ conversation_id: str,
+ text: str = None,
+ ts: str = None,
+ blocks: list[dict] = None,
+ persist: bool = False,
+) -> dict:
"""Sends a message to the given conversation."""
response = make_call(
- client, "chat.postMessage", channel=conversation_id, text=text, blocks=blocks
+ client,
+ SlackAPIPostEndpoints.chat_post_message,
+ channel=conversation_id,
+ text=text,
+ thread_ts=ts,
+ blocks=blocks,
+ unfurl_links=False,
)
if persist:
add_pin(client, response["channel"], response["ts"])
- return {"id": response["channel"], "timestamp": response["ts"]}
+ return {
+ "id": response["channel"],
+ "timestamp": response["ts"],
+ "weblink": get_message_permalink(client, response["channel"], response["ts"]),
+ }
-def send_ephemeral_message(
- client: Any, conversation_id: str, user_id: str, text: str, blocks: Optional[List] = None
-):
- """Sends an ephemeral message to a user in a channel."""
+def update_message(
+ client: WebClient,
+ conversation_id: str,
+ text: str = None,
+ ts: str = None,
+ blocks: list[dict] = None,
+) -> dict:
+ """Updates a message for the given conversation."""
response = make_call(
client,
- "chat.postEphemeral",
+ SlackAPIPostEndpoints.chat_update,
channel=conversation_id,
- user=user_id,
text=text,
+ ts=ts,
blocks=blocks,
)
- return {"id": response["channel"], "timestamp": response["ts"]}
+ return {
+ "id": response["channel"],
+ "timestamp": response["ts"],
+ "weblink": get_message_permalink(client, response["channel"], response["ts"]),
+ }
-# TODO what about pagination?
-def list_pins(client: Any, conversation_id: str):
- """Lists all pins for conversation."""
- return make_call(client, "pins.list", channel=conversation_id)
+def send_ephemeral_message(
+ client: WebClient,
+ conversation_id: str,
+ user_id: str,
+ text: str,
+ blocks: list | None = None,
+ thread_ts: str | None = None,
+) -> dict:
+ """Sends an ephemeral message to a user in a channel or thread."""
+ if thread_ts:
+ response = make_call(
+ client,
+ SlackAPIPostEndpoints.chat_post_ephemeral,
+ channel=conversation_id,
+ user=user_id,
+ text=text,
+ thread_ts=thread_ts,
+ blocks=blocks,
+ )
+ else:
+ response = make_call(
+ client,
+ SlackAPIPostEndpoints.chat_post_ephemeral,
+ channel=conversation_id,
+ user=user_id,
+ text=text,
+ blocks=blocks,
+ )
+
+ return {"id": response["channel"], "timestamp": response["ts"]}
-def add_pin(client: Any, conversation_id: str, timestamp: str):
+def add_pin(client: WebClient, conversation_id: str, timestamp: str) -> SlackResponse:
"""Adds a pin to a conversation."""
- return make_call(client, "pins.add", channel=conversation_id, timestamp=timestamp)
+ return make_call(
+ client, SlackAPIPostEndpoints.pins_add, channel=conversation_id, timestamp=timestamp
+ )
-def remove_pin(client: Any, conversation_id: str, timestamp: str):
- """Removed pin from conversation."""
- return make_call(client, "pins.remove", channel=conversation_id, timestamp=timestamp)
+def is_user(config: SlackConversationConfiguration, user_id: str) -> bool:
+ """Returns true if it's a regular user, false if Dispatch or Slackbot bot."""
+ return user_id != config.app_user_slug and user_id != "USLACKBOT"
+
+
+def get_thread_activity(
+ client: WebClient, conversation_id: str, ts: str, oldest: str = "0"
+) -> list:
+ """Gets all messages for a given Slack thread.
+
+ Returns:
+ A sorted list of tuples (utc_dt, user_id) of each thread reply.
+ """
+ result = []
+ cursor = None
+ while True:
+ response = make_call(
+ client,
+ SlackAPIGetEndpoints.conversations_replies,
+ channel=conversation_id,
+ ts=ts,
+ cursor=cursor,
+ oldest=oldest,
+ )
+ if not response["ok"] or "messages" not in response:
+ break
+
+ for message in response["messages"]:
+ if "bot_id" in message:
+ continue
+
+ # Resolves users for messages.
+ if "user" in message:
+ user_id = resolve_user(client, message["user"])["id"]
+ heapq.heappush(result, (datetime.utcfromtimestamp(float(message["ts"])), user_id))
+
+ if not response["has_more"]:
+ break
+ cursor = response["response_metadata"]["next_cursor"]
+
+ return heapq.nsmallest(len(result), result)
+
+
+def has_important_reaction(message, important_reaction):
+ if not important_reaction:
+ return False
+ for reaction in message.get("reactions", []):
+ if reaction["name"] == important_reaction:
+ return True
+ return False
+
+
+def get_channel_activity(
+ client: WebClient,
+ conversation_id: str,
+ oldest: str = "0",
+ include_message_text: bool = False,
+ include_user_details: bool = False,
+ important_reaction: str | None = None,
+) -> list:
+ """Gets all top-level messages for a given Slack channel.
+
+ Args:
+ client (WebClient): Slack client responsible for API calls
+ conversation_id (str): Channel ID to reference
+ oldest (int): Oldest timestamp to fetch messages from
+ include_message_text (bool): Include message text (in addition to datetime and user id)
+ include_user_details (bool): Include user name and email information
+ important_reaction (str): Optional emoji reaction designating important messages
+
+ Returns:
+ A sorted list of tuples (utc_dt, user_id) of each message in the channel,
+ or (utc_dt, user_id, message_text), depending on include_message_text.
+ """
+ result = []
+ cursor = None
+
+ def mention_resolver(user_match):
+ """
+ Helper function to extract user informations from @ mentions in messages.
+ """
+ user_id = user_match.group(1)
+ try:
+ user_info = get_user_info_by_id(client, user_id)
+ return user_info.get("real_name", f"{user_id} (name not found)")
+ except SlackApiError as e:
+ log.warning(f"Error resolving mentioned Slack user: {e}")
+ # fall back on id
+ return user_id
+
+ while True:
+ response = make_call(
+ client,
+ SlackAPIGetEndpoints.conversations_history,
+ channel=conversation_id,
+ cursor=cursor,
+ oldest=oldest,
+ )
+
+ if not response["ok"] or "messages" not in response:
+ break
+
+ for message in response["messages"]:
+ if "bot_id" in message:
+ continue
+
+ # Resolves users for messages.
+ if "user" in message:
+ user_id = resolve_user(client, message["user"])["id"]
+ utc_dt = datetime.utcfromtimestamp(float(message["ts"]))
+
+ message_result = [utc_dt, user_id]
+
+ if include_message_text:
+ message_text = message.get("text", "")
+ if has_important_reaction(message, important_reaction):
+ message_text = f"IMPORTANT!: {message_text}"
+
+ if include_user_details: # attempt to resolve mentioned users
+ message_text = re.sub(r"<@(\w+)>", mention_resolver, message_text)
+
+ message_result.append(message_text)
+
+ if include_user_details:
+ user_details = get_user_info_by_id(client, user_id)
+ user_name = user_details.get("real_name", "Name not found")
+ user_profile = user_details.get("profile", {})
+ user_display_name = user_profile.get(
+ "display_name_normalized", "DisplayName not found"
+ )
+ user_email = user_profile.get("email", "Email not found")
+ message_result.extend([user_name, user_display_name, user_email])
+
+ heapq.heappush(result, tuple(message_result))
+
+ if not response["has_more"]:
+ break
+ cursor = response["response_metadata"]["next_cursor"]
+
+ return heapq.nsmallest(len(result), result)
+
+
+def json_to_slack_format(json_message: dict[str, str]) -> str:
+ """
+ Converts a JSON dictionary to Slack markup format.
+
+ Args:
+ json_dict (dict): The JSON dictionary to convert.
+
+ Returns:
+ str: A string formatted with Slack markup.
+ """
+ slack_message = ""
+ for key, value in json_message.items():
+ slack_message += f"*{key}*\n{value}\n\n"
+ return slack_message.strip()
+
+
+def create_genai_message_metadata_blocks(
+ title: str, blocks: list[Block], message: str | dict[str, str]
+) -> list[Block]:
+ """
+ Appends a GenAI section to any existing metadata blocks.
+
+ Args:
+ blocks (list[Block]): The list of existing metadata blocks.
+ message (str | dict[str, str]): The GenAI message, either as a string or a dictionary.
+
+ Returns:
+ list[Block]: The updated list of metadata blocks with the GenAI section appended.
+ """
+ if isinstance(message, dict):
+ message = json_to_slack_format(message)
+
+ # Truncate the text if it exceeds Block Kit's maximum length of 3000 characters
+ text = f"đĒ *{title}*\n\n{message}"
+ text = f"{text[:2997]}..." if len(text) > 3000 else text
+ blocks.append(
+ Section(text=text),
+ )
+ blocks.append(Divider())
+ return Message(blocks=blocks).build()["blocks"]
-def message_filter(message):
- """Some messages are not useful, we filter them here."""
- if not message["text"]: # sometimes for file upload there is no text only files
- return
+def is_member_in_channel(client: WebClient, conversation_id: str, user_id: str) -> bool:
+ """
+ Check if a user is a member of a specific Slack channel.
- if message["type"] != "message":
- return
+ Args:
+ client (WebClient): A Slack WebClient object used to interact with the Slack API.
+ conversation_id (str): The ID of the Slack channel/conversation to check.
+ user_id (str): The ID of the user to check for membership.
- if message.get("subtype"):
- return
+ Returns:
+ bool: True if the user is a member of the channel, False otherwise.
- if message.get("bot_id"):
- return
+ Raises:
+ SlackApiError: If there's an error from the Slack API (e.g., channel not found).
+ """
+ try:
+ response = make_call(
+ client,
+ SlackAPIGetEndpoints.conversations_members,
+ channel=conversation_id,
+ )
+
+ # Check if the user_id is in the list of members
+ return user_id in response.get("members", [])
+
+ except SlackApiError as e:
+ if e.response["error"] == SlackAPIErrorCode.CHANNEL_NOT_FOUND:
+ log.warning(
+ f"Channel {conversation_id} not found when checking membership for user {user_id}"
+ )
+ return False
+ elif e.response["error"] == SlackAPIErrorCode.USER_NOT_IN_CHANNEL:
+ # The bot itself is not in the channel, so it can't check membership
+ log.warning(
+ f"Bot not in channel {conversation_id}, cannot check membership for user {user_id}"
+ )
+ return False
+ else:
+ log.exception(
+ f"Error checking channel membership for user {user_id} in channel {conversation_id}: {e}"
+ )
+ raise
- return message
+def canvas_set_access(
+ client: WebClient, conversation_id: str, canvas_id: str, user_ids: list[str] = None
+) -> bool:
+ """
+ Locks the canvas to read-only by the channel but allows the Dispatch bot to edit the canvas.
-def is_user(slack_user: str):
- """Returns true if it's a regular user, false if dispatch bot'."""
- return slack_user != SLACK_APP_USER_SLUG
+ Args:
+ client (WebClient): A Slack WebClient object used to interact with the Slack API.
+ conversation_id (str): The ID of the Slack conversation where the canvas will be created.
+ canvas_id (str): The ID of the canvas to update.
+ user_ids (list[str]): The IDs of the users to allow to edit the canvas.
+ Returns:
+ bool: True if the canvas was successfully updated, False otherwise.
+ """
+ if user_ids is None:
+ user_ids = []
-def open_dialog_with_user(client: Any, trigger_id: str, dialog: dict):
- """Opens a dialog with a user."""
- return make_call(client, "dialog.open", trigger_id=trigger_id, dialog=dialog)
+ try:
+ make_call(
+ client,
+ SlackAPIPostEndpoints.canvas_access_set,
+ access_level="read",
+ canvas_id=canvas_id,
+ channel_ids=[conversation_id],
+ )
+ if user_ids:
+ make_call(
+ client,
+ SlackAPIPostEndpoints.canvas_access_set,
+ access_level="write",
+ canvas_id=canvas_id,
+ user_ids=user_ids,
+ )
+
+ return True
+ except SlackApiError as e:
+ log.exception(f"Error setting canvas access for canvas {canvas_id}: {e}")
+ return False
+
+
+def create_canvas(
+ client: WebClient,
+ conversation_id: str,
+ title: str,
+ user_ids: list[str] = None,
+ content: str = None,
+) -> str:
+ """
+ Creates a new Slack canvas in the specified conversation.
+
+ Args:
+ client (WebClient): A Slack WebClient object used to interact with the Slack API.
+ conversation_id (str): The ID of the Slack conversation where the canvas will be created.
+ title (str): The title of the canvas.
+ user_ids (list[str]): The IDs of the user(s) who will have write access to the canvas.
+ content (str, optional): The markdown content of the canvas. Defaults to None.
+
+ Returns:
+ str | None: The ID of the created canvas, or None if creation failed.
+ """
+ if user_ids is None:
+ user_ids = []
+
+ try:
+ kwargs = {
+ "channel_id": conversation_id,
+ "title": title,
+ }
+ if content is not None:
+ kwargs["document_content"] = {"type": "markdown", "markdown": content}
+
+ response = make_call(
+ client,
+ SlackAPIPostEndpoints.canvas_create,
+ **kwargs,
+ )
+ canvas_id = response.get("canvas_id")
+ canvas_set_access(
+ client=client, conversation_id=conversation_id, canvas_id=canvas_id, user_ids=user_ids
+ )
+ return canvas_id
+ except SlackApiError as e:
+ log.exception(f"Error creating canvas in conversation {conversation_id}: {e}")
+ return None
+
+
+def update_canvas(
+ client: WebClient,
+ canvas_id: str,
+ content: str,
+) -> bool:
+ """
+ Updates an existing Slack canvas.
+
+ Args:
+ client (WebClient): A Slack WebClient object used to interact with the Slack API.
+ canvas_id (str): The ID of the canvas to update.
+ content (str): The new markdown content for the canvas.
+
+ Returns:
+ bool: True if the canvas was successfully updated, False otherwise.
+ """
+ try:
+ changes = [
+ {
+ "operation": "replace",
+ "document_content": {"type": "markdown", "markdown": content},
+ }
+ ]
+
+ make_call(
+ client,
+ SlackAPIPostEndpoints.canvas_update,
+ canvas_id=canvas_id,
+ changes=changes,
+ )
+ return True
+ except SlackApiError as e:
+ log.exception(f"Error updating canvas {canvas_id}: {e}")
+ return False
+
+
+def delete_canvas(client: WebClient, canvas_id: str) -> bool:
+ """
+ Deletes a Slack canvas.
+
+ Args:
+ client (WebClient): A Slack WebClient object used to interact with the Slack API.
+ canvas_id (str): The ID of the canvas to delete.
+
+ Returns:
+ bool: True if the canvas was successfully deleted, False otherwise.
+ """
+ try:
+ make_call(
+ client,
+ SlackAPIPostEndpoints.canvas_delete,
+ canvas_id=canvas_id,
+ )
+ return True
+ except SlackApiError as e:
+ log.exception(f"Error deleting canvas {canvas_id}: {e}")
+ return False
diff --git a/src/dispatch/plugins/dispatch_slack/views.py b/src/dispatch/plugins/dispatch_slack/views.py
deleted file mode 100644
index e09071a8b128..000000000000
--- a/src/dispatch/plugins/dispatch_slack/views.py
+++ /dev/null
@@ -1,772 +0,0 @@
-import hashlib
-import hmac
-import json
-import logging
-import platform
-import sys
-from time import time
-from typing import List
-
-import arrow
-from cachetools import TTLCache
-from fastapi import APIRouter, BackgroundTasks, Depends, Header, HTTPException
-from pydantic import BaseModel
-from sqlalchemy.orm import Session
-from starlette.requests import Request
-from starlette.responses import Response
-
-from dispatch.config import INCIDENT_PLUGIN_CONTACT_SLUG
-from dispatch.conversation.enums import ConversationButtonActions
-from dispatch.conversation.service import get_by_channel_id
-from dispatch.database import get_db, SessionLocal
-from dispatch.decorators import background_task
-from dispatch.enums import Visibility
-from dispatch.incident import flows as incident_flows
-from dispatch.incident import service as incident_service
-from dispatch.incident.models import IncidentUpdate, IncidentRead, IncidentStatus
-from dispatch.incident_priority import service as incident_priority_service
-from dispatch.incident_priority.models import IncidentPriorityType
-from dispatch.incident_type import service as incident_type_service
-from dispatch.participant import service as participant_service
-from dispatch.participant_role import service as participant_role_service
-from dispatch.participant_role.models import ParticipantRoleType
-from dispatch.plugins.base import plugins
-from dispatch.plugins.dispatch_slack import service as dispatch_slack_service
-from dispatch.service import service as service_service
-from dispatch.status_report import flows as status_report_flows
-from dispatch.status_report import service as status_report_service
-from dispatch.task import service as task_service
-from dispatch.task.models import TaskStatus
-
-from . import __version__
-from .config import (
- SLACK_COMMAND_ASSIGN_ROLE_SLUG,
- SLACK_COMMAND_UPDATE_INCIDENT_SLUG,
- SLACK_COMMAND_ENGAGE_ONCALL_SLUG,
- SLACK_COMMAND_LIST_PARTICIPANTS_SLUG,
- SLACK_COMMAND_LIST_RESOURCES_SLUG,
- SLACK_COMMAND_LIST_TASKS_SLUG,
- SLACK_COMMAND_MARK_ACTIVE_SLUG,
- SLACK_COMMAND_MARK_CLOSED_SLUG,
- SLACK_COMMAND_MARK_STABLE_SLUG,
- SLACK_COMMAND_STATUS_REPORT_SLUG,
- SLACK_SIGNING_SECRET,
-)
-from .messaging import (
- INCIDENT_CONVERSATION_COMMAND_MESSAGE,
- render_non_incident_conversation_command_error_message,
-)
-
-from .service import get_user_email
-
-once_a_day_cache = TTLCache(maxsize=1000, ttl=60 * 60 * 24)
-
-
-router = APIRouter()
-slack_client = dispatch_slack_service.create_slack_client()
-log = logging.getLogger(__name__)
-
-
-class SlackEventAppException(Exception):
- pass
-
-
-class EventBody(BaseModel):
- """Body of the Slack event."""
-
- channel: str = None
- channel_type: str = None
- channel_id: str = None
- file_id: str = None
- deleted_ts: float = None
- event_ts: float = None
- hidden: bool = None
- inviter: str = None
- team: str = None
- text: str = None
- type: str
- subtype: str = None
- user: str = None
- user_id: str = None
-
-
-class EventEnvelope(BaseModel):
- """Envelope of the Slack event."""
-
- challenge: str = None
- token: str = None
- team_id: str = None
- enterprise_id: str = None
- api_app_id: str = None
- event: EventBody = None
- type: str
- event_id: str = None
- event_time: int = None
- authed_users: List[str] = []
-
-
-@background_task
-def add_message_to_timeline(event: EventEnvelope, incident_id: int, db_session=None):
- """Uses reaction to add messages to timelines."""
- pass
-
-
-@background_task
-def remove_message_from_timeline(event: EventEnvelope, incident_id: int, db_session=None):
- """Uses reaction to remove messages from timeline."""
- pass
-
-
-@background_task
-def add_evidence_to_storage(event: EventEnvelope, incident_id: int, db_session=None):
- """Adds evidence (e.g. files) added/shared in the conversation to storage."""
- pass
-
-
-def is_business_hours(commander_tz: str):
- """Determines if it's currently office hours where the incident commander is located."""
- now = arrow.utcnow().to(commander_tz)
- return now.weekday() not in [5, 6] and now.hour < 9 and now.hour > 16
-
-
-def create_cache_key(user_id: str, channel_id: str):
- """Uses information in the evenvelope to construct a caching key."""
- return f"{channel_id}-{user_id}"
-
-
-@background_task
-def after_hours(user_email: str, incident_id: int, db_session=None):
- """Notifies the user that this incident is current in after hours mode."""
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
-
- user_id = dispatch_slack_service.resolve_user(slack_client, user_email)["id"]
-
- # NOTE Limitations: Does not sync across instances. Does not survive webserver restart
- cache_key = create_cache_key(user_id, incident.conversation.channel_id)
- try:
- once_a_day_cache[cache_key]
- return
- except Exception:
- pass # we don't care if there is nothing here
-
- # bail early if we don't care for a given severity
- priority_types = [
- IncidentPriorityType.info,
- IncidentPriorityType.low,
- IncidentPriorityType.medium,
- ]
- if incident.incident_priority.name.lower() not in priority_types:
- return
-
- # get their timezone from slack
- commander_info = dispatch_slack_service.get_user_info_by_email(
- slack_client, email=incident.commander.email
- )
-
- commander_tz = commander_info["tz"]
-
- if not is_business_hours(commander_tz):
- # send ephermal message
- blocks = [
- {
- "type": "section",
- "text": {
- "type": "mrkdwn",
- "text": (
- (
- f"Responses may be delayed. The current incident severity is *{incident.incident_severity.name}*"
- f" and your message was sent outside of the incident commander's working hours (Weekdays, 9am-5pm, {commander_tz})."
- )
- ),
- },
- }
- ]
- dispatch_slack_service.send_ephemeral_message(
- slack_client, incident.conversation.channel_id, user_id, "", blocks=blocks
- )
- once_a_day_cache[cache_key] = True
-
-
-@background_task
-def list_tasks(incident_id: int, command: dict = None, db_session=None):
- """Returns the list of incident tasks to the user as an ephemeral message."""
- blocks = []
- for status in TaskStatus:
- blocks.append(
- {
- "type": "section",
- "text": {"type": "mrkdwn", "text": f"*{status.value} Incident Tasks*"},
- }
- )
-
- tasks = task_service.get_all_by_incident_id_and_status(
- db_session=db_session, incident_id=incident_id, status=status.value
- )
-
- for task in tasks:
- blocks.append(
- {
- "type": "section",
- "text": {
- "type": "mrkdwn",
- "text": (
- f"*Description:* <{task.weblink}|{task.description}>\n"
- f"*Assignees:* {task.assignees}"
- ),
- },
- }
- )
- blocks.append({"type": "divider"})
-
- dispatch_slack_service.send_ephemeral_message(
- slack_client,
- command["channel_id"],
- command["user_id"],
- "Incident List Tasks",
- blocks=blocks,
- )
-
-
-@background_task
-def list_participants(incident_id: int, command: dict = None, db_session=None):
- """Returns the list of incident participants to the user as an ephemeral message."""
- blocks = []
- blocks.append(
- {"type": "section", "text": {"type": "mrkdwn", "text": f"*Incident Participants*"}}
- )
-
- participants = participant_service.get_all_by_incident_id(
- db_session=db_session, incident_id=incident_id
- )
-
- contact_plugin = plugins.get(INCIDENT_PLUGIN_CONTACT_SLUG)
-
- for participant in participants:
- if participant.is_active:
- participant_email = participant.individual.email
- participant_info = contact_plugin.get(participant_email)
- participant_name = participant_info["fullname"]
- participant_team = participant_info["team"]
- participant_department = participant_info["department"]
- participant_location = participant_info["location"]
- participant_weblink = participant_info["weblink"]
- participant_avatar_url = dispatch_slack_service.get_user_avatar_url(
- slack_client, participant_email
- )
- participant_active_roles = participant_role_service.get_all_active_roles(
- db_session=db_session, participant_id=participant.id
- )
- participant_roles = []
- for role in participant_active_roles:
- participant_roles.append(role.role)
-
- blocks.append(
- {
- "type": "section",
- "text": {
- "type": "mrkdwn",
- "text": (
- f"*Name:* <{participant_weblink}|{participant_name}>\n"
- f"*Team*: {participant_team}, {participant_department}\n"
- f"*Location*: {participant_location}\n"
- f"*Incident Role(s)*: {(', ').join(participant_roles)}\n"
- ),
- },
- "accessory": {
- "type": "image",
- "image_url": participant_avatar_url,
- "alt_text": participant_name,
- },
- }
- )
- blocks.append({"type": "divider"})
-
- dispatch_slack_service.send_ephemeral_message(
- slack_client,
- command["channel_id"],
- command["user_id"],
- "Incident List Participants",
- blocks=blocks,
- )
-
-
-def create_assign_role_dialog(incident_id: int, command: dict = None):
- """Creates a dialog for assigning a role."""
- role_options = []
- for role in ParticipantRoleType:
- if role != ParticipantRoleType.participant:
- role_options.append({"label": role.value, "value": role.value})
-
- dialog = {
- "callback_id": command["command"],
- "title": "Assign Role",
- "submit_label": "Assign",
- "elements": [
- {
- "label": "Participant",
- "type": "select",
- "name": "participant",
- "data_source": "users",
- },
- {"label": "Role", "type": "select", "name": "role", "options": role_options},
- ],
- }
-
- dispatch_slack_service.open_dialog_with_user(slack_client, command["trigger_id"], dialog)
-
-
-@background_task
-def create_update_incident_dialog(incident_id: int, command: dict = None, db_session=None):
- """Creates a dialog for updating incident information."""
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
-
- type_options = []
- for t in incident_type_service.get_all(db_session=db_session):
- type_options.append({"label": t.name, "value": t.name})
-
- priority_options = []
- for priority in incident_priority_service.get_all(db_session=db_session):
- priority_options.append({"label": priority.name, "value": priority.name})
-
- status_options = []
- for status in IncidentStatus:
- status_options.append({"label": status.value, "value": status.value})
-
- visibility_options = []
- for visibility in Visibility:
- visibility_options.append({"label": visibility.value, "value": visibility.value})
-
- notify_options = [{"label": "Yes", "value": "Yes"}, {"label": "No", "value": "No"}]
-
- dialog = {
- "callback_id": command["command"],
- "title": "Update Incident",
- "submit_label": "Save",
- "elements": [
- {"type": "textarea", "label": "Title", "name": "title", "value": incident.title},
- {
- "type": "textarea",
- "label": "Description",
- "name": "description",
- "value": incident.description,
- },
- {
- "label": "Type",
- "type": "select",
- "name": "type",
- "value": incident.incident_type.name,
- "options": type_options,
- },
- {
- "label": "Priority",
- "type": "select",
- "name": "priority",
- "value": incident.incident_priority.name,
- "options": priority_options,
- },
- {
- "label": "Status",
- "type": "select",
- "name": "status",
- "value": incident.status,
- "options": status_options,
- },
- {
- "label": "Visibility",
- "type": "select",
- "name": "visibility",
- "value": incident.visibility,
- "options": visibility_options,
- },
- {
- "label": "Notify on change",
- "type": "select",
- "name": "notify",
- "value": "Yes",
- "options": notify_options,
- },
- ],
- }
-
- dispatch_slack_service.open_dialog_with_user(slack_client, command["trigger_id"], dialog)
-
-
-@background_task
-def create_engage_oncall_dialog(incident_id: int, command: dict = None, db_session=None):
- """Creates a dialog to engage an oncall person."""
- oncall_services = service_service.get_all_by_status(db_session=db_session, is_active=True)
-
- if not oncall_services.count():
- blocks = [
- {
- "type": "section",
- "text": {
- "type": "mrkdwn",
- "text": f"No oncall services have been defined. You can define them in the Dispatch UI at /services",
- },
- }
- ]
- dispatch_slack_service.send_ephemeral_message(
- slack_client,
- command["channel_id"],
- command["user_id"],
- "No Oncall Services Defined",
- blocks=blocks,
- )
- return
-
- oncall_service_options = []
- for oncall_service in oncall_services:
- oncall_service_options.append(
- {"label": oncall_service.name, "value": oncall_service.external_id}
- )
-
- page_options = [{"label": "Yes", "value": "Yes"}, {"label": "No", "value": "No"}]
-
- dialog = {
- "callback_id": command["command"],
- "title": "Engage Oncall",
- "submit_label": "Engage",
- "elements": [
- {
- "label": "Oncall Service",
- "type": "select",
- "name": "oncall_service_id",
- "options": oncall_service_options,
- },
- {
- "label": "Page",
- "type": "select",
- "name": "page",
- "value": "No",
- "options": page_options,
- },
- ],
- }
-
- dispatch_slack_service.open_dialog_with_user(slack_client, command["trigger_id"], dialog)
-
-
-@background_task
-def create_status_report_dialog(incident_id: int, command: dict = None, db_session=None):
- """Fetches the last status report and creates a dialog."""
- # we load the most recent status report
- status_report = status_report_service.get_most_recent_by_incident_id(
- db_session=db_session, incident_id=incident_id
- )
-
- conditions = actions = needs = ""
- if status_report:
- conditions = status_report.conditions
- actions = status_report.actions
- needs = status_report.needs
-
- dialog = {
- "callback_id": command["command"],
- "title": "Status Report",
- "submit_label": "Submit",
- "elements": [
- {"type": "textarea", "label": "Conditions", "name": "conditions", "value": conditions},
- {"type": "textarea", "label": "Actions", "name": "actions", "value": actions},
- {"type": "textarea", "label": "Needs", "name": "needs", "value": needs},
- ],
- }
-
- dispatch_slack_service.open_dialog_with_user(slack_client, command["trigger_id"], dialog)
-
-
-@background_task
-def add_user_to_conversation(
- user_id: str, user_email: str, incident_id: int, action: dict, db_session=None
-):
- """Adds a user to a conversation."""
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
-
- if incident.status == IncidentStatus.closed:
- message = f"Sorry, we cannot add you to a closed incident. Please reach out to the incident commander ({incident.commander.name}) for details."
- dispatch_slack_service.send_ephemeral_message(
- slack_client, action["container"]["channel_id"], user_id, message
- )
- else:
- dispatch_slack_service.add_users_to_conversation(
- slack_client, incident.conversation.channel_id, [user_id]
- )
-
-
-def event_functions(event: EventEnvelope):
- """Interprets the events and routes it the appropriate function."""
- event_mappings = {
- "file_created": [add_evidence_to_storage],
- "file_shared": [add_evidence_to_storage],
- "link_shared": [],
- "member_joined_channel": [incident_flows.incident_add_or_reactivate_participant_flow],
- "message": [after_hours],
- "member_left_channel": [incident_flows.incident_remove_participant_flow],
- "message.groups": [],
- "message.im": [],
- "reaction_added": [add_message_to_timeline],
- "reaction_removed": [remove_message_from_timeline],
- }
-
- return event_mappings.get(event.event.type, [])
-
-
-def command_functions(command: str):
- """Interprets the command and routes it the appropriate function."""
- command_mappings = {
- SLACK_COMMAND_ASSIGN_ROLE_SLUG: [create_assign_role_dialog],
- SLACK_COMMAND_UPDATE_INCIDENT_SLUG: [create_update_incident_dialog],
- SLACK_COMMAND_LIST_PARTICIPANTS_SLUG: [list_participants],
- SLACK_COMMAND_LIST_RESOURCES_SLUG: [incident_flows.incident_list_resources_flow],
- SLACK_COMMAND_LIST_TASKS_SLUG: [list_tasks],
- SLACK_COMMAND_MARK_ACTIVE_SLUG: [],
- SLACK_COMMAND_MARK_CLOSED_SLUG: [],
- SLACK_COMMAND_MARK_STABLE_SLUG: [],
- SLACK_COMMAND_STATUS_REPORT_SLUG: [create_status_report_dialog],
- SLACK_COMMAND_ENGAGE_ONCALL_SLUG: [create_engage_oncall_dialog],
- }
-
- return command_mappings.get(command, [])
-
-
-@background_task
-def handle_update_incident_action(user_id, user_email, incident_id, action, db_session=None):
- """Messages slack dialog data into something that Dispatch can use."""
- submission = action["submission"]
- notify = True if submission["notify"] == "Yes" else False
- incident_in = IncidentUpdate(
- title=submission["title"],
- description=submission["description"],
- incident_type={"name": submission["type"]},
- incident_priority={"name": submission["priority"]},
- status=submission["status"],
- visibility=submission["visibility"],
- )
-
- incident = incident_service.get(db_session=db_session, incident_id=incident_id)
- existing_incident = IncidentRead.from_orm(incident)
- incident_service.update(db_session=db_session, incident=incident, incident_in=incident_in)
- incident_flows.incident_update_flow(user_email, incident_id, existing_incident, notify)
-
-
-@background_task
-def handle_assign_role_action(user_id, user_email, incident_id, action, db_session=None):
- """Messages slack dialog data into some thing that Dispatch can use."""
- assignee_user_id = action["submission"]["participant"]
- assignee_role = action["submission"]["role"]
- assignee_email = get_user_email(client=slack_client, user_id=assignee_user_id)
- incident_flows.incident_assign_role_flow(user_email, incident_id, assignee_email, assignee_role)
-
-
-def action_functions(action: str):
- """Interprets the action and routes it the appropriate function."""
- action_mappings = {
- SLACK_COMMAND_STATUS_REPORT_SLUG: [status_report_flows.new_status_report_flow],
- SLACK_COMMAND_ASSIGN_ROLE_SLUG: [handle_assign_role_action],
- SLACK_COMMAND_UPDATE_INCIDENT_SLUG: [handle_update_incident_action],
- SLACK_COMMAND_ENGAGE_ONCALL_SLUG: [incident_flows.incident_engage_oncall_flow],
- ConversationButtonActions.invite_user: [add_user_to_conversation],
- }
-
- return action_mappings.get(action, [])
-
-
-def get_action_name_by_action_type(action: dict):
- """Returns the action name based on the type."""
- action_name = ""
- if action["type"] == "dialog_submission":
- action_name = action["callback_id"]
-
- if action["type"] == "block_actions":
- action_name = action["actions"][0]["block_id"]
-
- return action_name
-
-
-def get_incident_id_by_action_type(action: dict, db_session: SessionLocal):
- """Returns the incident id based on the action type."""
- incident_id = -1
- if action["type"] == "dialog_submission":
- channel_id = action["channel"]["id"]
- conversation = get_by_channel_id(db_session=db_session, channel_id=channel_id)
- incident_id = conversation.incident_id
-
- if action["type"] == "block_actions":
- incident_id = action["actions"][0]["value"]
-
- return incident_id
-
-
-def create_ua_string():
- client_name = __name__.split(".")[0]
- client_version = __version__ # Version is returned from _version.py
-
- # Collect the package info, Python version and OS version.
- package_info = {
- "client": "{0}/{1}".format(client_name, client_version),
- "python": "Python/{v.major}.{v.minor}.{v.micro}".format(v=sys.version_info),
- "system": "{0}/{1}".format(platform.system(), platform.release()),
- }
-
- # Concatenate and format the user-agent string to be passed into request headers
- ua_string = []
- for _, val in package_info.items():
- ua_string.append(val)
-
- return " ".join(ua_string)
-
-
-def verify_signature(request_data, timestamp: int, signature: str):
- """Verifies the request signature using the app's signing secret."""
- req = f"v0:{timestamp}:{request_data}".encode("utf-8")
- slack_signing_secret = bytes(str(SLACK_SIGNING_SECRET), "utf-8")
- h = hmac.new(slack_signing_secret, req, hashlib.sha256).hexdigest()
- if not hmac.compare_digest(f"v0={h}", signature):
- raise HTTPException(status_code=403, detail="Invalid request signature")
-
-
-def verify_timestamp(timestamp: int):
- """Verifies that the timestamp does not differ from local time by more than five minutes."""
- if abs(time() - timestamp) > 60 * 5:
- raise HTTPException(status_code=403, detail="Invalid request timestamp")
-
-
-@router.post("/slack/event")
-async def handle_event(
- event: EventEnvelope,
- request: Request,
- response: Response,
- background_tasks: BackgroundTasks,
- x_slack_request_timestamp: int = Header(None),
- x_slack_signature: str = Header(None),
- db_session: Session = Depends(get_db),
-):
- """Handle all incomming Slack events."""
- raw_request_body = bytes.decode(await request.body())
-
- # We verify the timestamp
- verify_timestamp(x_slack_request_timestamp)
-
- # We verify the signature
- verify_signature(raw_request_body, x_slack_request_timestamp, x_slack_signature)
-
- # Echo the URL verification challenge code back to Slack
- if event.challenge:
- return {"challenge": event.challenge}
-
- event_body = event.event
-
- if (
- event_body.type == "message" and event_body.subtype
- ): # We ignore messages that have a subtype
- # Parse the Event payload and emit the event to the event listener
- response.headers["X-Slack-Powered-By"] = create_ua_string()
- return {"ok"}
-
- user_id = event_body.user
-
- # Fetch conversation by channel id
- channel_id = ( # We ensure channel_id always has a value
- event_body.channel_id if event_body.channel_id else event_body.channel
- )
- conversation = get_by_channel_id(db_session=db_session, channel_id=channel_id)
-
- if conversation and dispatch_slack_service.is_user(user_id):
- # We create an async Slack client
- slack_async_client = dispatch_slack_service.create_slack_client(run_async=True)
-
- # We resolve the user's email
- user_email = await dispatch_slack_service.get_user_email_async(slack_async_client, user_id)
-
- # Dispatch event functions to be executed in the background
- for f in event_functions(event):
- background_tasks.add_task(f, user_email, conversation.incident_id)
-
- # We add the user-agent string to the response headers
- response.headers["X-Slack-Powered-By"] = create_ua_string()
- return {"ok"}
-
-
-@router.post("/slack/command")
-async def handle_command(
- request: Request,
- response: Response,
- background_tasks: BackgroundTasks,
- x_slack_request_timestamp: int = Header(None),
- x_slack_signature: str = Header(None),
- db_session: Session = Depends(get_db),
-):
- """Handle all incomming Slack commands."""
- raw_request_body = bytes.decode(await request.body())
- request_body_form = await request.form()
- command = request_body_form._dict
-
- # We verify the timestamp
- verify_timestamp(x_slack_request_timestamp)
-
- # We verify the signature
- verify_signature(raw_request_body, x_slack_request_timestamp, x_slack_signature)
-
- # We add the user-agent string to the response headers
- response.headers["X-Slack-Powered-By"] = create_ua_string()
-
- # Fetch conversation by channel id
- channel_id = command.get("channel_id")
- conversation = get_by_channel_id(db_session=db_session, channel_id=channel_id)
-
- # Dispatch command functions to be executed in the background
- if conversation:
- for f in command_functions(command.get("command")):
- background_tasks.add_task(f, conversation.incident_id, command=command)
-
- return INCIDENT_CONVERSATION_COMMAND_MESSAGE.get(
- command.get("command"), f"Unable to find message. Command: {command.get('command')}"
- )
- else:
- return render_non_incident_conversation_command_error_message(command.get("command"))
-
-
-@router.post("/slack/action")
-async def handle_action(
- request: Request,
- response: Response,
- background_tasks: BackgroundTasks,
- x_slack_request_timestamp: int = Header(None),
- x_slack_signature: str = Header(None),
- db_session: Session = Depends(get_db),
-):
- """Handle all incomming Slack actions."""
-
- raw_request_body = bytes.decode(await request.body())
- request_body_form = await request.form()
- action = json.loads(request_body_form.get("payload"))
-
- # We verify the timestamp
- verify_timestamp(x_slack_request_timestamp)
-
- # We verify the signature
- verify_signature(raw_request_body, x_slack_request_timestamp, x_slack_signature)
-
- # We create an async Slack client
- slack_async_client = dispatch_slack_service.create_slack_client(run_async=True)
-
- # We resolve the user's email
- user_id = action["user"]["id"]
- user_email = await dispatch_slack_service.get_user_email_async(slack_async_client, user_id)
-
- # We resolve the action name based on the type
- action_name = get_action_name_by_action_type(action)
-
- # we resolve the incident id based on the action type
- incident_id = get_incident_id_by_action_type(action, db_session)
-
- # Dispatch action functions to be executed in the background
- for f in action_functions(action_name):
- background_tasks.add_task(f, user_id, user_email, incident_id, action)
-
- # We add the user-agent string to the response headers
- response.headers["X-Slack-Powered-By"] = create_ua_string()
-
- # When there are no exceptions within the dialog submission, your app must respond with 200 OK with an empty body.
- # This will complete the dialog. (https://api.slack.com/dialogs#validation)
- return {}
diff --git a/src/dispatch/plugins/dispatch_slack/workflow.py b/src/dispatch/plugins/dispatch_slack/workflow.py
new file mode 100644
index 000000000000..cd13f9a54f0b
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_slack/workflow.py
@@ -0,0 +1,314 @@
+from blockkit import Context, Input, MarkdownText, Modal, PlainTextInput, Section
+from slack_bolt import Ack, BoltContext
+from slack_sdk.web import WebClient
+from sqlalchemy.orm import Session
+
+from dispatch.auth.models import DispatchUser
+from dispatch.config import DISPATCH_UI_URL
+from dispatch.database.core import SessionLocal
+from dispatch.enums import DispatchEnum
+from dispatch.incident import service as incident_service
+from dispatch.messaging.strings import INCIDENT_WORKFLOW_CREATED_NOTIFICATION
+from dispatch.participant import service as participant_service
+from dispatch.plugins.dispatch_slack.bolt import app
+from dispatch.plugins.dispatch_slack.fields import static_select_block
+from dispatch.plugins.dispatch_slack.middleware import (
+ action_context_middleware,
+ command_context_middleware,
+ configuration_middleware,
+ db_middleware,
+ modal_submit_middleware,
+ user_middleware,
+)
+from dispatch.workflow import service as workflow_service
+from dispatch.workflow.flows import send_workflow_notification
+from dispatch.workflow.models import WorkflowInstanceCreate
+
+
+class RunWorkflowBlockIds(DispatchEnum):
+ workflow_select = "run-workflow-select"
+ reason_input = "run-workflow-reason-input"
+ param_input = "run-workflow-param-input"
+
+
+class RunWorkflowActionIds(DispatchEnum):
+ workflow_select = "run-workflow-workflow-select"
+ reason_input = "run-workflow-reason-input"
+ param_input = "run-workflow-param-input"
+
+
+class RunWorkflowActions(DispatchEnum):
+ submit = "run-workflow-submit"
+ workflow_select = "run-workflow-workflow-select"
+
+
+def configure(config):
+ """Maps commands/events to their functions."""
+ middleware = [
+ command_context_middleware,
+ db_middleware,
+ configuration_middleware,
+ ]
+ app.command(config.slack_command_list_workflows, middleware=middleware)(
+ handle_workflow_list_command
+ )
+ app.command(config.slack_command_run_workflow, middleware=middleware)(
+ handle_workflow_run_command
+ )
+
+
+def workflow_select(
+ db_session: SessionLocal,
+ action_id: str = RunWorkflowActionIds.workflow_select,
+ block_id: str = RunWorkflowBlockIds.workflow_select,
+ initial_option: dict = None,
+ label: str = "Workflow",
+ **kwargs,
+):
+ workflows = workflow_service.get_enabled(db_session=db_session)
+
+ return static_select_block(
+ action_id=action_id,
+ block_id=block_id,
+ initial_option=initial_option,
+ label=label,
+ options=[{"text": w.name, "value": w.id} for w in workflows],
+ placeholder="Select Workflow",
+ **kwargs,
+ )
+
+
+def reason_input(
+ action_id: str = RunWorkflowActionIds.reason_input,
+ block_id: str = RunWorkflowBlockIds.reason_input,
+ initial_value: str = None,
+ label: str = "Reason",
+ **kwargs,
+):
+ return Input(
+ block_id=block_id,
+ element=PlainTextInput(
+ action_id=action_id,
+ initial_value=initial_value,
+ multiline=True,
+ placeholder="Short description why workflow was run.",
+ ),
+ label=label,
+ **kwargs,
+ )
+
+
+def param_input(
+ action_id: str = RunWorkflowActionIds.param_input,
+ block_id: str = RunWorkflowBlockIds.param_input,
+ initial_options: list = None,
+ label: str = "Workflow Parameters",
+ **kwargs,
+):
+ inputs = []
+ for p in initial_options:
+ inputs.append(
+ Input(
+ block_id=f"{block_id}-{p['key']}",
+ element=PlainTextInput(
+ placeholder="Parameter Value",
+ action_id=f"{action_id}-{p['key']}",
+ initial_value=p["value"],
+ ),
+ label=p["key"],
+ **kwargs,
+ )
+ )
+ return inputs
+
+
+def handle_workflow_list_command(
+ ack: Ack, body: dict, client: WebClient, context: BoltContext, db_session: Session
+) -> None:
+ """Handles the workflow list command."""
+ ack()
+ incident = incident_service.get(db_session=db_session, incident_id=context["subject"].id)
+ workflows = incident.workflow_instances
+
+ blocks = [Section(text="*Workflows*")]
+
+ if not workflows:
+ blocks.append(Section(text="No workflows running."))
+
+ for w in workflows:
+ artifact_links = ""
+ for a in w.artifacts:
+ artifact_links += f"- <{a.weblink}|{a.name}> \n"
+
+ blocks.append(
+ Section(
+ fields=[
+ "*Name:* " + f"\n <{w.weblink}|{w.workflow.name}> \n"
+ if w.weblink
+ else "*Name:* " + f"\n {w.workflow.name} \n"
+ f"*Workflow Description:* \n {w.workflow.description} \n"
+ f"*Run Reason:* \n {w.run_reason} \n"
+ f"*Creator:* \n {w.creator.individual.name} \n"
+ f"*Status:* \n {w.status} \n"
+ f"*Artifacts:* \n {artifact_links} \n"
+ ]
+ )
+ )
+
+ modal = Modal(
+ title="Workflows List",
+ blocks=blocks,
+ close="Close",
+ ).build()
+
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+def handle_workflow_run_command(
+ ack: Ack,
+ body: dict,
+ client: WebClient,
+ context: BoltContext,
+ db_session: Session,
+) -> None:
+ """Handles the workflow run command."""
+ ack()
+
+ blocks = [
+ Context(elements=[MarkdownText(text="Select a workflow to run.")]),
+ workflow_select(
+ db_session=db_session,
+ dispatch_action=True,
+ ),
+ ]
+
+ modal = Modal(
+ title="Run Workflow",
+ blocks=blocks,
+ submit="Run",
+ close="Close",
+ callback_id=RunWorkflowActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+ client.views_open(trigger_id=body["trigger_id"], view=modal)
+
+
+@app.view(
+ RunWorkflowActions.submit,
+ middleware=[action_context_middleware, db_middleware, user_middleware, modal_submit_middleware],
+)
+def handle_workflow_submission_event(
+ ack: Ack,
+ context: BoltContext,
+ db_session: Session,
+ form_data: dict,
+ user: DispatchUser,
+) -> None:
+ """Handles workflow submission event."""
+ ack()
+
+ incident = incident_service.get(db_session=db_session, incident_id=context["subject"].id)
+ workflow_id = form_data.get(RunWorkflowBlockIds.workflow_select)["value"]
+ workflow = workflow_service.get(db_session=db_session, workflow_id=workflow_id)
+
+ params = {}
+ named_params = []
+ for i in form_data.keys():
+ if i.startswith(RunWorkflowBlockIds.param_input):
+ key = i.split(RunWorkflowBlockIds.param_input + "-")[1]
+ value = form_data[i]
+ params.update({key: value})
+ named_params.append({"key": key, "value": value})
+
+ creator = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=incident.id, email=user.email
+ )
+
+ instance = workflow_service.create_instance(
+ db_session=db_session,
+ workflow=workflow,
+ instance_in=WorkflowInstanceCreate(
+ incident=incident,
+ creator=creator,
+ run_reason=form_data[RunWorkflowBlockIds.reason_input],
+ parameters=named_params,
+ ),
+ )
+
+ for p in instance.parameters:
+ if p["value"]:
+ params.update({p["key"]: p["value"]})
+
+ params.update(
+ {
+ "externalRef": f"{DISPATCH_UI_URL}/{instance.incident.project.organization.name}/incidents/{instance.incident.name}?project={instance.incident.project.name}",
+ "workflowInstanceId": instance.id,
+ "incident_name": instance.incident.name,
+ "incident_title": instance.incident.title,
+ "incident_severity": instance.incident.incident_severity.name,
+ "incident_status": instance.incident.status,
+ }
+ )
+
+ workflow.plugin_instance.instance.run(workflow.resource_id, params)
+
+ # TODO we should move off these types of notification functions and create them directly
+ send_workflow_notification(
+ incident.project.id,
+ incident.conversation.channel_id,
+ INCIDENT_WORKFLOW_CREATED_NOTIFICATION,
+ db_session,
+ instance_creator_name=instance.creator.individual.name,
+ workflow_name=instance.workflow.name,
+ workflow_description=instance.workflow.description,
+ )
+
+
+@app.action(
+ RunWorkflowActions.workflow_select, middleware=[action_context_middleware, db_middleware]
+)
+def handle_run_workflow_select_action(
+ ack: Ack,
+ body: dict,
+ db_session: Session,
+ context: BoltContext,
+ client: WebClient,
+) -> None:
+ """Handles workflow select event."""
+ ack()
+ values = body["view"]["state"]["values"]
+ workflow_id = values[RunWorkflowBlockIds.workflow_select][RunWorkflowActionIds.workflow_select][
+ "selected_option"
+ ]["value"]
+
+ selected_workflow = workflow_service.get(db_session=db_session, workflow_id=workflow_id)
+
+ blocks = [
+ Context(elements=[MarkdownText(text="Select a workflow to run.")]),
+ workflow_select(
+ initial_option={"text": selected_workflow.name, "value": selected_workflow.id},
+ db_session=db_session,
+ dispatch_action=True,
+ ),
+ reason_input(),
+ ]
+
+ blocks.extend(
+ param_input(initial_options=selected_workflow.parameters),
+ )
+
+ modal = Modal(
+ title="Run Workflow",
+ blocks=blocks,
+ submit="Run",
+ close="Close",
+ callback_id=RunWorkflowActions.submit,
+ private_metadata=context["subject"].json(),
+ ).build()
+
+ client.views_update(
+ view_id=body["view"]["id"],
+ hash=body["view"]["hash"],
+ trigger_id=body["trigger_id"],
+ view=modal,
+ )
diff --git a/src/dispatch/plugins/dispatch_test/conference.py b/src/dispatch/plugins/dispatch_test/conference.py
index 5eac4f300673..a40d41ce6c75 100644
--- a/src/dispatch/plugins/dispatch_test/conference.py
+++ b/src/dispatch/plugins/dispatch_test/conference.py
@@ -2,7 +2,7 @@
class TestConferencePlugin(ConferencePlugin):
- title = "TestConference"
+ title = "Dispatch Test Plugin - Conference"
slug = "test-conference"
def create(self, items, **kwargs):
diff --git a/src/dispatch/plugins/dispatch_test/contact.py b/src/dispatch/plugins/dispatch_test/contact.py
index e1914d315e53..b0ddeb95172d 100644
--- a/src/dispatch/plugins/dispatch_test/contact.py
+++ b/src/dispatch/plugins/dispatch_test/contact.py
@@ -2,7 +2,7 @@
class TestContactPlugin(ContactPlugin):
- title = "Test Contact"
+ title = "Dispatch Test Plugin - Contact"
slug = "test-contact"
def get(self, key, **kwargs):
diff --git a/src/dispatch/plugins/dispatch_test/conversation.py b/src/dispatch/plugins/dispatch_test/conversation.py
index 3cf90b92df3c..f3be3214d00f 100644
--- a/src/dispatch/plugins/dispatch_test/conversation.py
+++ b/src/dispatch/plugins/dispatch_test/conversation.py
@@ -1,9 +1,24 @@
+from datetime import datetime
+from typing import Any
+
+from slack_sdk import WebClient
+
from dispatch.plugins.bases import ConversationPlugin
+from dispatch.plugins.dispatch_slack.events import ChannelActivityEvent
+
+
+class TestWebClient(WebClient):
+ def api_call(self, *args, **kwargs):
+ return {"ok": True, "messages": [], "has_more": False}
class TestConversationPlugin(ConversationPlugin):
- title = "Test Conversation"
+ id = 123
+ title = "Dispatch Test Plugin - Conversation"
slug = "test-conversation"
+ configuration = {"api_bot_token": "123"}
+ type = "conversation"
+ plugin_events = [ChannelActivityEvent]
def create(self, items, **kwargs):
return
@@ -13,3 +28,13 @@ def add(self, items, **kwargs):
def send(self, items, **kwargs):
return
+
+ def fetch_events(self, subject: Any, **kwargs):
+ client = TestWebClient()
+ for plugin_event in self.plugin_events:
+ plugin_event().fetch_activity(client=client, subject=subject)
+ return [
+ (datetime.utcfromtimestamp(1512085950.000216), "0XDECAFBAD"),
+ (datetime.utcfromtimestamp(1512104434.000490), "0XDECAFBAD"),
+ (datetime.utcfromtimestamp(1512104534.000490), "0X8BADF00D"),
+ ]
diff --git a/src/dispatch/plugins/dispatch_test/definition.py b/src/dispatch/plugins/dispatch_test/definition.py
index b73aba66a5b6..a1696e213bd3 100644
--- a/src/dispatch/plugins/dispatch_test/definition.py
+++ b/src/dispatch/plugins/dispatch_test/definition.py
@@ -2,7 +2,7 @@
class TestDefinitionPlugin(DefinitionPlugin):
- title = "Test Definition"
+ title = "Dispatch Test Plugin - Definition"
slug = "test-definition"
def get(self, key, **kwargs):
diff --git a/src/dispatch/plugins/dispatch_test/document.py b/src/dispatch/plugins/dispatch_test/document.py
index e3528c1e8887..ae67bcc1ff33 100644
--- a/src/dispatch/plugins/dispatch_test/document.py
+++ b/src/dispatch/plugins/dispatch_test/document.py
@@ -1,8 +1,62 @@
from dispatch.plugins.bases import DocumentPlugin
+from googleapiclient.discovery import Resource
+from googleapiclient.http import HttpRequestMock, BatchHttpRequest
+from httplib2 import Response
+
+_DOCUMENT_SUCCESS = {
+ "body": {
+ "content": [
+ {
+ "startIndex": 0,
+ "endIndex": 40,
+ "paragraph": {
+ "elements": [
+ {
+ "endIndex": 40,
+ "startIndex": 0,
+ "textRun": {
+ "content": "Incident Conversation Commands Reference",
+ "textStyle": {
+ "link": {"url": "https://www.netflix.com/login"},
+ },
+ },
+ }
+ ],
+ },
+ },
+ ]
+ }
+}
+
+_DOCUMENT_FAIL = {"body": {"content": []}}
+
+
+class TestResource(Resource):
+ """A mock Google API Client"""
+
+ def __init__(self, has_link: bool = True):
+ self.has_link = has_link
+
+ def batchUpdate(self, **kwargs) -> BatchHttpRequest:
+ """Performs one or more update API requests."""
+ return BatchHttpRequest(
+ callback=None,
+ batch_uri=None,
+ )
+
+ def get(self, documentId: int) -> HttpRequestMock:
+ """Returns a mock HTTP request to fetch a document."""
+ document = _DOCUMENT_SUCCESS if self.has_link else _DOCUMENT_FAIL
+
+ return HttpRequestMock(
+ resp=Response({"status": 200}),
+ content="",
+ postproc=lambda _x, _y: document,
+ )
class TestDocumentPlugin(DocumentPlugin):
- title = "Test Document"
+ title = "Dispatch Test Plugin - Document"
slug = "test-document"
def get(self, key, **kwargs):
diff --git a/src/dispatch/plugins/dispatch_test/document_resolver.py b/src/dispatch/plugins/dispatch_test/document_resolver.py
index e8a6314b3a85..6540f4e8ff24 100644
--- a/src/dispatch/plugins/dispatch_test/document_resolver.py
+++ b/src/dispatch/plugins/dispatch_test/document_resolver.py
@@ -2,7 +2,7 @@
class TestDocumentResolverPlugin(DocumentResolverPlugin):
- title = "Test Document Resovler"
+ title = "Dispatch Test Plugin - Document Resolver"
slug = "test-document-resolver"
def get(self, items, **kwargs):
diff --git a/src/dispatch/plugins/dispatch_test/oncall.py b/src/dispatch/plugins/dispatch_test/oncall.py
index 99f4a00d965f..2bbd02454d05 100644
--- a/src/dispatch/plugins/dispatch_test/oncall.py
+++ b/src/dispatch/plugins/dispatch_test/oncall.py
@@ -4,16 +4,24 @@
:copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
:license: Apache, see LICENSE for more details.
"""
+
from dispatch.plugins.bases import OncallPlugin
class TestOncallPlugin(OncallPlugin):
- title = "Test Oncall"
+ title = "Dispatch Test Plugin - Oncall"
slug = "test-oncall"
description = "Oncall plugin for testing purposes"
- def get(self, **kwargs):
+ def get(self, service_id: str, **kwargs):
return "johnsmith@example.com"
- def page(self, **kwargs):
+ def page(
+ self,
+ service_id: str,
+ incident_name: str,
+ incident_title: str,
+ incident_description: str,
+ **kwargs,
+ ):
return
diff --git a/src/dispatch/plugins/dispatch_test/participant.py b/src/dispatch/plugins/dispatch_test/participant.py
index d98b733cabda..25f10eda1483 100644
--- a/src/dispatch/plugins/dispatch_test/participant.py
+++ b/src/dispatch/plugins/dispatch_test/participant.py
@@ -2,7 +2,7 @@
class TestParticipantPlugin(ParticipantPlugin):
- title = "Test Participant"
+ title = "Dispatch Test Plugin - Participant"
slug = "test-participant"
def get(self, items, **kwargs):
diff --git a/src/dispatch/plugins/dispatch_test/participant_group.py b/src/dispatch/plugins/dispatch_test/participant_group.py
index 24246119c684..7084a93fe5b5 100644
--- a/src/dispatch/plugins/dispatch_test/participant_group.py
+++ b/src/dispatch/plugins/dispatch_test/participant_group.py
@@ -2,7 +2,7 @@
class TestParticipantGroupPlugin(ParticipantGroupPlugin):
- title = "Test Participant Group"
+ title = "Dispatch Test Plugin - Participant Group"
slug = "test-participant-group"
def create(self, participants, **kwargs):
diff --git a/src/dispatch/plugins/dispatch_test/storage.py b/src/dispatch/plugins/dispatch_test/storage.py
index 0119d32ab939..670c28a77e6b 100644
--- a/src/dispatch/plugins/dispatch_test/storage.py
+++ b/src/dispatch/plugins/dispatch_test/storage.py
@@ -2,7 +2,7 @@
class TestStoragePlugin(StoragePlugin):
- title = "Test Storage"
+ title = "Dispatch Test Plugin - Storage"
slug = "test-storage"
def get(self, **kwargs):
diff --git a/src/dispatch/plugins/dispatch_test/task.py b/src/dispatch/plugins/dispatch_test/task.py
index de140e5f99c0..3b86783ea302 100644
--- a/src/dispatch/plugins/dispatch_test/task.py
+++ b/src/dispatch/plugins/dispatch_test/task.py
@@ -2,7 +2,7 @@
class TestTaskPlugin(TaskPlugin):
- title = "Test Task"
+ title = "Dispatch Test Plugin - Task"
slug = "test-task"
def get(self, **kwargs):
diff --git a/src/dispatch/plugins/dispatch_test/term.py b/src/dispatch/plugins/dispatch_test/term.py
index 9cbeaded5f01..0bd53a8a2d9f 100644
--- a/src/dispatch/plugins/dispatch_test/term.py
+++ b/src/dispatch/plugins/dispatch_test/term.py
@@ -2,7 +2,7 @@
class TestTermPlugin(TermPlugin):
- title = "Test Term"
+ title = "Dispatch Test Plugin - Term"
slug = "test-term"
def get(self, **kwargs):
diff --git a/src/dispatch/plugins/dispatch_test/ticket.py b/src/dispatch/plugins/dispatch_test/ticket.py
index bf2d3a0e0f0e..afc7d4b92bb6 100644
--- a/src/dispatch/plugins/dispatch_test/ticket.py
+++ b/src/dispatch/plugins/dispatch_test/ticket.py
@@ -2,7 +2,7 @@
class TestTicketPlugin(TicketPlugin):
- title = "Test Ticket"
+ title = "Dispatch Test Plugin - Ticket"
slug = "test-ticket"
def create(self, ticket_id, **kwargs):
diff --git a/src/dispatch/plugins/dispatch_test/workflow.py b/src/dispatch/plugins/dispatch_test/workflow.py
new file mode 100644
index 000000000000..1d0303ddd8ea
--- /dev/null
+++ b/src/dispatch/plugins/dispatch_test/workflow.py
@@ -0,0 +1,12 @@
+from dispatch.plugins.bases import WorkflowPlugin
+
+
+class TestWorkflowPlugin(WorkflowPlugin):
+ title = "Dispatch Test Plugin - Workflow"
+ slug = "test-workflow"
+
+ def get_instance(self, workflow_id: str, instance_id: str, **kwargs):
+ return
+
+ def run(self, workflow_id: str, params: dict, **kwargs):
+ return
diff --git a/src/dispatch/plugins/dispatch_zoom/client.py b/src/dispatch/plugins/dispatch_zoom/client.py
index 311483dc1ef1..503213269a91 100644
--- a/src/dispatch/plugins/dispatch_zoom/client.py
+++ b/src/dispatch/plugins/dispatch_zoom/client.py
@@ -8,8 +8,9 @@
API_BASE_URI = "https://api.zoom.us/v2"
-class ZoomClient():
+class ZoomClient:
"""Simple HTTP Client for Zoom Calls."""
+
def __init__(self, api_key, api_secret):
self.api_key = api_key
self.api_secret = api_secret
@@ -25,27 +26,27 @@ def _get_headers(self):
def get(self, path, params=None):
return requests.get(
- '{}/{}'.format(API_BASE_URI, path),
+ "{}/{}".format(API_BASE_URI, path),
params=params,
headers=self.headers,
- timeout=self.timeout
+ timeout=self.timeout,
)
def post(self, path, data):
return requests.post(
- '{}/{}'.format(API_BASE_URI, path),
+ "{}/{}".format(API_BASE_URI, path),
data=json.dumps(data),
headers=self.headers,
- timeout=self.timeout
+ timeout=self.timeout,
)
def delete(self, path, data=None, params=None):
return requests.delete(
- '{}/{}'.format(API_BASE_URI, path),
+ "{}/{}".format(API_BASE_URI, path),
data=json.dumps(data),
params=params,
headers=self.headers,
- timeout=self.timeout
+ timeout=self.timeout,
)
diff --git a/src/dispatch/plugins/dispatch_zoom/config.py b/src/dispatch/plugins/dispatch_zoom/config.py
index ce54bd52c183..f47acaa92afe 100644
--- a/src/dispatch/plugins/dispatch_zoom/config.py
+++ b/src/dispatch/plugins/dispatch_zoom/config.py
@@ -1,6 +1,16 @@
-from dispatch.config import config, Secret
+from pydantic import Field, SecretStr
+from dispatch.config import BaseConfigurationModel
-ZOOM_API_USER_ID = config("ZOOM_API_USER_ID")
-ZOOM_API_KEY = config("ZOOM_API_KEY")
-ZOOM_API_SECRET = config("ZOOM_API_SECRET", cast=Secret)
+
+class ZoomConfiguration(BaseConfigurationModel):
+ """Zoom configuration description."""
+
+ api_user_id: str = Field(title="Zoom API User Id")
+ api_key: str = Field(title="API Key")
+ api_secret: SecretStr = Field(title="API Secret")
+ default_duration_minutes: int = Field(
+ default=1440, # 1 day
+ title="Default Meeting Duration (Minutes)",
+ description="Default duration in minutes for conference meetings. Defaults to 1440 minutes (1 day).",
+ )
diff --git a/src/dispatch/plugins/dispatch_zoom/plugin.py b/src/dispatch/plugins/dispatch_zoom/plugin.py
index 5581e5bdde27..0cb24192f9cc 100644
--- a/src/dispatch/plugins/dispatch_zoom/plugin.py
+++ b/src/dispatch/plugins/dispatch_zoom/plugin.py
@@ -1,18 +1,19 @@
"""
.. module: dispatch.plugins.dispatch_zoom.plugin
:platform: Unix
- :copyright: (c) 2019 by HashiCorp Inc., see AUTHORS for more
+ :copyright: (c) 2019 by HashCorp Inc., see AUTHORS for more
:license: Apache, see LICENSE for more details.
.. moduleauthor:: Will Bengtson
"""
+
import logging
import random
-from typing import List
from dispatch.decorators import apply, counter, timer
+from dispatch.plugins import dispatch_zoom as zoom_plugin
from dispatch.plugins.bases import ConferencePlugin
-from .config import ZOOM_API_USER_ID, ZOOM_API_KEY, ZOOM_API_SECRET
+from .config import ZoomConfiguration
from .client import ZoomClient
log = logging.getLogger(__name__)
@@ -32,6 +33,7 @@ def delete_meeting(client, event_id: int):
def create_meeting(
client,
+ user_id: str,
name: str,
description: str = None,
title: str = None,
@@ -43,45 +45,51 @@ def create_meeting(
"agenda": description if description else f"Situation Room for {name}. Please join.",
"duration": duration,
"password": gen_conference_challenge(8),
- "settings": {
- "join_before_host": True
- }
+ "settings": {"join_before_host": True},
}
- return client.post(
- "/users/{}/meetings".format(ZOOM_API_USER_ID),
- data=body
- )
+ return client.post("/users/{}/meetings".format(user_id), data=body)
@apply(timer, exclude=["__init__"])
@apply(counter, exclude=["__init__"])
class ZoomConferencePlugin(ConferencePlugin):
- title = "Zoom - Conference"
+ title = "Zoom Plugin - Conference Management"
slug = "zoom-conference"
description = "Uses Zoom to manage conference meetings."
+ version = zoom_plugin.__version__
+
+ author = "HashCorp"
+ author_url = "https://github.com/netflix/dispatch.git"
+
+ def __init__(self):
+ self.configuration_schema = ZoomConfiguration
def create(
- self, name: str, description: str = None, title: str = None, participants: List[str] = []
+ self, name: str, description: str = None, title: str = None, participants: list[str] = None
):
"""Create a new event."""
- client = ZoomClient(ZOOM_API_KEY, str(ZOOM_API_SECRET))
+ client = ZoomClient(
+ self.configuration.api_key, self.configuration.api_secret.get_secret_value()
+ )
conference_response = create_meeting(
- client, name, description=description, title=title
+ client, self.configuration.api_user_id, name, description=description, title=title, duration=self.configuration.default_duration_minutes
)
conference_json = conference_response.json()
return {
"weblink": conference_json.get("join_url", "https://zoom.us"),
- "id": conference_json.get('id', '1'),
- "challenge": conference_json.get('password', '123')
+ "id": conference_json.get("id", "1"),
+ "challenge": conference_json.get("password", "123"),
}
def delete(self, event_id: str):
"""Deletes an existing event."""
- client = ZoomClient(ZOOM_API_KEY, str(ZOOM_API_SECRET))
+ client = ZoomClient(
+ self.configuration.api_key, self.configuration.api_secret.get_secret_value()
+ )
delete_meeting(client, event_id)
def add_participant(self, event_id: str, participant: str):
diff --git a/src/dispatch/plugins/generic_workflow/__init__.py b/src/dispatch/plugins/generic_workflow/__init__.py
new file mode 100644
index 000000000000..ad5cc752c07b
--- /dev/null
+++ b/src/dispatch/plugins/generic_workflow/__init__.py
@@ -0,0 +1 @@
+from ._version import __version__ # noqa
diff --git a/src/dispatch/plugins/generic_workflow/_version.py b/src/dispatch/plugins/generic_workflow/_version.py
new file mode 100644
index 000000000000..3dc1f76bc69e
--- /dev/null
+++ b/src/dispatch/plugins/generic_workflow/_version.py
@@ -0,0 +1 @@
+__version__ = "0.1.0"
diff --git a/src/dispatch/plugins/generic_workflow/plugin.py b/src/dispatch/plugins/generic_workflow/plugin.py
new file mode 100644
index 000000000000..d6627210af13
--- /dev/null
+++ b/src/dispatch/plugins/generic_workflow/plugin.py
@@ -0,0 +1,116 @@
+"""
+.. module: dispatch.plugins.generic_workflow.plugin
+ :platform: Unix
+ :copyright: (c) 2021 by Jørgen Teunis.
+ :license: MIT, see LICENSE for more details.
+ :description:
+
+ The rest API needs to respond with JSON according to the JSON schema mentioned here
+ https://github.com/Netflix/dispatch/issues/1722#issuecomment-947863678
+
+ For example:
+ {
+ "status": "Completed", # String
+ "artifacts": [{
+ "evergreen": False,
+ "evergreen_owner": None,
+ "evergreen_reminder_interval": 90,
+ "evergreen_last_reminder_at": None,
+ "resource_type": None,
+ "resource_id": None,
+ "weblink": "https://www.example.com",
+ "description": "Description",
+ "name": "Logfile20211020",
+ "created_at": "2021-10-20 20:50:00",
+ "updated_at": "2021-10-20 20:50:00"
+ }],
+ "weblink": "https://www.twitter.com", #String,
+ }
+
+"""
+
+import logging
+import requests
+import json
+
+from pydantic import Field, SecretStr, AnyHttpUrl
+from tenacity import TryAgain, retry, stop_after_attempt, wait_exponential
+
+from dispatch.config import BaseConfigurationModel
+from dispatch.decorators import apply, counter, timer
+from dispatch.plugins import generic_workflow as generic_workflow_plugin
+from dispatch.plugins.bases import WorkflowPlugin
+
+
+class GenericWorkflowConfiguration(BaseConfigurationModel):
+ """
+ Generic Workflow configuration
+
+ You can enter an REST API endpoint here that gets called when a workflow needs to either run or return its status.
+ Run results in a POST request with a JSON payload containing workflow_id and params.
+ Getting the status of the workflow is called as a GET request with the following GET query string parameters:
+ workflow_id, workflow_instance_id, incident_id and incident_name.
+ """
+
+ api_url: AnyHttpUrl = Field(
+ title="API URL", description="This API endpoint to GET or POST workflow info from/to."
+ )
+ auth_header: SecretStr = Field(
+ title="Authorization Header",
+ description="For example: Bearer: JWT token, or basic: Basic QWxhZGRpbjpvcGVuIHNlc2FtZQ==",
+ )
+
+
+@apply(counter, exclude=["__init__"])
+@apply(timer, exclude=["__init__"])
+class GenericWorkflowPlugin(WorkflowPlugin):
+ title = "Generic Workflow Plugin - Workflow Management"
+ slug = "generic-workflow"
+ description = "A generic workflow plugin that calls an API endpoint to kick-off a workflow and retrieve the status of a workflow."
+ version = generic_workflow_plugin.__version__
+
+ author = "Jørgen Teunis"
+ author_url = "https://github.com/jtorvald/"
+
+ def __init__(self):
+ WorkflowPlugin.__init__(self)
+ self.configuration_schema = GenericWorkflowConfiguration
+
+ @retry(stop=stop_after_attempt(3), wait=wait_exponential(multiplier=1, min=1, max=10))
+ def get_workflow_instance(
+ self,
+ workflow_id: str,
+ tags: list[str],
+ **kwargs,
+ ):
+ api_url = str(self.configuration.api_url)
+ headers = {
+ "Content-Type": "application/json",
+ "Authorization": self.configuration.auth_header.get_secret_value(),
+ }
+ fields = {
+ "workflow_id": workflow_id,
+ "tags": tags,
+ }
+ resp = requests.get(api_url, params=fields, headers=headers)
+
+ if resp.status_code in [429, 500, 502, 503, 504]:
+ raise TryAgain
+
+ return resp.json()
+
+ @retry(stop=stop_after_attempt(3), wait=wait_exponential(multiplier=1, min=1, max=10))
+ def run(self, workflow_id: str, params: dict, **kwargs):
+ logging.info("Run on generic workflow %s, %s", params, kwargs)
+ api_url = str(self.configuration.api_url)
+ obj = {"workflow_id": workflow_id, "params": params}
+ headers = {
+ "Content-Type": "application/json",
+ "Authorization": self.configuration.auth_header.get_secret_value(),
+ }
+ resp = requests.post(api_url, data=json.dumps(obj), headers=headers)
+
+ if resp.status_code in [429, 500, 502, 503, 504]:
+ raise TryAgain
+
+ return resp.json()
diff --git a/src/dispatch/policy/dsl.py b/src/dispatch/policy/dsl.py
deleted file mode 100644
index baf8a554311c..000000000000
--- a/src/dispatch/policy/dsl.py
+++ /dev/null
@@ -1,131 +0,0 @@
-import operator
-
-from pyparsing import infixNotation, opAssoc, Word, alphanums, Literal
-
-from dispatch.exceptions import InvalidFilterPolicy
-
-
-def contains(a: str, b: list) -> bool:
- """Simple infix 'in' operator"""
- return a in b
-
-
-def build_parser():
- operators = (
- Literal("=")
- | Literal("==")
- | Literal("eq")
- | Literal("<")
- | Literal("lt")
- | Literal(">")
- | Literal("gt")
- | Literal("<=")
- | Literal("le")
- | Literal(">=")
- | Literal("ge")
- | Literal("!=")
- | Literal("ne")
- | Literal("in")
- | Literal("and")
- | Literal("or")
- )
- field = Word(alphanums)
- value = Word(alphanums)
- comparison = field + operators + value
- query = infixNotation(
- comparison,
- [
- ("and", 2, opAssoc.LEFT, NestedExpr),
- ("or", 2, opAssoc.LEFT, NestedExpr),
- ("in", 2, opAssoc.LEFT, NestedExpr),
- ],
- )
-
- comparison.addParseAction(ComparisonExpr)
- return query
-
-
-def operatorOperands(tokenlist: list) -> tuple:
- """Generator to extract operators and operands in pairs."""
- it = iter(tokenlist)
- while 1:
- try:
- yield (next(it), next(it))
- except StopIteration:
- break
-
-
-class FilterPolicy(object):
- binary_operators = {
- "=": operator.eq,
- "==": operator.eq,
- "eq": operator.eq,
- "<": operator.lt,
- "lt": operator.lt,
- ">": operator.gt,
- "gt": operator.gt,
- "<=": operator.le,
- "le": operator.le,
- ">=": operator.ge,
- "ge": operator.ge,
- "!=": operator.ne,
- "ne": operator.ne,
- "in": contains,
- }
-
- multiple_operators = {"or": any, "â¨": any, "and": all, "â§": all}
-
- def __init__(self, tree):
- self._eval = self.build_evaluator(tree)
-
- def __call__(self, **kwargs):
- return self._eval(kwargs)
-
- def build_evaluator(self, tree):
- try:
- operator, nodes = list(tree.items())[0]
- except Exception as e:
- raise InvalidFilterPolicy(f"Unable to parse tree: {tree} reason: {e}")
- try:
- op = self.multiple_operators[operator]
- except KeyError:
- try:
- op = self.binary_operators[operator]
- except KeyError:
- raise InvalidFilterPolicy(f"Unknown operator: {operator}")
- assert len(nodes) == 2 # binary operators take 2 values
-
- def _op(values):
- return op(values[nodes[0]], nodes[1])
-
- return _op
- # Iterate over every item in the list of the value linked
- # to the logical operator, and compile it down to its own
- # evaluator.
- elements = [self.build_evaluator(node) for node in nodes]
- return lambda values: op((e(values) for e in elements))
-
-
-# ExampleFilter = FilterPolicy({"or": ({"eq": ("term", "bar")}, {"eq": ("term", "baz")})})
-
-
-class NestedExpr(object):
- def __init__(self, tokens):
- self.value = tokens[0]
-
- def eval(self):
- val1 = self.value[0].eval() # noqa
- for op, val in operatorOperands(self.value[1:]):
- print(op)
- print(val.eval())
- return True
-
-
-class ComparisonExpr:
- def __init__(self, tokens):
- self.tokens = tokens
-
- def __str__(self):
- return str({self.tokens[1]: (self.tokens[0], self.tokens[2])})
-
- __repr__ = __str__
diff --git a/src/dispatch/policy/models.py b/src/dispatch/policy/models.py
deleted file mode 100644
index 36accf9f133b..000000000000
--- a/src/dispatch/policy/models.py
+++ /dev/null
@@ -1,42 +0,0 @@
-from typing import List, Optional
-
-from sqlalchemy import Column, Integer, String
-from sqlalchemy_utils import TSVectorType
-
-from dispatch.database import Base
-from dispatch.models import DispatchBase
-
-
-class Policy(Base):
- id = Column(Integer, primary_key=True)
- name = Column(String, unique=True)
- description = Column(String)
- expression = Column(String)
-
- search_vector = Column(
- TSVectorType("name", "description", weights={"name": "A", "description": "B"})
- )
-
-
-# Pydantic models...
-class PolicyBase(DispatchBase):
- expression: str
- name: Optional[str]
- description: Optional[str]
-
-
-class PolicyCreate(PolicyBase):
- pass
-
-
-class PolicyUpdate(PolicyBase):
- id: int
-
-
-class PolicyRead(PolicyBase):
- id: int
-
-
-class PolicyPagination(DispatchBase):
- items: List[PolicyRead]
- total: int
diff --git a/src/dispatch/policy/service.py b/src/dispatch/policy/service.py
deleted file mode 100644
index 9e62f1cd73f0..000000000000
--- a/src/dispatch/policy/service.py
+++ /dev/null
@@ -1,72 +0,0 @@
-from typing import List, Optional
-from fastapi.encoders import jsonable_encoder
-
-from .models import Policy, PolicyCreate, PolicyUpdate
-from .dsl import build_parser
-
-
-def get(*, db_session, policy_id: int) -> Optional[Policy]:
- return db_session.query(Policy).filter(Policy.id == policy_id).first()
-
-
-def get_by_text(*, db_session, text: str) -> Optional[Policy]:
- return db_session.query(Policy).filter(Policy.text == text).first()
-
-
-def get_all(*, db_session):
- return db_session.query(Policy)
-
-
-def create(*, db_session, policy_in: PolicyCreate) -> Policy:
- policy = Policy(**policy_in.dict())
- db_session.add(policy)
- db_session.commit()
- return policy
-
-
-def create_all(*, db_session, policys_in: List[PolicyCreate]) -> List[Policy]:
- policys = [Policy(text=d.text) for d in policys_in]
- db_session.bulk_save_insert(policys)
- db_session.commit()
- db_session.refresh()
- return policys
-
-
-def update(*, db_session, policy: Policy, policy_in: PolicyUpdate) -> Policy:
- policy_data = jsonable_encoder(policy)
- update_data = policy_in.dict(skip_defaults=True)
-
- for field in policy_data:
- if field in update_data:
- setattr(policy, field, update_data[field])
-
- db_session.add(policy)
- db_session.commit()
- return policy
-
-
-def create_or_update(*, db_session, policy_in: PolicyCreate) -> Policy:
- update_data = policy_in.dict(skip_defaults=True, exclude={"terms"})
-
- q = db_session.query(Policy)
- for attr, value in update_data.items():
- q = q.filter(getattr(Policy, attr) == value)
-
- instance = q.first()
-
- if instance:
- return update(db_session=db_session, policy=instance, policy_in=policy_in)
-
- return create(db_session=db_session, policy_in=policy_in)
-
-
-def parse(policy: str) -> dict:
- sample = "footbar eq 123 or bar eq blah"
- query = build_parser()
- return query.parseString(sample, parseAll=True)
-
-
-def delete(*, db_session, policy_id: int):
- policy = db_session.query(Policy).filter(Policy.id == policy_id).first()
- db_session.delete(policy)
- db_session.commit()
diff --git a/src/dispatch/policy/views.py b/src/dispatch/policy/views.py
deleted file mode 100644
index 96a7697739db..000000000000
--- a/src/dispatch/policy/views.py
+++ /dev/null
@@ -1,63 +0,0 @@
-from fastapi import APIRouter, Depends, HTTPException
-from sqlalchemy.orm import Session
-
-from dispatch.database import get_db, paginate
-from dispatch.search.service import search
-
-from .models import Policy, PolicyCreate, PolicyPagination, PolicyRead, PolicyUpdate
-from .service import create, delete, get, get_all, update
-
-router = APIRouter()
-
-
-@router.get("/", response_model=PolicyPagination)
-def get_policies(
- db_session: Session = Depends(get_db), page: int = 0, itemsPerPage: int = 5, q: str = None
-):
- """
- Retrieve policies.
- """
- if q:
- query = search(db_session=db_session, query_str=q, model=Policy)
- else:
- query = get_all(db_session=db_session)
-
- items, total = paginate(query=query, page=page, items_per_page=itemsPerPage)
-
- return {"items": items, "total": total}
-
-
-@router.post("/", response_model=PolicyRead)
-def create_policy(*, db_session: Session = Depends(get_db), policy_in: PolicyCreate):
- """
- Create a new policy.
- """
- # TODO check for similarity
- policy = create(db_session=db_session, policy_in=policy_in)
- return policy
-
-
-@router.put("/{policy_id}", response_model=PolicyRead)
-def update_policy(
- *, db_session: Session = Depends(get_db), policy_id: int, policy_in: PolicyUpdate
-):
- """
- Update a policy.
- """
- policy = get(db_session=db_session, policy_id=policy_id)
- if not policy:
- raise HTTPException(status_code=404, detail="A policy with this id does not exist.")
- policy = update(db_session=db_session, policy=policy, policy_in=policy_in)
- return policy
-
-
-@router.delete("/{policy_id}")
-def delete_individual(*, db_session: Session = Depends(get_db), policy_id: int):
- """
- Delete a policy.
- """
- policy = get(db_session=db_session, policy_id=policy_id)
- if not policy:
- raise HTTPException(status_code=404, detail="A policy with this id does not exist.")
-
- delete(db_session=db_session, policy_id=policy_id)
diff --git a/src/dispatch/project/__init__.py b/src/dispatch/project/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/project/flows.py b/src/dispatch/project/flows.py
new file mode 100644
index 000000000000..6ce1ec9a4ba5
--- /dev/null
+++ b/src/dispatch/project/flows.py
@@ -0,0 +1,156 @@
+from dispatch.case.priority import service as case_priority_service
+from dispatch.case.priority.config import default_case_priorities
+from dispatch.case.priority.models import CasePriorityCreate
+from dispatch.case.severity import service as case_severity_service
+from dispatch.case.severity.config import default_case_severities
+from dispatch.case.severity.models import CaseSeverityCreate
+from dispatch.case.type import service as case_type_service
+from dispatch.case.type.config import default_case_type
+from dispatch.case.type.models import CaseTypeCreate
+from dispatch.case.enums import CostModelType
+from dispatch.case_cost_type import service as case_cost_type_service
+from dispatch.decorators import background_task
+from dispatch.incident.priority import service as incident_priority_service
+from dispatch.incident.priority.config import default_incident_priorities
+from dispatch.incident.priority.models import IncidentPriorityCreate
+from dispatch.incident.severity import service as incident_severity_service
+from dispatch.incident.severity.config import default_incident_severities
+from dispatch.incident.severity.models import IncidentSeverityCreate
+from dispatch.incident.type import service as incident_type_service
+from dispatch.incident.type.config import default_incident_type
+from dispatch.incident.type.models import IncidentTypeCreate
+from dispatch.incident_cost_type import service as incident_cost_type_service
+from dispatch.incident_cost_type.config import default_incident_cost_type
+from dispatch.incident_cost_type.models import IncidentCostTypeCreate
+from dispatch.plugin import service as plugin_service
+from dispatch.plugin.models import PluginInstanceCreate
+
+from .service import get
+
+
+@background_task
+def project_init_flow(*, project_id: int, organization_slug: str, db_session=None):
+ """Initializes a new project with default settings."""
+ project = get(db_session=db_session, project_id=project_id)
+
+ # Add all plugins in disabled mode
+ plugins = plugin_service.get_all(db_session=db_session)
+ for plugin in plugins:
+ plugin_instance_in = PluginInstanceCreate(
+ project=project, plugin=plugin, configuration={}, enabled=False
+ )
+ plugin_service.create_instance(db_session=db_session, plugin_instance_in=plugin_instance_in)
+
+ # Create default incident type
+ incident_type_in = IncidentTypeCreate(
+ name=default_incident_type["name"],
+ description=default_incident_type["description"],
+ visibility=default_incident_type["visibility"],
+ exclude_from_metrics=default_incident_type["exclude_from_metrics"],
+ default=default_incident_type["default"],
+ enabled=default_incident_type["enabled"],
+ project=project,
+ )
+ incident_type = incident_type_service.create(
+ db_session=db_session, incident_type_in=incident_type_in
+ )
+
+ # Create default incident priorities
+ for priority in default_incident_priorities:
+ incident_priority_in = IncidentPriorityCreate(
+ name=priority["name"],
+ description=priority["description"],
+ page_commander=priority["page_commander"],
+ tactical_report_reminder=priority["tactical_report_reminder"],
+ executive_report_reminder=priority["executive_report_reminder"],
+ project=project,
+ default=priority["default"],
+ enabled=priority["enabled"],
+ view_order=priority["view_order"],
+ color=priority["color"],
+ )
+ incident_priority_service.create(
+ db_session=db_session, incident_priority_in=incident_priority_in
+ )
+
+ # Create default incident severities
+ for severity in default_incident_severities:
+ incident_severity_in = IncidentSeverityCreate(
+ name=severity["name"],
+ description=severity["description"],
+ project=project,
+ default=severity["default"],
+ enabled=severity["enabled"],
+ view_order=severity["view_order"],
+ color=severity["color"],
+ )
+ incident_severity_service.create(
+ db_session=db_session, incident_severity_in=incident_severity_in
+ )
+
+ # Create default incident response cost
+ incident_cost_type_in = IncidentCostTypeCreate(
+ name=default_incident_cost_type["name"],
+ description=default_incident_cost_type["description"],
+ category=default_incident_cost_type["category"],
+ details=default_incident_cost_type["details"],
+ default=default_incident_cost_type["default"],
+ editable=default_incident_cost_type["editable"],
+ project=project,
+ )
+ incident_cost_type_service.create(
+ db_session=db_session, incident_cost_type_in=incident_cost_type_in
+ )
+
+ # Create default case response cost
+ case_cost_type_service.get_or_create_response_cost_type(
+ db_session=db_session, project_id=project.id, model_type=CostModelType.classic
+ )
+ case_cost_type_service.get_or_create_response_cost_type(
+ db_session=db_session, project_id=project.id, model_type=CostModelType.new
+ )
+
+ # Create default case type
+ case_type_in = CaseTypeCreate(
+ name=default_case_type["name"],
+ description=default_case_type["description"],
+ visibility=default_case_type["visibility"],
+ exclude_from_metrics=default_case_type["exclude_from_metrics"],
+ default=default_case_type["default"],
+ enabled=default_case_type["enabled"],
+ project=project,
+ )
+ case_type = case_type_service.create(db_session=db_session, case_type_in=case_type_in)
+
+ # Map case type with incident type
+ case_type.incident_type = incident_type
+ db_session.add(case_type)
+ db_session.commit()
+
+ # Create default case priorities
+ for priority in default_case_priorities:
+ case_priority_in = CasePriorityCreate(
+ name=priority["name"],
+ description=priority["description"],
+ page_assignee=priority["page_assignee"],
+ project=project,
+ default=priority["default"],
+ enabled=priority["enabled"],
+ view_order=priority["view_order"],
+ color=priority["color"],
+ disable_delayed_message_warning=priority["disable_delayed_message_warning"],
+ )
+ case_priority_service.create(db_session=db_session, case_priority_in=case_priority_in)
+
+ # Create default case severities
+ for severity in default_case_severities:
+ case_severity_in = CaseSeverityCreate(
+ name=severity["name"],
+ description=severity["description"],
+ project=project,
+ default=severity["default"],
+ enabled=severity["enabled"],
+ view_order=severity["view_order"],
+ color=severity["color"],
+ )
+ case_severity_service.create(db_session=db_session, case_severity_in=case_severity_in)
diff --git a/src/dispatch/project/models.py b/src/dispatch/project/models.py
new file mode 100644
index 000000000000..4278858d2a46
--- /dev/null
+++ b/src/dispatch/project/models.py
@@ -0,0 +1,146 @@
+from pydantic import ConfigDict, EmailStr, Field
+from slugify import slugify
+from sqlalchemy import Boolean, Column, ForeignKey, Integer, String
+from sqlalchemy.ext.hybrid import hybrid_property
+from sqlalchemy.orm import relationship
+from sqlalchemy.sql import false
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.database.core import Base
+from dispatch.incident.priority.models import (
+ IncidentPriority,
+ IncidentPriorityRead,
+)
+from dispatch.models import DispatchBase, NameStr, Pagination, PrimaryKey
+from dispatch.organization.models import Organization, OrganizationRead
+
+
+class Project(Base):
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ display_name = Column(String, nullable=False, server_default="")
+
+ description = Column(String)
+ default = Column(Boolean, default=False)
+ color = Column(String)
+
+ annual_employee_cost = Column(Integer, default=50000)
+ business_year_hours = Column(Integer, default=2080)
+
+ owner_email = Column(String)
+ owner_conversation = Column(String)
+
+ organization_id = Column(Integer, ForeignKey(Organization.id))
+ organization = relationship("Organization")
+
+ dispatch_user_project = relationship(
+ "DispatchUserProject", cascade="all, delete-orphan", overlaps="users"
+ )
+
+ enabled = Column(Boolean, default=True, server_default="t")
+ allow_self_join = Column(Boolean, default=True, server_default="t")
+
+ send_daily_reports = Column(Boolean)
+ send_weekly_reports = Column(Boolean)
+
+ weekly_report_notification_id = Column(Integer, nullable=True)
+
+ select_commander_visibility = Column(Boolean, default=True, server_default="t")
+
+ stable_priority_id = Column(Integer, nullable=True)
+ stable_priority = relationship(
+ IncidentPriority,
+ foreign_keys=[stable_priority_id],
+ primaryjoin="IncidentPriority.id == Project.stable_priority_id",
+ )
+
+ # allows for alternative names for storage folders inside incident/case
+ storage_folder_one = Column(String, nullable=True)
+ storage_folder_two = Column(String, nullable=True)
+ # when true, storage_folder_one is used as the primary storage folder for incidents/cases
+ storage_use_folder_one_as_primary = Column(Boolean, default=False, nullable=True)
+ # when true, folder and incident docs will be created with the title of the incident
+ storage_use_title = Column(Boolean, default=False, server_default=false())
+
+ # allows customized instructions for reporting incidents
+ report_incident_instructions = Column(String, nullable=True)
+ report_incident_title_hint = Column(String, nullable=True)
+ report_incident_description_hint = Column(String, nullable=True)
+
+ # controls whether to suggest security events over incidents
+ suggest_security_event_over_incident = Column(Boolean, default=False, server_default="f")
+
+ snooze_extension_oncall_service_id = Column(Integer, nullable=True)
+ snooze_extension_oncall_service = relationship(
+ "Service",
+ foreign_keys=[snooze_extension_oncall_service_id],
+ primaryjoin="Service.id == Project.snooze_extension_oncall_service_id",
+ )
+
+ @hybrid_property
+ def slug(self):
+ return slugify(str(self.name))
+
+ search_vector = Column(
+ TSVectorType("name", "description", weights={"name": "A", "description": "B"})
+ )
+
+
+class Service(DispatchBase):
+ id: PrimaryKey
+ description: str | None = None
+ external_id: str
+ is_active: bool | None = None
+ name: NameStr
+ type: str | None = None
+
+
+class ProjectBase(DispatchBase):
+ id: PrimaryKey | None
+ name: NameStr
+ display_name: str | None = Field("")
+ owner_email: EmailStr | None = None
+ owner_conversation: str | None = None
+ annual_employee_cost: int | None = 50000
+ business_year_hours: int | None = 2080
+ description: str | None = None
+ default: bool = False
+ color: str | None = None
+ send_daily_reports: bool | None = Field(True)
+ send_weekly_reports: bool | None = Field(False)
+ weekly_report_notification_id: int | None = None
+ enabled: bool | None = Field(True)
+ storage_folder_one: str | None = None
+ storage_folder_two: str | None = None
+ storage_use_folder_one_as_primary: bool | None = Field(True)
+ storage_use_title: bool | None = Field(False)
+ allow_self_join: bool | None = Field(True)
+ select_commander_visibility: bool | None = Field(True)
+ report_incident_instructions: str | None = None
+ report_incident_title_hint: str | None = None
+ report_incident_description_hint: str | None = None
+ suggest_security_event_over_incident: bool | None = Field(True)
+ snooze_extension_oncall_service: Service | None = None
+
+
+class ProjectCreate(ProjectBase):
+ organization: OrganizationRead
+
+
+class ProjectUpdate(ProjectBase):
+ send_daily_reports: bool | None = Field(True)
+ send_weekly_reports: bool | None = Field(False)
+ weekly_report_notification_id: int | None = None
+ stable_priority_id: int | None = None
+ snooze_extension_oncall_service_id: int | None = None
+
+
+class ProjectRead(ProjectBase):
+ id: PrimaryKey | None = None
+ stable_priority: IncidentPriorityRead | None = None
+
+ model_config = ConfigDict(from_attributes=True)
+
+
+class ProjectPagination(Pagination):
+ items: list[ProjectRead] = []
diff --git a/src/dispatch/project/service.py b/src/dispatch/project/service.py
new file mode 100644
index 000000000000..962325423c03
--- /dev/null
+++ b/src/dispatch/project/service.py
@@ -0,0 +1,121 @@
+from pydantic import ValidationError
+from sqlalchemy.orm import Session
+from sqlalchemy.sql.expression import true
+
+
+from .models import Project, ProjectCreate, ProjectRead, ProjectUpdate
+
+
+def get(*, db_session: Session, project_id: int) -> Project | None:
+ """Returns a project based on the given project id."""
+ return db_session.query(Project).filter(Project.id == project_id).first()
+
+
+def get_default(*, db_session: Session) -> Project | None:
+ """Returns the default project."""
+ return db_session.query(Project).filter(Project.default == true()).one_or_none()
+
+
+def get_default_or_raise(*, db_session: Session) -> Project:
+ """Returns the default project or raise a ValidationError if one doesn't exist."""
+ project = get_default(db_session=db_session)
+
+ if not project:
+ raise ValidationError(
+ [
+ {
+ "loc": ("project",),
+ "msg": "No default project defined.",
+ "type": "value_error",
+ }
+ ]
+ )
+ return project
+
+
+def get_by_name(*, db_session: Session, name: str) -> Project | None:
+ """Returns a project based on the given project name."""
+ return db_session.query(Project).filter(Project.name == name).one_or_none()
+
+
+def get_by_name_or_raise(*, db_session: Session, project_in: ProjectRead) -> Project:
+ """Returns the project specified or raises ValidationError."""
+ project = get_by_name(db_session=db_session, name=project_in.name)
+
+ if not project:
+ raise ValidationError(
+ [
+ {
+ "msg": "Project not found.",
+ "name": project_in.name,
+ "loc": "name",
+ }
+ ]
+ )
+
+ return project
+
+
+def get_by_name_or_default(*, db_session, project_in: ProjectRead) -> Project:
+ """Returns a project based on a name or the default if not specified."""
+ if project_in and project_in.name:
+ project = get_by_name(db_session=db_session, name=project_in.name)
+ if project:
+ return project
+ return get_default_or_raise(db_session=db_session)
+
+
+def get_all(*, db_session) -> list[Project | None]:
+ """Returns all projects."""
+ return db_session.query(Project)
+
+
+def create(*, db_session, project_in: ProjectCreate) -> Project:
+ """Creates a project."""
+ from dispatch.organization import service as organization_service
+
+ organization = organization_service.get_by_slug(
+ db_session=db_session, slug=project_in.organization.slug
+ )
+ project = Project(
+ **project_in.dict(exclude={"organization"}),
+ organization_id=organization.id,
+ )
+
+ db_session.add(project)
+ db_session.commit()
+ return project
+
+
+def get_or_create(*, db_session, project_in: ProjectCreate) -> Project:
+ if project_in.id:
+ q = db_session.query(Project).filter(Project.id == project_in.id)
+ else:
+ q = db_session.query(Project).filter_by(**project_in.dict(exclude={"id", "organization"}))
+
+ instance = q.first()
+ if instance:
+ return instance
+
+ return create(db_session=db_session, project_in=project_in)
+
+
+def update(*, db_session, project: Project, project_in: ProjectUpdate) -> Project:
+ """Updates a project."""
+ project_data = project.dict()
+
+ update_data = project_in.dict(exclude_unset=True, exclude={})
+
+ for field in project_data:
+ if field in update_data:
+ setattr(project, field, update_data[field])
+
+ db_session.commit()
+ return project
+
+
+def delete(*, db_session, project_id: int):
+ """Deletes a project."""
+ project = db_session.query(Project).filter(Project.id == project_id).first()
+ db_session.delete(project)
+ db_session.commit()
diff --git a/src/dispatch/project/views.py b/src/dispatch/project/views.py
new file mode 100644
index 000000000000..beca03643dac
--- /dev/null
+++ b/src/dispatch/project/views.py
@@ -0,0 +1,124 @@
+from fastapi import APIRouter, BackgroundTasks, Depends, HTTPException, status
+from pydantic import ValidationError
+
+
+from dispatch.auth.permissions import (
+ PermissionsDependency,
+ ProjectCreatePermission,
+ ProjectUpdatePermission,
+)
+
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import OrganizationSlug, PrimaryKey
+
+from .flows import project_init_flow
+from .models import (
+ ProjectCreate,
+ ProjectRead,
+ ProjectUpdate,
+ ProjectPagination,
+)
+from .service import create, delete, get, get_by_name, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=ProjectPagination)
+def get_projects(common: CommonParameters):
+ """Get all projects."""
+ return search_filter_sort_paginate(model="Project", **common)
+
+
+@router.post(
+ "",
+ response_model=ProjectRead,
+ summary="Create a new project.",
+ dependencies=[Depends(PermissionsDependency([ProjectCreatePermission]))],
+)
+def create_project(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ project_in: ProjectCreate,
+ background_tasks: BackgroundTasks,
+):
+ """Create a new project."""
+ project = get_by_name(db_session=db_session, name=project_in.name)
+ if project:
+ raise ValidationError(
+ [
+ {
+ "msg": "A project with this name already exists.",
+ "loc": "name",
+ }
+ ]
+ )
+ if project_in.id and get(db_session=db_session, project_id=project_in.id):
+ raise ValidationError(
+ [
+ {
+ "msg": "A project with this id already exists.",
+ "loc": "id",
+ }
+ ]
+ )
+
+ project = create(db_session=db_session, project_in=project_in)
+ background_tasks.add_task(
+ project_init_flow, project_id=project.id, organization_slug=organization
+ )
+ return project
+
+
+@router.get(
+ "/{project_id}",
+ response_model=ProjectRead,
+ summary="Get a project.",
+)
+def get_project(db_session: DbSession, project_id: PrimaryKey):
+ """Get a project."""
+ project = get(db_session=db_session, project_id=project_id)
+ if not project:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A project with this id does not exist."}],
+ )
+ return project
+
+
+@router.put(
+ "/{project_id}",
+ response_model=ProjectRead,
+ dependencies=[Depends(PermissionsDependency([ProjectUpdatePermission]))],
+)
+def update_project(
+ db_session: DbSession,
+ project_id: PrimaryKey,
+ project_in: ProjectUpdate,
+):
+ """Update a project."""
+ project = get(db_session=db_session, project_id=project_id)
+ if not project:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A project with this id does not exist."}],
+ )
+ project = update(db_session=db_session, project=project, project_in=project_in)
+ return project
+
+
+@router.delete(
+ "/{project_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([ProjectUpdatePermission]))],
+)
+def delete_project(db_session: DbSession, project_id: PrimaryKey):
+ """Delete a project."""
+ project = get(db_session=db_session, project_id=project_id)
+ if not project:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A project with this id does not exist."}],
+ )
+ delete(db_session=db_session, project_id=project_id)
diff --git a/src/dispatch/rate_limiter.py b/src/dispatch/rate_limiter.py
new file mode 100644
index 000000000000..52054d33310c
--- /dev/null
+++ b/src/dispatch/rate_limiter.py
@@ -0,0 +1,5 @@
+from slowapi import Limiter
+from slowapi.util import get_remote_address
+
+
+limiter = Limiter(key_func=get_remote_address)
diff --git a/src/dispatch/report/__init__.py b/src/dispatch/report/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/report/enums.py b/src/dispatch/report/enums.py
new file mode 100644
index 000000000000..6402257ad652
--- /dev/null
+++ b/src/dispatch/report/enums.py
@@ -0,0 +1,6 @@
+from dispatch.enums import DispatchEnum
+
+
+class ReportTypes(DispatchEnum):
+ tactical_report = "Tactical Report"
+ executive_report = "Executive Report"
diff --git a/src/dispatch/report/flows.py b/src/dispatch/report/flows.py
new file mode 100644
index 000000000000..6b473658d041
--- /dev/null
+++ b/src/dispatch/report/flows.py
@@ -0,0 +1,226 @@
+import logging
+
+from datetime import date
+
+from pydantic import ValidationError
+
+from dispatch.decorators import background_task
+from dispatch.document import service as document_service
+from dispatch.document.models import DocumentCreate
+from dispatch.enums import DocumentResourceTypes
+from dispatch.event import service as event_service
+from dispatch.incident import service as incident_service
+from dispatch.participant import service as participant_service
+from dispatch.plugin import service as plugin_service
+
+from .enums import ReportTypes
+from .messaging import (
+ send_executive_report_to_notifications_group,
+ send_tactical_report_to_conversation,
+ send_tactical_report_to_tactical_group,
+)
+from .models import ReportCreate, TacticalReportCreate, ExecutiveReportCreate
+from .service import create, get_all_by_incident_id_and_type
+
+
+log = logging.getLogger(__name__)
+
+
+@background_task
+def create_tactical_report(
+ user_email: str,
+ incident_id: int,
+ tactical_report_in: TacticalReportCreate,
+ organization_slug: str = None,
+ db_session=None,
+):
+ """Creates and sends a new tactical report to a conversation."""
+ conditions = tactical_report_in.conditions
+ actions = tactical_report_in.actions
+ needs = tactical_report_in.needs
+
+ # we load the incident instance
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+
+ # we create a new tactical report
+ details = {"conditions": conditions, "actions": actions, "needs": needs}
+ tactical_report_in = ReportCreate(details=details, type=ReportTypes.tactical_report)
+ tactical_report = create(db_session=db_session, report_in=tactical_report_in)
+
+ # we load the participant
+ participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=incident_id, email=user_email
+ )
+
+ # we save the tactical report
+ participant.reports.append(tactical_report)
+ incident.reports.append(tactical_report)
+
+ db_session.add(participant)
+ db_session.add(incident)
+ db_session.commit()
+
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Incident Participant",
+ description=f"{participant.individual.name} created a new tactical report",
+ details={"conditions": conditions, "actions": actions, "needs": needs},
+ incident_id=incident_id,
+ individual_id=participant.individual.id,
+ owner=participant.individual.name,
+ )
+
+ # we send the tactical report to the conversation
+ send_tactical_report_to_conversation(incident_id, conditions, actions, needs, db_session)
+
+ # we send the tactical report to the tactical group
+ send_tactical_report_to_tactical_group(incident_id, tactical_report, db_session)
+
+ return tactical_report
+
+
+@background_task
+def create_executive_report(
+ user_email: str,
+ incident_id: int,
+ executive_report_in: ExecutiveReportCreate,
+ organization_slug: str = None,
+ db_session=None,
+):
+ """Creates an executive report."""
+ current_date = date.today().strftime("%B %d, %Y")
+
+ current_status = executive_report_in.current_status
+ overview = executive_report_in.overview
+ next_steps = executive_report_in.next_steps
+
+ # we load the incident instance
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+
+ if not incident.incident_type.executive_template_document:
+ raise ValidationError([
+ {
+ "msg": "No executive report template defined.",
+ "loc": "executive_template_document",
+ }
+ ])
+
+ # we fetch all previous executive reports
+ executive_reports = get_all_by_incident_id_and_type(
+ db_session=db_session, incident_id=incident_id, report_type=ReportTypes.executive_report
+ )
+
+ previous_executive_reports = []
+ for executive_report in executive_reports:
+ previous_executive_reports.append(
+ f"{executive_report.document.name} - {executive_report.document.weblink}\n"
+ )
+
+ # we create a new executive report
+ details = {"current_status": current_status, "overview": overview, "next_steps": next_steps}
+ executive_report_in = ReportCreate(
+ details=details,
+ type=ReportTypes.executive_report,
+ )
+ executive_report = create(db_session=db_session, report_in=executive_report_in)
+
+ # we load the participant
+ participant = participant_service.get_by_incident_id_and_email(
+ db_session=db_session, incident_id=incident_id, email=user_email
+ )
+
+ # we save the executive report
+ participant.reports.append(executive_report)
+ incident.reports.append(executive_report)
+
+ db_session.add(participant)
+ db_session.add(incident)
+ db_session.commit()
+
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Incident Participant",
+ description=f"{participant.individual.name} created a new executive report",
+ details={"current_status": current_status, "overview": overview, "next_steps": next_steps},
+ incident_id=incident_id,
+ individual_id=participant.individual.id,
+ owner=participant.individual.name,
+ )
+
+ # we create a new document for the executive report
+ storage_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="storage"
+ )
+ executive_report_document_name = f"{incident.name} - Executive Report - {current_date}"
+ executive_report_document = storage_plugin.instance.copy_file(
+ folder_id=incident.storage.resource_id,
+ file_id=incident.incident_type.executive_template_document.resource_id,
+ name=executive_report_document_name,
+ )
+
+ executive_report_document.update(
+ {
+ "name": executive_report_document_name,
+ "description": incident.incident_type.executive_template_document.description,
+ "resource_type": DocumentResourceTypes.executive,
+ }
+ )
+
+ storage_plugin.instance.move_file(
+ new_folder_id=incident.storage.resource_id, file_id=executive_report_document["id"]
+ )
+
+ event_service.log_incident_event(
+ db_session=db_session,
+ source=storage_plugin.plugin.title,
+ description="Executive report document added to storage",
+ incident_id=incident.id,
+ )
+
+ document_in = DocumentCreate(
+ name=executive_report_document["name"],
+ description=executive_report_document["description"],
+ resource_id=executive_report_document["id"],
+ resource_type=executive_report_document["resource_type"],
+ project=incident.project,
+ weblink=executive_report_document["weblink"],
+ )
+ executive_report.document = document_service.create(
+ db_session=db_session, document_in=document_in
+ )
+
+ incident.documents.append(executive_report.document)
+
+ db_session.add(executive_report)
+ db_session.add(incident)
+ db_session.commit()
+
+ event_service.log_incident_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Executive report document added to incident",
+ incident_id=incident.id,
+ )
+
+ # we update the incident update document
+ document_plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="document"
+ )
+ document_plugin.instance.update(
+ executive_report_document["id"],
+ name=incident.name,
+ title=incident.title,
+ current_date=current_date,
+ current_status=current_status,
+ overview=overview,
+ next_steps=next_steps,
+ previous_reports="\n".join(previous_executive_reports),
+ commander_fullname=incident.commander.individual.name,
+ commander_team=incident.commander.team,
+ commander_weblink=incident.commander.individual.weblink,
+ )
+
+ # we send the executive report to the notifications group
+ send_executive_report_to_notifications_group(incident.id, executive_report, db_session)
+
+ return executive_report
diff --git a/src/dispatch/report/messaging.py b/src/dispatch/report/messaging.py
new file mode 100644
index 000000000000..803df11dbcca
--- /dev/null
+++ b/src/dispatch/report/messaging.py
@@ -0,0 +1,202 @@
+import logging
+
+from datetime import datetime, timedelta
+from sqlalchemy.orm import Session
+
+from dispatch.conversation.enums import ConversationCommands
+from dispatch.database.core import resolve_attr
+from dispatch.incident import service as incident_service
+from dispatch.incident.models import Incident
+from dispatch.messaging.strings import (
+ INCIDENT_EXECUTIVE_REPORT,
+ INCIDENT_REPORT_REMINDER,
+ INCIDENT_REPORT_REMINDER_DELAYED,
+ INCIDENT_TACTICAL_REPORT,
+ MessageType,
+)
+from dispatch.plugin import service as plugin_service
+
+from .enums import ReportTypes
+from .models import Report
+
+log = logging.getLogger(__name__)
+
+
+def get_report_reminder_settings(report_type: ReportTypes):
+ report_reminder_settings_map = {
+ ReportTypes.tactical_report: (
+ ConversationCommands.tactical_report,
+ MessageType.incident_tactical_report,
+ ),
+ ReportTypes.executive_report: (
+ ConversationCommands.executive_report,
+ MessageType.incident_executive_report,
+ ),
+ }
+
+ return report_reminder_settings_map.get(report_type, (None, None))
+
+
+def send_tactical_report_to_conversation(
+ incident_id: int, conditions: str, actions: str, needs: str, db_session: Session
+):
+ """Sends a tactical report to the conversation."""
+ # we load the incident instance
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+
+ if not plugin:
+ log.warning("Tactical report not sent, no conversation plugin enabled.")
+ return
+
+ plugin.instance.send(
+ incident.conversation.channel_id,
+ "Incident Tactical Report",
+ INCIDENT_TACTICAL_REPORT,
+ notification_type=MessageType.incident_tactical_report,
+ persist=True,
+ conditions=conditions,
+ actions=actions,
+ needs=needs,
+ )
+
+ log.debug("Tactical report sent to conversation {incident.conversation.channel_id}.")
+
+
+def send_tactical_report_to_tactical_group(
+ incident_id: int,
+ tactical_report: Report,
+ db_session: Session,
+):
+ """Sends a tactical report to the tactical group."""
+ # we load the incident instance
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="email"
+ )
+
+ if not plugin:
+ log.warning("Tactical report not sent. No email plugin enabled.")
+ return
+
+ notification_text = "Tactical Report"
+
+ # Can raise exception "tenacity.RetryError: RetryError". (Email may still go through).
+ try:
+ plugin.instance.send(
+ incident.tactical_group.email,
+ notification_text,
+ INCIDENT_TACTICAL_REPORT,
+ MessageType.incident_tactical_report,
+ name=incident.name,
+ title=incident.title,
+ conditions=tactical_report.details.get("conditions"),
+ actions=tactical_report.details.get("actions"),
+ needs=tactical_report.details.get("needs"),
+ contact_fullname=incident.commander.individual.name,
+ contact_team=incident.commander.team,
+ contact_weblink=incident.commander.individual.weblink,
+ )
+ except Exception as e:
+ log.error(
+ f"Error in sending {notification_text} email to {incident.tactical_group.email}: {e}"
+ )
+
+ log.debug(f"Tactical report sent to tactical group {incident.tactical_group.email}.")
+
+
+def send_executive_report_to_notifications_group(
+ incident_id: int,
+ executive_report: Report,
+ db_session: Session,
+):
+ """Sends an executive report to the notifications group."""
+ # we load the incident instance
+ incident = incident_service.get(db_session=db_session, incident_id=incident_id)
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="email"
+ )
+
+ if not plugin:
+ log.warning("Executive report not sent. No email plugin enabled.")
+ return
+
+ notification_text = "Executive Report"
+ plugin.instance.send(
+ incident.notifications_group.email,
+ notification_text,
+ INCIDENT_EXECUTIVE_REPORT,
+ MessageType.incident_executive_report,
+ name=incident.name,
+ title=incident.title,
+ current_status=executive_report.details.get("current_status"),
+ overview=executive_report.details.get("overview"),
+ next_steps=executive_report.details.get("next_steps"),
+ weblink=executive_report.document.weblink,
+ notifications_group=incident.notifications_group.email,
+ contact_fullname=incident.commander.individual.name,
+ contact_weblink=incident.commander.individual.weblink,
+ )
+
+ log.debug(f"Executive report sent to notifications group {incident.notifications_group.email}.")
+
+
+def send_incident_report_reminder(
+ incident: Incident, report_type: ReportTypes, db_session: Session, reminder=False
+):
+ """Sends a direct message to the incident commander indicating that they should complete a report."""
+ message_text = f"Incident {report_type} Reminder"
+ message_template = INCIDENT_REPORT_REMINDER_DELAYED if reminder else INCIDENT_REPORT_REMINDER
+ command_name, message_type = get_report_reminder_settings(report_type)
+
+ # Null out db attribute if this is a delayed reminder
+ if reminder:
+ if report_type == ReportTypes.tactical_report:
+ incident.delay_tactical_report_reminder = None
+ elif report_type == ReportTypes.executive_report:
+ incident.delay_executive_report_reminder = None
+ db_session.add(incident)
+ db_session.commit()
+
+ # check to see if there wasn't a recent report
+ now = datetime.utcnow()
+ if incident.last_tactical_report:
+ last_reported_at = incident.last_tactical_report.created_at
+ if now - last_reported_at < timedelta(hours=1):
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=incident.project.id, plugin_type="conversation"
+ )
+ if not plugin:
+ log.warning("Incident report reminder not sent, no conversation plugin enabled.")
+ return
+
+ report_command = plugin.instance.get_command_name(command_name)
+ ticket_weblink = resolve_attr(incident, "ticket.weblink")
+
+ items = [
+ {
+ "command": report_command,
+ "name": incident.name,
+ "report_type": report_type,
+ "ticket_weblink": ticket_weblink,
+ "title": incident.title,
+ "incident_id": incident.id,
+ "organization_slug": incident.project.organization.slug,
+ }
+ ]
+
+ plugin.instance.send_direct(
+ incident.commander.individual.email,
+ message_text,
+ message_template,
+ message_type,
+ items=items,
+ )
+ log.debug(f"Incident report reminder sent to {incident.commander.individual.email}.")
diff --git a/src/dispatch/report/models.py b/src/dispatch/report/models.py
new file mode 100644
index 000000000000..0e83b07e2861
--- /dev/null
+++ b/src/dispatch/report/models.py
@@ -0,0 +1,70 @@
+from datetime import datetime
+
+
+from sqlalchemy import Column, DateTime, Integer, String, ForeignKey, event
+from sqlalchemy.orm import relationship
+from sqlalchemy_utils import JSONType, TSVectorType
+
+from dispatch.database.core import Base
+from dispatch.models import DispatchBase, Pagination
+
+from .enums import ReportTypes
+
+
+class Report(Base):
+ id = Column(Integer, primary_key=True)
+ created_at = Column(DateTime, default=datetime.utcnow)
+ details = Column(JSONType, nullable=True)
+ details_raw = Column(String, nullable=True)
+ type = Column(String, nullable=False, server_default=ReportTypes.tactical_report)
+
+ # relationships
+ incident_id = Column(Integer, ForeignKey("incident.id", ondelete="CASCADE", use_alter=True))
+ participant_id = Column(Integer, ForeignKey("participant.id"))
+ document = relationship("Document", uselist=False, backref="report")
+
+ # full text search capabilities
+ search_vector = Column(TSVectorType("details_raw"))
+
+ @staticmethod
+ def _details_raw(mapper, connection, target):
+ target.details_raw = " ".join(target.details.values())
+
+ @classmethod
+ def __declare_last__(cls):
+ event.listen(cls, "before_update", cls._details_raw)
+
+
+# Pydantic models...
+class ReportBase(DispatchBase):
+ details: dict | None = None
+ type: ReportTypes
+
+
+class ReportCreate(ReportBase):
+ pass
+
+
+class ReportUpdate(ReportBase):
+ pass
+
+
+class ReportRead(ReportBase):
+ id: int
+ created_at: datetime | None = None
+
+
+class ReportPagination(Pagination):
+ items: list[ReportRead] = []
+
+
+class TacticalReportCreate(DispatchBase):
+ conditions: str
+ actions: str
+ needs: str
+
+
+class ExecutiveReportCreate(DispatchBase):
+ current_status: str
+ overview: str
+ next_steps: str
diff --git a/src/dispatch/report/scheduled.py b/src/dispatch/report/scheduled.py
new file mode 100644
index 000000000000..edfc73cb8382
--- /dev/null
+++ b/src/dispatch/report/scheduled.py
@@ -0,0 +1,100 @@
+"""
+.. module: dispatch.report.scheduled
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
+
+import logging
+from datetime import datetime, timedelta
+from schedule import every
+
+from sqlalchemy.orm import Session
+from dispatch.decorators import scheduled_project_task, timer
+from dispatch.incident import service as incident_service
+from dispatch.incident.enums import IncidentStatus
+from dispatch.project.models import Project
+from dispatch.scheduler import scheduler
+
+from .messaging import send_incident_report_reminder
+from .models import ReportTypes
+
+
+log = logging.getLogger(__name__)
+
+
+def reminder_set_in_future(reminder: datetime | None) -> bool:
+ """if this reminder has been manually delayed, do not send regularly scheduled one"""
+ if reminder and reminder - datetime.utcnow() > timedelta(minutes=1):
+ return True
+ return False
+
+
+@scheduler.add(every(1).hours, name="incident-report-reminders")
+@timer
+@scheduled_project_task
+def incident_report_reminders(db_session: Session, project: Project):
+ """Sends report reminders to incident commanders for active incidents."""
+ incidents = incident_service.get_all_by_status(
+ db_session=db_session, project_id=project.id, status=IncidentStatus.active
+ )
+
+ for incident in incidents:
+ for report_type in ReportTypes:
+ try:
+ remind_after = incident.created_at
+ if report_type == ReportTypes.tactical_report:
+ if reminder_set_in_future(incident.delay_tactical_report_reminder):
+ continue
+ notification_hour = incident.incident_priority.tactical_report_reminder
+ if incident.last_tactical_report:
+ remind_after = incident.last_tactical_report.created_at
+ elif report_type == ReportTypes.executive_report:
+ if reminder_set_in_future(incident.delay_executive_report_reminder):
+ continue
+ notification_hour = incident.incident_priority.executive_report_reminder
+ if incident.last_executive_report:
+ remind_after = incident.last_executive_report.created_at
+
+ now = datetime.utcnow() - remind_after
+
+ # we calculate the number of hours and seconds since last report was sent
+ hours, seconds = divmod((now.days * 86400) + now.seconds, 3600)
+
+ q, r = divmod(hours, notification_hour)
+ if q >= 1 and r == 0: # it's time to send the reminder
+ send_incident_report_reminder(incident, report_type, db_session)
+
+ except Exception as e:
+ # we shouldn't fail to send all reminders when one fails
+ log.exception(e)
+
+
+@scheduler.add(every(5).minutes, name="incident-report-delayed-reminders")
+@timer
+@scheduled_project_task
+def incident_report_delayed_reminders(db_session: Session, project: Project):
+ """Sends user-delayed report reminders to incident commanders for active incidents."""
+ incidents = incident_service.get_all_by_status(
+ db_session=db_session, project_id=project.id, status=IncidentStatus.active
+ )
+
+ for incident in incidents:
+ try:
+ if exec_report_time := incident.delay_executive_report_reminder:
+ if datetime.utcnow() - exec_report_time > timedelta(minutes=1):
+ # send exec report reminder now
+ send_incident_report_reminder(
+ incident, ReportTypes.executive_report, db_session, True
+ )
+
+ if tech_report_time := incident.delay_tactical_report_reminder:
+ if datetime.utcnow() - tech_report_time > timedelta(minutes=1):
+ # send tech report reminder now!
+ send_incident_report_reminder(
+ incident, ReportTypes.tactical_report, db_session, True
+ )
+
+ except Exception as e:
+ # we shouldn't fail to send all reminders when one fails
+ log.exception(e)
diff --git a/src/dispatch/report/service.py b/src/dispatch/report/service.py
new file mode 100644
index 000000000000..8299ddf0e226
--- /dev/null
+++ b/src/dispatch/report/service.py
@@ -0,0 +1,64 @@
+
+from .enums import ReportTypes
+from .models import Report, ReportCreate, ReportUpdate
+
+
+def get(*, db_session, report_id: int) -> Report | None:
+ """Get a report by id."""
+ return db_session.query(Report).filter(Report.id == report_id).one_or_none()
+
+
+def get_most_recent_by_incident_id_and_type(
+ *, db_session, incident_id: int, report_type: ReportTypes
+) -> Report | None:
+ """Get most recent report by incident id and report type."""
+ return (
+ db_session.query(Report)
+ .filter(Report.incident_id == incident_id)
+ .filter(Report.type == report_type)
+ .order_by(Report.created_at.desc())
+ .first()
+ )
+
+
+def get_all_by_incident_id_and_type(
+ *, db_session, incident_id: int, report_type: ReportTypes
+) -> Report | None:
+ """Get all reports by incident id and report type."""
+ return (
+ db_session.query(Report)
+ .filter(Report.incident_id == incident_id)
+ .filter(Report.type == report_type)
+ )
+
+
+def get_all(*, db_session) -> list[Report | None]:
+ """Get all reports."""
+ return db_session.query(Report)
+
+
+def create(*, db_session, report_in: ReportCreate) -> Report:
+ """Create a new report."""
+ report = Report(**report_in.dict())
+ db_session.add(report)
+ db_session.commit()
+ return report
+
+
+def update(*, db_session, report: Report, report_in: ReportUpdate) -> Report:
+ """Updates a report."""
+ report_data = report.dict()
+ update_data = report_in.dict(exclude_unset=True)
+
+ for field in report_data:
+ if field in update_data:
+ setattr(report, field, update_data[field])
+
+ db_session.commit()
+ return report
+
+
+def delete(*, db_session, report_id: int):
+ """Deletes a report."""
+ db_session.query(Report).filter(Report.id == report_id).delete()
+ db_session.commit()
diff --git a/src/dispatch/route/flows.py b/src/dispatch/route/flows.py
deleted file mode 100644
index b136b18c36af..000000000000
--- a/src/dispatch/route/flows.py
+++ /dev/null
@@ -1,48 +0,0 @@
-"""
-.. module: dispatch.task.flows
- :platform: Unix
- :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
- :license: Apache, see LICENSE for more details.
-
-.. moduleauthor:: Kevin Glisson
-"""
-import logging
-from collections import defaultdict
-from datetime import datetime
-
-from dispatch.config import INCIDENT_PLUGIN_EMAIL_SLUG
-from dispatch.messaging import INCIDENT_TASK_REMINDER
-from dispatch.plugins.base import plugins
-
-log = logging.getLogger(__name__)
-
-
-def group_tasks_by_assignee(tasks):
- """Groups tasks by assignee."""
- grouped = defaultdict(lambda: [])
- for task in tasks:
- grouped[task.assignees].append(task)
- return grouped
-
-
-def create_reminder(db_session, assignee, tasks):
- """Contains the logic for incident task reminders."""
- # send email
- email_plugin = plugins.get(INCIDENT_PLUGIN_EMAIL_SLUG)
- message_template = INCIDENT_TASK_REMINDER
-
- notification_type = "incident-task-reminder"
- email_plugin.send(
- assignee, message_template, notification_type, name="Task Reminder", items=tasks
- )
-
- # We currently think DM's might be too agressive
- # send slack
- # convo_plugin = plugins.get(INCIDENT_PLUGIN_CONVERSATION_SLUG)
- # convo_plugin.send_direct(
- # assignee, notification_text, message_template, notification_type, items=tasks
- # )
-
- for task in tasks:
- task.last_reminder_at = datetime.utcnow()
- db_session.commit()
diff --git a/src/dispatch/route/models.py b/src/dispatch/route/models.py
index 7342cf1063a6..cb9eb5cec2ae 100644
--- a/src/dispatch/route/models.py
+++ b/src/dispatch/route/models.py
@@ -1,140 +1,44 @@
-from typing import List, Optional
+from datetime import datetime
-from pydantic import validator
-from sqlalchemy import Boolean, Column, ForeignKey, Integer, PrimaryKeyConstraint, String, Table
+from sqlalchemy import Boolean, Column, ForeignKey, Integer, DateTime, String
from sqlalchemy.orm import relationship
+from sqlalchemy_utils import JSONType
-from dispatch.database import Base
-from dispatch.incident_priority.models import IncidentPriorityRead
-from dispatch.incident_type.models import IncidentTypeRead
+from dispatch.database.core import Base
from dispatch.models import (
DispatchBase,
- IndividualReadNested,
- ServiceReadNested,
- TeamReadNested,
- DocumentRead,
-)
-from dispatch.term.models import TermRead
-
-recommendation_documents = Table(
- "recommendation_documents",
- Base.metadata,
- Column("document_id", Integer, ForeignKey("document.id")),
- Column("recommendation_id", Integer, ForeignKey("recommendation.id")),
- PrimaryKeyConstraint("document_id", "recommendation_id"),
-)
-
-recommendation_terms = Table(
- "recommendation_terms",
- Base.metadata,
- Column("term_id", Integer, ForeignKey("term.id")),
- Column("recommendation_id", Integer, ForeignKey("recommendation.id")),
- PrimaryKeyConstraint("term_id", "recommendation_id"),
-)
-
-recommendation_services = Table(
- "recommendation_services",
- Base.metadata,
- Column("service_id", Integer, ForeignKey("service.id")),
- Column("recommendation_id", Integer, ForeignKey("recommendation.id")),
- PrimaryKeyConstraint("service_id", "recommendation_id"),
-)
-
-recommendation_individual_contacts = Table(
- "recommendation_individual_contacts",
- Base.metadata,
- Column("individual_contact_id", Integer, ForeignKey("individual_contact.id")),
- Column("recommendation_id", Integer, ForeignKey("recommendation.id")),
- PrimaryKeyConstraint("individual_contact_id", "recommendation_id"),
)
+from dispatch.incident.models import IncidentRead
-recommendation_team_contacts = Table(
- "recommendation_team_contacts",
- Base.metadata,
- Column("team_contact_id", Integer, ForeignKey("team_contact.id")),
- Column("recommendation_id", Integer, ForeignKey("recommendation.id")),
- PrimaryKeyConstraint("team_contact_id", "recommendation_id"),
-)
-recommendation_incident_types = Table(
- "recommendation_incident_types",
- Base.metadata,
- Column("incident_type_id", Integer, ForeignKey("incident_type.id")),
- Column("recommendation_id", Integer, ForeignKey("recommendation.id")),
- PrimaryKeyConstraint("incident_type_id", "recommendation_id"),
-)
-
-recommendation_incident_priorities = Table(
- "recommendation_incident_priorities",
- Base.metadata,
- Column("incident_priority_id", Integer, ForeignKey("incident_priority.id")),
- Column("recommendation_id", Integer, ForeignKey("recommendation.id")),
- PrimaryKeyConstraint("incident_priority_id", "recommendation_id"),
-)
-
-
-class RecommendationAccuracy(Base):
+class RecommendationMatch(Base):
id = Column(Integer, primary_key=True)
recommendation_id = Column(Integer, ForeignKey("recommendation.id"))
correct = Column(Boolean)
- resource_id = Column(Integer)
resource_type = Column(String)
+ resource_state = Column(JSONType)
class Recommendation(Base):
id = Column(Integer, primary_key=True)
- text = Column(String)
- matched_terms = relationship("Term", secondary=recommendation_terms, backref="recommendations")
- accuracy = relationship("RecommendationAccuracy")
-
- documents = relationship(
- "Document", secondary=recommendation_documents, backref="recommendations"
- )
- team_contacts = relationship(
- "TeamContact", secondary=recommendation_team_contacts, backref="recommendations"
- )
- individual_contacts = relationship(
- "IndividualContact", secondary=recommendation_individual_contacts, backref="recommendations"
- )
- service_contacts = relationship(
- "Service", secondary=recommendation_services, backref="recommendations"
- )
- incident_types = relationship(
- "IncidentType", secondary=recommendation_incident_types, backref="recommendations"
- )
- incident_priorities = relationship(
- "IncidentPriority", secondary=recommendation_incident_priorities, backref="recommendations"
- )
+ incident_id = Column(Integer, ForeignKey("incident.id"))
+ created_at = Column(DateTime, default=datetime.utcnow)
+ matches = relationship("RecommendationMatch")
# Pydantic models...
-class RecommendationBase(DispatchBase):
- text: Optional[str]
- matched_terms: Optional[List[TermRead]] = []
- service_contacts: Optional[List[ServiceReadNested]] = []
- team_contacts: Optional[List[TeamReadNested]] = []
- documents: Optional[List[DocumentRead]] = []
- individual_contacts: Optional[List[IndividualReadNested]] = []
- incident_priorities: Optional[List[IncidentPriorityRead]] = []
- incident_types: Optional[List[IncidentTypeRead]] = []
+class RecommendationMatchBase(DispatchBase):
+ correct: bool
+ resource_type: str
+ resource_state: dict
-class ContextBase(DispatchBase):
- incident_priorities: Optional[List[IncidentPriorityRead]] = []
- incident_types: Optional[List[IncidentTypeRead]] = []
- terms: Optional[List[TermRead]] = []
+class RecommendationBase(DispatchBase):
+ matches: list[RecommendationMatchBase | None] = []
class RouteBase(DispatchBase):
- text: Optional[str]
- context: Optional[ContextBase]
-
- # NOTE order matters here to validate multiple arguments we must only validate the last one.
- @validator("context")
- def check_text_or_context(cls, v, values): # pylint: disable=no-self-argument
- if "text" not in values and "context" not in values:
- raise ValueError("Either 'text' or 'context' must be passed")
- return v
+ incident: IncidentRead
class RouteRequest(RouteBase):
diff --git a/src/dispatch/route/service.py b/src/dispatch/route/service.py
index 0c2eb82dff69..509a877192e6 100644
--- a/src/dispatch/route/service.py
+++ b/src/dispatch/route/service.py
@@ -1,278 +1,59 @@
+import json
import logging
-from typing import Any, Dict, List
+from typing import Any
-import spacy
-from spacy.matcher import PhraseMatcher
-from sqlalchemy import func
-
-from dispatch.incident_priority import service as incident_priority_service
-from dispatch.incident_priority.models import IncidentPriority
-from dispatch.incident_type import service as incident_type_service
-from dispatch.incident_type.models import IncidentType
-from dispatch.term.models import Term
-
-from .models import Recommendation, RecommendationAccuracy, RouteRequest, ContextBase
+from dispatch.database.core import Base
+from dispatch.route.models import Recommendation, RecommendationMatch
+from dispatch.search_filter import service as search_filter_service
log = logging.getLogger(__name__)
-nlp = spacy.blank("en")
-nlp.vocab.lex_attr_getters = {}
-
-
-# NOTE
-# This is kinda slow so we might cheat and just build this
-# periodically or cache it
-def build_term_vocab(terms: List[Term]):
- """Builds nlp vocabulary."""
- # We need to build four sets of vocabulary
- # such that we can more accurately match
- #
- # - No change
- # - Lower
- # - Upper
- # - Title
- #
- # We may also normalize the document itself at some point
- # but it unclear how this will affect the things like
- # Parts-of-speech (POS) analysis.
- for v in terms:
- texts = [v.text, v.text.lower(), v.text.upper(), v.text.title()]
- for t in texts:
- if t: # guard against `None`
- phrase = nlp.tokenizer(t)
- for w in phrase:
- _ = nlp.tokenizer.vocab[w.text]
- yield phrase
-
-
-def build_phrase_matcher(phrases: List[str]) -> PhraseMatcher:
- """Builds a PhraseMatcher object."""
- matcher = PhraseMatcher(nlp.tokenizer.vocab)
- matcher.add("NFLX", None, *phrases) # TODO customize
- return matcher
-
-
-def extract_terms_from_document(
- document: str, phrases: List[str], matcher: PhraseMatcher
-) -> List[str]:
- """Extracts key terms out of documents."""
- terms = []
- doc = nlp.tokenizer(document)
- for w in doc:
- _ = doc.vocab[
- w.text.lower()
- ] # We normalize our docs so that vocab doesn't take so long to build.
-
- matches = matcher(doc)
- for _, start, end in matches:
- token = doc[start:end].merge()
-
- # We try to filter out common stop words unless
- # we have surrounding context that would suggest they are not stop words.
- if token.is_stop:
- continue
-
- terms.append(token.text)
-
- return terms
-
-
-def get_terms(db_session, text: str) -> List[str]:
- """Get terms from request."""
- all_terms = db_session.query(Term).all()
- phrases = build_term_vocab(all_terms)
- matcher = build_phrase_matcher(phrases)
- extracted_terms = extract_terms_from_document(text, phrases, matcher)
- return extracted_terms
-
-
-# TODO is there a better way to deduplicate across sqlalchemy models?
-def deduplicate_resources(resources: List[dict]) -> Dict:
- """Creates a new dict adding new resources if they are not yet seen."""
- contact_set = {}
- for c in resources:
- key = f"{type(c).__name__}-{c.id}"
- if key not in contact_set.keys():
- contact_set[key] = c
-
- return list(contact_set.values())
-
-
-def resource_union(resources: List[dict], inputs: int) -> Dict:
- """Ensures the an item occurs in the resources list at least n times."""
- resource_set = {}
- for c in resources:
- key = f"{type(c).__name__}-{c.id}"
- if key not in resource_set.keys():
- resource_set[key] = (1, c)
- else:
- count, obj = resource_set[key]
- resource_set[key] = (count + 1, obj)
-
- unions = []
- for key, value in resource_set.items():
- count, obj = value
- if count >= inputs:
- unions.append(obj)
- return unions
-
-
-def get_resources_from_incident_types(db_session, incident_types: List[str]) -> list:
- """Get all resources related to a specific incident type."""
- incident_types = [i.name for i in incident_types]
- incident_type_models = (
- db_session.query(IncidentType).filter(IncidentType.name.in_(incident_types)).all()
- )
-
- resources = []
- for i in incident_type_models:
- resources += i.teams
- resources += i.individuals
- resources += i.services
- resources += i.documents
-
- return resources
-
-
-def get_resources_from_priorities(db_session, incident_priorities: List[str]) -> list:
- """Get all resources related to a specific priority."""
- incident_priorities = [i.name for i in incident_priorities]
- incident_priority_models = (
- db_session.query(IncidentPriority)
- .filter(IncidentPriority.name.in_(incident_priorities))
- .all()
- )
-
- resources = []
- for i in incident_priority_models:
- resources += i.teams
- resources += i.individuals
- resources += i.services
- resources += i.documents
- return resources
-
-
-def get_resources_from_context(db_session, context: ContextBase):
- """Fetch relevent resources based on context only."""
- resources = []
- if context.incident_types:
- resources += get_resources_from_incident_types(
- db_session, incident_types=context.incident_types
- )
-
- if context.incident_priorities:
- resources += get_resources_from_priorities(
- db_session, incident_priorities=context.incident_priorities
- )
-
- inputs = 0
- if context.incident_priorities:
- inputs += 1
-
- if context.incident_types:
- inputs += 1
-
- resources = resource_union(resources, inputs)
-
- if context.terms:
- _, term_resources = get_resources_from_terms(db_session, terms=context.terms)
- resources += term_resources
-
- return resources
-
-
-def get_resources_from_terms(db_session, terms: List[str]):
- """Fetch resources based solely on connected terms with the text."""
- # lookup extracted terms
- matched_terms = (
- db_session.query(Term)
- .filter(func.upper(Term.text).in_([func.upper(t) for t in terms]))
+def get_resource_matches(
+ *, db_session, project_id: int, class_instance: Base, model: Any
+) -> list[RecommendationMatch]:
+ """Fetches all matching model entities for the given class instance."""
+ # get all entities with an associated filter
+ model_cls, model_state = model
+ resources = (
+ db_session.query(model_cls)
+ .filter(model_cls.project_id == project_id)
+ .filter(model_cls.filters.any())
.all()
)
- # find resources associated with those terms
- resources = []
- for t in matched_terms:
- resources += t.teams
- resources += t.individuals
- resources += t.services
- resources += t.documents
-
- return matched_terms, resources
-
-
-# TODO ontacts could be List[Union(...)]
-def create_recommendation(
- *, db_session, text=str, context: ContextBase, matched_terms: List[Term], resources: List[Any]
-):
- """Create recommendation object for accuracy tracking."""
- accuracy = [
- RecommendationAccuracy(resource_id=r.id, resource_type=type(r).__name__) for r in resources
- ]
-
- incident_priorities = [
- incident_priority_service.get_by_name(db_session=db_session, name=n.name)
- for n in context.incident_priorities
- ]
- incident_types = [
- incident_type_service.get_by_name(db_session=db_session, name=n.name)
- for n in context.incident_types
- ]
-
- service_contacts = [x for x in resources if type(x).__name__ == "Service"]
- individual_contacts = [x for x in resources if type(x).__name__ == "IndividualContact"]
- team_contacts = [x for x in resources if type(x).__name__ == "TeamContact"]
- documents = [x for x in resources if type(x).__name__ == "Document"]
-
- log.debug(
- f"Recommendation: Documents: {documents} Individuals: {individual_contacts} Teams: {team_contacts} Services: {service_contacts}"
- )
-
- r = Recommendation(
- accuracy=accuracy,
- service_contacts=service_contacts,
- individual_contacts=individual_contacts,
- team_contacts=team_contacts,
- documents=documents,
- incident_types=incident_types,
- incident_priorities=incident_priorities,
- matched_terms=matched_terms,
- text=text,
- )
-
- db_session.add(r)
- db_session.commit()
- return r
-
-
-def get(*, db_session, route_in: RouteRequest) -> Dict[Any, Any]:
+ matched_resources = []
+ for resource in resources:
+ for f in resource.filters:
+ match = search_filter_service.match(
+ db_session=db_session,
+ subject=f.subject,
+ filter_spec=f.expression,
+ class_instance=class_instance,
+ )
+
+ if match:
+ matched_resources.append(
+ RecommendationMatch(
+ resource_state=json.loads(model_state(**resource.__dict__).json()),
+ resource_type=model_cls.__name__,
+ )
+ )
+ break
+
+ return matched_resources
+
+
+def get(*, db_session, project_id: int, class_instance: Base, models: list[Any]) -> Recommendation:
"""Get routed resources."""
- resources = []
- matched_terms = []
- if route_in.context:
- resources.extend(
- get_resources_from_context(db_session=db_session, context=route_in.context)
+ matches = []
+ for model in models:
+ matches += get_resource_matches(
+ db_session=db_session, project_id=project_id, class_instance=class_instance, model=model
)
- if route_in.text:
- # get terms from text (question, incident description, etc,.)
- text_terms = get_terms(db_session, text=route_in.text)
- resource_matched_terms, term_resources = get_resources_from_terms(
- db_session=db_session, terms=text_terms
- )
- route_in.context.terms.extend(resource_matched_terms)
- resources.extend(term_resources)
-
- resources = deduplicate_resources(resources)
-
- # create a recommandation entry we can use to data mine at a later time
- recommendation = create_recommendation(
- db_session=db_session,
- text=route_in.text,
- context=route_in.context,
- resources=resources,
- matched_terms=matched_terms,
- )
+ recommendation = Recommendation(matches=matches)
+ db_session.add(recommendation)
+ db_session.commit()
return recommendation
diff --git a/src/dispatch/route/views.py b/src/dispatch/route/views.py
deleted file mode 100644
index a628f9fccd2e..000000000000
--- a/src/dispatch/route/views.py
+++ /dev/null
@@ -1,17 +0,0 @@
-from fastapi import APIRouter, Depends
-from sqlalchemy.orm import Session
-
-from dispatch.database import get_db
-
-from .models import RouteRequest, RouteResponse
-from .service import get
-
-router = APIRouter()
-
-
-@router.post("/", response_model=RouteResponse)
-def route(*, db_session: Session = Depends(get_db), route_in: RouteRequest):
- """
- Determine the correct entities to dispatch.
- """
- return {"recommendation": get(db_session=db_session, route_in=route_in)}
diff --git a/src/dispatch/scheduler.py b/src/dispatch/scheduler.py
index c275ce3a3bf9..0fd994a1baa1 100644
--- a/src/dispatch/scheduler.py
+++ b/src/dispatch/scheduler.py
@@ -7,19 +7,26 @@
.. moduleauthor:: Kevin Glisson
.. moduleauthor:: Marc Vilanova
"""
+
import logging
import time
+from multiprocessing.pool import ThreadPool
+
import schedule
log = logging.getLogger(__name__)
# See: https://schedule.readthedocs.io/en/stable/ for documentation on job syntax
-class Scheduler(object):
+class Scheduler:
"""Simple scheduler class that holds all scheduled functions."""
registered_tasks = []
+ running = True
+
+ def __init__(self, num_workers=100):
+ self.pool = ThreadPool(processes=num_workers)
def add(self, job, *args, **kwargs):
"""Adds a task to the scheduler."""
@@ -30,7 +37,9 @@ def decorator(func):
else:
name = kwargs.pop("name")
- self.registered_tasks.append({"name": name, "func": func, "job": job.do(func)})
+ self.registered_tasks.append(
+ {"name": name, "func": func, "job": job.do(self.pool.apply_async, func)}
+ )
return decorator
@@ -40,9 +49,21 @@ def remove(self, task):
def start(self):
"""Runs all scheduled tasks."""
- while True:
+ log.info("Starting scheduler...")
+
+ while self.running:
schedule.run_pending()
time.sleep(1)
+ def stop(self):
+ """Stops the scheduler (letting existing threads finish)."""
+ log.debug("Stopping scheduler...")
+ self.pool.close()
+ self.running = False
+
scheduler = Scheduler()
+
+
+def stop_scheduler(signum, frame):
+ scheduler.stop()
diff --git a/src/dispatch/search/fulltext/__init__.py b/src/dispatch/search/fulltext/__init__.py
new file mode 100644
index 000000000000..4619b5d89a37
--- /dev/null
+++ b/src/dispatch/search/fulltext/__init__.py
@@ -0,0 +1,496 @@
+"""
+Originally authored by:
+https://github.com/kvesteri/sqlalchemy-searchable/blob/master/sqlalchemy_searchable
+"""
+import os
+from functools import reduce
+
+from sqlalchemy import event, inspect, func, desc, text, MetaData, Table, Index, orm
+from sqlalchemy.dialects.postgresql.base import RESERVED_WORDS
+from sqlalchemy.engine import Connection
+from sqlalchemy.schema import DDL
+from sqlalchemy_utils import TSVectorType
+from typing import Any
+
+from .vectorizers import Vectorizer
+
+
+vectorizer = Vectorizer()
+
+
+class SearchQueryMixin(object):
+ def search(self, search_query, vector=None, regconfig=None, sort=False):
+ """
+ Search given query with full text search.
+
+ :param search_query: the search query
+ :param vector: search vector to use
+ :param regconfig: postgresql regconfig to be used
+ :param sort: order results by relevance (quality of hit)
+ """
+ return search(self, search_query, vector=vector, regconfig=regconfig, sort=sort)
+
+
+def inspect_search_vectors(entity):
+ return [
+ getattr(entity, key).property.columns[0]
+ for key, column in inspect(entity).columns.items()
+ if isinstance(column.type, TSVectorType)
+ ]
+
+
+def search(query, search_query, vector=None, regconfig=None, sort=False):
+ """
+ Search given query with full text search.
+
+ :param search_query: the search query
+ :param vector: search vector to use
+ :param regconfig: postgresql regconfig to be used
+ :param sort: order results by relevance (quality of hit)
+ """
+ if not search_query.strip():
+ return query
+
+ if vector is None:
+ # Get the entity class from the query in a SQLAlchemy 2.x compatible way
+ try:
+ # For SQLAlchemy 2.x
+ entity = query.column_descriptions[0]['entity']
+ except (AttributeError, IndexError, KeyError):
+ raise ValueError("Could not determine entity class from query. Please provide vector explicitly.") from None
+
+ search_vectors = inspect_search_vectors(entity)
+ vector = search_vectors[0]
+
+ if regconfig is None:
+ regconfig = search_manager.options["regconfig"]
+
+ query = query.filter(vector.op("@@")(func.tsq_parse(regconfig, search_query)))
+ if sort:
+ query = query.order_by(desc(func.ts_rank_cd(vector, func.tsq_parse(search_query))))
+
+ return query.params(term=search_query)
+
+
+def quote_identifier(identifier):
+ """Adds double quotes to given identifier. Since PostgreSQL is the only
+ supported dialect we don't need dialect specific stuff here"""
+ return '"%s"' % identifier
+
+
+class SQLConstruct(object):
+ def __init__(self, tsvector_column, conn=None, indexed_columns=None, options=None):
+ self.table = tsvector_column.table
+ self.tsvector_column = tsvector_column
+ self.conn = conn
+ self.options = self.init_options(options)
+ if indexed_columns:
+ self.indexed_columns = list(indexed_columns)
+ elif hasattr(self.tsvector_column.type, "columns"):
+ self.indexed_columns = list(self.tsvector_column.type.columns)
+ else:
+ self.indexed_columns = None
+ self.params = {}
+
+ def init_options(self, options=None):
+ if not options:
+ options = {}
+ for key, value in SearchManager.default_options.items():
+ try:
+ option = self.tsvector_column.type.options[key]
+ except (KeyError, AttributeError):
+ option = value
+ options.setdefault(key, option)
+ return options
+
+ @property
+ def schema_name(self):
+ return self.table.schema
+
+ @property
+ def table_name(self):
+ if self.table.schema:
+ return '%s."%s"' % (self.table.schema, self.table.name)
+ else:
+ return '"' + self.table.name + '"'
+
+ @property
+ def search_function_name(self):
+ return self.options["search_trigger_function_name"].format(
+ table=self.table.name, column=self.tsvector_column.name
+ )
+
+ @property
+ def search_trigger_name(self):
+ return self.options["search_trigger_name"].format(
+ table=self.table.name, column=self.tsvector_column.name
+ )
+
+ def column_vector(self, column):
+ if column.name in RESERVED_WORDS:
+ column.name = quote_identifier(column.name)
+ value = text("NEW.{column}".format(column=column.name))
+ try:
+ vectorizer_func = vectorizer[column]
+ except KeyError:
+ pass
+ else:
+ value = vectorizer_func(value)
+ value = func.coalesce(value, text("''"))
+ value = func.to_tsvector(self.options["regconfig"], value)
+ if column.name in self.options["weights"]:
+ weight = self.options["weights"][column.name]
+ value = func.setweight(value, weight)
+ return value
+
+ @property
+ def search_vector(self):
+ vectors = (
+ self.column_vector(getattr(self.table.c, column_name))
+ for column_name in self.indexed_columns
+ )
+ concatenated = reduce(lambda x, y: x.op("||")(y), vectors)
+ compiled = concatenated.compile(self.conn)
+ self.params = compiled.params
+ return compiled
+
+
+class CreateSearchFunctionSQL(SQLConstruct):
+ def __str__(self):
+ return (
+ """CREATE OR REPLACE FUNCTION
+ {schema_name}.{search_trigger_function_name}() RETURNS TRIGGER AS $$
+ BEGIN
+ NEW.{search_vector_name} = {ts_vector};
+ RETURN NEW;
+ END
+ $$ LANGUAGE 'plpgsql';
+ """
+ ).format(
+ schema_name=self.schema_name,
+ search_trigger_function_name=self.search_function_name,
+ search_vector_name=self.tsvector_column.name,
+ ts_vector=self.search_vector,
+ )
+
+
+class CreateSearchTriggerSQL(SQLConstruct):
+ @property
+ def search_trigger_function_with_trigger_args(self):
+ if self.options["weights"] or any(
+ getattr(self.table.c, column) in vectorizer for column in self.indexed_columns
+ ):
+ return self.schema_name + "." + self.search_function_name + "()"
+ return "tsvector_update_trigger({arguments})".format(
+ arguments=", ".join(
+ [self.tsvector_column.name, "'%s'" % self.options["regconfig"]]
+ + self.indexed_columns
+ )
+ )
+
+ def __str__(self):
+ return (
+ "CREATE TRIGGER {search_trigger_name}"
+ " BEFORE UPDATE OR INSERT ON {table}"
+ " FOR EACH ROW EXECUTE PROCEDURE"
+ " {procedure_ddl}".format(
+ search_trigger_name=self.search_trigger_name,
+ table=self.table_name,
+ procedure_ddl=(self.search_trigger_function_with_trigger_args),
+ )
+ )
+
+
+class DropSearchFunctionSQL(SQLConstruct):
+ def __str__(self):
+ return "DROP FUNCTION IF EXISTS %s.%s()" % (self.schema_name, self.search_function_name)
+
+
+class DropSearchTriggerSQL(SQLConstruct):
+ def __str__(self):
+ return "DROP TRIGGER IF EXISTS %s ON %s" % (
+ self.search_trigger_name,
+ self.table_name,
+ )
+
+
+class SearchManager:
+ default_options = {
+ "search_trigger_name": "{table}_{column}_trigger",
+ "search_trigger_function_name": "{table}_{column}_update",
+ "regconfig": "pg_catalog.english",
+ "weights": (),
+ }
+
+ def __init__(self, options=None):
+ self.options = self.default_options
+ if options:
+ self.options.update(options)
+ self.processed_columns = []
+ self.classes = set()
+ self.listeners = []
+
+ def option(self, column, name):
+ try:
+ return column.type.options[name]
+ except (AttributeError, KeyError):
+ return self.options[name]
+
+ def search_function_ddl(self, column):
+ def after_create(target, connection, **kw):
+ clause = CreateSearchFunctionSQL(column, conn=connection)
+ connection.exec_driver_sql(str(clause), clause.params)
+
+ return after_create
+
+ def search_trigger_ddl(self, column):
+ """
+ Returns the ddl for creating an automatically updated search trigger.
+
+ :param column: TSVectorType typed SQLAlchemy column object
+ """
+ return DDL(str(CreateSearchTriggerSQL(column)))
+
+ def inspect_columns(self, table):
+ """
+ Inspects all searchable columns for given class.
+
+ :param table: SQLAlchemy Table
+ """
+ return [column for column in table.c if isinstance(column.type, TSVectorType)]
+
+ def append_index(self, cls, column):
+ Index("_".join((column.table.name, column.name, "idx")), column, postgresql_using="gin")
+
+ def process_mapper(self, mapper, cls):
+ columns = self.inspect_columns(mapper.persist_selectable)
+ for column in columns:
+ if column in self.processed_columns:
+ continue
+
+ self.append_index(cls, column)
+
+ self.processed_columns.append(column)
+
+ def add_listener(self, args):
+ self.listeners.append(args)
+ event.listen(*args)
+
+ def remove_listeners(self):
+ for listener in self.listeners:
+ event.remove(*listener)
+ self.listeners = []
+
+ def attach_ddl_listeners(self):
+ # Remove all previously added listeners, so that same listener don't
+ # get added twice in situations where class configuration happens in
+ # multiple phases (issue #31).
+ self.remove_listeners()
+
+ for column in self.processed_columns:
+ # This sets up the trigger that keeps the tsvector column up to
+ # date.
+ if column.type.columns:
+ table = column.table
+ if self.option(column, "weights") or vectorizer.contains_tsvector(column):
+ self.add_listener((table, "after_create", self.search_function_ddl(column)))
+ self.add_listener(
+ (table, "after_drop", DDL(str(DropSearchFunctionSQL(column))))
+ )
+ self.add_listener((table, "after_create", self.search_trigger_ddl(column)))
+
+
+search_manager = SearchManager()
+
+
+def sync_trigger(
+ conn: Connection,
+ table: Table,
+ tsvector_column: str,
+ indexed_columns: list[str],
+ metadata: MetaData | None = None,
+ options: dict[str, Any] | None = None,
+) -> None:
+ """
+ Synchronizes search trigger and trigger function for given table and given
+ search index column. Internally this function executes the following SQL
+ queries:
+
+ * Drops search trigger for given table (if it exists)
+ * Drops search function for given table (if it exists)
+ * Creates search function for given table
+ * Creates search trigger for given table
+ * Updates all rows for given search vector by running a column=column
+ update query for given table.
+
+
+ Example::
+
+ from sqlalchemy_searchable import sync_trigger
+
+
+ sync_trigger(
+ conn,
+ 'article',
+ 'search_vector',
+ ['name', 'content']
+ )
+
+
+ This function is especially useful when working with alembic migrations.
+ In the following example we add a content column to article table and then
+ sync the trigger to contain this new column. ::
+
+
+ from alembic import op
+ from sqlalchemy_searchable import sync_trigger
+
+
+ def upgrade():
+ conn = op.get_bind()
+ op.add_column('article', sa.Column('content', sa.Text))
+
+ sync_trigger(conn, 'article', 'search_vector', ['name', 'content'])
+
+ # ... same for downgrade
+
+
+ If you are using vectorizers you need to initialize them in your migration
+ file and pass them to this function. ::
+
+
+ import sqlalchemy as sa
+ from alembic import op
+ from sqlalchemy.dialects.postgresql import HSTORE
+ from sqlalchemy_searchable import sync_trigger, vectorizer
+
+
+ def upgrade():
+ vectorizer.clear()
+
+ conn = op.get_bind()
+ op.add_column('article', sa.Column('name_translations', HSTORE))
+
+ metadata = sa.MetaData(bind=conn)
+ articles = sa.Table('article', metadata, autoload=True)
+
+ @vectorizer(articles.c.name_translations)
+ def hstore_vectorizer(column):
+ return sa.cast(sa.func.avals(column), sa.Text)
+
+ op.add_column('article', sa.Column('content', sa.Text))
+ sync_trigger(
+ conn,
+ 'article',
+ 'search_vector',
+ ['name_translations', 'content'],
+ metadata=metadata
+ )
+
+ # ... same for downgrade
+
+ :param conn: SQLAlchemy Connection object
+ :param table: table to apply search trigger syncing
+ :param tsvector_column:
+ TSVector typed column which is used as the search index column
+ :param indexed_columns:
+ Full text indexed column names as a list
+ :param metadata:
+ Optional SQLAlchemy metadata object that is being used for autoloaded
+ Table. If None is given then new MetaData object is initialized within
+ this function.
+ :param options: Dictionary of configuration options
+ """
+ if metadata is None:
+ metadata = MetaData()
+ params = {
+ "tsvector_column": getattr(table.c, tsvector_column),
+ "indexed_columns": indexed_columns,
+ "options": options,
+ "conn": conn,
+ }
+ classes = [
+ DropSearchTriggerSQL,
+ DropSearchFunctionSQL,
+ CreateSearchFunctionSQL,
+ CreateSearchTriggerSQL,
+ ]
+ for class_ in classes:
+ sql = class_(**params)
+ conn.exec_driver_sql(str(sql), sql.params)
+ update_sql = table.update().values({indexed_columns[0]: text(indexed_columns[0])})
+ conn.execute(update_sql)
+
+
+def drop_trigger(
+ conn: Connection,
+ table_name: str,
+ tsvector_column: str,
+ metadata: MetaData | None = None,
+ options: dict[str, Any] | None = None,
+) -> None:
+ """
+ * Drops search trigger for given table (if it exists)
+ * Drops search function for given table (if it exists)
+
+
+ Example::
+
+ from alembic import op
+ from sqlalchemy_searchable import drop_trigger
+
+
+ def downgrade():
+ conn = op.get_bind()
+
+ drop_trigger(conn, 'article', 'search_vector')
+ op.drop_index('ix_article_search_vector', table_name='article')
+ op.drop_column('article', 'search_vector')
+
+ :param conn: SQLAlchemy Connection object
+ :param table_name: name of the table to apply search trigger dropping
+ :param tsvector_column:
+ TSVector typed column which is used as the search index column
+ :param metadata:
+ Optional SQLAlchemy metadata object that is being used for autoloaded
+ Table. If None is given then new MetaData object is initialized within
+ this function.
+ :param options: Dictionary of configuration options
+ """
+ if metadata is None:
+ metadata = MetaData()
+ table = Table(table_name, metadata, autoload=True, autoload_with=conn)
+ params = {
+ "tsvector_column": getattr(table.c, tsvector_column),
+ "options": options,
+ "conn": conn,
+ }
+ classes = [
+ DropSearchTriggerSQL,
+ DropSearchFunctionSQL,
+ ]
+ for class_ in classes:
+ sql = class_(**params)
+ conn.exec_driver_sql(str(sql), sql.params)
+
+
+path = os.path.dirname(os.path.abspath(__file__))
+
+
+with open(os.path.join(path, "expressions.sql")) as file:
+ sql_expressions = DDL(file.read())
+
+
+def make_searchable(metadata, mapper=orm.mapper, manager=search_manager, options=None):
+ if options:
+ manager.options.update(options)
+ event.listen(mapper, "instrument_class", manager.process_mapper)
+ # event.listen(mapper, "after_configured", manager.attach_ddl_listeners)
+ event.listen(metadata, "before_create", sql_expressions)
+
+
+def remove_listeners(metadata, manager=search_manager, mapper=orm.mapper):
+ event.remove(mapper, "instrument_class", manager.process_mapper)
+ # event.remove(mapper, "after_configured", manager.attach_ddl_listeners)
+ manager.remove_listeners()
+ event.remove(metadata, "before_create", sql_expressions)
diff --git a/src/dispatch/common/utils/composite_search.py b/src/dispatch/search/fulltext/composite_search.py
similarity index 94%
rename from src/dispatch/common/utils/composite_search.py
rename to src/dispatch/search/fulltext/composite_search.py
index ef829475d5ee..27b91a30e1fc 100644
--- a/src/dispatch/common/utils/composite_search.py
+++ b/src/dispatch/search/fulltext/composite_search.py
@@ -34,9 +34,11 @@ def map_result(self, search_row, object):
s.search(query=q)
"""
+
+from collections import defaultdict
from sqlalchemy.sql.expression import literal
-from sqlalchemy_searchable import inspect_search_vectors, search
+from . import inspect_search_vectors, search
class CompositeSearch(object):
@@ -101,9 +103,9 @@ def search(self, query, by_type=True):
objects_by_type = self.load_search_objects(objects_by_model)
# mapping all to search result
+ objects = defaultdict(list)
if by_type:
- objects = []
for x in search_result:
if x.type in objects_by_type:
- objects.append(self.map_result(x, objects_by_type[x.type][x.id]))
+ objects[x.type].append(objects_by_type[x.type][x.id])
return objects
diff --git a/src/dispatch/search/fulltext/expressions.sql b/src/dispatch/search/fulltext/expressions.sql
new file mode 100644
index 000000000000..c634e0eeb816
--- /dev/null
+++ b/src/dispatch/search/fulltext/expressions.sql
@@ -0,0 +1,222 @@
+DROP TYPE IF EXISTS tsq_state CASCADE;
+
+CREATE TYPE tsq_state AS (
+ search_query text,
+ parentheses_stack int,
+ skip_for int,
+ current_token text,
+ current_index int,
+ current_char text,
+ previous_char text,
+ tokens text[]
+);
+
+CREATE OR REPLACE FUNCTION tsq_append_current_token(state tsq_state)
+RETURNS tsq_state AS $$
+BEGIN
+ IF state.current_token != '' THEN
+ state.tokens := array_append(state.tokens, state.current_token);
+ state.current_token := '';
+ END IF;
+ RETURN state;
+END;
+$$ LANGUAGE plpgsql IMMUTABLE;
+
+
+CREATE OR REPLACE FUNCTION tsq_tokenize_character(state tsq_state)
+RETURNS tsq_state AS $$
+BEGIN
+ IF state.current_char = '(' THEN
+ state.tokens := array_append(state.tokens, '(');
+ state.parentheses_stack := state.parentheses_stack + 1;
+ state := tsq_append_current_token(state);
+ ELSIF state.current_char = ')' THEN
+ IF (state.parentheses_stack > 0 AND state.current_token != '') THEN
+ state := tsq_append_current_token(state);
+ state.tokens := array_append(state.tokens, ')');
+ state.parentheses_stack := state.parentheses_stack - 1;
+ END IF;
+ ELSIF state.current_char = '"' THEN
+ state.skip_for := position('"' IN substring(
+ state.search_query FROM state.current_index + 1
+ ));
+
+ IF state.skip_for > 1 THEN
+ state.tokens = array_append(
+ state.tokens,
+ substring(
+ state.search_query
+ FROM state.current_index FOR state.skip_for + 1
+ )
+ );
+ ELSIF state.skip_for = 0 THEN
+ state.current_token := state.current_token || state.current_char;
+ END IF;
+ ELSIF (
+ state.current_char = '-' AND
+ (state.current_index = 1 OR state.previous_char = ' ')
+ ) THEN
+ state.tokens := array_append(state.tokens, '-');
+ ELSIF state.current_char = ' ' THEN
+ state := tsq_append_current_token(state);
+ ELSE
+ state.current_token = state.current_token || state.current_char;
+ END IF;
+ RETURN state;
+END;
+$$ LANGUAGE plpgsql IMMUTABLE;
+
+
+CREATE OR REPLACE FUNCTION tsq_tokenize(search_query text) RETURNS text[] AS $$
+DECLARE
+ state tsq_state;
+BEGIN
+ SELECT
+ search_query::text AS search_query,
+ 0::int AS parentheses_stack,
+ 0 AS skip_for,
+ ''::text AS current_token,
+ 0 AS current_index,
+ ''::text AS current_char,
+ ''::text AS previous_char,
+ '{}'::text[] AS tokens
+ INTO state;
+
+ state.search_query := lower(trim(
+ regexp_replace(search_query, '""+', '""', 'g')
+ ));
+
+ FOR state.current_index IN (
+ SELECT generate_series(1, length(state.search_query))
+ ) LOOP
+ state.current_char := substring(
+ search_query FROM state.current_index FOR 1
+ );
+
+ IF state.skip_for > 0 THEN
+ state.skip_for := state.skip_for - 1;
+ CONTINUE;
+ END IF;
+
+ state := tsq_tokenize_character(state);
+ state.previous_char := state.current_char;
+ END LOOP;
+ state := tsq_append_current_token(state);
+
+ state.tokens := array_nremove(state.tokens, '(', -state.parentheses_stack);
+
+ RETURN state.tokens;
+END;
+$$ LANGUAGE plpgsql IMMUTABLE;
+
+
+-- Processes an array of text search tokens and returns a tsquery
+CREATE OR REPLACE FUNCTION tsq_process_tokens(config regconfig, tokens text[])
+RETURNS tsquery AS $$
+DECLARE
+ result_query text;
+ previous_value text;
+ value text;
+BEGIN
+ result_query := '';
+
+ FOREACH value IN ARRAY tokens LOOP
+ IF value = '"' THEN
+ CONTINUE;
+ END IF;
+
+ IF value = 'or' THEN
+ value := ' | ';
+ END IF;
+
+ IF left(value, 1) = '"' AND right(value, 1) = '"' THEN
+ value := phraseto_tsquery(config, value);
+ ELSIF value NOT IN ('(', ' | ', ')', '-') THEN
+ value := quote_literal(value) || ':*';
+ END IF;
+
+ IF previous_value = '-' THEN
+ IF value = '(' THEN
+ value := '!' || value;
+ ELSIF value = ' | ' THEN
+ CONTINUE;
+ ELSE
+ value := '!(' || value || ')';
+ END IF;
+ END IF;
+
+ SELECT
+ CASE
+ WHEN result_query = '' THEN value
+ WHEN previous_value = ' | ' AND value = ' | ' THEN result_query
+ WHEN previous_value = ' | ' THEN result_query || ' | ' || value
+ WHEN previous_value IN ('!(', '(') OR value = ')' THEN result_query || value
+ WHEN value != ' | ' THEN result_query || ' & ' || value
+ ELSE result_query
+ END
+ INTO result_query;
+
+ IF result_query = ' | ' THEN
+ result_query := '';
+ END IF;
+
+ previous_value := value;
+ END LOOP;
+
+ RETURN to_tsquery(config, result_query);
+END;
+$$ LANGUAGE plpgsql IMMUTABLE;
+
+
+CREATE OR REPLACE FUNCTION tsq_process_tokens(tokens text[])
+RETURNS tsquery AS $$
+ SELECT tsq_process_tokens(get_current_ts_config(), tokens);
+$$ LANGUAGE SQL IMMUTABLE;
+
+
+CREATE OR REPLACE FUNCTION tsq_parse(config regconfig, search_query text)
+RETURNS tsquery AS $$
+ SELECT tsq_process_tokens(config, tsq_tokenize(search_query));
+$$ LANGUAGE SQL IMMUTABLE;
+
+
+CREATE OR REPLACE FUNCTION tsq_parse(config text, search_query text)
+RETURNS tsquery AS $$
+ SELECT tsq_parse(config::regconfig, search_query);
+$$ LANGUAGE SQL IMMUTABLE;
+
+
+CREATE OR REPLACE FUNCTION tsq_parse(search_query text) RETURNS tsquery AS $$
+ SELECT tsq_parse(get_current_ts_config(), search_query);
+$$ LANGUAGE SQL IMMUTABLE;
+
+
+-- remove first N elements equal to the given value from the array (array
+-- must be one-dimensional)
+--
+-- If negative value is given as the third argument the removal of elements
+-- starts from the last array element.
+CREATE OR REPLACE FUNCTION array_nremove(anyarray, anyelement, int)
+RETURNS ANYARRAY AS $$
+ WITH replaced_positions AS (
+ SELECT UNNEST(
+ CASE
+ WHEN $2 IS NULL THEN
+ '{}'::int[]
+ WHEN $3 > 0 THEN
+ (array_positions($1, $2))[1:$3]
+ WHEN $3 < 0 THEN
+ (array_positions($1, $2))[
+ (cardinality(array_positions($1, $2)) + $3 + 1):
+ ]
+ ELSE
+ '{}'::int[]
+ END
+ ) AS position
+ )
+ SELECT COALESCE((
+ SELECT array_agg(value)
+ FROM unnest($1) WITH ORDINALITY AS t(value, index)
+ WHERE index NOT IN (SELECT position FROM replaced_positions)
+ ), $1[1:0]);
+$$ LANGUAGE SQL IMMUTABLE;
diff --git a/src/dispatch/search/fulltext/vectorizers.py b/src/dispatch/search/fulltext/vectorizers.py
new file mode 100644
index 000000000000..f9ed0399a033
--- /dev/null
+++ b/src/dispatch/search/fulltext/vectorizers.py
@@ -0,0 +1,189 @@
+"""
+Originally authored by:
+https://github.com/kvesteri/sqlalchemy-searchable/blob/master/sqlalchemy_searchable/vectorizers.py
+
+Vectorizers provide means for changing the way how different column types and
+columns are turned into fulltext search vectors.
+
+Type vectorizers
+----------------
+
+By default PostgreSQL only knows how to vectorize string columns. If your model
+contains for example HSTORE column which you would like to fulltext index you
+need to define special vectorization rule for this.
+
+The easiest way to add a vectorization rule is by using the vectorizer
+decorator. In the following example we vectorize only the values of all HSTORE
+typed columns are models may have.
+
+::
+
+ import sqlalchemy as sa
+ from sqlalchemy.dialects.postgresql import HSTORE
+ from sqlalchemy_searchable import vectorizer
+
+
+ @vectorizer(HSTORE)
+ def hstore_vectorizer(column):
+ return sa.cast(sa.func.avals(column), sa.Text)
+
+
+The SQLAlchemy clause construct returned by the vectorizer will be used for all
+fulltext indexed columns that are of type HSTORE. Consider the following
+model::
+
+
+ class Article(Base):
+ __tablename__ = 'article'
+
+ id = sa.Column(sa.Integer)
+ name_translations = sa.Column(HSTORE)
+ content_translations = sa.Column(HSTORE)
+
+
+Now SQLAlchemy-Searchable would create the following search trigger for this
+model (with default configuration)
+
+.. code-block:: sql
+
+
+ CREATE FUNCTION
+ textitem_search_vector_update() RETURNS TRIGGER AS $$
+ BEGIN
+ NEW.search_vector = to_tsvector(
+ 'simple',
+ concat(
+ regexp_replace(
+ coalesce(
+ CAST(avals(NEW.name_translations) AS TEXT),
+ ''
+ ),
+ '[-@.]', ' ', 'g'
+ ),
+ ' ',
+ regexp_replace(
+ coalesce(
+ CAST(avals(NEW.content_translations) AS TEXT),
+ ''
+ ),
+ '[-@.]', ' ', 'g'),
+ ' '
+ )
+ );
+ RETURN NEW;
+ END
+ $$ LANGUAGE 'plpgsql';
+
+
+Column vectorizers
+------------------
+
+Sometimes you may want to set special vectorizer only for specific column. This
+can be achieved as follows::
+
+
+ class Article(Base):
+ __tablename__ = 'article'
+
+ id = sa.Column(sa.Integer)
+ name_translations = sa.Column(HSTORE)
+
+
+ @vectorizer(Article.name_translations)
+ def name_vectorizer(column):
+ return sa.cast(sa.func.avals(column), sa.Text)
+
+
+.. note::
+
+ Column vectorizers always have precedence over type vectorizers.
+"""
+
+from functools import wraps
+from inspect import isclass
+
+import sqlalchemy as sa
+from sqlalchemy.orm.attributes import InstrumentedAttribute
+from sqlalchemy.sql.type_api import TypeEngine
+
+
+class Vectorizer(object):
+ def __init__(self, type_vectorizers=None, column_vectorizers=None):
+ self.type_vectorizers = {} if type_vectorizers is None else type_vectorizers
+ self.column_vectorizers = {} if column_vectorizers is None else column_vectorizers
+
+ def clear(self):
+ self.type_vectorizers = {}
+ self.column_vectorizers = {}
+
+ def contains_tsvector(self, tsvector_column):
+ if not hasattr(tsvector_column.type, "columns"):
+ return False
+ return any(
+ getattr(tsvector_column.table.c, column) in self
+ for column in tsvector_column.type.columns
+ )
+
+ def __contains__(self, column):
+ try:
+ self[column]
+ return True
+ except KeyError:
+ return False
+
+ def __getitem__(self, column):
+ if column in self.column_vectorizers:
+ return self.column_vectorizers[column]
+ type_class = column.type.__class__
+
+ if type_class in self.type_vectorizers:
+ return self.type_vectorizers[type_class]
+ raise KeyError(column)
+
+ def __call__(self, type_or_column):
+ """
+ Decorator that marks given function as vectorizer for given column or
+ type.
+
+ In the following example we add vectorizer for HSTORE type.
+
+ ::
+
+ from sqlalchemy.dialects.postgresql import HSTORE
+
+
+ @vectorizer(HSTORE)
+ def hstore_vectorizer(column):
+ return sa.func.avals(column)
+
+ """
+
+ def outer(func):
+ @wraps(func)
+ def wrapper(*args, **kwargs):
+ return func(*args, **kwargs)
+
+ if isclass(type_or_column) and issubclass(type_or_column, TypeEngine):
+ self.type_vectorizers[type_or_column] = wrapper
+ elif isinstance(type_or_column, sa.Column):
+ self.column_vectorizers[type_or_column] = wrapper
+ elif isinstance(type_or_column, InstrumentedAttribute):
+ prop = type_or_column.property
+ if not isinstance(prop, sa.orm.ColumnProperty):
+ raise TypeError(
+ "Given InstrumentedAttribute does not wrap "
+ "ColumnProperty. Only instances of ColumnProperty are "
+ "supported for vectorizer."
+ )
+ column = type_or_column.property.columns[0]
+
+ self.column_vectorizers[column] = wrapper
+ else:
+ raise TypeError(
+ "First argument should be either valid SQLAlchemy type, "
+ "Column, ColumnProperty or InstrumentedAttribute object."
+ )
+
+ return wrapper
+
+ return outer
diff --git a/src/dispatch/search/models.py b/src/dispatch/search/models.py
index 64d101e76a78..7cb40670f16a 100644
--- a/src/dispatch/search/models.py
+++ b/src/dispatch/search/models.py
@@ -1,21 +1,49 @@
-from typing import Any, List, Optional
-
+"""Models for search functionality in the Dispatch application."""
+
+from pydantic import ConfigDict, Field
+from typing import ClassVar
+from dispatch.case.models import CaseRead
+from dispatch.data.query.models import QueryRead
+from dispatch.data.source.models import SourceRead
+from dispatch.definition.models import DefinitionRead
+from dispatch.document.models import DocumentRead
+from dispatch.incident.models import IncidentRead
+from dispatch.individual.models import IndividualContactRead
from dispatch.models import DispatchBase
-
+from dispatch.service.models import ServiceRead
+from dispatch.tag.models import TagRead
+from dispatch.task.models import TaskRead
+from dispatch.team.models import TeamContactRead
+from dispatch.term.models import TermRead
# Pydantic models...
class SearchBase(DispatchBase):
- query: Optional[str]
+ """Base model for search queries."""
+ query: str | None = None
class SearchRequest(SearchBase):
- pass
-
-
-class ContentBase(DispatchBase):
- type: str
- content: Any # Union[TermRead, DefinitionRead, IndividualContactRead, TeamContactRead]
-
-
-class SearchResponse(SearchBase):
- results: List[ContentBase]
+ """Model for a search request."""
+
+
+class ContentResponse(DispatchBase):
+ """Model for search content response."""
+ documents: list[DocumentRead] | None = Field(default_factory=list, alias="Document")
+ incidents: list[IncidentRead] | None = Field(default_factory=list, alias="Incident")
+ tasks: list[TaskRead] | None = Field(default_factory=list, alias="Task")
+ tags: list[TagRead] | None = Field(default_factory=list, alias="Tag")
+ terms: list[TermRead] | None = Field(default_factory=list, alias="Term")
+ definitions: list[DefinitionRead] | None = Field(default_factory=list, alias="Definition")
+ sources: list[SourceRead] | None = Field(default_factory=list, alias="Source")
+ queries: list[QueryRead] | None = Field(default_factory=list, alias="Query")
+ teams: list[TeamContactRead] | None = Field(default_factory=list, alias="TeamContact")
+ individuals: list[IndividualContactRead] | None = Field(default_factory=list, alias="IndividualContact")
+ services: list[ServiceRead] | None = Field(default_factory=list, alias="Service")
+ cases: list[CaseRead] | None = Field(default_factory=list, alias="Case")
+ model_config: ClassVar[ConfigDict] = ConfigDict(populate_by_name=True)
+
+
+class SearchResponse(DispatchBase):
+ """Model for a search response."""
+ query: str | None = None
+ results: ContentResponse
diff --git a/src/dispatch/search/service.py b/src/dispatch/search/service.py
deleted file mode 100644
index d790d60adebf..000000000000
--- a/src/dispatch/search/service.py
+++ /dev/null
@@ -1,19 +0,0 @@
-from typing import List
-
-from sqlalchemy_searchable import search as search_db
-
-from dispatch.common.utils.composite_search import CompositeSearch
-from dispatch.database import Base
-
-
-def composite_search(*, db_session, query_str: str, models: List[Base]):
- """Perform a multi-table search based on the supplied query."""
- s = CompositeSearch(db_session, models)
- q = s.build_query(query_str, sort=True)
- return s.search(query=q)
-
-
-def search(*, db_session, query_str: str, model: Base):
- """Perform a search based on the query."""
- q = db_session.query(model)
- return search_db(q, query_str, sort=True)
diff --git a/src/dispatch/search/utils.py b/src/dispatch/search/utils.py
new file mode 100644
index 000000000000..5247dceb9c40
--- /dev/null
+++ b/src/dispatch/search/utils.py
@@ -0,0 +1,57 @@
+def create_filter_expression(filters: dict, model: str) -> list[dict]:
+ """Python implementation of @/search/utils/createFilterExpression"""
+
+ filter_expression = []
+ for key, value in filters.items():
+ sub_filter = []
+
+ # Check if a time window is specified
+ if "start" in value:
+ if value["start"]:
+ sub_filter.append(
+ {
+ "and": [
+ {"model": model, "field": key, "op": ">=", "value": value["start"]},
+ {"model": model, "field": key, "op": "<=", "value": value["end"]},
+ ]
+ }
+ )
+ else:
+ for val in value:
+ if not val:
+ continue
+ # Check if the filter is being applied to an id
+ if "id" in val:
+ sub_filter.append(
+ {"model": key.title(), "field": "id", "op": "==", "value": val["id"]}
+ )
+ # Check if the filter is being applied to a name
+ elif "name" in val:
+ sub_filter.append(
+ {"model": key.title(), "field": "name", "op": "==", "value": val["name"]}
+ )
+ # Check if the filter is being applied to a different model
+ elif "model" in val:
+ if val["value"]:
+ sub_filter.append(
+ {
+ "model": val["model"],
+ "field": val["field"],
+ "op": "==",
+ "value": val["value"],
+ }
+ )
+ # If no special condition is met, apply the filter to the current model
+ else:
+ sub_filter.append({"model": model, "field": key, "op": "==", "value": val})
+
+ # Only add the sub_filter to filter_expression if it has any filters in it
+ if len(sub_filter) > 0:
+ # If the key is "visibility", use "and" as the condition
+ if key == "visibility":
+ filter_expression.append({"and": sub_filter})
+ # Use "or" as the condition for all other filters
+ else:
+ filter_expression.append({"or": sub_filter})
+
+ return filter_expression
diff --git a/src/dispatch/search/views.py b/src/dispatch/search/views.py
index 745c23005877..8ff9a73e00c8 100644
--- a/src/dispatch/search/views.py
+++ b/src/dispatch/search/views.py
@@ -1,33 +1,58 @@
-from typing import List
+from fastapi import APIRouter
+from fastapi.params import Query
+from starlette.responses import JSONResponse
-from fastapi import APIRouter, Depends
-from sqlalchemy.orm import Session
+from dispatch.database.core import get_class_by_tablename
+from dispatch.database.service import composite_search
+from dispatch.database.service import CommonParameters
+from dispatch.enums import SearchTypes, UserRoles
+from dispatch.enums import Visibility
-from dispatch.database import get_class_by_tablename, get_db
-from dispatch.enums import SearchTypes
-
-from .models import SearchResponse
-from .service import composite_search
+from .models import (
+ SearchResponse,
+)
router = APIRouter()
-@router.get("/", response_model=SearchResponse)
+@router.get("", response_class=JSONResponse)
def search(
- *,
- db_session: Session = Depends(get_db),
- skip: int = 0,
- limit: int = 10,
- q: str = None,
- type: List[SearchTypes] = SearchTypes,
+ common: CommonParameters,
+ type: list[SearchTypes] = Query(..., alias="type[]"),
):
- """
- Perform a search.
- """
- if q:
- models = [get_class_by_tablename(t.name) for t in type]
- results = composite_search(db_session=db_session, query_str=q, models=models)
+ """Perform a search."""
+ if common["query_str"]:
+ models = [get_class_by_tablename(t) for t in type]
+ results = composite_search(
+ db_session=common["db_session"],
+ query_str=common["query_str"],
+ models=models,
+ current_user=common["current_user"],
+ )
+ # add a filter for restricted incidents
+ admin_projects = []
+ for p in common["current_user"].projects:
+ if p.role == UserRoles.admin:
+ admin_projects.append(p)
+
+ filtered_incidents = []
+ current_user_email = common["current_user"].email
+ for incident in results["Incident"]:
+ participant_emails: list[str] = [
+ participant.individual.email
+ for participant in incident.participants
+ if participant.individual
+ ]
+ if (
+ incident.project in admin_projects
+ or current_user_email in participant_emails
+ or incident.visibility == Visibility.open
+ ):
+ filtered_incidents.append(incident)
+
+ results["Incident"] = filtered_incidents
+
else:
results = []
- return {"query": q, "results": results}
+ return SearchResponse(**{"query": common["query_str"], "results": results}).dict(by_alias=False)
diff --git a/src/dispatch/search_filter/__init__.py b/src/dispatch/search_filter/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/search_filter/models.py b/src/dispatch/search_filter/models.py
new file mode 100644
index 000000000000..5bd0e0864ee5
--- /dev/null
+++ b/src/dispatch/search_filter/models.py
@@ -0,0 +1,96 @@
+from datetime import datetime
+
+from sqlalchemy import Column, ForeignKey, Integer, String, Boolean
+from sqlalchemy.orm import relationship
+from sqlalchemy.sql.schema import UniqueConstraint
+from sqlalchemy.sql.sqltypes import JSON
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.auth.models import DispatchUser, UserRead
+from dispatch.database.core import Base
+from dispatch.enums import DispatchEnum
+from dispatch.models import (
+ DispatchBase,
+ NameStr,
+ PrimaryKey,
+ ProjectMixin,
+ TimeStampMixin,
+ Pagination,
+)
+from dispatch.project.models import ProjectRead
+
+
+class SearchFilterSubject(DispatchEnum):
+ case = "case"
+ incident = "incident"
+
+
+class SearchFilter(Base, ProjectMixin, TimeStampMixin):
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ expression = Column(JSON, nullable=False, default=[])
+ creator_id = Column(Integer, ForeignKey(DispatchUser.id))
+ creator = relationship("DispatchUser", backref="search_filters")
+ subject = Column(String, default="incident")
+ enabled = Column(Boolean, default=True)
+
+ search_vector = Column(
+ TSVectorType("name", "description", weights={"name": "A", "description": "B"})
+ )
+
+
+# Pydantic models...
+class IndividualContactRead(DispatchBase):
+ id: PrimaryKey | None = None
+ name: str
+ email: str
+
+
+class TeamRead(DispatchBase):
+ id: PrimaryKey | None = None
+ name: str
+
+
+class ServiceRead(DispatchBase):
+ id: PrimaryKey | None = None
+ name: str
+
+
+class NotificationRead(DispatchBase):
+ id: PrimaryKey | None = None
+ name: str
+
+
+class SearchFilterBase(DispatchBase):
+ description: str | None = None
+ enabled: bool | None = None
+ expression: list[dict]
+ name: NameStr
+ subject: SearchFilterSubject = SearchFilterSubject.incident
+
+
+class SearchFilterCreate(SearchFilterBase):
+ project: ProjectRead
+
+
+class SearchFilterUpdate(SearchFilterBase):
+ id: PrimaryKey = None
+
+
+class SearchFilterRead(SearchFilterBase):
+ id: PrimaryKey
+ created_at: datetime | None = None
+ updated_at: datetime | None = None
+ project: ProjectRead | None = None
+ creator: UserRead | None = None
+ individuals: list[IndividualContactRead | None] = []
+ notifications: list[NotificationRead | None] = []
+ services: list[ServiceRead | None] = []
+ teams: list[TeamRead | None] = []
+
+
+class SearchFilterPagination(Pagination):
+ items: list[SearchFilterRead]
diff --git a/src/dispatch/search_filter/permissions.py b/src/dispatch/search_filter/permissions.py
new file mode 100644
index 000000000000..763fff65b7fa
--- /dev/null
+++ b/src/dispatch/search_filter/permissions.py
@@ -0,0 +1,35 @@
+from starlette.requests import Request
+
+from dispatch.auth.service import get_current_user
+from dispatch.auth.permissions import BasePermission, OrganizationAdminPermission, any_permission
+
+from .service import get
+
+
+class SearchFilterEditDeletePermission(BasePermission):
+ def has_required_permissions(
+ self,
+ request: Request,
+ ) -> bool:
+ """
+ Permissions class that checks if the user is the filter creator
+ or has Dispatch admin or greater permissions.
+ """
+ search_filter = get(
+ db_session=request.state.db, search_filter_id=request.path_params["search_filter_id"]
+ )
+
+ if not search_filter:
+ return False
+
+ current_user = get_current_user(request=request)
+
+ if current_user.email == search_filter.creator.email:
+ return True
+
+ return any_permission(
+ permissions=[
+ OrganizationAdminPermission,
+ ],
+ request=request,
+ )
diff --git a/src/dispatch/search_filter/service.py b/src/dispatch/search_filter/service.py
new file mode 100644
index 000000000000..67ef809671cd
--- /dev/null
+++ b/src/dispatch/search_filter/service.py
@@ -0,0 +1,94 @@
+
+from sqlalchemy_filters import apply_filters
+
+from dispatch.database.core import Base, get_class_by_tablename, get_table_name_by_class_instance
+from dispatch.database.service import apply_filter_specific_joins
+from dispatch.project import service as project_service
+
+from .models import SearchFilter, SearchFilterCreate, SearchFilterUpdate
+
+
+def get(*, db_session, search_filter_id: int) -> SearchFilter | None:
+ """Gets a search filter by id."""
+ return db_session.query(SearchFilter).filter(SearchFilter.id == search_filter_id).first()
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> SearchFilter | None:
+ """Gets a search filter by name."""
+ return (
+ db_session.query(SearchFilter)
+ .filter(SearchFilter.name == name)
+ .filter(SearchFilter.project_id == project_id)
+ .first()
+ )
+
+
+def match(*, db_session, subject: str, filter_spec: list[dict], class_instance: Base):
+ """Matches a class instance with a given search filter."""
+ table_name = get_table_name_by_class_instance(class_instance)
+
+ # this filter doesn't apply to the current class_instance
+ if table_name != subject:
+ return
+
+ model_cls = get_class_by_tablename(table_name)
+ query = db_session.query(model_cls)
+
+ query = apply_filter_specific_joins(model_cls, filter_spec, query)
+ query = apply_filters(query, filter_spec)
+ return query.filter(model_cls.id == class_instance.id).one_or_none()
+
+
+def get_or_create(*, db_session, search_filter_in) -> SearchFilter:
+ if search_filter_in.id:
+ q = db_session.query(SearchFilter).filter(SearchFilter.id == search_filter_in.id)
+ else:
+ q = db_session.query(SearchFilter).filter_by(**search_filter_in.dict(exclude={"id"}))
+
+ instance = q.first()
+ if instance:
+ return instance
+
+ return create(db_session=db_session, search_filter_in=search_filter_in)
+
+
+def get_all(*, db_session):
+ """Gets all search filters."""
+ return db_session.query(SearchFilter)
+
+
+def create(*, db_session, creator, search_filter_in: SearchFilterCreate) -> SearchFilter:
+ """Creates a new search filter."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=search_filter_in.project
+ )
+ search_filter = SearchFilter(
+ **search_filter_in.dict(exclude={"project"}), project=project, creator=creator
+ )
+ db_session.add(search_filter)
+ db_session.commit()
+ return search_filter
+
+
+def update(
+ *, db_session, search_filter: SearchFilter, search_filter_in: SearchFilterUpdate
+) -> SearchFilter:
+ """Updates a search filter."""
+ search_filter_data = search_filter.dict()
+ update_data = search_filter_in.dict(exclude_unset=True)
+
+ for field in search_filter_data:
+ if field in update_data:
+ setattr(search_filter, field, update_data[field])
+
+ db_session.commit()
+ return search_filter
+
+
+def delete(*, db_session, search_filter_id: int):
+ """Deletes a search filter."""
+ search_filter = (
+ db_session.query(SearchFilter).filter(SearchFilter.id == search_filter_id).first()
+ )
+ db_session.delete(search_filter)
+ db_session.commit()
diff --git a/src/dispatch/search_filter/views.py b/src/dispatch/search_filter/views.py
new file mode 100644
index 000000000000..4da5553f147b
--- /dev/null
+++ b/src/dispatch/search_filter/views.py
@@ -0,0 +1,114 @@
+from fastapi import APIRouter, HTTPException, status, Depends
+from pydantic import ValidationError
+
+from sqlalchemy.exc import IntegrityError
+
+from dispatch.auth.permissions import PermissionsDependency
+from dispatch.auth.service import CurrentUser
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
+
+from .models import (
+ SearchFilterCreate,
+ SearchFilterPagination,
+ SearchFilterRead,
+ SearchFilterUpdate,
+)
+from .permissions import SearchFilterEditDeletePermission
+from .service import create, delete, get, update
+
+
+router = APIRouter()
+
+
+@router.get("", response_model=SearchFilterPagination)
+def get_filters(common: CommonParameters):
+ """Retrieve filters."""
+ return search_filter_sort_paginate(model="SearchFilter", **common)
+
+
+@router.get("/{search_filter_id}", response_model=SearchFilterRead)
+def get_search_filter(
+ db_session: DbSession,
+ search_filter_id: PrimaryKey,
+):
+ """Get a search filter by id."""
+ search_filter = get(db_session=db_session, search_filter_id=search_filter_id)
+ if not search_filter:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A search filter with this id does not exist."}],
+ )
+ return search_filter
+
+
+@router.post("", response_model=SearchFilterRead)
+def create_search_filter(
+ db_session: DbSession,
+ search_filter_in: SearchFilterCreate,
+ current_user: CurrentUser,
+):
+ """Create a new filter."""
+ try:
+ return create(
+ db_session=db_session, creator=current_user, search_filter_in=search_filter_in
+ )
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "A search filter with this name already exists.",
+ "loc": "name",
+ }
+ ],
+ ) from None
+
+
+@router.put(
+ "/{search_filter_id}",
+ response_model=SearchFilterRead,
+ dependencies=[Depends(PermissionsDependency([SearchFilterEditDeletePermission]))],
+)
+def update_search_filter(
+ db_session: DbSession,
+ search_filter_id: PrimaryKey,
+ search_filter_in: SearchFilterUpdate,
+):
+ """Update a search filter."""
+ search_filter = get(db_session=db_session, search_filter_id=search_filter_id)
+ if not search_filter:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A search filter with this id does not exist."}],
+ )
+ try:
+ search_filter = update(
+ db_session=db_session, search_filter=search_filter, search_filter_in=search_filter_in
+ )
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "A search filter with this name already exists.",
+ "loc": "name",
+ }
+ ],
+ ) from None
+ return search_filter
+
+
+@router.delete(
+ "/{search_filter_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([SearchFilterEditDeletePermission]))],
+)
+def delete_filter(db_session: DbSession, search_filter_id: PrimaryKey):
+ """Delete a search filter."""
+ search_filter = get(db_session=db_session, search_filter_id=search_filter_id)
+ if not search_filter:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A search filter with this id does not exist."}],
+ )
+ delete(db_session=db_session, search_filter_id=search_filter_id)
diff --git a/src/dispatch/service/flows.py b/src/dispatch/service/flows.py
new file mode 100644
index 000000000000..29859175c9ca
--- /dev/null
+++ b/src/dispatch/service/flows.py
@@ -0,0 +1,34 @@
+"""
+.. module: dispatch.service.flows
+ :platform: Unix
+ :copyright: (c) 2019 by Netflix Inc., see AUTHORS for more
+ :license: Apache, see LICENSE for more details.
+"""
+
+import logging
+
+from sqlalchemy.orm import Session
+
+from dispatch.plugin import service as plugin_service
+from dispatch.service.models import Service
+
+
+log = logging.getLogger(__name__)
+
+
+def resolve_oncall(service: Service, db_session: Session) -> str:
+ """Uses the active oncall plugin to resolve a given oncall service to its email address."""
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session, project_id=service.project.id, plugin_type="oncall"
+ )
+ if not plugin:
+ log.warning("Oncall not resolved. No oncall plugin enabled.")
+ return
+
+ email_address = plugin.instance.get(service_id=service.external_id)
+
+ if not email_address:
+ log.warning("Email address for oncall service not returned.")
+ return
+
+ return email_address
diff --git a/src/dispatch/service/models.py b/src/dispatch/service/models.py
index 3c9bee76f546..827a35efa4e3 100644
--- a/src/dispatch/service/models.py
+++ b/src/dispatch/service/models.py
@@ -1,105 +1,71 @@
from datetime import datetime
-from typing import List, Optional
+from pydantic import Field
+from dispatch.models import EvergreenBase, EvergreenMixin, PrimaryKey
from sqlalchemy import Boolean, Column, ForeignKey, Integer, PrimaryKeyConstraint, String, Table
-from sqlalchemy.orm import backref, relationship
+from sqlalchemy.orm import relationship
+from sqlalchemy.sql.schema import UniqueConstraint
from sqlalchemy_utils import TSVectorType
-from dispatch.database import Base
-from dispatch.incident_priority.models import IncidentPriorityCreate, IncidentPriorityRead
-from dispatch.incident_type.models import IncidentTypeCreate, IncidentTypeRead
-from dispatch.models import DispatchBase, TermReadNested, TimeStampMixin
-from dispatch.term.models import TermCreate
+from dispatch.database.core import Base
+from dispatch.models import TimeStampMixin, ProjectMixin, Pagination
+from dispatch.project.models import ProjectRead
+from dispatch.search_filter.models import SearchFilterRead
-# Association tables for many to many relationships
-assoc_service_incident_priorities = Table(
- "service_incident_priority",
- Base.metadata,
- Column("incident_priority_id", Integer, ForeignKey("incident_priority.id")),
- Column("service_id", Integer, ForeignKey("service.id")),
- PrimaryKeyConstraint("incident_priority_id", "service_id"),
-)
-assoc_service_incident_types = Table(
- "service_incident_type",
+assoc_service_filters = Table(
+ "assoc_service_filters",
Base.metadata,
- Column("incident_type_id", Integer, ForeignKey("incident_type.id")),
- Column("service_id", Integer, ForeignKey("service.id")),
- PrimaryKeyConstraint("incident_type_id", "service_id"),
-)
-
-assoc_service_incidents = Table(
- "service_incident",
- Base.metadata,
- Column("incident_id", Integer, ForeignKey("incident.id")),
- Column("service_id", Integer, ForeignKey("service.id")),
- PrimaryKeyConstraint("incident_id", "service_id"),
-)
-
-assoc_service_terms = Table(
- "service_terms",
- Base.metadata,
- Column("term_id", Integer, ForeignKey("term.id")),
- Column("service_id", Integer, ForeignKey("service.id")),
- PrimaryKeyConstraint("term_id", "service_id"),
+ Column("service_id", Integer, ForeignKey("service.id", ondelete="CASCADE")),
+ Column("search_filter_id", Integer, ForeignKey("search_filter.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("service_id", "search_filter_id"),
)
# SQLAlchemy models...
-class Service(TimeStampMixin, Base):
+class Service(Base, TimeStampMixin, ProjectMixin, EvergreenMixin):
+ __table_args__ = (UniqueConstraint("external_id", "project_id"),)
id = Column(Integer, primary_key=True)
is_active = Column(Boolean, default=True)
name = Column(String)
type = Column(String, default="pagerduty-oncall")
+ description = Column(String)
external_id = Column(String)
- incidents = relationship("Incident", secondary=assoc_service_incidents, backref="services")
- incident_types = relationship("IncidentType")
- incident_priorities = relationship(
- "IncidentPriority", secondary=assoc_service_incident_priorities, backref="services"
- )
- incident_types = relationship(
- "IncidentType", secondary=assoc_service_incident_types, backref="services"
- )
- terms = relationship(
- "Term", secondary=assoc_service_terms, backref=backref("services", cascade="all")
- )
+ health_metrics = Column(Boolean, default=False)
+ shift_hours_type = Column(Integer, default=24)
+
+ # Relationships
+ filters = relationship("SearchFilter", secondary=assoc_service_filters, backref="services")
- search_vector = Column(TSVectorType("name"))
+ search_vector = Column(TSVectorType("name", regconfig="pg_catalog.simple"))
# Pydantic models...
-class ServiceBase(DispatchBase):
- name: Optional[str] = None
- external_id: Optional[str] = None
- is_active: Optional[bool] = None
- type: Optional[str] = None
+class ServiceBase(EvergreenBase):
+ description: str | None = None
+ external_id: str | None = None
+ health_metrics: bool | None = None
+ is_active: bool | None = None
+ name: str | None = None
+ type: str | None = None
+ shift_hours_type: int | None = Field(24, nullable=True)
class ServiceCreate(ServiceBase):
- terms: Optional[List[TermCreate]] = []
- incident_priorities: Optional[List[IncidentPriorityCreate]] = []
- incident_types: Optional[List[IncidentTypeCreate]] = []
+ filters: list[SearchFilterRead | None] = []
+ project: ProjectRead
class ServiceUpdate(ServiceBase):
- terms: Optional[List[TermCreate]] = []
- incident_priorities: Optional[List[IncidentPriorityCreate]] = []
- incident_types: Optional[List[IncidentTypeCreate]] = []
+ filters: list[SearchFilterRead | None] = []
class ServiceRead(ServiceBase):
- id: int
- incident_priorities: Optional[List[IncidentPriorityRead]] = []
- incident_types: Optional[List[IncidentTypeRead]] = []
- terms: Optional[List[TermReadNested]] = []
- created_at: Optional[datetime] = None
- updated_at: Optional[datetime] = None
-
-
-class ServiceNested(ServiceBase):
- id: int
+ id: PrimaryKey
+ filters: list[SearchFilterRead | None] = []
+ created_at: datetime | None = None
+ updated_at: datetime | None = None
-class ServicePagination(DispatchBase):
- total: int
- items: List[ServiceRead] = []
+class ServicePagination(Pagination):
+ items: list[ServiceRead] = []
diff --git a/src/dispatch/service/service.py b/src/dispatch/service/service.py
index 081a426ccbb8..6352d1f1dd4b 100644
--- a/src/dispatch/service/service.py
+++ b/src/dispatch/service/service.py
@@ -1,45 +1,175 @@
-from typing import Optional
-from fastapi.encoders import jsonable_encoder
+from pydantic import ValidationError
-from dispatch.incident_priority import service as incident_priority_service
-from dispatch.incident_type import service as incident_type_service
-from dispatch.term import service as term_service
+from dispatch.plugin import service as plugin_service
+from dispatch.project import service as project_service
+from dispatch.project.models import ProjectRead
+from dispatch.search_filter import service as search_filter_service
-from .models import Service, ServiceCreate, ServiceUpdate
+from .models import Service, ServiceCreate, ServiceRead, ServiceUpdate
-def get(*, db_session, service_id: int) -> Optional[Service]:
+def get(*, db_session, service_id: int) -> Service | None:
+ """Gets a service by id."""
return db_session.query(Service).filter(Service.id == service_id).first()
-def get_by_external_id(*, db_session, external_id: str) -> Optional[Service]:
+def get_by_external_id(*, db_session, external_id: str) -> Service | None:
+ """Gets a service by external id (e.g. PagerDuty service id)."""
return db_session.query(Service).filter(Service.external_id == external_id).first()
+def get_all_by_external_ids(*, db_session, external_ids: list[str]) -> list[Service | None]:
+ """Gets a service by external id (e.g. PagerDuty service id) and project id."""
+ return db_session.query(Service).filter(Service.external_id.in_(external_ids)).all()
+
+
+def get_by_name(*, db_session, project_id: int, name: str) -> Service | None:
+ """Gets a service by its name."""
+ return (
+ db_session.query(Service)
+ .filter(Service.name == name)
+ .filter(Service.project_id == project_id)
+ .one_or_none()
+ )
+
+
+def get_by_name_or_raise(*, db_session, project_id, service_in: ServiceRead) -> ServiceRead:
+ """Returns the service specified or raises ValidationError."""
+ source = get_by_name(db_session=db_session, project_id=project_id, name=service_in.name)
+
+ if not source:
+ raise ValidationError([
+ {
+ "loc": ("service",),
+ "msg": f"Service not found: {service_in.name}",
+ "type": "value_error",
+ "input": service_in.name,
+ }
+ ])
+
+ return source
+
+
+def get_by_external_id_and_project_id(
+ *, db_session, external_id: str, project_id: int
+) -> Service | None:
+ """Gets a service by external id (e.g. PagerDuty service id) and project id."""
+ return (
+ db_session.query(Service)
+ .filter(Service.project_id == project_id)
+ .filter(Service.external_id == external_id)
+ .first()
+ )
+
+
+def get_by_external_id_and_project_id_or_raise(
+ *, db_session, project_id: int, service_in=ServiceRead
+) -> Service:
+ """Returns the service specified or raises ValidationError."""
+ service = get_by_external_id_and_project_id(
+ db_session=db_session, project_id=project_id, external_id=service_in.external_id
+ )
+
+ if not service:
+ raise ValidationError(
+ [
+ {
+ "msg": "Service not found.",
+ "incident_priority": service.external_id,
+ }
+ ],
+ model=ServiceRead,
+ )
+
+ return service
+
+
+def get_overdue_evergreen_services(*, db_session, project_id: int) -> list[Service | None]:
+ """Returns all services that have not had a recent evergreen notification."""
+ query = (
+ db_session.query(Service)
+ .filter(Service.project_id == project_id)
+ .filter(Service.evergreen == True) # noqa
+ .filter(Service.overdue == True) # noqa
+ )
+ return query.all()
+
+
+def get_by_external_id_and_project_name(
+ *, db_session, external_id: str, project_name: str
+) -> Service | None:
+ """Gets a service by external id (e.g. PagerDuty service id) and project name."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=ProjectRead(name=project_name)
+ )
+ service = get_by_external_id_and_project_id(
+ db_session=db_session, external_id=external_id, project_id=project.id
+ )
+ return service
+
+
def get_all(*, db_session):
+ """Gets all services."""
return db_session.query(Service)
def get_all_by_status(*, db_session, is_active: bool):
+ """Gets services by status."""
return db_session.query(Service).filter(Service.is_active.is_(is_active))
+def get_all_by_type_and_status(
+ *, db_session, service_type: str, is_active: bool
+) -> list[Service | None]:
+ """Gets services by type and status."""
+ return (
+ db_session.query(Service)
+ .filter(Service.type == service_type)
+ .filter(Service.is_active.is_(is_active))
+ .all()
+ )
+
+
+def get_all_by_project_id_and_status(
+ *, db_session, project_id: id, is_active: bool
+) -> list[Service | None]:
+ """Gets services by project id and status."""
+ return (
+ db_session.query(Service)
+ .filter(Service.project_id == project_id)
+ .filter(Service.is_active.is_(is_active))
+ .order_by(Service.name)
+ )
+
+
+def get_all_by_health_metrics(
+ *, db_session, service_type: str, health_metrics: bool, project_id: int
+) -> list[Service | None]:
+ """Gets all services based on the given health metrics value for a given project."""
+ return (
+ db_session.query(Service)
+ .filter(Service.health_metrics.is_(health_metrics))
+ .filter(Service.project_id == project_id)
+ .all()
+ )
+
+
def create(*, db_session, service_in: ServiceCreate) -> Service:
- terms = [term_service.get_or_create(db_session=db_session, term_in=t) for t in service_in.terms]
- incident_priorities = [
- incident_priority_service.get_by_name(db_session=db_session, name=n.name)
- for n in service_in.incident_priorities
- ]
- incident_types = [
- incident_type_service.get_by_name(db_session=db_session, name=n.name)
- for n in service_in.incident_types
+ """Creates a new service."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=service_in.project
+ )
+
+ filters = [
+ search_filter_service.get(db_session=db_session, search_filter_id=f.id)
+ for f in service_in.filters
]
+
service = Service(
- **service_in.dict(exclude={"terms", "incident_priorities", "incident_types"}),
- incident_priorities=incident_priorities,
- incident_types=incident_types,
- terms=terms,
+ **service_in.dict(exclude={"filters", "project"}),
+ filters=filters,
+ project=project,
)
db_session.add(service)
db_session.commit()
@@ -47,39 +177,44 @@ def create(*, db_session, service_in: ServiceCreate) -> Service:
def update(*, db_session, service: Service, service_in: ServiceUpdate) -> Service:
- service_data = jsonable_encoder(service)
+ """Updates an existing service."""
+ service_data = service.dict()
- terms = [term_service.get_or_create(db_session=db_session, term_in=t) for t in service_in.terms]
- incident_priorities = [
- incident_priority_service.get_by_name(db_session=db_session, name=n.name)
- for n in service_in.incident_priorities
- ]
- incident_types = [
- incident_type_service.get_by_name(db_session=db_session, name=n.name)
- for n in service_in.incident_types
+ update_data = service_in.dict(exclude_unset=True, exclude={"filters"})
+
+ filters = [
+ search_filter_service.get(db_session=db_session, search_filter_id=f.id)
+ for f in service_in.filters
]
- update_data = service_in.dict(
- skip_defaults=True, exclude={"terms", "incident_priorities", "incident_types"}
- )
+
+ if service_in.is_active: # user wants to enable the service
+ oncall_plugin_instance = plugin_service.get_active_instance_by_slug(
+ db_session=db_session, slug=service_in.type, project_id=service.project.id
+ )
+ if not oncall_plugin_instance.enabled:
+ raise ValidationError(
+ [
+ {
+ "msg": "Cannot enable service. Its associated plugin is not enabled.",
+ "loc": "type",
+ }
+ ],
+ model=ServiceUpdate,
+ )
for field in service_data:
if field in update_data:
setattr(service, field, update_data[field])
- service.terms = terms
- service.incident_priorities = incident_priorities
- service.incident_types = incident_types
- db_session.add(service)
+ service.filters = filters
+
db_session.commit()
return service
def delete(*, db_session, service_id: int):
+ """Deletes a service."""
service = db_session.query(Service).filter(Service.id == service_id).one()
-
- # TODO clear out other relationships
- # we clear out our associated items
- service.terms = []
db_session.delete(service)
db_session.commit()
return service_id
diff --git a/src/dispatch/service/views.py b/src/dispatch/service/views.py
index 42386105a211..e8d7da9aa83f 100644
--- a/src/dispatch/service/views.py
+++ b/src/dispatch/service/views.py
@@ -1,50 +1,42 @@
-from typing import List
+from fastapi import APIRouter, Body, HTTPException, Query, status
+from pydantic import ValidationError
+from sqlalchemy.exc import IntegrityError
-from fastapi import APIRouter, Body, Depends, HTTPException, Query
-from sqlalchemy.orm import Session
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import PrimaryKey
-from dispatch.database import get_db, search_filter_sort_paginate
-from dispatch.search.service import search
-
-from .models import Service, ServiceCreate, ServicePagination, ServiceRead, ServiceUpdate
-from .service import create, delete, get, get_all, get_by_external_id, update
+from .models import ServiceCreate, ServicePagination, ServiceRead, ServiceUpdate
+from .service import (
+ create,
+ delete,
+ get,
+ get_all_by_external_ids,
+ get_by_external_id_and_project_name,
+ update,
+)
router = APIRouter()
-@router.get("/", response_model=ServicePagination)
-def get_services(
- db_session: Session = Depends(get_db),
- page: int = 1,
- items_per_page: int = Query(5, alias="itemsPerPage"),
- query_str: str = Query(None, alias="q"),
- sort_by: List[str] = Query(None, alias="sortBy[]"),
- descending: List[bool] = Query(None, alias="descending[]"),
- fields: List[str] = Query(None, alias="field[]"),
- ops: List[str] = Query(None, alias="op[]"),
- values: List[str] = Query(None, alias="value[]"),
+@router.get("", response_model=ServicePagination)
+def get_services(common: CommonParameters):
+ """Retrieves all services."""
+ return search_filter_sort_paginate(model="Service", **common)
+
+
+@router.get("/externalids", response_model=list[ServiceRead])
+def get_services_by_external_ids(
+ db_session: DbSession,
+ ids: list[str] = Query(..., alias="ids[]"),
):
- """
- Retrieve all services.
- """
- return search_filter_sort_paginate(
- db_session=db_session,
- model="Service",
- query_str=query_str,
- page=page,
- items_per_page=items_per_page,
- sort_by=sort_by,
- descending=descending,
- fields=fields,
- values=values,
- ops=ops,
- )
+ """Retrieves all services given list of external ids."""
+ return get_all_by_external_ids(db_session=db_session, external_ids=ids)
-@router.post("/", response_model=ServiceRead)
+@router.post("", response_model=ServiceRead)
def create_service(
- *,
- db_session: Session = Depends(get_db),
+ db_session: DbSession,
service_in: ServiceCreate = Body(
...,
example={
@@ -55,50 +47,81 @@ def create_service(
},
),
):
- """
- Create a new service.
- """
- service = get_by_external_id(db_session=db_session, external_id=service_in.external_id)
+ """Creates a new service."""
+ service = get_by_external_id_and_project_name(
+ db_session=db_session,
+ external_id=service_in.external_id,
+ project_name=service_in.project.name,
+ )
if service:
- raise HTTPException(
- status_code=400,
- detail=f"The service with this identifier ({service_in.external_id}) already exists.",
+ raise ValidationError(
+ [
+ {
+ "msg": "An oncall service with this external id already exists.",
+ "loc": "external_id",
+ }
+ ],
)
service = create(db_session=db_session, service_in=service_in)
return service
@router.put("/{service_id}", response_model=ServiceRead)
-def update_service(
- *, db_session: Session = Depends(get_db), service_id: int, service_in: ServiceUpdate
-):
- """
- Update an existing service.
- """
+def update_service(db_session: DbSession, service_id: PrimaryKey, service_in: ServiceUpdate):
+ """Updates an existing service."""
service = get(db_session=db_session, service_id=service_id)
if not service:
- raise HTTPException(status_code=404, detail="The service with this id does not exist.")
- service = update(db_session=db_session, service=service, service_in=service_in)
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An oncall service with this id does not exist."}],
+ )
+
+ try:
+ service = update(db_session=db_session, service=service, service_in=service_in)
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "An oncall service with this name already exists.",
+ "loc": "name",
+ }
+ ],
+ ) from None
+
return service
@router.get("/{service_id}", response_model=ServiceRead)
-def get_service(*, db_session: Session = Depends(get_db), service_id: int):
- """
- Get a single service.
- """
+def get_service(db_session: DbSession, service_id: PrimaryKey):
+ """Gets a service."""
service = get(db_session=db_session, service_id=service_id)
if not service:
- raise HTTPException(status_code=404, detail="The service with this id does not exist.")
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An oncall service with this id does not exist."}],
+ )
return service
-@router.delete("/{service_id}")
-def delete_service(*, db_session: Session = Depends(get_db), service_id: int):
- """
- Delete a single service.
- """
+@router.delete("/{service_id}", response_model=None)
+def delete_service(db_session: DbSession, service_id: PrimaryKey):
+ """Deletes a service."""
service = get(db_session=db_session, service_id=service_id)
if not service:
- raise HTTPException(status_code=404, detail="The service with this id does not exist.")
- delete(db_session=db_session, service_id=service_id)
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "An oncall service with this id does not exist."}],
+ )
+
+ try:
+ delete(db_session=db_session, service_id=service_id)
+ except IntegrityError:
+ raise HTTPException(
+ status_code=status.HTTP_409_CONFLICT,
+ detail=[
+ {
+ "msg": f"Unable to delete oncall service {service.name} with id {service.id}. Contact your administrator",
+ "loc": "service_id",
+ }
+ ],
+ ) from None
diff --git a/src/dispatch/signal/__init__.py b/src/dispatch/signal/__init__.py
new file mode 100644
index 000000000000..e69de29bb2d1
diff --git a/src/dispatch/signal/enums.py b/src/dispatch/signal/enums.py
new file mode 100644
index 000000000000..823f06fcb8ba
--- /dev/null
+++ b/src/dispatch/signal/enums.py
@@ -0,0 +1,7 @@
+from dispatch.enums import DispatchEnum
+
+
+class SignalEngagementStatus(DispatchEnum):
+ new = "New"
+ approved = "Approved"
+ denied = "Denied"
diff --git a/src/dispatch/signal/exceptions.py b/src/dispatch/signal/exceptions.py
new file mode 100644
index 000000000000..867412acd0d4
--- /dev/null
+++ b/src/dispatch/signal/exceptions.py
@@ -0,0 +1,13 @@
+from dispatch.exceptions import DispatchException
+
+
+class SignalNotIdentifiedException(DispatchException):
+ pass
+
+
+class SignalNotDefinedException(DispatchException):
+ pass
+
+
+class SignalNotEnabledException(DispatchException):
+ pass
diff --git a/src/dispatch/signal/flows.py b/src/dispatch/signal/flows.py
new file mode 100644
index 000000000000..141eec9374e1
--- /dev/null
+++ b/src/dispatch/signal/flows.py
@@ -0,0 +1,455 @@
+import logging
+import time
+from datetime import timedelta
+
+from cachetools import TTLCache
+from email_validator import EmailNotValidError, validate_email
+from sqlalchemy.exc import SQLAlchemyError
+from sqlalchemy.orm import Session
+
+from dispatch.auth import service as user_service
+from dispatch.auth.models import DispatchUser, UserRegister
+from dispatch.case import flows as case_flows
+from dispatch.case import service as case_service
+from dispatch.case.enums import CaseStatus
+from dispatch.case.models import CaseCreate
+from dispatch.database.core import get_organization_session, get_session
+from dispatch.entity import service as entity_service
+from dispatch.entity_type import service as entity_type_service
+from dispatch.entity_type.models import EntityScopeEnum
+from dispatch.enums import Visibility
+from dispatch.exceptions import DispatchException
+from dispatch.individual.models import IndividualContactRead
+from dispatch.messaging.strings import CASE_RESOLUTION_DEFAULT
+from dispatch.organization.service import get_all as get_all_organizations
+from dispatch.participant.models import ParticipantUpdate
+from dispatch.plugin import service as plugin_service
+from dispatch.project.models import Project
+from dispatch.service import flows as service_flows
+from dispatch.signal import flows as signal_flows
+from dispatch.signal import service as signal_service
+from dispatch.signal.enums import SignalEngagementStatus
+from dispatch.signal.models import SignalFilterAction, SignalInstance, SignalInstanceCreate
+from dispatch.workflow import flows as workflow_flows
+
+log = logging.getLogger(__name__)
+
+# Constants for signal processing
+BATCH_SIZE = 50 # Process signals in batches of 50
+LOOP_DELAY = 60 # seconds
+MAX_PROCESSING_TIME = (
+ 300 # Maximum time to process signals before refreshing organization list (5 minutes)
+)
+
+
+def signal_instance_create_flow(
+ signal_instance_id: int,
+ db_session: Session = None,
+ current_user: DispatchUser = None,
+):
+ """Create flow used by the API."""
+ signal_instance = signal_service.get_signal_instance(
+ db_session=db_session, signal_instance_id=signal_instance_id
+ )
+ if signal_instance is None:
+ log.error("signal_instance is None for id: %%s", signal_instance_id)
+ return None
+ # fetch `all` entities that should be associated with all signal definitions
+ entity_types = entity_type_service.get_all(
+ db_session=db_session, scope=EntityScopeEnum.all
+ ).all()
+ entity_types = signal_instance.signal.entity_types + entity_types
+
+ if entity_types:
+ entities = entity_service.find_entities(
+ db_session=db_session,
+ signal_instance=signal_instance,
+ entity_types=entity_types,
+ )
+ signal_instance.entities = entities
+ db_session.commit()
+
+ # we don't need to continue if a filter action took place
+ if signal_service.filter_signal(
+ db_session=db_session,
+ signal_instance=signal_instance,
+ ):
+ # If a case and conversation exists and the signal was deduplicated,
+ # we need to update the corresponding signal message
+ if (
+ signal_instance.case_id
+ and signal_instance.case.conversation
+ and _should_update_signal_message(signal_instance)
+ ):
+ update_signal_message(
+ db_session=db_session,
+ signal_instance=signal_instance,
+ )
+ return signal_instance
+
+ # limited support for canary signals, just store the instance and return
+ if signal_instance.canary:
+ return signal_instance
+
+ if not signal_instance.signal.create_case:
+ return signal_instance
+
+ # set signal instance attributes with priority given to signal instance specification, then signal, then case type.
+ if signal_instance.case_type:
+ case_type = signal_instance.case_type
+ else:
+ case_type = signal_instance.signal.case_type
+
+ if signal_instance.case_priority:
+ case_priority = signal_instance.case_priority
+ else:
+ case_priority = signal_instance.signal.case_priority
+
+ if signal_instance.oncall_service:
+ oncall_service = signal_instance.oncall_service
+ elif signal_instance.signal.oncall_service:
+ oncall_service = signal_instance.signal.oncall_service
+ elif case_type.oncall_service:
+ oncall_service = case_type.oncall_service
+ else:
+ oncall_service = None
+
+ if signal_instance.conversation_target:
+ conversation_target = signal_instance.conversation_target
+ elif signal_instance.signal.conversation_target:
+ conversation_target = signal_instance.signal.conversation_target
+ elif case_type.conversation_target:
+ conversation_target = case_type.conversation_target
+ else:
+ conversation_target = None
+
+ assignee = None
+ if oncall_service:
+ email = service_flows.resolve_oncall(service=oncall_service, db_session=db_session)
+ if email:
+ assignee = ParticipantUpdate(
+ individual=IndividualContactRead(
+ id=1,
+ email=str(email),
+ ),
+ location=None,
+ team=None,
+ department=None,
+ added_reason=None,
+ )
+
+ # create a case if not duplicate or snoozed and case creation is enabled
+ case_severity = (
+ getattr(signal_instance, "case_severity", None)
+ or getattr(signal_instance.signal, "case_severity", None)
+ or getattr(case_type, "case_severity", None)
+ )
+
+ reporter = None
+ if current_user and hasattr(current_user, "email"):
+ reporter = ParticipantUpdate(
+ individual=IndividualContactRead(
+ id=1,
+ email=str(current_user.email),
+ ),
+ location=None,
+ team=None,
+ department=None,
+ added_reason=None,
+ )
+
+ case_in = CaseCreate(
+ title=signal_instance.signal.name,
+ description=signal_instance.signal.description,
+ resolution=CASE_RESOLUTION_DEFAULT,
+ resolution_reason=None,
+ status=CaseStatus.new,
+ visibility=Visibility.open,
+ case_priority=case_priority,
+ case_severity=case_severity,
+ project=signal_instance.project,
+ case_type=case_type,
+ assignee=assignee,
+ dedicated_channel=False,
+ reporter=reporter,
+ )
+ case = case_service.create(db_session=db_session, case_in=case_in, current_user=current_user)
+ signal_instance.case = case
+
+ db_session.commit()
+
+ # Ensure valid types for case_new_create_flow arguments
+ org_slug = None
+ svc_id = None
+ conv_target = conversation_target if isinstance(conversation_target, str) else None
+ case_flows.case_new_create_flow(
+ db_session=db_session,
+ organization_slug=org_slug,
+ service_id=svc_id,
+ conversation_target=conv_target,
+ case_id=case.id,
+ create_all_resources=False,
+ )
+
+ if signal_instance.signal.engagements and entities:
+ signal_flows.engage_signal_identity(
+ db_session=db_session,
+ signal_instance=signal_instance,
+ )
+
+ # run workflows if not duplicate or snoozed
+ if workflows := signal_instance.signal.workflows:
+ for workflow in workflows:
+ workflow_flows.signal_workflow_run_flow(
+ current_user=current_user,
+ db_session=db_session,
+ signal_instance=signal_instance,
+ workflow=workflow,
+ )
+
+ return signal_instance
+
+
+def create_signal_instance(
+ db_session: Session,
+ project: Project,
+ signal_instance_data: dict,
+ current_user: DispatchUser = None,
+):
+ """Create flow used by the scheduler."""
+ signal = signal_service.get_by_variant_or_external_id(
+ db_session=db_session,
+ project_id=project.id,
+ external_id=signal_instance_data.get("id"),
+ variant=signal_instance_data["variant"],
+ )
+
+ if not signal:
+ raise DispatchException("No signal definition defined.")
+
+ if not signal.enabled:
+ raise DispatchException("Signal definition is not enabled.")
+
+ signal_instance_in = SignalInstanceCreate(
+ **signal_instance_data,
+ raw=signal_instance_data,
+ signal=signal,
+ project=signal.project,
+ )
+
+ signal_instance = signal_service.create_instance(
+ db_session=db_session, signal_instance_in=signal_instance_in
+ )
+ return signal_instance
+
+
+def engage_signal_identity(db_session: Session, signal_instance: SignalInstance) -> None:
+ """Engage the signal identity."""
+
+ users_to_engage = []
+ engagements = signal_instance.signal.engagements
+ for engagement in engagements:
+ for entity in signal_instance.entities:
+ if engagement.entity_type_id == entity.entity_type_id:
+ try:
+ validated_email = validate_email(entity.value, check_deliverability=False)
+ except EmailNotValidError as e:
+ log.warning(
+ f"A user subject included in a signal for {signal_instance.signal.name} (id: {signal_instance.signal.id}) contains an invalid email address: {e}. Investigate why this detection included a user subject with an invalid email in the signal."
+ )
+ else:
+ users_to_engage.append(
+ {
+ "user": validated_email.email,
+ "engagement": engagement,
+ }
+ )
+
+ if not users_to_engage:
+ log.warning(
+ f"Engagement configured for signal {signal_instance.signal.name} (id: {signal_instance.signal.id}), but no users found in instance with id {signal_instance.id}."
+ )
+ return
+
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session,
+ project_id=signal_instance.case.project.id,
+ plugin_type="conversation",
+ )
+ if not plugin:
+ log.warning("No conversation plugin is active.")
+ return
+
+ for reachout in users_to_engage:
+ email = reachout.get("user")
+ case_flows.case_add_or_reactivate_participant_flow(
+ db_session=db_session,
+ user_email=email,
+ case_id=signal_instance.case.id,
+ add_to_conversation=True,
+ )
+
+ user = user_service.get_or_create(
+ db_session=db_session,
+ organization=signal_instance.case.project.organization.slug,
+ user_in=UserRegister(email=email),
+ )
+
+ response = plugin.instance.create_engagement_threaded(
+ signal_instance=signal_instance,
+ case=signal_instance.case,
+ conversation_id=signal_instance.case.conversation.channel_id,
+ thread_id=signal_instance.case.conversation.thread_id,
+ user=user,
+ engagement=reachout.get("engagement"),
+ engagement_status=SignalEngagementStatus.new,
+ )
+ signal_instance.engagement_thread_ts = response.get("timestamp")
+ db_session.commit()
+
+
+def update_signal_message(db_session: Session, signal_instance: SignalInstance) -> None:
+ plugin = plugin_service.get_active_instance(
+ db_session=db_session,
+ project_id=signal_instance.case.project.id,
+ plugin_type="conversation",
+ )
+ if not plugin:
+ log.warning("No conversation plugin is active.")
+ return
+
+ plugin.instance.update_signal_message(
+ case_id=signal_instance.case_id,
+ conversation_id=signal_instance.case.conversation.channel_id,
+ db_session=db_session,
+ thread_id=signal_instance.case.signal_thread_ts,
+ )
+
+
+# Cache structure: {case_id: {"created_at": datetime, "filter_action": SignalFilterAction}}
+_last_nonupdated_signal_cache = TTLCache(maxsize=4, ttl=60)
+
+
+def _should_update_signal_message(signal_instance: SignalInstance) -> bool:
+ """
+ Determine if the signal message should be updated based on the filter action and time since the last update.
+ """
+ global _last_nonupdated_signal_cache
+
+ case_id = str(signal_instance.case_id)
+
+ if case_id not in _last_nonupdated_signal_cache:
+ # Store only the necessary data, not the entire object
+ _last_nonupdated_signal_cache[case_id] = {
+ "created_at": signal_instance.created_at,
+ "filter_action": signal_instance.filter_action,
+ }
+ return True
+
+ last_cached_data = _last_nonupdated_signal_cache[case_id]
+ time_since_last_update = signal_instance.created_at - last_cached_data["created_at"]
+
+ if (
+ signal_instance.filter_action == SignalFilterAction.deduplicate
+ and signal_instance.case.signal_thread_ts # noqa
+ and time_since_last_update >= timedelta(seconds=5) # noqa
+ ):
+ # Update the cache with the new data
+ _last_nonupdated_signal_cache[case_id] = {
+ "created_at": signal_instance.created_at,
+ "filter_action": signal_instance.filter_action,
+ }
+ return True
+ else:
+ return False
+
+
+def process_signal_batch(db_session: Session, signal_instance_ids: list[int]) -> None:
+ """Process a batch of signal instances.
+
+ Args:
+ db_session (Session): The database session.
+ signal_instance_ids (list[int]): List of signal instance IDs to process.
+ """
+ for signal_instance_id in signal_instance_ids:
+ try:
+ signal_flows.signal_instance_create_flow(
+ db_session=db_session,
+ signal_instance_id=signal_instance_id,
+ )
+ # Commit after each successful processing to ensure progress is saved
+ db_session.commit()
+ except Exception as e:
+ log.exception(f"Error processing signal instance {signal_instance_id}: {e}")
+ # Ensure transaction is rolled back on error
+ db_session.rollback()
+
+
+def process_organization_signals(organization_slug: str) -> None:
+ """Processes all unprocessed signals for a given organization using batched processing.
+
+ Args:
+ organization_slug (str): The slug of the organization whose signals need to be processed.
+ """
+ try:
+ with get_organization_session(organization_slug) as db_session:
+ # Get unprocessed signal IDs
+ signal_instance_ids = signal_service.get_unprocessed_signal_instance_ids(db_session)
+
+ if not signal_instance_ids:
+ log.debug(f"No unprocessed signals found for organization {organization_slug}")
+ return
+
+ log.info(
+ f"Processing {len(signal_instance_ids)} signals for organization {organization_slug}"
+ )
+
+ # Process signals in batches
+ for i in range(0, len(signal_instance_ids), BATCH_SIZE):
+ batch = signal_instance_ids[i : i + BATCH_SIZE]
+ process_signal_batch(db_session, batch)
+
+ # Log progress for large batches
+ if len(signal_instance_ids) > BATCH_SIZE:
+ log.info(
+ f"Processed {min(i + BATCH_SIZE, len(signal_instance_ids))}/{len(signal_instance_ids)} signals for {organization_slug}"
+ )
+ except SQLAlchemyError as e:
+ log.exception(f"Database error while processing signals for {organization_slug}: {e}")
+ except Exception as e:
+ log.exception(f"Error processing signals for organization {organization_slug}: {e}")
+
+
+def main_processing_loop() -> None:
+ """Main processing loop that iterates through all organizations and processes their signals.
+
+ Uses time-based batching to ensure the organization list is refreshed periodically.
+ """
+ while True:
+ try:
+ # Get organizations in a dedicated session that will be properly closed
+ organizations = []
+ with get_session() as session:
+ organizations = list(get_all_organizations(db_session=session))
+
+ if not organizations:
+ log.warning("No organizations found to process signals for")
+ time.sleep(LOOP_DELAY)
+ continue
+
+ start_time = time.time()
+
+ # Process each organization with its own session
+ for organization in organizations:
+ # Check if we've been processing for too long and should refresh org list
+ if time.time() - start_time > MAX_PROCESSING_TIME:
+ log.info("Processing time limit reached, refreshing organization list")
+ break
+
+ log.info(f"Processing signals for organization {organization.slug}")
+ process_organization_signals(organization.slug)
+
+ except Exception as e:
+ log.exception(f"Error in main signal processing loop: {e}")
+ finally:
+ time.sleep(LOOP_DELAY)
diff --git a/src/dispatch/signal/models.py b/src/dispatch/signal/models.py
new file mode 100644
index 000000000000..d583b9574485
--- /dev/null
+++ b/src/dispatch/signal/models.py
@@ -0,0 +1,417 @@
+import uuid
+from datetime import datetime
+from typing import Any
+from pydantic import Field
+from sqlalchemy import (
+ JSON,
+ Boolean,
+ Column,
+ DateTime,
+ ForeignKey,
+ Integer,
+ PrimaryKeyConstraint,
+ String,
+ Table,
+ UniqueConstraint,
+)
+from sqlalchemy.dialects.postgresql import JSONB, UUID
+from sqlalchemy.orm import relationship
+from sqlalchemy_utils import TSVectorType
+
+from dispatch.auth.models import DispatchUser
+from dispatch.case.models import CaseReadMinimal
+from dispatch.case.priority.models import CasePriority, CasePriorityRead
+from dispatch.case.type.models import CaseType, CaseTypeRead
+from dispatch.data.source.models import SourceBase
+from dispatch.database.core import Base
+from dispatch.entity.models import EntityRead
+from dispatch.entity_type.models import EntityType, EntityTypeRead
+from dispatch.enums import DispatchEnum
+from dispatch.event.models import EventRead
+from dispatch.models import (
+ DispatchBase,
+ EvergreenMixin,
+ NameStr,
+ Pagination,
+ PrimaryKey,
+ ProjectMixin,
+ TimeStampMixin,
+)
+from dispatch.project.models import ProjectRead
+from dispatch.service.models import Service, ServiceRead
+from dispatch.tag.models import TagRead
+from dispatch.workflow.models import WorkflowRead
+
+
+class RuleMode(DispatchEnum):
+ active = "Active"
+ monitor = "Monitor"
+ inactive = "Inactive"
+
+
+assoc_signal_tags = Table(
+ "assoc_signal_tags",
+ Base.metadata,
+ Column("signal_id", Integer, ForeignKey("signal.id", ondelete="CASCADE")),
+ Column("tag_id", Integer, ForeignKey("tag.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("signal_id", "tag_id"),
+)
+
+assoc_signal_engagements = Table(
+ "assoc_signal_engagements",
+ Base.metadata,
+ Column("signal_id", Integer, ForeignKey("signal.id", ondelete="CASCADE")),
+ Column("signal_engagement_id", Integer, ForeignKey("signal_engagement.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("signal_id", "signal_engagement_id"),
+)
+
+assoc_signal_filters = Table(
+ "assoc_signal_filters",
+ Base.metadata,
+ Column("signal_id", Integer, ForeignKey("signal.id", ondelete="CASCADE")),
+ Column("signal_filter_id", Integer, ForeignKey("signal_filter.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("signal_id", "signal_filter_id"),
+)
+
+assoc_signal_instance_entities = Table(
+ "assoc_signal_instance_entities",
+ Base.metadata,
+ Column(
+ "signal_instance_id",
+ UUID(as_uuid=True),
+ ForeignKey("signal_instance.id", ondelete="CASCADE"),
+ ),
+ Column("entity_id", Integer, ForeignKey("entity.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("signal_instance_id", "entity_id"),
+)
+
+assoc_signal_entity_types = Table(
+ "assoc_signal_entity_types",
+ Base.metadata,
+ Column("signal_id", Integer, ForeignKey("signal.id", ondelete="CASCADE")),
+ Column("entity_type_id", Integer, ForeignKey("entity_type.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("signal_id", "entity_type_id"),
+)
+
+assoc_signal_workflows = Table(
+ "assoc_signal_workflows",
+ Base.metadata,
+ Column("signal_id", Integer, ForeignKey("signal.id", ondelete="CASCADE")),
+ Column("workflow_id", Integer, ForeignKey("workflow.id", ondelete="CASCADE")),
+ PrimaryKeyConstraint("signal_id", "workflow_id"),
+)
+
+
+class SignalFilterMode(DispatchEnum):
+ active = "active"
+ monitor = "monitor"
+ inactive = "inactive"
+ expired = "expired"
+
+
+class SignalFilterAction(DispatchEnum):
+ deduplicate = "deduplicate"
+ snooze = "snooze"
+ none = "none"
+
+
+class Signal(Base, TimeStampMixin, ProjectMixin):
+ """Class that represents a detection and its properties."""
+
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ owner = Column(String)
+ description = Column(String)
+ external_url = Column(String)
+ external_id = Column(String)
+ source = relationship("Source", backref="signals")
+ source_id = Column(Integer, ForeignKey("source.id"))
+ variant = Column(String)
+ loopin_signal_identity = Column(Boolean, default=False)
+ enabled = Column(Boolean, default=False)
+ case_type_id = Column(Integer, ForeignKey(CaseType.id))
+ case_type = relationship("CaseType", backref="signals")
+ case_priority_id = Column(Integer, ForeignKey(CasePriority.id))
+ case_priority = relationship("CasePriority", backref="signals")
+ create_case = Column(Boolean, default=True)
+ conversation_target = Column(String)
+ default = Column(Boolean, default=False)
+ lifecycle = Column(String)
+ runbook = Column(String)
+
+ # GenAI specific fields
+ genai_enabled = Column(Boolean, default=True)
+ genai_model = Column(String)
+ genai_system_message = Column(String)
+ genai_prompt = Column(String)
+
+ oncall_service_id = Column(Integer, ForeignKey("service.id"))
+ oncall_service = relationship("Service", foreign_keys=[oncall_service_id])
+ engagements = relationship(
+ "SignalEngagement", secondary=assoc_signal_engagements, backref="signals"
+ )
+ filters = relationship("SignalFilter", secondary=assoc_signal_filters, backref="signals")
+ events = relationship("Event", backref="signal", cascade="all, delete-orphan")
+
+ entity_types = relationship(
+ "EntityType",
+ secondary=assoc_signal_entity_types,
+ backref="signals",
+ )
+ workflows = relationship(
+ "Workflow",
+ secondary=assoc_signal_workflows,
+ backref="signals",
+ )
+ workflow_instances = relationship(
+ "WorkflowInstance", backref="signal", cascade="all, delete-orphan"
+ )
+
+ tags = relationship(
+ "Tag",
+ secondary=assoc_signal_tags,
+ backref="signals",
+ )
+ search_vector = Column(
+ TSVectorType("name", "description", "variant", regconfig="pg_catalog.simple")
+ )
+
+
+class SignalEngagement(Base, ProjectMixin, TimeStampMixin):
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ message = Column(String, nullable=True)
+ require_mfa = Column(Boolean, default=False)
+ entity_type_id = Column(Integer, ForeignKey(EntityType.id))
+ entity_type = relationship("EntityType", backref="signal_engagements")
+ creator_id = Column(Integer, ForeignKey(DispatchUser.id))
+ creator = relationship("DispatchUser", backref="signal_engagements")
+
+ search_vector = Column(
+ TSVectorType("name", "description", weights={"name": "A", "description": "B"})
+ )
+
+
+class SignalFilter(Base, ProjectMixin, EvergreenMixin, TimeStampMixin):
+ __table_args__ = (UniqueConstraint("name", "project_id"),)
+ id = Column(Integer, primary_key=True)
+ name = Column(String)
+ description = Column(String)
+ expression = Column(JSON, nullable=False, default=[])
+ mode = Column(String, default=SignalFilterMode.active, nullable=False)
+ action = Column(String, nullable=False)
+ expiration = Column(DateTime, nullable=True)
+ window = Column(
+ Integer, default=(60 * 60)
+ ) # number of seconds for duplication lookback default to 1 hour
+
+ creator_id = Column(Integer, ForeignKey(DispatchUser.id))
+ creator = relationship("DispatchUser", backref="signal_filters")
+
+ search_vector = Column(
+ TSVectorType("name", "description", weights={"name": "A", "description": "B"})
+ )
+
+
+class SignalInstance(Base, TimeStampMixin, ProjectMixin):
+ """Class that represents a detection alert and its properties."""
+
+ id = Column(UUID(as_uuid=True), primary_key=True, default=lambda: str(uuid.uuid4()))
+ case = relationship("Case", backref="signal_instances")
+ case_id = Column(Integer, ForeignKey("case.id", ondelete="CASCADE"))
+ engagement_thread_ts = Column(String, nullable=True)
+ entities = relationship(
+ "Entity",
+ secondary=assoc_signal_instance_entities,
+ backref="signal_instances",
+ )
+ case_type_id = Column(Integer, ForeignKey(CaseType.id))
+ case_type = relationship("CaseType", backref="signal_instances")
+ case_priority_id = Column(Integer, ForeignKey(CasePriority.id))
+ case_priority = relationship("CasePriority", backref="signal_instances")
+ conversation_target = Column(String)
+ filter_action = Column(String)
+ canary = Column(Boolean, default=False)
+ oncall_service_id = Column(Integer, ForeignKey(Service.id))
+ oncall_service = relationship("Service", backref="signal_instances")
+ raw = Column(JSONB)
+ signal = relationship("Signal", backref="instances")
+ signal_id = Column(Integer, ForeignKey("signal.id"))
+
+ @property
+ def external_id(self) -> str | None:
+ """Get external_id from raw data or use instance ID"""
+ if not self.raw:
+ return str(self.id)
+
+ # Check for common external ID field names in the raw data
+ for field in ["external_id", "externalId", "id"]:
+ if field in self.raw:
+ return str(self.raw[field])
+
+ # Fall back to using the instance ID
+ return str(self.id)
+
+
+# Pydantic models
+class Service(DispatchBase):
+ id: PrimaryKey
+ description: str | None = Field(default=None)
+ external_id: str
+ is_active: bool | None = None
+ name: NameStr
+ type: str | None = Field(default=None)
+
+
+class SignalEngagementBase(DispatchBase):
+ name: NameStr
+ description: str | None = Field(default=None)
+ require_mfa: bool | None = False
+ entity_type: EntityTypeRead | None = None
+ message: str | None = Field(default=None)
+
+
+class SignalFilterBase(DispatchBase):
+ mode: SignalFilterMode | None = SignalFilterMode.active
+ expression: list[dict[str, Any]] | None = Field(default=[])
+ name: NameStr
+ action: SignalFilterAction = SignalFilterAction.snooze
+ description: str | None = Field(default=None)
+ window: int | None = 600
+ expiration: datetime | None = Field(default=None)
+
+
+class SignalFilterUpdate(SignalFilterBase):
+ id: PrimaryKey
+
+
+class SignalEngagementCreate(SignalEngagementBase):
+ project: ProjectRead
+
+
+class SignalEngagementRead(SignalEngagementBase):
+ id: PrimaryKey
+
+
+class SignalEngagementUpdate(SignalEngagementBase):
+ id: PrimaryKey
+
+
+class SignalEngagementPagination(Pagination):
+ items: list[SignalEngagementRead]
+
+
+class SignalFilterCreate(SignalFilterBase):
+ project: ProjectRead
+
+
+class SignalFilterRead(SignalFilterBase):
+ id: PrimaryKey
+ signals: list["SignalBase"] | None = []
+
+
+class SignalFilterPagination(Pagination):
+ items: list[SignalFilterRead]
+
+
+class SignalBase(DispatchBase):
+ case_priority: CasePriorityRead | None = None
+ case_type: CaseTypeRead | None = None
+ conversation_target: str | None = None
+ create_case: bool | None = True
+ created_at: datetime | None = None
+ default: bool | None = False
+ description: str | None = None
+ enabled: bool | None = False
+ external_id: str | None = None
+ external_url: str | None = None
+ name: str
+ oncall_service: Service | None = None
+ owner: str
+ project: ProjectRead
+ source: SourceBase | None = None
+ variant: str | None = None
+ lifecycle: str | None = None
+ runbook: str | None = None
+ genai_enabled: bool | None = True
+ genai_model: str | None = None
+ genai_system_message: str | None = None
+ genai_prompt: str | None = None
+
+
+class SignalCreate(SignalBase):
+ filters: list[SignalFilterRead] | None = []
+ engagements: list[SignalEngagementRead] | None = []
+ entity_types: list[EntityTypeRead] | None = []
+ workflows: list[WorkflowRead] | None = []
+ tags: list[TagRead] | None = []
+
+
+class SignalUpdate(SignalBase):
+ id: PrimaryKey
+ engagements: list[SignalEngagementRead] | None = []
+ filters: list[SignalFilterRead] | None = []
+ entity_types: list[EntityTypeRead] | None = []
+ workflows: list[WorkflowRead] | None = []
+ tags: list[TagRead] | None = []
+
+
+class SignalRead(SignalBase):
+ id: PrimaryKey
+ engagements: list[SignalEngagementRead] | None = []
+ entity_types: list[EntityTypeRead] | None = []
+ filters: list[SignalFilterRead] | None = []
+ workflows: list[WorkflowRead] | None = []
+ tags: list[TagRead] | None = []
+ events: list[EventRead] | None = []
+
+
+class SignalPagination(Pagination):
+ items: list[SignalRead]
+
+
+class AdditionalMetadata(DispatchBase):
+ name: str | None = None
+ value: Any | None = None
+ type: str | None = None
+ important: bool | None = None
+
+
+class SignalStats(DispatchBase):
+ num_signal_instances_alerted: int | None = None
+ num_signal_instances_snoozed: int | None = None
+ num_snoozes_active: int | None = None
+ num_snoozes_expired: int | None = None
+
+
+class SignalInstanceBase(DispatchBase):
+ project: ProjectRead | None = None
+ case: CaseReadMinimal | None = None
+ canary: bool | None = False
+ entities: list[EntityRead] | None = []
+ raw: dict[str, Any]
+ external_id: str | None = None
+ filter_action: SignalFilterAction | None = None
+ created_at: datetime | None = None
+
+
+class SignalInstanceCreate(SignalInstanceBase):
+ signal: SignalRead | None = None
+ case_priority: CasePriorityRead | None = None
+ case_type: CaseTypeRead | None = None
+ conversation_target: str | None = None
+ oncall_service: ServiceRead | None = None
+
+
+class SignalInstanceRead(SignalInstanceBase):
+ id: uuid.UUID
+ signal: SignalRead
+
+
+class SignalInstancePagination(Pagination):
+ items: list[SignalInstanceRead]
+
+# Update forward references
+SignalFilterRead.model_rebuild()
diff --git a/src/dispatch/signal/service.py b/src/dispatch/signal/service.py
new file mode 100644
index 000000000000..2ac5c55c7a9c
--- /dev/null
+++ b/src/dispatch/signal/service.py
@@ -0,0 +1,1066 @@
+import json
+import logging
+import uuid
+from datetime import datetime, timedelta, timezone
+from collections import defaultdict
+from fastapi import HTTPException, status
+from pydantic import ValidationError
+from sqlalchemy import asc, desc, or_, func, and_, select, cast
+from sqlalchemy.orm import Session
+from sqlalchemy.orm.query import Query
+from sqlalchemy.sql.expression import true
+from sqlalchemy.dialects.postgresql import JSONB
+
+from dispatch.auth.models import DispatchUser
+from dispatch.case.models import Case
+from dispatch.case.priority import service as case_priority_service
+from dispatch.case.type import service as case_type_service
+from dispatch.case.type.models import CaseType
+from dispatch.database.service import apply_filter_specific_joins, apply_filters
+from dispatch.entity.models import Entity
+from dispatch.entity_type import service as entity_type_service
+from dispatch.entity_type.models import EntityType
+from dispatch.event import service as event_service
+from dispatch.individual import service as individual_service
+from dispatch.project import service as project_service
+from dispatch.service import service as service_service
+from dispatch.tag import service as tag_service
+from dispatch.workflow import service as workflow_service
+
+from .exceptions import (
+ SignalNotDefinedException,
+ SignalNotIdentifiedException,
+)
+from .models import (
+ assoc_signal_instance_entities,
+ Signal,
+ SignalCreate,
+ SignalEngagement,
+ SignalEngagementCreate,
+ SignalEngagementRead,
+ SignalEngagementUpdate,
+ SignalFilter,
+ SignalFilterAction,
+ SignalFilterCreate,
+ SignalFilterMode,
+ SignalFilterRead,
+ SignalFilterUpdate,
+ SignalInstance,
+ SignalInstanceCreate,
+ SignalStats,
+ SignalUpdate,
+ assoc_signal_entity_types,
+)
+
+log = logging.getLogger(__name__)
+
+
+def get_signal_engagement(
+ *, db_session: Session, signal_engagement_id: int
+) -> SignalEngagement | None:
+ """Gets a signal engagement by id."""
+ return (
+ db_session.query(SignalEngagement)
+ .filter(SignalEngagement.id == signal_engagement_id)
+ .one_or_none()
+ )
+
+
+def get_signal_engagement_by_name(
+ *, db_session, project_id: int, name: str
+) -> SignalEngagement | None:
+ """Gets a signal engagement by its name."""
+ return (
+ db_session.query(SignalEngagement)
+ .filter(SignalEngagement.project_id == project_id)
+ .filter(SignalEngagement.name == name)
+ .first()
+ )
+
+
+def get_signal_engagement_by_name_or_raise(
+ *, db_session: Session, project_id: int, signal_engagement_in: SignalEngagementRead
+) -> SignalEngagement:
+ """Gets a signal engagement by its name or raises an error if not found."""
+ signal_engagement = get_signal_engagement_by_name(
+ db_session=db_session, project_id=project_id, name=signal_engagement_in.name
+ )
+
+ if not signal_engagement:
+ raise ValidationError(
+ [
+ {
+ "msg": "Signal engagement not found.",
+ "loc": "signalEngagement",
+ }
+ ]
+ )
+ return signal_engagement
+
+
+def create_signal_engagement(
+ *, db_session: Session, creator: DispatchUser, signal_engagement_in: SignalEngagementCreate
+) -> SignalEngagement:
+ """Creates a new signal engagement."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=signal_engagement_in.project
+ )
+
+ entity_type = entity_type_service.get(
+ db_session=db_session, entity_type_id=signal_engagement_in.entity_type.id
+ )
+
+ signal_engagement = SignalEngagement(
+ name=signal_engagement_in.name,
+ description=signal_engagement_in.description,
+ message=signal_engagement_in.message,
+ require_mfa=signal_engagement_in.require_mfa,
+ entity_type=entity_type,
+ creator=creator,
+ project=project,
+ )
+ db_session.add(signal_engagement)
+ db_session.commit()
+ return signal_engagement
+
+
+def update_signal_engagement(
+ *,
+ db_session: Session,
+ signal_engagement: SignalEngagement,
+ signal_engagement_in: SignalEngagementUpdate,
+) -> SignalEngagement:
+ """Updates an existing signal engagement."""
+ signal_engagement_data = signal_engagement.dict()
+ update_data = signal_engagement_in.dict(
+ exclude_unset=True,
+ exclude={},
+ )
+
+ for field in signal_engagement_data:
+ if field in update_data:
+ setattr(signal_engagement, field, update_data[field])
+
+ db_session.add(signal_engagement)
+ db_session.commit()
+ return signal_engagement
+
+
+def get_all_by_entity_type(*, db_session: Session, entity_type_id: int) -> list[SignalInstance]:
+ """Fetches all signal instances associated with a given entity type."""
+ return (
+ db_session.query(SignalInstance)
+ .join(SignalInstance.signal)
+ .join(assoc_signal_entity_types)
+ .join(EntityType)
+ .filter(assoc_signal_entity_types.c.entity_type_id == entity_type_id)
+ .all()
+ )
+
+
+def create_signal_instance(*, db_session: Session, signal_instance_in: SignalInstanceCreate):
+ """Creates a new signal instance."""
+ project = project_service.get_by_name_or_default(
+ db_session=db_session, project_in=signal_instance_in.project
+ )
+
+ if not signal_instance_in.signal:
+ external_id = signal_instance_in.external_id
+
+ # this assumes the external_ids are uuids
+ if not external_id:
+ msg = "A detection external id must be provided in order to get the signal definition."
+ raise SignalNotIdentifiedException(msg)
+
+ signal_definition = (
+ db_session.query(Signal).filter(Signal.external_id == external_id).one_or_none()
+ )
+
+ if not signal_definition:
+ # we get the default signal definition
+ signal_definition = get_default(
+ db_session=db_session,
+ project_id=project.id,
+ )
+ msg = f"Default signal definition used for signal instance with external id {external_id}"
+ log.warn(msg)
+
+ if not signal_definition:
+ msg = f"No signal definition could be found by external id {external_id}, and no default exists."
+ raise SignalNotDefinedException(msg)
+
+ signal_instance_in.signal = signal_definition
+
+ signal_instance = create_instance(db_session=db_session, signal_instance_in=signal_instance_in)
+ db_session.commit()
+ return signal_instance
+
+
+def create_signal_filter(
+ *, db_session: Session, creator: DispatchUser, signal_filter_in: SignalFilterCreate
+) -> SignalFilter:
+ """Creates a new signal filter."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=signal_filter_in.project
+ )
+
+ signal_filter = SignalFilter(
+ **signal_filter_in.dict(
+ exclude={
+ "project",
+ }
+ ),
+ creator=creator,
+ project=project,
+ )
+ db_session.add(signal_filter)
+ db_session.commit()
+ return signal_filter
+
+
+def update_signal_filter(
+ *, db_session: Session, signal_filter: SignalFilter, signal_filter_in: SignalFilterUpdate
+) -> SignalFilter:
+ """Updates an existing signal filter."""
+
+ signal_filter_data = signal_filter.dict()
+ update_data = signal_filter_in.dict(
+ exclude_unset=True,
+ exclude={},
+ )
+
+ for field in signal_filter_data:
+ if field in update_data:
+ setattr(signal_filter, field, update_data[field])
+
+ db_session.add(signal_filter)
+ db_session.commit()
+ return signal_filter
+
+
+def delete_signal_filter(*, db_session: Session, signal_filter_id: int) -> int:
+ """Deletes an existing signal filter."""
+ signal_filter = db_session.query(SignalFilter).filter(SignalFilter.id == signal_filter_id).one()
+ db_session.delete(signal_filter)
+ db_session.commit()
+ return signal_filter_id
+
+
+def get_signal_filter_by_name_or_raise(
+ *, db_session: Session, project_id: int, signal_filter_in: SignalFilterRead
+) -> SignalFilter:
+ signal_filter = get_signal_filter_by_name(
+ db_session=db_session, project_id=project_id, name=signal_filter_in.name
+ )
+
+ if not signal_filter:
+ raise ValidationError(
+ [
+ {
+ "msg": "Signal Filter not found.",
+ "loc": "signalFilter",
+ }
+ ]
+ )
+ return signal_filter
+
+
+def get_signal_filter_by_name(*, db_session, project_id: int, name: str) -> SignalFilter | None:
+ """Gets a signal filter by its name."""
+ return (
+ db_session.query(SignalFilter)
+ .filter(SignalFilter.project_id == project_id)
+ .filter(SignalFilter.name == name)
+ .first()
+ )
+
+
+def get_signal_filter(*, db_session: Session, signal_filter_id: int) -> SignalFilter:
+ """Gets a single signal filter."""
+ return db_session.query(SignalFilter).filter(SignalFilter.id == signal_filter_id).one_or_none()
+
+
+def get_signal_instance(
+ *, db_session: Session, signal_instance_id: int | str
+) -> SignalInstance | None:
+ """Gets a signal instance by its UUID."""
+ return (
+ db_session.query(SignalInstance)
+ .filter(SignalInstance.id == signal_instance_id)
+ .one_or_none()
+ )
+
+
+def get(*, db_session: Session, signal_id: str | int) -> Signal | None:
+ """Gets a signal by id."""
+ return db_session.query(Signal).filter(Signal.id == signal_id).one_or_none()
+
+
+def get_default(*, db_session: Session, project_id: int) -> Signal | None:
+ """Gets the default signal definition."""
+ return (
+ db_session.query(Signal)
+ .filter(Signal.project_id == project_id, Signal.default == true())
+ .one_or_none()
+ )
+
+
+def get_by_primary_or_external_id(*, db_session: Session, signal_id: str | int) -> Signal | None:
+ """Gets a signal by id or external_id."""
+ if is_valid_uuid(signal_id):
+ signal = db_session.query(Signal).filter(Signal.external_id == signal_id).one_or_none()
+ else:
+ signal = (
+ db_session.query(Signal)
+ .filter(or_(Signal.id == signal_id, Signal.external_id == signal_id))
+ .one_or_none()
+ )
+ return signal
+
+
+def get_by_variant_or_external_id(
+ *, db_session: Session, project_id: int, external_id: str = None, variant: str = None
+) -> Signal | None:
+ """Gets a signal by its variant or external id."""
+ if variant:
+ return (
+ db_session.query(Signal)
+ .filter(Signal.project_id == project_id, Signal.variant == variant)
+ .one_or_none()
+ )
+ return (
+ db_session.query(Signal)
+ .filter(Signal.project_id == project_id, Signal.external_id == external_id)
+ .one_or_none()
+ )
+
+
+def get_all_by_conversation_target(
+ *, db_session: Session, project_id: int, conversation_target: str
+) -> list[Signal]:
+ """Gets all signals for a given conversation target (e.g. #conversation-channel)"""
+ return (
+ db_session.query(Signal)
+ .join(CaseType)
+ .filter(
+ CaseType.project_id == project_id,
+ CaseType.conversation_target == conversation_target,
+ Signal.case_type_id == CaseType.id,
+ )
+ .all()
+ )
+
+
+excluded_attributes = {
+ "case_priority",
+ "case_type",
+ "engagements",
+ "entity_types",
+ "filters",
+ "oncall_service",
+ "project",
+ "source",
+ "tags",
+ "workflows",
+}
+
+
+def create(
+ *, db_session: Session, signal_in: SignalCreate, user: DispatchUser | None = None
+) -> Signal:
+ """Creates a new signal."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=signal_in.project
+ )
+
+ updates = defaultdict(list)
+
+ signal = Signal(
+ **signal_in.dict(exclude=excluded_attributes),
+ project=project,
+ )
+
+ for field in signal_in.dict(exclude=excluded_attributes):
+ attr = getattr(signal, field)
+ if attr and not isinstance(attr, datetime):
+ updates[field] = attr
+
+ tags = []
+ for t in signal_in.tags:
+ tag = tag_service.get_or_create(db_session=db_session, tag_in=t)
+ tags.append(tag)
+ updates["tags"].append(f"{tag.tag_type.name}/{tag.name}")
+ signal.tags = tags
+
+ entity_types = []
+ for e in signal_in.entity_types:
+ entity_type = entity_type_service.get(db_session=db_session, entity_type_id=e.id)
+ entity_types.append(entity_type)
+ updates["entity_types"].append(entity_type.name)
+ signal.entity_types = entity_types
+
+ engagements = []
+ for signal_engagement_in in signal_in.engagements:
+ signal_engagement = get_signal_engagement_by_name(
+ db_session=db_session, project_id=project.id, name=signal_engagement_in.name
+ )
+ engagements.append(signal_engagement)
+ updates["engagements"].append(signal_engagement.name)
+ signal.engagements = engagements
+
+ filters = []
+ for f in signal_in.filters:
+ signal_filter = get_signal_filter_by_name(
+ db_session=db_session, project_id=project.id, name=f.name
+ )
+ filters.append(signal_filter)
+ updates["filters"].append(signal_filter.name)
+ signal.filters = filters
+
+ workflows = []
+ for w in signal_in.workflows:
+ workflow = workflow_service.get_by_name_or_raise(db_session=db_session, workflow_in=w)
+ workflows.append(workflow)
+ updates["workflows"].append(workflow.name)
+ signal.workflows = workflows
+
+ if signal_in.case_priority:
+ case_priority = case_priority_service.get_by_name_or_default(
+ db_session=db_session, project_id=project.id, case_priority_in=signal_in.case_priority
+ )
+ signal.case_priority = case_priority
+ updates["case_priority"] = case_priority.name
+
+ if signal_in.oncall_service:
+ oncall_service = service_service.get(
+ db_session=db_session, service_id=signal_in.oncall_service.id
+ )
+ signal.oncall_service = oncall_service
+ updates["oncall_service"] = oncall_service.name
+
+ if signal_in.case_type:
+ case_type = case_type_service.get_by_name_or_default(
+ db_session=db_session, project_id=project.id, case_type_in=signal_in.case_type
+ )
+ signal.case_type = case_type
+ updates["case_type"] = case_type.name
+
+ db_session.add(signal)
+ db_session.commit()
+
+ if user:
+ individual = individual_service.get_by_email_and_project(
+ db_session=db_session, email=user.email, project_id=signal.project.id
+ )
+ else:
+ individual = None
+
+ event_service.log_signal_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Signal created",
+ details=updates,
+ individual_id=individual.id if individual else None,
+ dispatch_user_id=user.id if user else None,
+ signal_id=signal.id,
+ owner=user.email if user else None,
+ pinned=True,
+ )
+ return signal
+
+
+def update(
+ *,
+ db_session: Session,
+ signal: Signal,
+ signal_in: SignalUpdate,
+ user: DispatchUser | None = None,
+ update_filters: bool = False,
+) -> Signal:
+ """Updates a signal."""
+ signal_data = signal.dict()
+ update_data = signal_in.dict(
+ exclude_unset=True,
+ exclude=excluded_attributes,
+ )
+
+ updates = defaultdict(list)
+
+ for field in signal_data:
+ if field in update_data:
+ if signal_data[field] != update_data[field] and not isinstance(
+ signal_data[field], datetime
+ ):
+ updates[field] = f"{signal_data[field]} -> {update_data[field]}"
+ setattr(signal, field, update_data[field])
+
+ if signal_in.tags:
+ tags = []
+ for t in signal_in.tags:
+ tag = tag_service.get_or_create(db_session=db_session, tag_in=t)
+ if tag not in signal.tags:
+ updates["tags-added"].append(f"{tag.tag_type.name}/{tag.name}")
+ tags.append(tag)
+ for t in signal.tags:
+ if t not in tags:
+ updates["tags-removed"].append(f"{t.tag_type.name}/{t.name}")
+ signal.tags = tags
+
+ if signal_in.entity_types:
+ entity_types = []
+ for e in signal_in.entity_types:
+ entity_type = entity_type_service.get(db_session=db_session, entity_type_id=e.id)
+ if entity_type not in signal.entity_types:
+ updates["entity_types-added"].append(entity_type.name)
+ entity_types.append(entity_type)
+ for et in signal.entity_types:
+ if et not in entity_types:
+ updates["entity_types-removed"].append(et.name)
+ signal.entity_types = entity_types
+
+ if signal_in.engagements:
+ engagements = []
+ for signal_engagement_in in signal_in.engagements:
+ signal_engagement = get_signal_engagement_by_name_or_raise(
+ db_session=db_session,
+ project_id=signal.project.id,
+ signal_engagement_in=signal_engagement_in,
+ )
+ if signal_engagement not in signal.engagements:
+ updates["engagements-added"].append(signal_engagement.name)
+ engagements.append(signal_engagement)
+ for se in signal.engagements:
+ if se not in engagements:
+ updates["engagements-removed"].append(se.name)
+ signal.engagements = engagements
+
+ # if update_filters, use only the filters from the signal_in, otherwise use the existing filters and add new filters
+ filter_set = set() if update_filters else set(signal.filters)
+ for f in signal_in.filters:
+ signal_filter = get_signal_filter_by_name_or_raise(
+ db_session=db_session, project_id=signal.project.id, signal_filter_in=f
+ )
+ if signal_filter not in signal.filters:
+ updates["filters-added"].append(signal_filter.name)
+ filter_set.add(signal_filter)
+ elif update_filters:
+ filter_set.add(signal_filter)
+ for f in signal.filters:
+ if f not in filter_set:
+ updates["filters-removed"].append(f.name)
+ signal.filters = list(filter_set)
+
+ if signal_in.workflows:
+ workflows = []
+ for w in signal_in.workflows:
+ workflow = workflow_service.get_by_name_or_raise(db_session=db_session, workflow_in=w)
+ if workflow not in signal.workflows:
+ updates["workflows-added"].append(workflow.name)
+ workflows.append(workflow)
+ for w in signal.workflows:
+ if w not in workflows:
+ updates["workflows-removed"].append(w.name)
+ signal.workflows = workflows
+
+ if signal_in.oncall_service:
+ oncall_service = service_service.get(
+ db_session=db_session, service_id=signal_in.oncall_service.id
+ )
+ if signal.oncall_service != oncall_service:
+ from_service = signal.oncall_service.name if signal.oncall_service else "None"
+ to_service = oncall_service.name if oncall_service else "None"
+ updates["oncall_service"] = f"{from_service} -> {to_service}"
+ signal.oncall_service = oncall_service
+
+ if signal_in.case_priority:
+ case_priority = case_priority_service.get_by_name_or_default(
+ db_session=db_session,
+ project_id=signal.project.id,
+ case_priority_in=signal_in.case_priority,
+ )
+ if signal.case_priority != case_priority:
+ from_case_priority = signal.case_priority.name if signal.case_priority else "None"
+ to_case_priority = case_priority.name if case_priority else "None"
+ updates["case_priority"] = f"{from_case_priority} -> {to_case_priority}"
+ signal.case_priority = case_priority
+
+ if signal_in.case_type:
+ case_type = case_type_service.get_by_name_or_default(
+ db_session=db_session, project_id=signal.project.id, case_type_in=signal_in.case_type
+ )
+ if signal.case_type != case_type:
+ from_case_type = signal.case_type.name if signal.case_type else "None"
+ to_case_type = case_type.name if case_type else "None"
+ updates["case_type"] = f"{from_case_type} -> {to_case_type}"
+ signal.case_type = case_type
+
+ db_session.commit()
+
+ # only log if something changed
+ if updates:
+ individual = (
+ individual_service.get_by_email_and_project(
+ db_session=db_session, email=user.email, project_id=signal.project.id
+ )
+ if user
+ else None
+ )
+
+ event_service.log_signal_event(
+ db_session=db_session,
+ source="Dispatch Core App",
+ description="Signal updated",
+ details=updates,
+ individual_id=individual.id if individual else None,
+ dispatch_user_id=user.id if user else None,
+ signal_id=signal.id,
+ owner=user.email if user else None,
+ pinned=True,
+ )
+
+ return signal
+
+
+def delete(*, db_session: Session, signal_id: int):
+ """Deletes a signal definition."""
+ signal = db_session.query(Signal).filter(Signal.id == signal_id).one()
+ db_session.delete(signal)
+ db_session.commit()
+ return signal_id
+
+
+def is_valid_uuid(value) -> bool:
+ """
+ Checks if the provided value is a valid UUID.
+
+ Args:
+ val: The value to be checked.
+
+ Returns:
+ bool: True if the value is a valid UUID, False otherwise.
+ """
+ try:
+ uuid.UUID(str(value), version=4)
+ return True
+ except ValueError:
+ return False
+
+
+def create_instance(
+ *, db_session: Session, signal_instance_in: SignalInstanceCreate
+) -> SignalInstance:
+ """Creates a new signal instance."""
+ project = project_service.get_by_name_or_raise(
+ db_session=db_session, project_in=signal_instance_in.project
+ )
+
+ signal = get(db_session=db_session, signal_id=signal_instance_in.signal.id)
+
+ # remove non-serializable entities from the raw JSON:
+ signal_instance_in_raw = signal_instance_in.raw.copy()
+ if signal_instance_in.oncall_service:
+ signal_instance_in_raw.pop("oncall_service")
+
+ # we round trip the raw data to json-ify date strings
+ signal_instance = SignalInstance(
+ **signal_instance_in.dict(
+ exclude={
+ "case",
+ "case_priority",
+ "case_type",
+ "entities",
+ "external_id",
+ "oncall_service",
+ "project",
+ "raw",
+ "signal",
+ }
+ ),
+ raw=json.loads(json.dumps(signal_instance_in_raw)),
+ project=project,
+ signal=signal,
+ )
+
+ # if the signal has an existing uuid we propgate it as our primary key
+ if signal_instance_in.raw:
+ if signal_instance_in.raw.get("id"):
+ signal_instance.id = signal_instance_in.raw["id"]
+
+ if signal_instance.id and not is_valid_uuid(signal_instance.id):
+ msg = f"Invalid signal id format. Expecting UUIDv4 format. Signal id: {signal_instance.id}. Signal name/variant: {signal_instance.raw['name'] if signal_instance and signal_instance.raw and signal_instance.raw.get('name') else signal_instance.raw['variant']}"
+ log.exception(msg)
+ raise HTTPException(
+ status_code=status.HTTP_400_BAD_REQUEST,
+ detail=[{"msg": msg}],
+ ) from None
+
+ if signal_instance_in.case_priority:
+ case_priority = case_priority_service.get_by_name_or_default(
+ db_session=db_session,
+ project_id=project.id,
+ case_priority_in=signal_instance_in.case_priority,
+ )
+ signal_instance.case_priority = case_priority
+
+ if signal_instance_in.case_type:
+ case_type = case_type_service.get_by_name_or_default(
+ db_session=db_session,
+ project_id=project.id,
+ case_type_in=signal_instance_in.case_type,
+ )
+ signal_instance.case_type = case_type
+
+ if signal_instance_in.oncall_service:
+ oncall_service = service_service.get_by_name(
+ db_session=db_session,
+ project_id=project.id,
+ name=signal_instance_in.oncall_service.name,
+ )
+ signal_instance.oncall_service = oncall_service
+
+ db_session.add(signal_instance)
+ return signal_instance
+
+
+def update_instance(
+ *, db_session: Session, signal_instance_in: SignalInstanceCreate
+) -> SignalInstance:
+ """Updates an existing signal instance."""
+ if signal_instance_in.raw:
+ if signal_instance_in.raw.get("id"):
+ signal_instance_id = signal_instance_in.raw["id"]
+
+ signal_instance = get_signal_instance(
+ db_session=db_session, signal_instance_id=signal_instance_id
+ )
+ signal_instance.raw = json.loads(json.dumps(signal_instance_in.raw))
+
+ db_session.commit()
+ return signal_instance
+
+
+def filter_snooze(*, db_session: Session, signal_instance: SignalInstance) -> SignalInstance:
+ """
+ Apply snooze filter actions to the signal instance.
+
+ Args:
+ db_session (Session): Database session.
+ signal_instance (SignalInstance): Signal instance to be filtered.
+
+ Returns:
+ SignalInstance: The filtered signal instance.
+ """
+ for f in signal_instance.signal.filters:
+ if not f.mode:
+ log.warning(f"Signal filter {f.name} has no mode")
+ continue
+
+ if f.mode != SignalFilterMode.active:
+ continue
+
+ if not f.action:
+ log.warning(f"Signal filter {f.name} has no action")
+ continue
+
+ if f.action != SignalFilterAction.snooze:
+ continue
+
+ if not f.expiration:
+ log.warning(f"Signal filter {f.name} has no expiration date")
+ continue
+
+ if f.expiration.replace(tzinfo=timezone.utc) <= datetime.now(timezone.utc):
+ continue
+
+ query = db_session.query(SignalInstance).filter(
+ SignalInstance.signal_id == signal_instance.signal_id
+ )
+ query = apply_filter_specific_joins(SignalInstance, f.expression, query)
+ query = apply_filters(query, f.expression)
+ # an expression is not required for snoozing, if absent we snooze regardless of entity
+ if f.expression:
+ instances = query.filter(SignalInstance.id == signal_instance.id).all()
+
+ if instances:
+ signal_instance.filter_action = SignalFilterAction.snooze
+ break
+ else:
+ signal_instance.filter_action = SignalFilterAction.snooze
+ break
+
+ return signal_instance
+
+
+def filter_dedup(*, db_session: Session, signal_instance: SignalInstance) -> SignalInstance:
+ """
+ Apply deduplication filter actions to the signal instance.
+
+ Args:
+ db_session (Session): Database session.
+ signal_instance (SignalInstance): Signal instance to be filtered.
+
+ Returns:
+ SignalInstance: The filtered signal instance.
+ """
+ # Skip deduplication on canary signals
+ if signal_instance.canary:
+ return signal_instance
+
+ if not signal_instance.signal.filters:
+ default_dedup_window = datetime.now(timezone.utc) - timedelta(hours=1)
+ instance = (
+ db_session.query(SignalInstance)
+ .filter(
+ SignalInstance.signal_id == signal_instance.signal_id,
+ SignalInstance.created_at >= default_dedup_window,
+ SignalInstance.id != signal_instance.id,
+ SignalInstance.case_id.isnot(None), # noqa
+ ~SignalInstance.canary, # Ignore canary signals in deduplication
+ )
+ .with_entities(SignalInstance.case_id)
+ .order_by(desc(SignalInstance.created_at))
+ .first()
+ )
+
+ if instance:
+ signal_instance.case_id = instance.case_id
+ signal_instance.filter_action = SignalFilterAction.deduplicate
+ return signal_instance
+
+ for f in signal_instance.signal.filters:
+ if f.mode != SignalFilterMode.active:
+ continue
+
+ if f.action != SignalFilterAction.deduplicate:
+ continue
+
+ query = db_session.query(SignalInstance).filter(
+ SignalInstance.signal_id == signal_instance.signal_id,
+ ~SignalInstance.canary, # Ignore canary signals in deduplication
+ )
+ # First join entities
+ query = query.join(SignalInstance.entities)
+
+ # Then join entity_type through entities
+ query = query.join(Entity.entity_type)
+
+ # Now apply filters
+ query = apply_filters(query, f.expression)
+
+ window = datetime.now(timezone.utc) - timedelta(minutes=f.window)
+ query = query.filter(SignalInstance.created_at >= window)
+ query = query.join(SignalInstance.entities).filter(
+ Entity.id.in_([e.id for e in signal_instance.entities])
+ )
+ query = query.filter(SignalInstance.id != signal_instance.id)
+
+ # get the earliest instance
+ query = query.order_by(asc(SignalInstance.created_at))
+ instances = query.all()
+
+ if instances:
+ # associate with existing case
+ signal_instance.case_id = instances[0].case_id
+ signal_instance.filter_action = SignalFilterAction.deduplicate
+ break
+
+ return signal_instance
+
+
+def filter_signal(*, db_session: Session, signal_instance: SignalInstance) -> bool:
+ """
+ Apply filter actions to the signal instance.
+
+ The function first checks if the signal instance is snoozed. If not snoozed,
+ it checks for a deduplication rule set on the signal instance. If no
+ deduplication rule is set, a default deduplication rule is applied,
+ grouping all signal instances together for a 1-hour window, regardless of
+ the entities in the signal instance.
+
+ Args:
+ db_session (Session): Database session.
+ signal_instance (SignalInstance): Signal instance to be filtered.
+
+ Returns:
+ bool: True if the signal instance is filtered, False otherwise.
+ """
+ filtered = False
+
+ signal_instance = filter_snooze(db_session=db_session, signal_instance=signal_instance)
+
+ # we only dedupe if we haven't been snoozed
+ if not signal_instance.filter_action:
+ signal_instance = filter_dedup(db_session=db_session, signal_instance=signal_instance)
+
+ if not signal_instance.filter_action:
+ signal_instance.filter_action = SignalFilterAction.none
+ else:
+ filtered = True
+
+ db_session.commit()
+ return filtered
+
+
+def get_unprocessed_signal_instance_ids(session: Session) -> list[int]:
+ """Retrieves IDs of unprocessed signal instances from the database.
+
+ Args:
+ session (Session): The database session.
+
+ Returns:
+ list[int]: A list of signal instance IDs that need processing.
+ """
+ return (
+ session.query(SignalInstance.id)
+ .filter(SignalInstance.filter_action == None) # noqa
+ .filter(SignalInstance.case_id == None) # noqa
+ .order_by(SignalInstance.created_at.asc())
+ .limit(500)
+ .all()
+ )
+
+
+def get_instances_in_case(db_session: Session, case_id: int) -> Query:
+ """
+ Retrieves signal instances associated with a given case.
+
+ Args:
+ db_session (Session): The database session.
+ case_id (int): The ID of the case.
+
+ Returns:
+ Query: A SQLAlchemy query object for the signal instances associated with the case.
+ """
+ return (
+ db_session.query(SignalInstance, Signal)
+ .join(Signal)
+ .with_entities(SignalInstance.id, Signal)
+ .filter(SignalInstance.case_id == case_id)
+ .order_by(SignalInstance.created_at)
+ )
+
+
+def get_cases_for_signal(db_session: Session, signal_id: int, limit: int = 10) -> Query:
+ """
+ Retrieves cases associated with a given signal.
+
+ Args:
+ db_session (Session): The database session.
+ signal_id (int): The ID of the signal.
+ limit (int, optional): The maximum number of cases to retrieve. Defaults to 10.
+
+ Returns:
+ Query: A SQLAlchemy query object for the cases associated with the signal.
+ """
+ return (
+ db_session.query(Case)
+ .join(SignalInstance)
+ .filter(SignalInstance.signal_id == signal_id)
+ .order_by(desc(Case.created_at))
+ .limit(limit)
+ )
+
+
+def get_cases_for_signal_by_resolution_reason(
+ db_session: Session, signal_id: int, resolution_reason: str, limit: int = 10
+) -> Query:
+ """
+ Retrieves cases associated with a given signal and resolution reason.
+
+ Args:
+ db_session (Session): The database session.
+ signal_id (int): The ID of the signal.
+ resolution_reason (str): The resolution reason to filter cases by.
+ limit (int, optional): The maximum number of cases to retrieve. Defaults to 10.
+
+ Returns:
+ Query: A SQLAlchemy query object for the cases associated with the signal and resolution reason.
+ """
+ return (
+ db_session.query(Case)
+ .join(SignalInstance)
+ .filter(SignalInstance.signal_id == signal_id)
+ .filter(Case.resolution_reason == resolution_reason)
+ .order_by(desc(Case.created_at))
+ .limit(limit)
+ )
+
+
+def get_signal_stats(
+ *,
+ db_session: Session,
+ entity_value: str,
+ entity_type_id: int,
+ signal_id: int | None = None,
+ num_days: int | None = None,
+) -> SignalStats | None:
+ """
+ Gets signal statistics for a given named entity and type.
+
+ If signal_id is provided, only returns stats for that specific signal definition.
+ """
+ entity_subquery = (
+ db_session.query(
+ func.jsonb_build_array(
+ func.jsonb_build_object(
+ "or",
+ func.jsonb_build_array(
+ func.jsonb_build_object(
+ "model", "Entity", "field", "id", "op", "==", "value", Entity.id
+ )
+ ),
+ )
+ )
+ )
+ .filter(and_(Entity.value == entity_value, Entity.entity_type_id == entity_type_id))
+ .as_scalar()
+ )
+
+ active_count = func.count().filter(SignalFilter.expiration > func.current_date())
+ expired_count = func.count().filter(SignalFilter.expiration <= func.current_date())
+
+ query = db_session.query(
+ active_count.label("active_count"), expired_count.label("expired_count")
+ ).filter(cast(SignalFilter.expression, JSONB).op("@>")(entity_subquery))
+
+ snooze_result = db_session.execute(query).fetchone()
+
+ # Calculate the date threshold based on num_days
+ date_threshold = datetime.utcnow() - timedelta(days=num_days) if num_days is not None else None
+
+ count_with_snooze = func.count().filter(SignalInstance.filter_action == "snooze")
+ count_without_snooze = func.count().filter(
+ (SignalInstance.filter_action != "snooze") | (SignalInstance.filter_action.is_(None))
+ )
+
+ query = (
+ select(
+ count_with_snooze.label("count_with_snooze"),
+ count_without_snooze.label("count_without_snooze"),
+ )
+ .select_from(
+ assoc_signal_instance_entities.join(
+ Entity, assoc_signal_instance_entities.c.entity_id == Entity.id
+ ).join(
+ SignalInstance,
+ assoc_signal_instance_entities.c.signal_instance_id == SignalInstance.id,
+ )
+ )
+ .where(
+ and_(
+ Entity.value == entity_value,
+ Entity.entity_type_id == entity_type_id,
+ SignalInstance.signal_id == signal_id if signal_id else True,
+ SignalInstance.created_at >= date_threshold if date_threshold else True,
+ )
+ )
+ )
+
+ signal_result = db_session.execute(query).fetchone()
+
+ return SignalStats(
+ num_signal_instances_alerted=signal_result.count_without_snooze,
+ num_signal_instances_snoozed=signal_result.count_with_snooze,
+ num_snoozes_active=snooze_result.active_count,
+ num_snoozes_expired=snooze_result.expired_count,
+ )
diff --git a/src/dispatch/signal/views.py b/src/dispatch/signal/views.py
new file mode 100644
index 000000000000..bb7684fcf6d6
--- /dev/null
+++ b/src/dispatch/signal/views.py
@@ -0,0 +1,464 @@
+import logging
+
+from fastapi import (
+ APIRouter,
+ BackgroundTasks,
+ Depends,
+ HTTPException,
+ Query,
+ Request,
+ Response,
+ status,
+)
+from pydantic import ValidationError
+from sqlalchemy.exc import IntegrityError
+
+from dispatch.auth.permissions import PermissionsDependency, SensitiveProjectActionPermission
+from dispatch.auth.service import CurrentUser
+from dispatch.database.core import DbSession
+from dispatch.database.service import CommonParameters, search_filter_sort_paginate
+from dispatch.models import OrganizationSlug, PrimaryKey
+from dispatch.project import service as project_service
+from dispatch.rate_limiter import limiter
+from dispatch.signal import service as signal_service
+
+from .models import (
+ SignalCreate,
+ SignalEngagementCreate,
+ SignalEngagementPagination,
+ SignalEngagementRead,
+ SignalEngagementUpdate,
+ SignalFilterCreate,
+ SignalFilterPagination,
+ SignalFilterRead,
+ SignalFilterUpdate,
+ SignalInstanceCreate,
+ SignalInstancePagination,
+ SignalInstanceRead,
+ SignalPagination,
+ SignalRead,
+ SignalStats,
+ SignalUpdate,
+)
+from .service import (
+ create,
+ create_signal_engagement,
+ create_signal_filter,
+ delete,
+ delete_signal_filter,
+ get,
+ get_by_primary_or_external_id,
+ get_signal_stats,
+ get_signal_engagement,
+ get_signal_filter,
+ update,
+ update_signal_engagement,
+ update_signal_filter,
+)
+
+router = APIRouter()
+
+log = logging.getLogger(__name__)
+
+
+@router.get("/instances", response_model=SignalInstancePagination)
+def get_signal_instances(common: CommonParameters):
+ """Gets all signal instances."""
+ return search_filter_sort_paginate(model="SignalInstance", **common)
+
+
+@router.post("/instances", response_model=SignalInstanceRead)
+@limiter.limit("1000/minute")
+def create_signal_instance(
+ db_session: DbSession,
+ organization: OrganizationSlug,
+ signal_instance_in: SignalInstanceCreate,
+ background_tasks: BackgroundTasks,
+ request: Request,
+ response: Response,
+):
+ """Creates a new signal instance."""
+ # TODO this should be refactored to use the signal service
+ project = project_service.get_by_name_or_default(
+ db_session=db_session, project_in=signal_instance_in.project
+ )
+
+ if not signal_instance_in.signal:
+ # we try to get the signal definition by external id or variant
+ external_id = signal_instance_in.raw.get("externalId")
+ variant = signal_instance_in.raw.get("variant")
+ signal_definition = signal_service.get_by_variant_or_external_id(
+ db_session=db_session,
+ project_id=project.id,
+ external_id=external_id,
+ variant=variant,
+ )
+
+ if not signal_definition:
+ # we get the default signal definition
+ signal_definition = signal_service.get_default(
+ db_session=db_session,
+ project_id=project.id,
+ )
+ msg = f"Default signal definition used for signal instance with external id {external_id} or variant {variant}."
+ log.warn(msg)
+
+ if not signal_definition:
+ msg = f"No signal definition could be found by external id {external_id} or variant {variant}, and no default exists."
+ log.warn(msg)
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": msg}],
+ ) from None
+
+ signal_instance_in.signal = signal_definition
+
+ try:
+ signal_instance = signal_service.create_instance(
+ db_session=db_session, signal_instance_in=signal_instance_in
+ )
+ db_session.commit()
+ except IntegrityError:
+ msg = f"A signal instance with this id already exists. Id: {signal_instance_in.raw.get('id')}. Variant: {signal_instance_in.raw.get('variant')}"
+ log.warn(msg)
+ raise HTTPException(
+ status_code=status.HTTP_409_CONFLICT,
+ detail=[{"msg": msg}],
+ ) from None
+
+ return signal_instance
+
+
+@router.get("/filters", response_model=SignalFilterPagination)
+def get_signal_filters(common: CommonParameters):
+ """Gets all signal filters."""
+ return search_filter_sort_paginate(model="SignalFilter", **common)
+
+
+@router.get("/engagements", response_model=SignalEngagementPagination)
+def get_signal_engagements(common: CommonParameters):
+ """Gets all signal engagements."""
+ return search_filter_sort_paginate(model="SignalEngagement", **common)
+
+
+@router.get("/engagements/{engagement_id}", response_model=SignalEngagementRead)
+def get_engagement(
+ db_session: DbSession,
+ signal_engagement_id: PrimaryKey,
+):
+ """Gets a signal engagement by its id."""
+ engagement = get_signal_engagement(
+ db_session=db_session, signal_engagement_id=signal_engagement_id
+ )
+ if not engagement:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A signal engagement with this id does not exist."}],
+ )
+ return engagement
+
+
+@router.post("/engagements", response_model=SignalEngagementRead)
+def create_engagement(
+ db_session: DbSession,
+ signal_engagement_in: SignalEngagementCreate,
+ current_user: CurrentUser,
+):
+ """Creates a new signal engagement."""
+ try:
+ return create_signal_engagement(
+ db_session=db_session, creator=current_user, signal_engagement_in=signal_engagement_in
+ )
+ except IntegrityError:
+ raise HTTPException(
+ status_code=status.HTTP_409_CONFLICT,
+ detail=[{"msg": "A signal engagement with this name already exists."}],
+ ) from None
+
+
+@router.put(
+ "/engagements/{signal_engagement_id}",
+ response_model=SignalEngagementRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_engagement(
+ db_session: DbSession,
+ signal_engagement_id: PrimaryKey,
+ signal_engagement_in: SignalEngagementUpdate,
+):
+ """Updates an existing signal engagement."""
+ signal_engagement = get_signal_engagement(
+ db_session=db_session, signal_engagement_id=signal_engagement_id
+ )
+ if not signal_engagement:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A signal engagement with this id does not exist."}],
+ )
+
+ try:
+ signal_engagement = update_signal_engagement(
+ db_session=db_session,
+ signal_engagement=signal_engagement,
+ signal_engagement_in=signal_engagement_in,
+ )
+ except IntegrityError:
+ raise HTTPException(
+ status_code=status.HTTP_409_CONFLICT,
+ detail=[{"msg": "A signal engagement with this name already exists."}],
+ ) from None
+
+ return signal_engagement
+
+
+@router.post("/filters", response_model=SignalFilterRead)
+def create_filter(
+ db_session: DbSession,
+ signal_filter_in: SignalFilterCreate,
+ current_user: CurrentUser,
+):
+ """Creates a new signal filter."""
+ try:
+ return create_signal_filter(
+ db_session=db_session, creator=current_user, signal_filter_in=signal_filter_in
+ )
+ except IntegrityError:
+ raise HTTPException(
+ status_code=status.HTTP_409_CONFLICT,
+ detail=[{"msg": "A signal filter with this name already exists."}],
+ ) from None
+
+
+@router.put(
+ "/filters/{signal_filter_id}",
+ response_model=SignalFilterRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_filter(
+ db_session: DbSession,
+ signal_filter_id: PrimaryKey,
+ signal_filter_in: SignalFilterUpdate,
+):
+ """Updates an existing signal filter."""
+ signal_filter = get_signal_filter(db_session=db_session, signal_filter_id=signal_filter_id)
+ if not signal_filter:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A signal filter with this id does not exist."}],
+ )
+
+ try:
+ signal_filter = update_signal_filter(
+ db_session=db_session, signal_filter=signal_filter, signal_filter_in=signal_filter_in
+ )
+ except IntegrityError:
+ raise HTTPException(
+ status_code=status.HTTP_409_CONFLICT,
+ detail=[{"msg": "A signal filter with this name already exists."}],
+ ) from None
+
+ return signal_filter
+
+
+@router.delete(
+ "/filters/{signal_filter_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def delete_filter(db_session: DbSession, signal_filter_id: PrimaryKey):
+ """Deletes a signal filter."""
+ signal_filter = get(db_session=db_session, signal_filter_id=signal_filter_id)
+ if not signal_filter:
+ raise HTTPException(
+ status_code=status.HTTP_404_NOT_FOUND,
+ detail=[{"msg": "A signal filter with this id does not exist."}],
+ )
+ delete_signal_filter(db_session=db_session, signal_filter_id=signal_filter_id)
+
+
+@router.get("", response_model=SignalPagination)
+def get_signals(common: CommonParameters):
+ """Gets all signal definitions."""
+ return search_filter_sort_paginate(model="Signal", **common)
+
+
+@router.get("/stats", response_model=SignalStats)
+def return_signal_stats(
+ db_session: DbSession,
+ entity_value: str = Query(..., description="The name of the entity"),
+ entity_type_id: int = Query(..., description="The ID of the entity type"),
+ num_days: int = Query(None, description="The number of days to look back"),
+):
+ """Gets signal statistics given a named entity and entity type id."""
+ signal_data = get_signal_stats(
+ db_session=db_session,
+ entity_value=entity_value,
+ entity_type_id=entity_type_id,
+ num_days=num_days,
+ )
+ return signal_data
+
+
+@router.get("/{signal_id}/stats", response_model=SignalStats)
+def return_single_signal_stats(
+ db_session: DbSession,
+ signal_id: str | PrimaryKey,
+ entity_value: str = Query(..., description="The name of the entity"),
+ entity_type_id: int = Query(..., description="The ID of the entity type"),
+ num_days: int = Query(None, description="The number of days to look back"),
+):
+ """Gets signal statistics for a specific signal given a named entity and entity type id."""
+ signal = get_by_primary_or_external_id(db_session=db_session, signal_id=signal_id)
+ if not signal:
+ raise ValidationError.from_exception_data(
+ "SignalRead",
+ [
+ {
+ "type": "value_error",
+ "loc": ("signal",),
+ "input": signal_id,
+ "ctx": {"error": ValueError("Signal not found.")},
+ }
+ ],
+ )
+
+ signal_data = get_signal_stats(
+ db_session=db_session,
+ entity_value=entity_value,
+ entity_type_id=entity_type_id,
+ signal_id=signal.id,
+ num_days=num_days,
+ )
+ return signal_data
+
+
+@router.get("/{signal_id}", response_model=SignalRead)
+def get_signal(db_session: DbSession, signal_id: str | PrimaryKey):
+ """Gets a signal by its id."""
+ signal = get_by_primary_or_external_id(db_session=db_session, signal_id=signal_id)
+ if not signal:
+ raise ValidationError.from_exception_data(
+ "SignalRead",
+ [
+ {
+ "type": "value_error",
+ "loc": ("signal",),
+ "input": signal_id,
+ "ctx": {"error": ValueError("Signal not found.")},
+ }
+ ],
+ )
+ return signal
+
+
+@router.post("", response_model=SignalRead)
+def create_signal(db_session: DbSession, signal_in: SignalCreate, current_user: CurrentUser):
+ """Creates a new signal."""
+ return create(db_session=db_session, signal_in=signal_in, user=current_user)
+
+
+def _update_signal(
+ db_session: DbSession,
+ signal_id: str | PrimaryKey,
+ signal_in: SignalUpdate,
+ current_user: CurrentUser,
+ update_filters: bool = False,
+):
+ signal = get_by_primary_or_external_id(db_session=db_session, signal_id=signal_id)
+ if not signal:
+ raise ValidationError.from_exception_data(
+ "SignalRead",
+ [
+ {
+ "type": "value_error",
+ "loc": ("signal",),
+ "input": signal_id,
+ "ctx": {"error": ValueError("Signal not found.")},
+ }
+ ],
+ )
+
+ try:
+ signal = update(
+ db_session=db_session,
+ signal=signal,
+ signal_in=signal_in,
+ user=current_user,
+ update_filters=update_filters,
+ )
+ except IntegrityError:
+ raise ValidationError(
+ [
+ {
+ "msg": "A signal with this name already exists.",
+ "loc": "name",
+ }
+ ]
+ ) from None
+
+ return signal
+
+
+@router.put(
+ "/{signal_id}",
+ response_model=SignalRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_signal(
+ db_session: DbSession,
+ signal_id: str | PrimaryKey,
+ signal_in: SignalUpdate,
+ current_user: CurrentUser,
+):
+ """Updates an existing signal from API, no filters are updated."""
+ return _update_signal(
+ db_session=db_session,
+ signal_id=signal_id,
+ signal_in=signal_in,
+ current_user=current_user,
+ update_filters=False,
+ )
+
+
+@router.put(
+ "/update/{signal_id}",
+ response_model=SignalRead,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def update_signal_with_filters(
+ db_session: DbSession,
+ signal_id: str | PrimaryKey,
+ signal_in: SignalUpdate,
+ current_user: CurrentUser,
+):
+ """Updates an existing signal from the UI, also updates filters."""
+ return _update_signal(
+ db_session=db_session,
+ signal_id=signal_id,
+ signal_in=signal_in,
+ current_user=current_user,
+ update_filters=True,
+ )
+
+
+@router.delete(
+ "/{signal_id}",
+ response_model=None,
+ dependencies=[Depends(PermissionsDependency([SensitiveProjectActionPermission]))],
+)
+def delete_signal(db_session: DbSession, signal_id: str | PrimaryKey):
+ """Deletes a signal."""
+ signal = get_by_primary_or_external_id(db_session=db_session, signal_id=signal_id)
+ if not signal:
+ raise ValidationError.from_exception_data(
+ "SignalRead",
+ [
+ {
+ "type": "value_error",
+ "loc": ("signal",),
+ "input": signal_id,
+ "ctx": {"error": ValueError("Signal not found.")},
+ }
+ ],
+ )
+ delete(db_session=db_session, signal_id=signal.id)
diff --git a/src/dispatch/static/dispatch/.env.example_vue_app b/src/dispatch/static/dispatch/.env.example_vue_app
deleted file mode 100644
index bb70ffc4021e..000000000000
--- a/src/dispatch/static/dispatch/.env.example_vue_app
+++ /dev/null
@@ -1,3 +0,0 @@
-# Authentication
-VUE_APP_DISPATCH_OPEN_ID_CONNECT_URL=""
-VUE_APP_DISPATCH_CLIENT_ID=""
diff --git a/src/dispatch/static/dispatch/.eslintrc.js b/src/dispatch/static/dispatch/.eslintrc.js
new file mode 100644
index 000000000000..70bd54f99fec
--- /dev/null
+++ b/src/dispatch/static/dispatch/.eslintrc.js
@@ -0,0 +1,60 @@
+module.exports = {
+ root: true,
+ plugins: ["eslint-plugin-local-rules", "@typescript-eslint"],
+ extends: [
+ "eslint:recommended",
+ "plugin:prettier/recommended",
+ "plugin:vue/vue3-strongly-recommended",
+ "plugin:vuetify/base",
+ ],
+ parserOptions: {
+ ecmaVersion: 2020,
+ parser: "@typescript-eslint/parser",
+ },
+ env: {
+ browser: true,
+ es2021: true,
+ node: true,
+ },
+ overrides: [
+ {
+ files: ["test/*"],
+ rules: {
+ "no-undef": "off",
+ },
+ },
+ ],
+ rules: {
+ "local-rules/icon-button-variant": "error",
+ // "local-rules/list-item-children": "error",
+ // "local-rules/vee-validate": "error",
+
+ // Conflicts with prettier
+ "vue/max-attributes-per-line": "off",
+ "vue/singleline-html-element-content-newline": "off",
+ "vue/html-self-closing": [
+ "warn",
+ {
+ html: {
+ void: "any",
+ },
+ },
+ ],
+ "vue/html-closing-bracket-newline": "off",
+ "vue/html-indent": "off",
+ "vue/script-indent": "off",
+
+ // Bad defaults
+ "vue/valid-v-slot": [
+ "error",
+ {
+ allowModifiers: true,
+ },
+ ],
+ "vue/multi-word-component-names": "off",
+ "vue/attribute-hyphenation": "off",
+ "vue/require-default-prop": "off",
+ "vue/require-explicit-emits": "off",
+ "vuetify/no-deprecated-components": "off",
+ },
+}
diff --git a/src/dispatch/static/dispatch/.npmrc b/src/dispatch/static/dispatch/.npmrc
new file mode 100644
index 000000000000..ff753f54526a
--- /dev/null
+++ b/src/dispatch/static/dispatch/.npmrc
@@ -0,0 +1,2 @@
+//registry.npmjs.org/:_authToken=${FORMKIT_ENTERPRISE_TOKEN}
+
diff --git a/src/dispatch/static/dispatch/babel.config.js b/src/dispatch/static/dispatch/babel.config.js
deleted file mode 100644
index 3ecebf1a5205..000000000000
--- a/src/dispatch/static/dispatch/babel.config.js
+++ /dev/null
@@ -1,3 +0,0 @@
-module.exports = {
- presets: ["@vue/app"]
-};
diff --git a/src/dispatch/static/dispatch/components.d.ts b/src/dispatch/static/dispatch/components.d.ts
new file mode 100644
index 000000000000..cd9ce5015d8e
--- /dev/null
+++ b/src/dispatch/static/dispatch/components.d.ts
@@ -0,0 +1,135 @@
+// generated by unplugin-vue-components
+// We suggest you to commit this file into source control
+// Read more: https://github.com/vuejs/core/pull/3399
+import '@vue/runtime-core'
+
+export {}
+
+declare module '@vue/runtime-core' {
+ export interface GlobalComponents {
+ AdminLayout: typeof import('./src/components/layouts/AdminLayout.vue')['default']
+ AnimatedNumber: typeof import("./src/components/AnimatedNumber.vue")["default"]
+ AppDrawer: typeof import('./src/components/AppDrawer.vue')['default']
+ AppToolbar: typeof import('./src/components/AppToolbar.vue')['default']
+ AutoComplete: typeof import('./src/components/AutoComplete.vue')['default']
+ Avatar: typeof import("./src/components/Avatar.vue")["default"]
+ BaseCombobox: typeof import('./src/components/BaseCombobox.vue')['default']
+ BasicLayout: typeof import('./src/components/layouts/BasicLayout.vue')['default']
+ ColorPickerInput: typeof import('./src/components/ColorPickerInput.vue')['default']
+ DashboardLayout: typeof import('./src/components/layouts/DashboardLayout.vue')['default']
+ DateChipGroupRelative: typeof import('./src/components/DateChipGroupRelative.vue')['default']
+ DateTimePicker: typeof import('./src/components/DateTimePicker.vue')['default']
+ DateTimePickerMenu: typeof import('./src/components/DateTimePickerMenu.vue')['default']
+ DateWindowInput: typeof import('./src/components/DateWindowInput.vue')['default']
+ DefaultLayout: typeof import('./src/components/layouts/DefaultLayout.vue')['default']
+ DMenu: typeof import('./src/components/DMenu.vue')['default']
+ DTooltip: typeof import('./src/components/DTooltip.vue')['default']
+ GenaiAnalysisDisplay: typeof import('./src/components/GenaiAnalysisDisplay.vue')['default']
+ IconPickerInput: typeof import('./src/components/IconPickerInput.vue')['default']
+ InfoWidget: typeof import('./src/components/InfoWidget.vue')['default']
+ Loading: typeof import('./src/components/Loading.vue')['default']
+ LockButton: typeof import('./src/components/LockButton.vue')['default']
+ MonacoEditor: typeof import('./src/components/MonacoEditor.vue')['default']
+ NotificationSnackbarsWrapper: typeof import('./src/components/NotificationSnackbarsWrapper.vue')['default']
+ PageHeader: typeof import("./src/components/PageHeader.vue")["default"]
+ ParticipantAutoComplete: typeof import("./src/components/ParticipantAutoComplete.vue")["default"]
+ ParticipantSelect: typeof import('./src/components/ParticipantSelect.vue')['default']
+ PreciseDateTimePicker: typeof import('./src/components/PreciseDateTimePicker.vue')['default']
+ ProjectAutoComplete: typeof import("./src/components/ProjectAutoComplete.vue")["default"]
+ Refresh: typeof import('./src/components/Refresh.vue')['default']
+ RichEditor: typeof import('./src/components/RichEditor.vue')['default']
+ RouterLink: typeof import('vue-router')['RouterLink']
+ RouterView: typeof import('vue-router')['RouterView']
+ SavingState: typeof import('./src/components/SavingState.vue')['default']
+ SearchPopover: typeof import('./src/components/SearchPopover.vue')['default']
+ SettingsBreadcrumbs: typeof import('./src/components/SettingsBreadcrumbs.vue')['default']
+ ShepherdStep: typeof import("./src/components/ShepherdStep.vue")["default"]
+ ShpherdStep: typeof import("./src/components/ShpherdStep.vue")["default"]
+ StatWidget: typeof import('./src/components/StatWidget.vue')['default']
+ SubjectLastUpdated: typeof import("./src/components/SubjectLastUpdated.vue")["default"]
+ TimePicker: typeof import("./src/components/TimePicker.vue")["default"]
+ VAlert: typeof import("vuetify/lib")["VAlert"]
+ VApp: typeof import("vuetify/lib")["VApp"]
+ VAppBar: typeof import("vuetify/lib")["VAppBar"]
+ VAutocomplete: typeof import("vuetify/lib")["VAutocomplete"]
+ VAvatar: typeof import("vuetify/lib")["VAvatar"]
+ VBadge: typeof import("vuetify/lib")["VBadge"]
+ VBottomSheet: typeof import("vuetify/lib")["VBottomSheet"]
+ VBreadcrumbs: typeof import("vuetify/lib")["VBreadcrumbs"]
+ VBreadcrumbsItem: typeof import("vuetify/lib")["VBreadcrumbsItem"]
+ VBtn: typeof import("vuetify/lib")["VBtn"]
+ VCard: typeof import("vuetify/lib")["VCard"]
+ VCardActions: typeof import("vuetify/lib")["VCardActions"]
+ VCardSubtitle: typeof import("vuetify/lib")["VCardSubtitle"]
+ VCardText: typeof import("vuetify/lib")["VCardText"]
+ VCardTitle: typeof import("vuetify/lib")["VCardTitle"]
+ VCheckbox: typeof import("vuetify/lib")["VCheckbox"]
+ VChip: typeof import("vuetify/lib")["VChip"]
+ VChipGroup: typeof import("vuetify/lib")["VChipGroup"]
+ VCol: typeof import("vuetify/lib")["VCol"]
+ VColorPicker: typeof import("vuetify/lib")["VColorPicker"]
+ VCombobox: typeof import("vuetify/lib")["VCombobox"]
+ VContainer: typeof import("vuetify/lib")["VContainer"]
+ VDataTable: typeof import("vuetify/lib")["VDataTable"]
+ VDatePicker: typeof import("vuetify/lib")["VDatePicker"]
+ VDialog: typeof import("vuetify/lib")["VDialog"]
+ VDivider: typeof import("vuetify/lib")["VDivider"]
+ VExpandTransition: typeof import("vuetify/lib")["VExpandTransition"]
+ VExpansionPanel: typeof import("vuetify/lib")["VExpansionPanel"]
+ VExpansionPanelContent: typeof import("vuetify/lib")["VExpansionPanelContent"]
+ VExpansionPanelHeader: typeof import("vuetify/lib")["VExpansionPanelHeader"]
+ VExpansionPanels: typeof import("vuetify/lib")["VExpansionPanels"]
+ VFlex: typeof import("vuetify/lib")["VFlex"]
+ VForm: typeof import("vuetify/lib")["VForm"]
+ VHover: typeof import("vuetify/lib")["VHover"]
+ VIcon: typeof import("vuetify/lib")["VIcon"]
+ VItem: typeof import("vuetify/lib")["VItem"]
+ VLayout: typeof import("vuetify/lib")["VLayout"]
+ VLazy: typeof import("vuetify/lib")["VLazy"]
+ VList: typeof import("vuetify/lib")["VList"]
+ VListGroup: typeof import("vuetify/lib")["VListGroup"]
+ VListItem: typeof import("vuetify/lib")["VListItem"]
+ VListItemAction: typeof import("vuetify/lib")["VListItemAction"]
+ VListItemAvatar: typeof import("vuetify/lib")["VListItemAvatar"]
+ VListItemContent: typeof import("vuetify/lib")["VListItemContent"]
+ VListItemGroup: typeof import("vuetify/lib")["VListItemGroup"]
+ VListItemIcon: typeof import("vuetify/lib")["VListItemIcon"]
+ VListItemSubtitle: typeof import("vuetify/lib")["VListItemSubtitle"]
+ VListItemTitle: typeof import("vuetify/lib")["VListItemTitle"]
+ VMain: typeof import("vuetify/lib")["VMain"]
+ VMenu: typeof import("vuetify/lib")["VMenu"]
+ VNavigationDrawer: typeof import("vuetify/lib")["VNavigationDrawer"]
+ VProgressLinear: typeof import("vuetify/lib")["VProgressLinear"]
+ VRadio: typeof import("vuetify/lib")["VRadio"]
+ VRadioGroup: typeof import("vuetify/lib")["VRadioGroup"]
+ VRow: typeof import("vuetify/lib")["VRow"]
+ VSelect: typeof import("vuetify/lib")["VSelect"]
+ VSheet: typeof import("vuetify/lib")["VSheet"]
+ VSimpleCheckbox: typeof import("vuetify/lib")["VSimpleCheckbox"]
+ VSnackbar: typeof import("vuetify/lib")["VSnackbar"]
+ VSpacer: typeof import("vuetify/lib")["VSpacer"]
+ VStepper: typeof import("vuetify/lib")["VStepper"]
+ VStepperContent: typeof import("vuetify/lib")["VStepperContent"]
+ VStepperHeader: typeof import("vuetify/lib")["VStepperHeader"]
+ VStepperItems: typeof import("vuetify/lib")["VStepperItems"]
+ VStepperStep: typeof import("vuetify/lib")["VStepperStep"]
+ VSubheader: typeof import("vuetify/lib")["VSubheader"]
+ VSwitch: typeof import("vuetify/lib")["VSwitch"]
+ VSystemBar: typeof import("vuetify/lib")["VSystemBar"]
+ VTab: typeof import("vuetify/lib")["VTab"]
+ VTabItem: typeof import("vuetify/lib")["VTabItem"]
+ VTabs: typeof import("vuetify/lib")["VTabs"]
+ VTabsItems: typeof import("vuetify/lib")["VTabsItems"]
+ VTextarea: typeof import("vuetify/lib")["VTextarea"]
+ VTextArea: typeof import("vuetify/lib")["VTextArea"]
+ VTextField: typeof import("vuetify/lib")["VTextField"]
+ VTimeline: typeof import("vuetify/lib")["VTimeline"]
+ VTimelineItem: typeof import("vuetify/lib")["VTimelineItem"]
+ VTimePicker: typeof import("vuetify/lib")["VTimePicker"]
+ VToolbarItems: typeof import("vuetify/lib")["VToolbarItems"]
+ VToolbarTitle: typeof import("vuetify/lib")["VToolbarTitle"]
+ VTooltip: typeof import("vuetify/lib")["VTooltip"]
+ VWindow: typeof import("vuetify/lib")["VWindow"]
+ VWindowItem: typeof import("vuetify/lib")["VWindowItem"]
+ }
+}
diff --git a/src/dispatch/static/dispatch/eslint-local-rules.js b/src/dispatch/static/dispatch/eslint-local-rules.js
new file mode 100644
index 000000000000..61ec4a9f4e5e
--- /dev/null
+++ b/src/dispatch/static/dispatch/eslint-local-rules.js
@@ -0,0 +1,653 @@
+module.exports = {
+ "icon-button-variant": {
+ meta: {
+ fixable: "code",
+ },
+ create(context) {
+ const template = context.parserServices.getTemplateBodyTokenStore()
+
+ return context.parserServices.defineTemplateBodyVisitor({
+ VElement(node) {
+ if (node.name !== "v-btn") return
+
+ const attr = node.startTag.attributes.find((attr) => {
+ return attr.type === "VAttribute" && attr.key.name === "icon"
+ })
+ if (
+ attr &&
+ !node.startTag.attributes.some((attr) => {
+ return attr.type === "VAttribute" && attr.key.name === "variant"
+ })
+ ) {
+ context.report({
+ node: attr,
+ message: 'default icon button variant is contained, override to variant="text"?',
+ fix(fixer) {
+ const tokenBefore = template.getTokenBefore(attr)
+ if (attr.loc.start.line === tokenBefore.loc.start.line) {
+ return fixer.insertTextAfter(attr, ' variant="text"')
+ } else if (attr.loc.start.line === tokenBefore.loc.start.line + 1) {
+ const indent = attr.loc.start.column
+ return fixer.insertTextAfter(attr, "\n" + " ".repeat(indent) + 'variant="text"')
+ }
+ },
+ })
+ }
+ },
+ })
+ },
+ },
+ "vee-validate": {
+ meta: {
+ fixable: "code",
+ },
+ create(context) {
+ const template = context.parserServices.getTemplateBodyTokenStore()
+ let observerNode
+ let formNode
+ let observerRefNodes = []
+ const rulesToImport = new Set()
+ const vvImportNodes = []
+ const vvRuleImportNodes = []
+ const validationProviderNodes = []
+ let vueObjectNode
+ let hasSetup = false
+ const methodNodes = []
+
+ return context.parserServices.defineTemplateBodyVisitor(
+ {
+ 'VElement[parent.type!="VElement"]:exit'(node) {
+ if (rulesToImport.size) {
+ if (!vueObjectNode) {
+ return context.report({
+ node: context.sourceCode.ast,
+ message: "unable to locate vue object",
+ })
+ }
+ if (hasSetup) {
+ return context.report({
+ node: vueObjectNode,
+ message: "component already has setup defined",
+ })
+ }
+ }
+
+ if (
+ !(
+ vvImportNodes.length ||
+ vvRuleImportNodes.length ||
+ observerNode ||
+ validationProviderNodes.length ||
+ rulesToImport.size
+ )
+ )
+ return
+
+ return context.report({
+ node,
+ message: "replace validation-observer with v-form",
+ fix(fixer) {
+ const fixes = []
+
+ // remove vee-validate imports
+ vvImportNodes.forEach((node) => {
+ fixes.push(fixer.remove(node))
+ context.getDeclaredVariables(node).forEach((variable) => {
+ variable.references.forEach((reference) => {
+ fixes.push(
+ fixer.removeRange([
+ context.sourceCode.getIndexFromLoc({
+ line: reference.identifier.parent.loc.start.line,
+ column: 0,
+ }) - 1,
+ context.sourceCode.getIndexFromLoc(reference.identifier.parent.loc.end) +
+ 1,
+ ])
+ )
+ })
+ })
+ })
+ vvRuleImportNodes.forEach((node) => {
+ fixes.push(fixer.remove(node))
+ })
+
+ if (observerNode) {
+ const observerRef = observerNode.startTag.attributes.find((attr) => {
+ return attr.type === "VAttribute" && attr.key.name === "ref"
+ })
+ const observerSlot = observerNode.startTag.attributes.find((attr) => {
+ return (
+ attr.type === "VAttribute" &&
+ attr.key.type === "VDirectiveKey" &&
+ attr.key.name.name === "slot"
+ )
+ })
+ const otherAttrs = observerNode.startTag.attributes
+ .filter((attr) => {
+ return attr !== observerRef && attr !== observerSlot
+ })
+ .map((attr) => context.sourceCode.getText(attr))
+
+ let formSubmitHandlerName
+ let formSubmitHandlerNode
+ if (formNode) {
+ const submitListener = formNode.startTag.attributes.find((attr) => {
+ return (
+ attr.directive &&
+ attr.key.name.name === "on" &&
+ attr.key.argument.name === "submit"
+ )
+ })
+
+ if (submitListener) {
+ formSubmitHandlerName = submitListener.value.expression.name
+ if (formSubmitHandlerName) {
+ methodNodes.forEach((node) => {
+ if (node.key.name === formSubmitHandlerName) {
+ formSubmitHandlerNode = node
+ }
+ })
+ if (!formSubmitHandlerNode) {
+ throw new Error("Unable to locate form submit handler")
+ }
+ }
+ } else {
+ throw new Error("No submit listener")
+ }
+ }
+
+ let newStartTag = "
+ context.sourceCode.getText(attr)
+ )
+ if (formAttrs.length) {
+ newStartTag += " " + formAttrs.join(" ")
+ }
+ } else {
+ newStartTag += ` @submit.prevent`
+ }
+ if (observerSlot) {
+ newStartTag += ` v-slot="{ isValid }"`
+ }
+ newStartTag += ">"
+
+ if (formNode) {
+ fixes.push(
+ fixer.replaceTextRange(
+ [observerNode.startTag.range[0], formNode.startTag.range[1]],
+ newStartTag
+ )
+ )
+ } else {
+ fixes.push(fixer.replaceText(observerNode.startTag, newStartTag))
+ }
+
+ if (observerSlot) {
+ observerNode.variables.forEach((variable) => {
+ if (variable.id.name === "invalid") {
+ variable.references.forEach((ref) => {
+ fixes.push(fixer.replaceText(ref.id, "!isValid.value"))
+ })
+ } else if (variable.id.name !== "validated") {
+ throw new Error("unsupported variable")
+ }
+ })
+ }
+
+ if (formNode) {
+ fixes.push(
+ fixer.replaceTextRange(
+ [formNode.endTag.range[0], observerNode.endTag.range[1]],
+ " "
+ )
+ )
+ } else {
+ fixes.push(fixer.replaceText(observerNode.endTag, ""))
+ }
+
+ observerRefNodes.forEach((node) => {
+ fixes.push(fixer.replaceText(node.property, "form"))
+ if (node.parent.property.name === "reset") {
+ fixes.push(fixer.replaceText(node.parent.property, "resetValidation"))
+ } else {
+ fixes.push({
+ range: Array(2).fill(
+ context.sourceCode.getIndexFromLoc({
+ line: node.loc.start.line + 1,
+ column: 0,
+ }) - 1
+ ),
+ text: " // TODO: find vuetify equivalent",
+ })
+ }
+ })
+
+ if (formSubmitHandlerNode) {
+ if (!formSubmitHandlerNode.value.async) {
+ fixes.push(fixer.insertTextBefore(formSubmitHandlerNode.key, "async "))
+ }
+ let paramName = "event"
+ if (!formSubmitHandlerNode.value.params.length) {
+ fixes.push(
+ fixer.replaceTextRange(
+ [
+ formSubmitHandlerNode.value.range[0],
+ formSubmitHandlerNode.value.body.range[0],
+ ],
+ `(${paramName}) `
+ )
+ )
+ } else {
+ paramName = formSubmitHandlerNode.value.params[0].name
+ }
+ const indent = context.sourceCode.lines
+ .at(formSubmitHandlerNode.value.body.body[0].loc.start.line - 1)
+ .match(/^\s*/)[0]
+ fixes.push(
+ fixer.insertTextBefore(
+ formSubmitHandlerNode.value.body.body[0],
+ `if (!(await ${paramName}).valid) return\n\n${indent}`
+ )
+ )
+ }
+ }
+
+ validationProviderNodes.forEach(({ node, child, rules, vid }) => {
+ fixes.push(fixer.removeRange([node.startTag.range[0], child.startTag.range[0]]))
+
+ if (node.variables.length) {
+ const b4 = template.getTokenBefore(
+ node.variables[0].references[0].id.parent.parent
+ )
+ fixes.push(
+ fixer.removeRange([
+ b4.range[1],
+ node.variables[0].references[0].id.parent.parent.range[1],
+ ])
+ )
+ } else {
+ node.children.forEach((child) => {
+ if (child.type === "VElement") {
+ child.startTag.attributes.forEach((attr) => {
+ if (
+ attr.key.type === "VDirectiveKey" &&
+ attr.key.name.name === "slot-scope"
+ ) {
+ fixes.push(fixer.remove(attr))
+ if (child.variables.length) {
+ child.variables.forEach((variable) => {
+ const b4 = template.getTokenBefore(
+ variable.references[0].id.parent.parent
+ )
+ fixes.push(
+ fixer.removeRange([
+ b4.range[1],
+ variable.references[0].id.parent.parent.range[1],
+ ])
+ )
+ })
+ } else {
+ throw new Error("slot-scope without variables")
+ }
+ }
+ })
+ }
+ })
+ }
+
+ const isMultiline = child.startTag.loc.start.line !== child.startTag.loc.end.line
+ const indent = isMultiline
+ ? "\n" +
+ context.sourceCode.lines
+ .at(child.startTag.loc.start.line - 1)
+ .match(/^\s*/)[0] +
+ " ".repeat(2)
+ : " "
+
+ if (vid) {
+ fixes.push(
+ fixer.insertTextAfter(
+ child.startTag.attributes.at(-1),
+ indent +
+ (vid.directive
+ ? `:name=${context.sourceCode.getText(vid.value)}`
+ : `name="${vid.value.value}"`)
+ )
+ )
+ }
+ if (rules) {
+ let rulesArray
+ let rulesString
+ if (rules.directive) {
+ // dynamic rules
+ if (
+ rules.value.expression.type !== "TemplateLiteral" ||
+ rules.value.expression.quasis.length !== 2 ||
+ rules.value.expression.expressions.length !== 1 ||
+ rules.value.expression.expressions[0].type !== "ConditionalExpression" ||
+ rules.value.expression.expressions[0].alternate.type !== "Literal" ||
+ rules.value.expression.expressions[0].alternate.value !== ""
+ ) {
+ throw new Error("Unsupported dynamic rules")
+ }
+ const test = context.sourceCode.getText(
+ rules.value.expression.expressions[0].test
+ )
+ const rulesValue = rules.value.expression.expressions[0].consequent.value
+ rulesArray = rulesValue.split("|")
+ rulesString = `:rules="${test} ? [${rulesArray
+ .map((v) => `rules.${v}`)
+ .join(", ")}] : []"`
+ } else {
+ const rulesValue = rules.value.value
+ rulesArray = rulesValue.split("|")
+ rulesString = `:rules="[${rulesArray.map((v) => `rules.${v}`).join(", ")}]"`
+ }
+ fixes.push(
+ fixer.insertTextAfter(child.startTag.attributes.at(-1), indent + rulesString)
+ )
+ rulesArray.forEach((rule) => rulesToImport.add(rule))
+ }
+
+ fixes.push(
+ fixer.removeRange([
+ child.startTag.selfClosing ? child.startTag.range[1] : child.endTag.range[1],
+ node.endTag.range[1],
+ ])
+ )
+ })
+
+ if (rulesToImport.size) {
+ fixes.push(
+ fixer.insertTextBefore(
+ context.sourceCode.ast.body[0],
+ "import { " + [...rulesToImport].join(", ") + " } from '@/util/form'\n"
+ ),
+ fixer.insertTextAfterRange(
+ [vueObjectNode.range[0] + 1, vueObjectNode.range[0] + 1],
+ `
+ setup() {
+ return {
+ rules: { ${[...rulesToImport].join(", ")} }
+ }
+ },`
+ )
+ )
+ }
+
+ return fixes
+ },
+ })
+ },
+ 'VElement[rawName="ValidationObserver"]'(node) {
+ if (observerNode) {
+ throw new Error("multiple validation-observers")
+ }
+ observerNode = node
+ formNode = node.children.find((child) => {
+ return child.type === "VElement" && child.rawName === "form"
+ })
+ },
+ 'VElement[rawName="ValidationProvider"]'(node) {
+ const children = node.children.filter((child) => {
+ return child.type !== "VText" || child.value.trim()
+ })
+ const child = children[0]
+ if (
+ children.length !== 1 ||
+ child.type !== "VElement" ||
+ !(
+ child.name.startsWith("v-") ||
+ ["participant-select", "incident-select", "assignee-combobox"].includes(child.name)
+ )
+ ) {
+ throw new Error(
+ `validation-provider has unsupported children at ${node.loc.start.line}:${node.loc.start.column}`
+ )
+ }
+ const vid = node.startTag.attributes.find((attr) => {
+ return (
+ attr.type === "VAttribute" &&
+ (attr.key.name === "vid" ||
+ (attr.directive &&
+ attr.key.name.name === "bind" &&
+ attr.key.argument?.name === "vid"))
+ )
+ })
+ const name = node.startTag.attributes.find((attr) => {
+ return (
+ attr.type === "VAttribute" &&
+ (attr.key.name === "name" ||
+ (attr.directive &&
+ attr.key.name.name === "bind" &&
+ attr.key.argument?.name === "name"))
+ )
+ })
+ const rules = node.startTag.attributes.find((attr) => {
+ return (
+ attr.type === "VAttribute" &&
+ (attr.key.name === "rules" ||
+ (attr.directive &&
+ attr.key.name.name === "bind" &&
+ attr.key.argument?.name === "rules"))
+ )
+ })
+ const providerSlot = node.startTag.attributes.find((attr) => {
+ return (
+ attr.type === "VAttribute" &&
+ attr.key.type === "VDirectiveKey" &&
+ attr.key.name.name === "slot"
+ )
+ })
+ const providerOtherAttrs = node.startTag.attributes
+ .filter((attr) => {
+ return (
+ attr !== vid &&
+ attr !== name &&
+ attr !== rules &&
+ attr !== providerSlot &&
+ attr.key?.name !== "immediate"
+ )
+ })
+ .map((attr) => context.sourceCode.getText(attr))
+ const childName = child.startTag.attributes.find((attr) => {
+ return (
+ attr.type === "VAttribute" &&
+ (attr.key.name === "name" ||
+ (attr.directive &&
+ attr.key.name.name === "bind" &&
+ attr.key.argument?.name === "name"))
+ )
+ })
+ const childRules = child.startTag.attributes.find((attr) => {
+ return (
+ attr.type === "VAttribute" &&
+ (attr.key.name === "rules" ||
+ (attr.directive &&
+ attr.key.name.name === "bind" &&
+ attr.key.argument?.name === "rules"))
+ )
+ })
+
+ if (providerOtherAttrs.length) {
+ console.log("additional attributes", providerOtherAttrs)
+ return context.report({
+ node,
+ message: "validation-provider has other attributes",
+ })
+ }
+
+ if (providerSlot) {
+ if (node.variables.length !== 1 || node.variables[0]?.id.name !== "errors") {
+ return context.report({
+ node,
+ message: "validation-provider slot must only expose errors",
+ })
+ }
+ if (
+ node.variables[0].references.length !== 1 ||
+ node.variables[0].references[0].id.parent.parent.type !== "VAttribute" ||
+ node.variables[0].references[0].id.parent.parent.key.argument.name !==
+ "error-messages"
+ ) {
+ return context.report({
+ node: node.variables[0].references[0].id,
+ message: "validation-provider errors must only be used in error-messages",
+ })
+ }
+ }
+
+ if (childName) {
+ return context.report({
+ node: childName,
+ message: "validation-provider child must not have a name",
+ })
+ }
+
+ if (childRules) {
+ return context.report({
+ node: childRules,
+ message: "validation-provider child should not have rules",
+ })
+ }
+
+ validationProviderNodes.push({ node, child, rules, vid: vid || name })
+ },
+ },
+ {
+ 'MemberExpression[object.type="MemberExpression"][object.property.name="$refs"][property.name="observer"]'(
+ node
+ ) {
+ observerRefNodes.push(node)
+ },
+ 'ImportDeclaration[source.value="vee-validate"]'(node) {
+ vvImportNodes.push(node)
+ },
+ 'ImportDeclaration[source.value="vee-validate/dist/rules"]'(node) {
+ vvRuleImportNodes.push(node)
+ },
+ "ExportDefaultDeclaration > ObjectExpression"(node) {
+ vueObjectNode = node
+ },
+ 'ExportDefaultDeclaration > ObjectExpression > Property[key.name="setup"]'() {
+ hasSetup = true
+ },
+ 'ExportDefaultDeclaration > ObjectExpression > Property[key.name="methods"] > ObjectExpression > Property'(
+ node
+ ) {
+ methodNodes.push(node)
+ },
+ }
+ )
+ },
+ },
+ "list-item-children": {
+ meta: {
+ fixable: "code",
+ },
+ create(context) {
+ return context.parserServices.defineTemplateBodyVisitor({
+ 'VElement[name="v-list-item"]'(node) {
+ if (
+ node.startTag.attributes.some((attr) => attr.directive && attr.key.name.name === "slot")
+ ) {
+ return
+ }
+
+ const prepend = []
+ const append = []
+ let inContent = false
+ let reachedContent = false
+ let fixable = true
+ for (const child of node.children) {
+ if (child.type !== "VElement") {
+ if (child.type === "VText" && !child.value.trim()) continue
+ else {
+ fixable = false
+ break
+ }
+ } else if (
+ ["v-list-item-title", "v-list-item-subtitle", "v-list-item-content"].includes(
+ child.name
+ )
+ ) {
+ if (reachedContent && !inContent) {
+ fixable = false
+ break
+ }
+ inContent = true
+ reachedContent = true
+ } else if (
+ ["v-list-item-avatar", "v-list-item-icon", "v-list-item-action"].includes(child.name)
+ ) {
+ inContent = false
+ if (child.startTag.attributes.length && child.name !== "v-list-item-avatar") {
+ fixable = false
+ break
+ }
+ if (reachedContent) {
+ append.push(child)
+ } else {
+ prepend.push(child)
+ }
+ } else {
+ fixable = false
+ break
+ }
+ }
+
+ if (prepend.length || append.length) {
+ context.report({
+ node,
+ message: "move list item children into slots",
+ *fix(fixer) {
+ if (!fixable) return
+
+ function stringifyChildren(children) {
+ return children
+ .flatMap((child) => {
+ if (child.name === "v-list-item-avatar") {
+ return context.sourceCode
+ .getText(child)
+ .replace(/)/, "", " ")
+ }
+ return child.children.map((v) => context.sourceCode.getText(v).trim())
+ })
+ .join("\n")
+ }
+
+ for (const child of prepend) {
+ yield fixer.remove(child)
+ }
+ for (const child of append) {
+ yield fixer.remove(child)
+ }
+
+ if (prepend.length) {
+ yield fixer.insertTextAfter(
+ node.startTag,
+ `${stringifyChildren(prepend)} `
+ )
+ }
+ if (append.length) {
+ yield fixer.insertTextBefore(
+ node.endTag,
+ `${stringifyChildren(append)} `
+ )
+ }
+ },
+ })
+ }
+ },
+ })
+ },
+ },
+}
diff --git a/src/dispatch/static/dispatch/public/index.html b/src/dispatch/static/dispatch/index.html
similarity index 91%
rename from src/dispatch/static/dispatch/public/index.html
rename to src/dispatch/static/dispatch/index.html
index 692f463b03e7..fb16c5195943 100755
--- a/src/dispatch/static/dispatch/public/index.html
+++ b/src/dispatch/static/dispatch/index.html
@@ -6,7 +6,7 @@
Dispatch
@@ -18,5 +18,6 @@
+