--- title: About --- Featurevisor is an Open Source project that adopts the IaC and GitOps principles for managing feature flags, A/B tests, and variables configuration declaratively in a Git repository, which is then consumable in your applications or services using its SDKs available in different programming languages. It was created by Fahad Heylaal, who is available as @fahad19 on GitHub and X (previously known as Twitter). It is free to use and released under the MIT License. You can contribute to it by submitting issues, feature requests, or pull requests on its GitHub repository: http://github.com/featurevisor/featurevisor --- title: Custom Parsers nextjs: metadata: title: Custom Parsers description: Learn how to define your own custom parsers for Featurevisor going beyond just YAML and JSON files. openGraph: title: Custom Parsers description: Learn how to define your own custom parsers for Featurevisor going beyond just YAML and JSON files. images: - url: /img/og/docs-advanced-custom-parsers.png --- Featurevisor ships with built-in parsers supporting YAML and JSON files, but you can also take advantage of custom parsers allowing you to define your features and segments in a language you are most comfortable with. {% .lead %} ## Built-in parsers ### YAML By default, Featurevisor assumes all your definitions are written in YAML and no extra [configuration](/docs/configuration) is needed in that case: ```js {% path="featurevisor.config.js" highlight="6" %} module.exports = { environments: ['staging', 'production'], tags: ['all'], // optional if value is "yml" parser: 'yml', } ``` You can find a example project using YAML [here](https://github.com/featurevisor/featurevisor/tree/main/examples/example-yml). ### JSON If we wish to use JSON files instead of YAMLs, we can do so by specifying the `parser` option: ```js {% path="featurevisor.config.js" highlight="6" %} module.exports = { environments: ['staging', 'production'], tags: ['all'], // define the parser to use parser: 'json', } ``` You can find a example project using JSON [here](https://github.com/featurevisor/featurevisor/tree/main/examples/example-json). ## Custom If you wish to define your features and segments in some other language besides YAML and JSON, you can provide your own custom parser. A parser in this case is a function that takes file content as input (string) and returns a parsed object. Let's say we wish to use [TOML](https://toml.io/en/) files for our definitions. We start by installing the [toml](https://www.npmjs.com/package/toml) package: ```{% title="Command" %} $ npm install --save-dev toml ``` Now we define a custom parser in our [configuration](/docs/configuration): ```js {% path="featurevisor.config.js" highlight="6-9" %} module.exports = { environments: ['staging', 'production'], tags: ['all'], // define the parser to use parser: { extension: 'toml', parse: (content) => require('toml').parse(content), }, } ``` You can find a example project using TOML [here](https://github.com/featurevisor/featurevisor/tree/main/examples/example-toml). --- title: Alternatives nextjs: metadata: title: Alternatives description: Alternative solutions openGraph: title: Alternatives description: Alternative solutions images: - url: /img/og/docs.png showEditPageLink: false --- There are several feature management tools available in the wild, and it is worth understanding how Featurevisor compares to them and if it is the right tool for you. {% .lead %} The sections below will help you understand their differences better. ## Comparison table Featurevisor is an open-source, [GitOps](/docs/concepts/gitops)-friendly feature flags management system that works entirely from static [datafiles](/docs/building-datafiles), offering zero-latency evaluations across [environments](/docs/environments). Here’s how it compares to [LaunchDarkly](https://launchdarkly.com/), [Optimizely](https://www.optimizely.com/), and [Unleash](https://getunleash.io/): | Feature | Featurevisor | LaunchDarkly | Optimizely | Unleash | | ------------------------------------------------------------------------- | ------------ | ------------ | ---------- | ------- | | Git-based workflow | ✅ | ❌ | ❌ | ❌ | | UI based workflow | ❌ | ✅ | ✅ | ✅ | | Change review/approval workflow | ✅ | ✅ | ✅ | ❌ | | [Feature flags](/docs/features) | ✅ | ✅ | ✅ | ✅ | | [A/B test experiments](/docs/experiments) | ✅ | ✅ | ✅ | ❌ | | [Multivariate tests](/docs/experiments) | ✅ | ✅ | ✅ | ❌ | | [Mutually exclusive experiments](/docs/groups) | ✅ | ✅ | ✅ | ❌ | | [Variables](/docs/use-cases/remote-configuration) | ✅ | ✅ | ✅ | ✅ | | [Testing features](/docs/testing) | ✅ | ❌ | ❌ | ❌ | | [Default/fallback values](/docs/features/#schema) | ✅ | ✅ | ✅ | ✅ | | [Custom user attributes](/docs/attributes) | ✅ | ✅ | ✅ | ✅ | | [Targeting rules](/docs/features/#rules) | ✅ | ✅ | ✅ | ✅ | | [Segment-based targeting](/docs/features/#rules) | ✅ | ✅ | ✅ | ✅ | | [Segment reusability](/docs/segments) | ✅ | ✅ | ✅ | ✅ | | [Time-based targeting](/docs/segments/#before) | ✅ | ✅ | ✅ | ✅ | | [Gradual/percentage rollout](/docs/features/#percentage) | ✅ | ✅ | ✅ | ✅ | | [Offline evaluation](/docs/sdks/javascript) | ✅ | ✅ | ✅ | ✅ | | [Dependent features](/docs/use-cases/dependencies) | ✅ | ✅ | ❌ | ✅ | | [Multiple environments](/docs/environments) | ✅ | ✅ | ✅ | ✅ | | [Consistent bucketing](/docs/bucketing) | ✅ | ✅ | ✅ | ✅ | | [Overriding values with conditions](/docs/features/#overriding-variables) | ✅ | ✅ | ✅ | ✅ | | Built-in analytics | ❌ | ✅ | ✅ | ❌ | | Multi-armed bandit | ❌ | ❌ | ✅ | ❌ | | Experiment analysis | ❌ | ✅ | ✅ | ❌ | | [External analytics tool integration](/docs/tracking/google-analytics) | ✅ | ✅ | ✅ | ✅ | | Audit logging | ✅ | ✅ | ✅ | ✅ | ## Considering Featurevisor Featurevisor is a pretty unique tool that offers a Git-based workflow for [feature management](/docs/feature-management) almost matching the functionalities of other established SaaS solutions. It may be a great fit for some, while also not so convenient for others. ### When to choose Featurevisor If your team values a code-centric approach with a strong review/approval workflow and wants to integrate feature management practices into your existing development process seamlessly, Featurevisor can be a great open source alternative to commercial feature management SaaS tools. Because of Git-based approach unlike other tools, Featurevisor is able to support [testing](/docs/testing) your feature and segment definitions, which can help catch issues early at Pull Requests level before your feature configuration changes even reach production, boosting your confidence in your deployments. Something that other UI-based SaaS tools does not offer. ### Challenges If you are aiming to get Product, Marketing and other non-technical folks of your organization in to a Git-based workflow, it will pose additional challenges of onboarding them to learn how to use Git. Using Featurevisor also means you will be required to set up your own Git repo, CI/CD pipeline, and manage your own infrastructure if you are using your own hosting. There are examples using [GitHub Actions](/docs/integrations/github-actions), and [Cloudflare Pages](/docs/integrations/cloudflare-pages) for you to learn from, but the responsibility ultimately falls on your engineering team(s). You are recommended to choose wisely depending on your needs and the level of expertise in your organization. --- title: Attributes nextjs: metadata: title: Attributes description: Learn how to create attributes in Featurevisor openGraph: title: Attributes description: Learn how to create attributes in Featurevisor images: - url: /img/og/docs-attributes.png --- Attributes are the building blocks of creating conditions, which can later be used in reusable [segments](/docs/segments/). {% .lead %} ## Create an attribute Let's create an attribute called `country`: ```yml {% path="attributes/country.yml" %} description: Country type: string ``` `type` and `description` are the minimum required properties for an attribute. ## Types These types are supported for attribute values: - `boolean` - `string` - `integer` - `double` - `date` - `array` (of strings) - `object` (flat object) ### boolean When an attribute is of type `boolean`, it can have two possible values: `true` or `false`. For example, if you want to create an attribute for `isPremiumUser`, you can do it like this: ```yml {% path="attributes/isPremiumUser.yml" %} description: Is Premium User type: boolean ``` ### string When an attribute is of type `string`, it can have any string value. For example, if you want to create an attribute for `country`, you can do it like this: ```yml {% path="attributes/country.yml" %} description: Country type: string ``` ### integer When an attribute is of type `integer`, it can have any integer value. For example, if you want to create an attribute for `age`, you can do it like this: ```yml {% path="attributes/age.yml" %} description: Age type: integer ``` ### double When an attribute is of type `double`, it can have any floating point number value. For example, if you want to create an attribute for `rating`, you can do it like this: ```yml {% path="attributes/rating.yml" %} description: Rating type: double ``` ### date When an attribute is of type `date`, it can have any date value in ISO 8601 format. For example, if you want to create an attribute for `signupDate`, you can do it like this: ```yml {% path="attributes/signupDate.yml" %} description: Signup Date type: date ``` ### array When an attribute is of type `array`, it can have an array of string values. For example, if you want to create an attribute for `permissions`, you can do it like this: ```yml {% path="attributes/permissions.yml" %} description: Permissions type: array ``` ### object When an attribute is of type `object`, it can have nested properties. For example, if you want to create an attribute for `user` with some nested properties, you can do it like this: ```yml {% path="attributes/user.yml" %} description: User type: object properties: id: type: string description: User ID country: type: string description: User country ``` When writing conditions for [segments](/docs/segments/), you can use the dot notation to access nested properties. For example, `user.id` or `user.country`. ## Archiving You can archive an attribute by setting `archived: true`: ```yml {% path="attributes/country.yml" %} archived: true type: string description: Country ``` ## Relationship with context [SDKs](/docs/sdks/) evaluate values against the [context](/docs/sdks/javascript/#context) available at runtime. The context is an object where the keys are attribute names and the values are the attribute values. For example, if you have an attribute called `country`, the context will look like this: ```js { "country": "nl" // The Netherlands } ``` If we combine all the above examples, the full context may look like this: ```js { "country": "nl", "isPremiumUser": true, "age": 30, "rating": 4.5, "signupDate": "2025-01-01T00:00:00Z", "permissions": ["read", "write"], "user": { "id": "12345", "country": "nl" } } ``` --- title: Bucketing nextjs: metadata: title: Bucketing description: Learn how Featurevisor bucketing works openGraph: title: Bucketing description: Learn how Featurevisor bucketing works images: - url: /img/og/docs-bucketing.png --- Bucketing is the process of assigning users to a specific cohort of a feature. It is a crucial part of the feature's rollout process, as it determines which users will be exposed to a feature, and which variation (if any) of the feature they will be assigned to consistently. {% .lead %} ## Factors Few factors are involved in the bucketing process: - The [feature](/docs/features) key (name of the feature) - If it has [variations](/docs/features/#variations), then their `weight` values - The [`bucketBy`](/docs/features/#bucketing) property of the feature (usually `userId`) - The rollout [rules](/docs/features/#rules), and their `percentage` values ## Bucketing process When a feature is evaluated in an application using [SDKs](/docs/sdks/), the following steps take place: - Create a bucketing key from: - the `bucketBy` attribute's value as found in [`context`](/docs/sdks/javascript/#context), and - the feature's own key - Generate a hash from the bucketing key, that ranges between 0 to 100 (inclusive) - Iterate through the rollout rules, and check if the segments have matched and the hash is within the range of the rule's `percentage` value - If a match is found, the feature is meant to be exposed to the user - If the feature has variations, then find the variation that the user is assigned to based on the `weight` of the variations and the hash ## Consistent bucketing The bucketing process is consistent, which means Featurevisor [SDKs](/docs/sdks/) will evaluate the same value for the same user and feature key repeatedly, irrespective of the number of different devices or sessions the user has. It is because of maintaining this consistency that we have the need for [state files](/docs/state-files). As long as the feature's rollout rules' `percentage` keeps increasing over time, it is possible to maintain consistent bucketing. The expectation is we will always gradually increase the rollout percentage of a feature, and never decrease it. If the `percentage` of a rollout rule decreases and/or the `weight`s of variations change, then the bucketing process will not be consistent for all users any more. Even though Featurevisor tries its best to maintain consistent bucketing, it is not possible to guarantee it in all cases if the percentage value decreases. --- title: Building datafiles nextjs: metadata: title: Building datafiles description: Build your Featurevisor project datafiles openGraph: title: Building datafiles description: Build your Featurevisor project datafiles images: - url: /img/og/docs.png --- Datafiles are JSON files that are created against combinations of [tags](/docs/tags/) and [environments](/docs/environments/). They are used to evaluate features in your application via Featurevisor [SDKs](/docs/sdks/). {% .lead %} ## Usage Use Featurevisor CLI to build your datafiles: ```{% title="Command" %} $ npx featurevisor build ``` ## Output The build output can be found in the `datafiles` directory. If your `featurevisor.config.js` file looks like this: ```js {% path="featurevisor.config.js" %} module.exports = { tags: ['all'], environments: ['staging', 'production'], } ``` Then the contents of your `datafiles` directory will look like this: ``` $ tree datafiles . ├── production │   └── featurevisor-tag-all.json └── staging └── featurevisor-tag-all.json 2 directories, 2 files ``` Next to datafiles, the build process will also generate some additional JSON files that we can learn more about in [State files](/docs/state-files). ## Revision By default, Featurevisor will increment the revision number as found in `.featurevisor/REVISION` file (learn more in [state files](/docs/state-files)). ### Custom revision You can optionally customize the `revision` value when building datafiles by passing a `--revision` flag: ```{% title="Command" %} $ npx featurevisor build --revision 1.2.3 ``` ### Revision from hash If instead of an incremented revision, you want to use the hash of the individual datafile content, you can pass `--revision-from-hash`: ```{% title="Command" %} $ npx featurevisor build --revision-from-hash ``` If in a particular datafile the content has not changed, the revision will remain the same as the previous build. This is useful for caching purposes. ### No state files If you wish to build datafiles without making any changes to [state files](/docs/state-files), you can pass the `--no-state-files` flag: ```{% title="Command" %} $ npx featurevisor build --no-state-files ``` ## Printing You can print the contents of a datafile for a single feature or all the features in an environment without writing anything to disk by passing these flags: ```{% title="Command" %} $ npx featurevisor build \ --feature=foo \ --environment=production \ --json \ --pretty ``` Or if you with to print datafile containing all features for a specific environment: ```{% title="Command" %} $ npx featurevisor build --environment=production --json --pretty ``` This is useful primarily for debugging and testing purposes. If you are an SDK developer in other languages besides JavaScript, you may want to use this handy command to get the generated datafile content in JSON format that you can use in your own [test runner](/docs/testing). ## Datafiles directory By default, datafiles will be generated in the `datafiles` directory, or your custom directory if you have specified it under `datafilesDirectoryPath` in your [`featurevisor.config.js`](/docs/configuration/) file. You can optionally override it from CLI: ```{% title="Command" %} $ npx featurevisor build --datafiles-dir=./custom-directory ``` ## Schema version If you are using older Featurevisor v1 SDKs, you can build datafiles in the v1 schema format by passing the `--schema=version=1` flag: ```bash {% title="Command" %} # finish regular build first $ npx featurevisor build # then build v1-compatible datafiles $ npx featurevisor build \ --schema-version=1 \ --datafiles-dir=./datafiles/v1 \ --no-state-files ``` --- title: Command Line Interface (CLI) Usage nextjs: metadata: title: Command Line Interface (CLI) Usage description: Command Line Interface (CLI) Usage of Featurevisor openGraph: title: Command Line Interface (CLI) Usage description: Command Line Interface (CLI) Usage of Featurevisor images: - url: /img/og/docs-cli.png --- Beyond just initializing a project and building datafiles, Featurevisor CLI can be used for a few more purposes. {% .lead %} ## Installation Use `npx` to initialize a project first: ``` $ mkdir my-featurevisor-project && cd my-featurevisor-project $ npx @featurevisor/cli init ``` If you wish to initialize a specific example as available in the [monorepo](https://github.com/featurevisor/featurevisor/tree/main/examples): ``` $ npx @featurevisor/cli init --example=json ``` After you have installed the dependencies in the project: ``` $ npm install ``` You can access the Featurevisor CLI from inside the project via: ``` $ npx featurevisor ``` Learn more in [Quick start](/docs/quick-start). ## Linting Check if the definition files have any syntax or structural errors: ``` $ npx featurevisor lint ``` Lear more in [Linting](/docs/linting). ## Building datafiles Generate JSON files on a per environment and tag combination as exists in project [configuration](/docs/configuration): ``` $ npx featurevisor build ``` Learn more in [Building datafiles](/docs/building-datafiles). ## Testing Test your features and segments: ``` $ npx featurevisor test ``` Learn more in [Testing](/docs/testing). ## Generate static site Build the site: ``` $ npx featurevisor site export ``` Serve the built site (defaults to port 3000): ``` $ npx featurevisor site serve ``` Serve it in a specific port: ``` $ npx featurevisor site serve -p 3000 ``` Learn more in [Status site](/docs/status-site). ## Generate code Generate TypeScript code from feature definitions: ``` $ npx featurevisor generate-code --language typescript --out-dir ./src ``` See output in `./src` directory. Learn more in [code generation](/docs/code-generation) page. ## Find duplicate segments It is possible to end up with multiple segments having same conditions in larger projects. This is not a problem per se, but we should be aware of it. We can find these duplicates early on by running: ``` $ npx featurevisor find-duplicate-segments ``` If we want to know the names of authors who worked on the duplicate segments, we can pass `--authors`: ``` $ npx featurevisor find-duplicate-segments --authors ``` ## Find usage Learn where/if certain segments and attributes are used in. For each of the `find-usage` commands below, you can optionally pass `--authors` to find who worked on the affected entities. ### Segment usage ``` $ npx featurevisor find-usage --segment=my_segment ``` ### Attribute usage ``` $ npx featurevisor find-usage --attribute=my_attribute ``` ### Unused segments ``` $ npx featurevisor find-usage --unusedSegments ``` ### Unused attributes ``` $ npx featurevisor find-usage --unusedAttributes ``` ### Feature usage ``` $ npx featurevisor find-usage --feature=my_feature ``` ## Benchmarking You can measure how fast or slow your SDK evaluations are for particular features. The `--n` option is used to specify the number of iterations to run the benchmark for. ### Feature To benchmark evaluating a feature itself if it is enabled or disabled via SDK's `.isEnabled()` method against provided [context](/docs/sdks/javascript/#context): ``` $ npx featurevisor benchmark \ --environment=production \ --feature=my_feature \ --context='{"userId": "123"}' \ --n=1000 ``` ### Variation To benchmark evaluating a feature's variation via SDKs's `.getVariation()` method: ``` $ npx featurevisor benchmark \ --environment=production \ --feature=my_feature \ --variation \ --context='{"userId": "123"}' \ --n=1000 ``` ### Variable To benchmark evaluating a feature's variable via SDKs's `.getVariable()` method: ``` $ npx featurevisor benchmark \ --environment=production \ --feature=my_feature \ --variable=my_variable_key \ --context='{"userId": "123"}' \ --n=1000 ``` You can optionally pass `--schema-version=2` if you are using the new schema v2. ## Configuration To view the project [configuration](/docs/configuration): ``` $ npx featurevisor config ``` Printing configuration as JSON: ``` $ npx featurevisor config --json --pretty ``` ## Evaluate To learn why certain values (like feature and its variation or variables) are evaluated as they are against provided [context](/docs/sdks/javascript/#context): ``` $ npx featurevisor evaluate \ --environment=production \ --feature=my_feature \ --context='{"userId": "123", "country": "nl"}' ``` This will show you full [evaluation details](/docs/sdks/javascript/#evaluation-details) helping you debug better in case of any confusion. It is similar to [logging](/docs/sdks/javascript/#logging) in SDKs with `debug` level. But here instead, we are doing it at CLI directly in our Featurevisor project without having to involve our application(s). If you wish to print the evaluation details in plain JSON, you can pass `--json` at the end: ``` $ npx featurevisor evaluate \ --environment=production \ --feature=my_feature \ --context='{"userId": "123", "country": "nl"}' \ --json \ --pretty ``` The `--pretty` flag is optional. To print further logs in a more verbose way, you can pass `--verbose`: ``` $ npx featurevisor evaluate \ --environment=production \ --feature=my_feature \ --context='{"userId": "123", "country": "nl"}' \ --verbose ``` You can optionally pass `--schema-version=2` if you are using the new schema v2. ## List ### List features To list all features in the project: ``` $ npx featurevisor list --features ``` Advanced search options: | Option | Description | | ------------------------------ | -------------------------------------------------- | | `--archived=` | by [archived](/docs/features/#archiving) status | | `--description=` | by description pattern | | `--disabledIn=` | disabled in an [environment](/docs/environments) | | `--enabledIn=` | enabled in an [environment](/docs/environments) | | `--json` | print as JSON | | `--keyPattern=` | by key pattern | | `--tag=` | by [tag](/docs/tags/) | | `--variable=` | containing specific variable key | | `--variation=` | containing specific variation key | | `--with-tests` | with [test specs](/docs/testing) | | `--with-variables` | with variables | | `--with-variations` | with [variations](/docs/features/#variations) | | `--without-tests` | without any test specs | | `--without-variables` | without any [variables](/docs/features/#variables) | | `--without-variations` | without any variations | ### List segments To list all segments in the project: ``` $ npx featurevisor list --segments ``` Advanced search options: | Option | Description | | ---------------------------- | ----------------------------------------------- | | `--archived=` | by [archived](/docs/segments/#archiving) status | | `--description=` | by description pattern | | `--json` | print as JSON | | `--keyPattern=` | by key pattern | | `--pretty` | pretty JSON | | `--with-tests` | with [test specs](/docs/testing) | | `--without-tests` | without any test specs | ### List attributes To list all attributes in the project: ``` $ npx featurevisor list --attributes ``` Advanced search options: | Option | Description | | ---------------------------- | ------------------------------------------------- | | `--archived=` | by [archived](/docs/attributes/#archiving) status | | `--description=` | by description pattern | | `--json` | print as JSON | | `--keyPattern=` | by key pattern | | `--pretty` | pretty JSON | ### List tests To list all tests specs in the project: ``` $ npx featurevisor list --tests ``` Advanced search options: | Option | Description | | ------------------------------ | ------------------------------------------------- | | `--applyMatrix` | apply matrix for assertions | | `--assertionPattern=` | by assertion's description pattern | | `--json` | print as JSON | | `--keyPattern=` | by key pattern of feature or segment being tested | | `--pretty` | pretty JSON | ## Assess distribution To check if the gradual rollout of a feature and the weight distribution of its variations (if any exists) are going to work as expected in a real world application with real traffic against provided [context](/docs/sdks/javascript/#context), we can imitate that by running: ``` $ npx featurevisor assess-distribution \ --environment=production \ --feature=my_feature \ --context='{"country": "nl"}' \ --populateUuid=userId \ --n=1000 ``` The `--n` option controls the number of iterations to run, and the `--populateUuid` option is used to simulate different users in each iteration in this particular case. Further details about all the options: - `--environment`: the environment name - `--feature`: the feature key - `--context`: the common [context](/docs/sdks/javascript/#context) object in stringified form - `--populateUuid`: attribute key that should be populated with a new UUID, and merged with provided context. - You can pass multiple attributes in your command: `--populateUuid=userId --populateUuid=deviceId` - `--n`: the number of iterations to run the assessment for - The higher the number, the more accurate the distribution will be - `--verbose`: print the merged context for better debugging Everything is happening locally in memory without modifying any content anywhere. This command exists only to add to our confidence if questions arise about how effective traffic distribution in Featurevisor is. ## Info Shows count of various entities in the project: ``` $ npx featurevisor info ``` ## Version Get the current version number of Featurevisor CLI, and its relevant packages: ``` $ npx featurevisor --version ``` Or do: ``` $ npx featurevisor -v ``` --- title: Code Generation nextjs: metadata: title: Code Generation description: Generate code from your defined features in Featurevisor. openGraph: title: Code Generation description: Generate code from your defined features in Featurevisor. images: - url: /img/og/docs-code-generation.png --- For additional compile-time and runtime safety including autocompletion, you can generate code from your already defined features for improved developer experience. {% .lead %} ## Why generate code? This is an optional step that you may wish to adopt in your workflow. If you do, it will help you avoid some common mistakes: - any unintentional spelling mistakes in feature and variable keys - worrying about the types of your variables - worrying about passing attributes in wrong types in context All of it done being code-driven, thus reducing overall cognitive load of your team. ## Supported languages Currently only TypeScript is supported. Support for other languages is planned in future, as Featurevisor SDK becomes available in more languages. ## Generate code From the root of your Featurevisor project directory, use the [CLI](/docs/cli) for generating code in a specified directory: ```{% title="Command" %} $ npx featurevisor generate-code --language typescript --out-dir ./src ``` The generated files can be found in `./src` directory. ## Publishing the generated code You are free to use the generated code in any way you want. You can choose to either: - copy/paste the code in your applications, or - publish the generated code as a private npm package and use it in multiple applications This guide assumes we are publishing it as a private npm package named `@yourorg/features`. The publishing part can be done in the same [deployment](/docs/deployment) process right after deploying your generated [datafiles](/docs/building-datafiles). ## Consuming the generated code Initialize Featurevisor SDK as usual, and make your newly created package aware of the SDK instance: ```js {% path="your-app/index.js" %} import { createInstance } from '@featurevisor/sdk' import { setInstance } from '@yourorg/features' const f = createInstance({ // ... }) setInstance(f) ``` Afterwards, you can import your features from the generated package and evaluate their variations and variables. ## Importing features Each feature as defined in our Featurevisor project is made available as an individual TypeScript namespace. If our feature was named `foo` (existing as `features/foo.yml` file), we can import it as follows: ```js import { FooFeature } from '@yourorg/features' ``` The imported feature will have several methods available depending how it's defined. Method for checking if the feature is enabled or not is always available: ```js FooFeature.isEnabled((context = {})) ``` If your feature has any defined variations, then `getVariation` method would also be available: ```js FooFeature.getVariation((context = {})) ``` If variables are also defined in the feature, they would be available as: ```js FooFeature.getMyVariableKey((context = {})) ``` ## Passing context You can access the full generated `Context` type as follows: ```js import { Context } from '@yourorg/features' ``` Passing `context` in all the methods is optional. The generated code is smart enough to know the types of all your individual attributes as defined in your Featurevisor project. Therefore, if you pass an attribute in wrong type for evaluating variation or variables, you will get a TypeScript error. ## Checking if enabled Assuming we have a `foo` feature defined already in `features/foo.yml` file: ```js import { FooFeature } from '@yourorg/features' const context = { userId: 'user-123' } const isFooEnabled = FooFeature.isEnabled(context) ``` ## Getting variation We can use the same imported feature to get its variation: ```js import { FooFeature } from '@yourorg/features' const context = { userId: 'user-123' } const fooVariation = FooFeature.getVariation(context) ``` ## Evaluating variable If our `foo` feature had a `bar` variable defined, we can evaluate it as follows: ```js import { FooFeature } from '@yourorg/features' const context = { userId: 'user-123' } const barValue = FooFeature.getBar(context) ``` The returned value will honour the variable type as defined in the feature's schema originally. If the variable type is of either `object` or `json`, we can use generics to specify the type of the returned value: ```js interface MyType { someKey: string; } const barValue = FooFeature.getBar(context); ``` ## Accessing keys To access the literal feature key, use the `key` property of imported feature: ```js import { FooFeature } from '@yourorg/features' console.log(FooFeature.key) // "foo" ``` ## Suggestions for package publishing You are advised to publish the generated code as a private npm package, with support for ES Modules (ESM). When published as ES Modules, it will enable tree-shaking in your applications, thus reducing the bundle size. --- title: Infrastructure as Code (IaC) nextjs: metadata: title: Infrastructure as Code (IaC) description: Learn what Infrastructure as Code (IaC) means and how it applies to Featurevisor. openGraph: title: Infrastructure as Code (IaC) description: Learn what Infrastructure as Code (IaC) means and how it applies to Featurevisor. images: - url: /img/og/docs-concepts-iac.png --- Infrastructure as Code (IaC) is the practice of managing and provisioning computing resources through human-readable yet machine-parsable files written in a declarative format. {% .lead %} Instead of manually configuring your infrastructure or using one-off scripts, IaC allows you to apply consistent settings across multiple environments, making the setup reproducible and easily versioned. ## Understanding infrastructure Before diving further into how Featurevisor adopts IaC principles, it's essential to understand what "**infrastructure**" means in the context of IaC. It refers to the various components that together form the operational environment where your software runs. This infrastructure may include: - **Compute resources**: servers and containers - **Networking components**: network configuration, load balancers, firewalls - **Storage elements**: databases and file storage - **Software & applications**: operating systems, dependencies, and applications - **Security mechanisms**: access control and encryption tools - **Management & monitoring**: configuration, monitoring and alerting systems - **Deployment tools**: CI/CD pipelines and automation tools These components, when managed and provisioned as (declarative) code, enhance automation, consistency, and scalability, streamlining the development and deployment processes. ## Key characteristics - **Declarative syntax**: IaC uses a declarative approach, meaning you specify "**what**" you want to achieve, not "**how**" to achieve it. The system interprets the code to bring the environment to the desired state. - **Version control**: All configurations are stored in a version control system like [Git](/docs/concepts/gitops), providing a historical record and enabling rollback capabilities. - **Automation**: IaC is often integrated into a CI/CD (Continuous Integration/Continuous Deployment) pipeline, automating the deployment process and minimizing human error. - **Modularity & reusability**: IaC encourages modular configurations, which can be reused across different projects or environments. ## What does declarative approach mean? Declarative approach means that you specify the desired state of your infrastructure, and the system takes care of the rest. This means you express your desired state from the system as files in (usually) one of these formats, which are human-readable yet machine-parsable: - [YAML](https://en.wikipedia.org/wiki/YAML) (Featurevisor uses this by default) - [JSON](https://en.wikipedia.org/wiki/JSON) - [TOML](https://toml.io/en/) - [HCL](https://github.com/hashicorp/hcl) Refer to our [custom parser guide](/docs/advanced/custom-parsers) to learn how to use other formats. ## How does it apply to Featurevisor? Featurevisor takes the principles of IaC and [GitOps](/docs/concepts/gitops) and applies them to [feature management](/docs/feature-management). Here's how: ### GitOps workflow In Featurevisor, all feature configurations are stored in a Git repository and managed via a GitOps workflow. This ensures that changes are reviewed, approved, versioned, and auditable, just like you would expect with IaC. ### Declarative configuration Featurevisor allows you to define all your feature flags, A/B tests, and other configurations in files written in a language of your choosing (like YAML, JSON, or TOML). This provides a human-readable, yet machine-parsable, way to manage features. Just like IaC, this is a declarative approach. You specify what you want to happen with your features, and Featurevisor takes care of the rest. {% callout type="note" title="Featurevisor's building blocks" %} Each of these are expressed as separate files: - [Attributes](/docs/attributes): building block for conditions - [Segments](/docs/segments): reusable conditions for targeting users - [Features](/docs/features): feature flags and experiments with rollout rules - [Groups](/docs/groups): for mutually exclusive features and experiments {% /callout %} ### Automation with CI/CD Once changes are merged into the main or master branch of your Git repository, Featurevisor automates the propagation of these configurations to your live environment via CI/CD pipelines. Since Featurevisor is [cloud native](/docs/concepts/cloud-native-architecture), it can be integrated with any CI/CD tool of your choice. ### Modularity [Feature](/docs/features) configurations in Featurevisor can be modular, meaning you can have separate configurations for different features against different environments. You can also define reusable targeting conditions as [segments](/docs/segments) and apply in different features' rollout rules. This promotes reusability and makes it easier to manage complex systems. ## Other examples These are various popular open source projects that adopt IaC principles: - [Terraform](https://www.terraform.io/): Infrastructure provisioning - [Kubernetes](https://kubernetes.io/): Container orchestration - [Docker](https://www.docker.com/): Containerization - [AWS CloudFormation](https://aws.amazon.com/cloudformation/): Infrastructure provisioning - [Azure Resource Manager](https://azure.microsoft.com/en-us/features/resource-manager/): Infrastructure provisioning - [Google Cloud Deployment Manager](https://cloud.google.com/deployment-manager): Infrastructure provisioning ## Conclusion Featurevisor successfully extends the principles of Infrastructure as Code to the realm of feature management. By doing so, it offers a robust, version-controlled, and automated approach to manage your application's features. Whether you're a developer, a system admin, or a product manager, understanding the IaC principles behind Featurevisor can help you manage features more effectively and efficiently. --- title: Cloud Native Architecture nextjs: metadata: title: Cloud Native Architecture description: Learn what being Cloud Native means and how it applies to Featurevisor openGraph: title: Cloud Native Architecture description: Learn what being Cloud Native means and how it applies to Featurevisor images: - url: /img/og/docs-concepts-cloud-native.png --- Featurevisor is agnostic of what CI/CD tool you use for your deployment, and also is not opinionated on how you store the generated datafiles making it a cloud native solution that you can customize as per your needs. {% .lead %} Before we dive into how Featurevisor leverages cloud native principles, let's first clarify what "**cloud native**" means. ## What is Cloud Native? Cloud native refers to a way of building and running applications that fully leverage the advantages of cloud computing. Instead of traditional methods, where software might be designed to run on a specific set of servers, cloud native applications are built to be flexible, scalable, and resilient. They are designed to run in a cloud environment where resources can be easily added or removed as needed. ## Key Benefits of Cloud Native - **Scalability**: Easily adjust to handle more users or data. - **Flexibility**: Run your services wherever you want — be it AWS, Google Cloud, Azure, or a private cloud. - **Resilience**: Built-in fault tolerance means less downtime. - **Speed**: Deploy updates and new features much more quickly. ## Cloud Native in Featurevisor Featurevisor is an open source project designed for managing feature flags, experiments, and remote configuration for your applications. It's built with a cloud native architecture, allowing it to seamlessly integrate with various cloud services. In its simplest form, all it requires is a CI/CD pipeline and a CDN for hosting generated static JSON files. ## How does it work? - **Declarative configuration**: Featurevisor allows users to define feature flags and their configuration in a straightforward, [declarative](/docs/features) manner. This means you specify what you want the system to do, not how to do it. - **GitOps Workflow**: In true cloud native fashion, Featurevisor adopts a [GitOps](/docs/concepts/gitops) approach. All changes are made via Pull Requests, reviewed, and then automatically deployed by a CI/CD (Continuous Integration/Continuous Deployment) pipeline. - **Unopinionated infrastructure**: Featurevisor doesn’t tie you down to a specific cloud provider or service. Whether you're using GitHub, GitLab, Jenkins, or any CDN for static file hosting, Featurevisor is designed to work smoothly. - **Static JSON datafiles**: The configuration files, known as [datafiles](/docs/building-datafiles), are static JSON files. Every time a Pull Request is merged to the master branch, a new build is triggered, producing these datafiles. They are then [deployed](/docs/deployment) to a CDN (Content Delivery Network), ensuring fast and reliable delivery. ## What Does This Mean for You? - **Flexibility**: You're not locked into a specific tech stack. Choose the Git provider, CI/CD tool, and CDN that best suit your needs. - **Transparency**: Every change goes through a Pull Request, making it easier to track who made what change and why. - **Speed**: Thanks to the CI/CD pipeline, changes can be deployed quickly and automatically. ## Conclusion Cloud native is more than just a buzzword; it’s a powerful approach to building robust, scalable, and flexible software. Featurevisor embodies these principles by offering an open-source, cloud native solution for managing feature flags. Whether you're a developer, either focused on product engineering or DevOps, or a product manager, Featurevisor provides a streamlined, efficient way to manage your software's features while reaping the benefits of cloud native architecture. --- title: Concepts nextjs: metadata: title: Concepts description: Understand the core concepts behind Featurevisor openGraph: title: Concepts description: Understand the core concepts behind Featurevisor images: - url: /img/og/docs.png --- Understanding these core concepts will help you understand the design decisions behind Featurevisor and how to use it effectively: - [Feature management](/docs/feature-management) - [Cloud native architecture](/docs/concepts/cloud-native-architecture) - [Infrastructure as code](/docs/concepts/infrastructure-as-code) - [GitOps](/docs/concepts/gitops) --- title: What is GitOps? nextjs: metadata: title: What is GitOps? description: Learn what GitOps mean and how it applies to Featurevisor openGraph: title: What is GitOps? description: Learn what GitOps mean and how it applies to Featurevisor images: - url: /img/og/docs-concepts-gitops.png --- GitOps in its simplest form means operations via Git. This guide will explain further what it is and how it applies to Featurevisor projects. {% .lead %} ## Git If you have ever worked on a software project, you are probably familiar with [Git](https://git-scm.com/) - a tool for tracking changes in source code. There are several Git hosting providers like [GitHub](https://github.com) and [GitLab](https://gitlab.com) which you might have used already. But what if we told you that Git could be used for much more than just that? Enter GitOps, a term that's gaining traction in the software engineering community. ## GitOps GitOps is like treating your infrastructure and operational tasks as if they were code. It builds on top of the principles of [Infrastructure as Code (IaC)](/docs/concepts/infrastructure-as-code). Imagine that you have a recipe book (your Git repository) that contains all the instructions (code) for making a dish (your software). Normally, chefs (developers) would update the recipe book and then cook the dish manually. But what if the kitchen could read the book and update the dish automatically every time the recipe changes? That's essentially what GitOps does for software deployment and operations. It automates the process of applying changes to your infrastructure based on changes to a Git repository. This makes it easier to manage, track, and roll back changes, all while using familiar tools like Git. ## Why should you care? - **Declarative**: You define the desired state of your infrastructure in a Git repository in a highly readable format - **Transparency**: All changes are tracked in Git, so you can easily see who made what change and when. - **Speed**: Automation means faster deployments, which means you get features and fixes out to users more quickly. - **Collaboration**: Because everything is stored in Git, team members can easily collaborate on changes through Pull Requests, which go via strict reviews and approval process. - **Consistency**: Automation ensures that the steps are repeated exactly each time, reducing human error. ## How does it affect Featurevisor? Featurevisor is an open-source project that falls perfectly in line with the GitOps model. It [manages your features](/docs/feature-management) which can be either on/off switches, variations for A/B testing, or even variables as remote configuration for specific functionalities in your software including their rollout rules. These [features](/docs/features) are declared as individual files in a Git repository. ## How does it work? - **Declare feature flags**: Developers declare feature flags as [files](/docs/features) and store them in a Git repository. - **Review & approve**: Any changes to feature flags must be submitted as Pull Requests in Git, allowing team members to review and approve changes. - **Automate with CI/CD**: Featurevisor is tightly integrated with your preferred Continuous Integration/Continuous Deployment (CI/CD) pipeline. When a Pull Request is merged, the pipeline automatically deploys the new configurations. You can find out more info about setting up custom deployment [here](/docs/deployment). Once deployed, all the applications that use Featurevisor [SDKs](/docs/sdks) will automatically fetch the latest feature flags and apply them to their respective environments. ## Does it limit non-technical users? As engineers, we might love the idea of managing all sorts of configuration in a Git repository because it fits our regular workflow without having to learn something new. But what about non-technical users like your Product Managers in a team? If it comes to read-only operations, Featurevisor comes with a [status site generator](/docs/site) so that the current status of all your feature flags, their targeting conditions, and rollout rules can be easily viewed by anyone in your team and organization via a nice and usable website. With Git hosting providers becoming more usable over time allowing changes to be made directly from your browser (like with [GitHub](https://github.com)), one does not have to be technically too advanced to find the YAML files in a Git repository to read and understand them. They can also send changes of their desired feature flags by updating or creating new files straight from the browser. But this does come with an additional learning curve. The list below can help get up to speed with the basics of Git and GitHub: ## Learning resources Assuming you are using GitHub, you can refer to these resources to learn how to send changes to your Git repository directly from your browser: - [About branches](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/about-branches) - [Creating a branch](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-and-deleting-branches-within-your-repository) - [Editing files](https://docs.github.com/en/repositories/working-with-files/managing-files/editing-files) - [Creating a Pull Request](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request) - [Requesting a review](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/requesting-a-pull-request-review) - [Comment on a Pull Request](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/reviewing-changes-in-pull-requests/commenting-on-a-pull-request) - [Merging a Pull Request](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/incorporating-changes-from-a-pull-request/merging-a-pull-request) ## Conclusion GitOps is a modern approach to software deployment and operations that leverages the power of Git. Featurevisor, with its GitOps model, provides a highly collaborative, transparent, and efficient way to manage feature flags in a software project. Whether you are a developer, an operations engineer, or a product manager, the GitOps methodology offers benefits that make your workflow smoother, more transparent, and more efficient. --- title: Configuration nextjs: metadata: title: Configuration description: Configure your Featurevisor project openGraph: title: Configuration description: Configure your Featurevisor project images: - url: /img/og/docs.png --- Every Featurevisor project expects a `featurevisor.config.js` file at the root of the project, next to `package.json`. {% .lead %} ## `featurevisor.config.js` Minimum configuration: ```js {% path="featurevisor.config.js" %} module.exports = { tags: [ 'web', 'mobile', ], environments: [ 'staging', 'production' ], } ``` As your [tags](/docs/tags) and [environments](/docs/environments) grow, you can keep adding them to your configuration file. ## Params ### `tags` An array of [tags](/docs/tags) that can be used in your [features](/docs/features/). Tags are used for building smaller [datafiles](/docs/building-datafiles) containing only the features that you need for your application(s). ```js {% path="featurevisor.config.js" %} module.exports = { tags: [ 'web', 'ios', 'android', 'all' ], } ``` ### `environments` An array of [environments](/docs/environments) that can be used in your [features](/docs/features/). By default, Featurevisor will use `staging` and `production` as environments: ```js {% path="featurevisor.config.js" %} module.exports = { environments: [ 'staging', 'production' ], } ``` If your project does not need any environments, you can also disable it: ```js {% path="featurevisor.config.js" %} module.exports = { environments: false, } ``` Read more in [Environments](/docs/environments) page. ### `attributesDirectoryPath` Path to the directory containing your [attributes](/docs/attributes/). Defaults to `/attributes`. ### `segmentsDirectoryPath` Path to the directory containing your [segments](/docs/segments/). Defaults to `/segments`. ### `featuresDirectoryPath` Path to the directory containing your [features](/docs/features/). Defaults to `/features`. ### `groupsDirectoryPath` Path to the directory containing your [groups](/docs/groups/). Defaults to `/groups`. ### `testsDirectoryPath` Path to the directory containing your [tests](/docs/testing/). Defaults to `/tests`. ### `datafilesDirectoryPath` Path to the directory for your generated [datafiles](/docs/building-datafiles/). Defaults to `/dist`. ### `datafileNamePattern` Defaults to `featurevisor-%s.json`. ### `revisionFileName` Defaults to `REVISION`. Name of the file that will be used to store the [revision](/docs/building-datafiles/) number of your project. ### `stateDirectoryPath` Path to the directory containing your state. Defaults to `/.featurevisor`. Read more in [State files](/docs/state-files). ### `defaultBucketBy` Default value for the `bucketBy` property in your project. Defaults to `userId`. ### `prettyState` Set to `true` or `false` to enable or disable pretty-printing of state files. Defaults to `true`. ### `prettyDatafile` Set to `true` or `false` to enable or disable pretty-printing of datafiles. Defaults to `false`. ### `stringify` By default, Featurevisor will stringify conditions and segments in generated datafiles so that they are parsed only when needed by the SDKs. This optimization technique works well when datafiles are too large in client-side devices (think browsers) and you are only dealing with one user in the runtime. This kind of optimization though can bring opposite results if you are using the SDKs in server-side (think Node.js) serving many different users. To disable this stringification, you can set it to `false`. ### `parser` By default, Featurevisor expects YAML for all definitions. You can change this to JSON by setting `parser: "json"`. See [custom parsers](/docs/advanced/custom-parsers) for more information. ### `enforceCatchAllRule` When set to `true`, linting will make sure all features have a catch-all rule with `segment: "*"` as the last rule in all environments. ### `maxVariableStringLength` Maximum length of a string variable in features. Defaults to no limit. ### `maxVariableArrayStringifiedLength` Maximum length of a stringified array variable in features. Defaults to no limit. ### `maxVariableObjectStringifiedLength` Maximum length of a stringified object variable in features. Defaults to no limit. ### `maxVariableJSONStringifiedLength` Maximum length of a JSON stringified variable in features. Defaults to no limit. --- title: Contributing nextjs: metadata: title: Contributing description: Learn how to contribute to Featurevisor openGraph: title: Contributing description: Learn how to contribute to Featurevisor images: - url: /img/og/docs.png --- ## Code of Conduct We have adopted the [Contributor Covenant](https://www.contributor-covenant.org/) as our [Code of Conduct](https://github.com/featurevisor/featurevisor/blob/main/CODE_OF_CONDUCT.md), and we expect project participants to adhere to it. ## Branch organization You can send your pull requests against the `main` branch. ## Bugs We use [GitHub Issues](https://github.com/featurevisor/featurevisor/issues) for bug reporting. Before reporting any new ones, please check if it has been reported already. ## License By contributing to Featurevisor, you agree that your contributions will be licensed under its [MIT license](https://github.com/featurevisor/featurevisor/blob/main/LICENSE). ## Development workflow Prerequsites: - [Node.js](https://nodejs.org/en/) v20+ - [npm](https://www.npmjs.com/) v8+ - [Git](https://git-scm.com/) v2+ ### Commands Clone the [repository](https://github.com/featurevisor/featurevisor). Run: ```{% title="Command" %} $ npm ci $ npm run bootstrap ``` Make your desired changes to any packages or documentation. To test everything: ```{% title="Command" %} $ npm run build $ npm test ``` Apply project-wide code styles: ```{% title="Command" %} $ npm run lint ``` If you need to override code style and linter configurations for individual packages, you can do so by adding or changing specific rules in package-level configuration files. For example: ```js {% path="packages/sdk/prettier.config.js" %} const rootConfig = require('../../prettier.config') /** @type {import('prettier').Config} */ const config = { ...rootConfig, singleQuote: true, } module.exports = config ``` ```js {% path="packages/sdk/.eslintrc.js" %} const rootConfig = require('../../.eslintrc.js') /** @type {import("eslint").Linter.Config} */ const config = { ...rootConfig, env: { node: true, }, rules: { ...rootConfig.rules, '@typescript-eslint/no-explicit-any': 'off', }, } module.exports = config ``` ### Testing You are advised to test your changes locally before submitting a pull request. The core team uses the `example-1` project as a playground for testing new features and changes. ```{% title="Command" %} $ cd examples/example-1 $ npm run lint $ npm run build $ npm test $ npm start ``` ### Pull Requests Send Pull Requests against the `main` branch. --- title: Datasource & Adapters nextjs: metadata: title: Datasource & Adapters description: Go beyond file system as source of your configuration with Featurevisor openGraph: title: Datasource & Adapters description: Go beyond file system as source of your configuration with Featurevisor images: - url: /img/og/docs-datasource.png --- By default, Featurevisor [CLI](/docs/cli) uses the file system for reading and writing data in your project, given it's a Git repository after all. But the [configuration](/docs/configuration) API allows you to switch to any source via adapters. {% .lead %} ## Accessing datasource It's unlikely that you will make use of the Datasource API yourself directly, unless you are a [plugin](/docs/plugins) developer. The `datasource` object allows you to read and write data from/to the Featurevisor project, so that you don't have to deal with the file system (or any other custom source of your project data) directly. You can refer to the full [datasource API](https://github.com/featurevisor/featurevisor/blob/main/packages/core/src/datasource/datasource.ts) for more details. ## Datasource methods Once you have access to the `datasource` object, you can use the following methods from its instance: ### Revision See [state files](/docs/state-files) for more details. ```js const revision = await datasource.readRevision() await datasource.writeRevision(revision + 1) ``` ### Features See [features](/docs/features) for more details. ```js const features = await datasource.listFeatures() const fooFeatureExists = await datasource.featureExists('foo') const fooFeature = await datasource.readFeature('foo') await datasource.writeFeature('foo', { ...fooFeature, ...newData }) await datasource.deleteFeature('foo') ``` ### Segments See [segments](/docs/segments) for more details. ```js const segments = await datasource.listSegments() const fooSegmentExists = await datasource.segmentExists('foo') const fooSegment = await datasource.readSegment('foo') await datasource.writeSegment('foo', { ...fooSegment, ...newData }) await datasource.deleteSegment('foo') ``` ### Attributes See [attributes](/docs/attributes) for more details. ```js const attributes = await datasource.listAttributes() const fooAttributeExists = await datasource.attributeExists('foo') const fooAttribute = await datasource.readAttribute('foo') await datasource.writeAttribute('foo', { ...fooAttribute, ...newData }) await datasource.deleteAttribute('foo') ``` ### Groups See [groups](/docs/groups) for more details. ```js const groups = await datasource.listGroups() const fooGroupExists = await datasource.groupExists('foo') const fooGroup = await datasource.readGroup('foo') await datasource.writeGroup('foo', { ...fooGroup, ...newData }) await datasource.deleteGroup('foo') ``` ### Tests See [testing](/docs/testing) for more details. ```js const tests = await datasource.listTests() const fooTest = await datasource.readTest('foo') await datasource.writeTest('foo', { ...fooTest, ...newData }) await datasource.deleteTest('foo') ``` ### State See [state files](/docs/state-files) for more details. ```js const existingState = await datasource.readState(environment) datasource.writeState(environment, { ...existingState, ...newState }) ``` ### History To get history of changes made to a specific entity: ```js const fooChanges = await datasource.listHistoryEntries('feature', 'foo') ``` The first argument for entity type can be one of: - `feature` - `segment` - `attribute` - `group` - `test` ## Adapters Because a Featurevisor project is a Git repository by default, Featurevisor CLI ships with a default adapter that reads and writes data from/to the file system which is called `FilesystemAdapter`. You don't have to configure this adapter explicitly anywhere, unless you are writing a custom one. ### Writing a custom adapter You can write your own custom datasource adapter as follows: ```ts {% path="adapters/custom-adapter.ts" %} import { Adapter } from '@featurevisor/core' export class CustomAdapter extends Adapter { // ...implement the methods here } ``` Refer to the implementation of [`FilesystemAdapter`](https://github.com/featurevisor/featurevisor/blob/main/packages/core/src/datasource/filesystemAdapter.ts) to understand more. ### Using a custom adapter You can swap out the default file system adapter with your custom adapter via you [configuration](/docs/configuration) file as found in `featurevisor.config.js`: ```js {% path="featurevisor.config.js" %} const { CustomAdapter } = require('./adapters/custom-adapter') module.exports = { environments: ['staging', 'production'], tags: ['web', 'mobile'], adapter: CustomAdapter, } ``` --- title: Deployment nextjs: metadata: title: Deployment description: Deploy your Featurevisor project datafiles openGraph: title: Deployment description: Deploy your Featurevisor project datafiles images: - url: /img/og/docs.png --- Once you have built your datafiles, you can deploy them to your CDN or any other static file hosting service. {% .lead %} We recommend that you set up a CI/CD pipeline to automate the build and deployment process, instead of doing it from your local machine. ## Steps involved When a new Pull Request (branch) is merged, the CI/CD pipeline should cover: ### Linting Lint all the attributes, segments, and feature definitions: ``` $ npx featurevisor lint ``` ### Testing Test all the features and segments: ``` $ npx featurevisor test ``` ### Build the datafiles ``` $ npx featurevisor build ``` ### Commit the state files ``` $ git add .featurevisor/* $ git commit -m "[skip ci] Revision $(cat .featurevisor/REVISION)" ``` Only the [state files](/docs/state-files) should be committed as available under `.featurevisor` directory. The generated datafiles in `datafiles` directory are ignored from Git. ### Upload to your CDN This step is specific to your CDN provider or custom infrastructure. You can use the `datafiles` directory as the root directory for your CDN. ### Push commits back to upstream We want to make sure the next Pull Request merge will be on top of the latest version and state files. ``` $ git push origin main ``` If any of the steps above fail, the CI/CD pipeline should stop and notify the team. ## Fully functional example You can refer to the CI/CD guide for [GitHub Actions](/docs/integrations/github-actions) and either [Cloudflare Pages](/docs/integrations/cloudflare-pages) or [GitHub Pages](/docs/integrations/github-pages) if you want a fully functional real-world example of setting up a Featurevisor project. Repo is available here: [https://github.com/featurevisor/featurevisor-example-cloudflare](https://github.com/featurevisor/featurevisor-example-cloudflare). --- title: Environments nextjs: metadata: title: Environments description: Customize your Featurevisor project with multiple environments openGraph: title: Environments description: Customize your Featurevisor project with multiple environments images: - url: /img/og/docs-environments.png --- Featurevisor is highly configurable and allows you to have any number of custom environments (like development, staging, and production). You can also choose to have no environments at all. {% .lead %} ## Custom environments It is recommended that you have at least `staging` and `production` environments in your project. You can add more environments as needed: ```js {% path="featurevisor.config.js" highlight="7-12" %} module.exports = { tags: [ 'web', 'mobile' ], environments: [ 'staging', 'production', // add more environments here... ], } ``` Above configuration will help you define your features against each environment as follows: ```yml {% path="features/my_feature.yml" highlight="9,14" %} description: My feature tags: - web bucketBy: userId # rules per each environment rules: staging: - key: everyone segments: '*' percentage: 100 production: - key: everyone segments: '*' percentage: 0 ``` And the [datafiles](/docs/building-datafiles) will be built per each environment: ```{% highlight="3,6" %} $ tree datafiles . ├── staging/ │ ├── featurevisor-tag-web.json │ └── featurevisor-tag-mobile.json ├── production/ │ ├── featurevisor-tag-web.json │ └── featurevisor-tag-mobile.json ``` ## No environments You can also choose to have no environments at all: ```js {% path="featurevisor.config.js" highlight="6" %} module.exports = { tags: [ 'web', 'mobile' ], environments: false, } ``` This will allow you to define your rollout rules directly: ```yml {% path="features/my_feature.yml" %} description: My feature tags: - web bucketBy: userId # rules without needing environment specific keys rules: - key: everyone segments: '*' percentage: 100 ``` The [datafiles](/docs/building-datafiles) will be built without any environment: ``` $ tree datafiles . ├── featurevisor-tag-web.json ├── featurevisor-tag-mobile.json ``` --- title: Examples nextjs: metadata: title: Examples description: Example projects with Featurevisor openGraph: title: Examples description: Example projects with Featurevisor images: - url: /img/og/docs.png --- ## Example projects You can find example projects on [GitHub](https://github.com/featurevisor/featurevisor) in the [examples](https://github.com/featurevisor/featurevisor/tree/main/examples) directory. ### Default project By default, initializing command will locally check out the [`example-yml`](https://github.com/featurevisor/featurevisor/tree/main/examples/example-yml) project. As the name suggests, it uses YAML files. ``` $ mkdir my-project && cd my-project $ npx @featurevisor/cli init ``` ### Specific example You can also initialize a specific example project by passing the `--example` flag. For JSON: ``` $ npx @featurevisor/cli init --example=json ``` For TOML: ``` $ npx @featurevisor/cli init --example=toml ``` ## SDK examples For SDK integration examples in applications, you can find them in our [GitHub organization](https://github.com/featurevisor) with repos prefixed with [`featurevisor-examples-`](https://github.com/orgs/featurevisor/repositories?q=featurevisor-example&type=all&language=&sort=). --- title: Frequently Asked Questions nextjs: metadata: title: Frequently Asked Questions description: Frequently asked questions about Featurevisor openGraph: title: Frequently Asked Questions description: Frequently asked questions about Featurevisor images: - url: /img/og/docs.png --- ## Is Featurevisor free? It is an Open Source project, released under the MIT license. You can use it for free. ## Why is it free? Where's the catch? There's none really. The tool's author is a developer, and he wanted to see how far he can stretch the limits of using declarative files in a Git repository for [feature management](/docs/feature-management) needs. It worked for him, and now it might work for you as well. ## Is there any UI for managing features? Featurevisor is a tool aimed at developers, and its entire workflow is based on working with a [Git repository](/docs/concepts/gitops). It is built on top of [Infrastructure as Code (IaC)](/docs/concepts/infrastructure-as-code) principles. Therefore there is no UI involved when it comes to changing anything. You will be editing files (like in [YAML or JSON](/docs/advanced/custom-parsers)) in your repository, and committing them to Git. There is a [status site](/docs/site) generator though, which is a static site generated from the repository content you can host internally for your organization in read-only mode. ## Should I use Featurevisor? It depends. If you are a team or organization that is open to managing all their feature flags, experiments, or any remote configuration via a Git repository based workflow, then yes, Featurevisor can fit in very nicely. But an organization can be more than just its engineering team(s), and many other stakeholders might be involved in the process of managing features and experiments, including people who may not know how to use Git. If they need to be involved in the process, then it's best you look for another tool. ## Will Featurevisor ever target non-technical people? The path to v1.0 has been fully focused on creating a solution that's fully Git based. So it excluded non-technical people from the process as a result, aiming to serve developers only. There's no plan to change that in the near future. But, never say never. An Open Source project can always keep evolving. ## Can I switch from another SaaS to Featurevisor? Yes, you can. But Featurevisor does not provide any migration tool to import your features and experiments from another service (yet). ## Can I switch to another tool or SaaS later? Given you own everything in your own repository, you can switch to any other tool or third-party SaaS at any time later. There's no lock-in. You will also be owning the responsibility of migration yourself in that case. ## Do you accept donations? No, but thanks if you were thinking about it. The author of Featurevisor is doing this for fun and to learn new things by spending his evenings and weekends building this project. He's not looking to make money out of it. ## Can I contribute to Featurevisor? Yes! Please see our contribution guidelines [here](/docs/contributing). --- title: Feature Management nextjs: metadata: title: Feature Management description: Learn what feature management is all about and how to use it to roll out new features safely. openGraph: title: Feature Management description: Learn what feature management is all about and how to use it to roll out new features safely. images: - url: /img/og/docs-feature-management.png --- In software development, a "**feature**" is a distinct unit of functionality that fulfills a particular requirement or solves a specific problem. Features are the building blocks of any software, be it a simple button in a mobile app or a complex e-commerce service. {% .lead %} Managing features effectively is critical for the success of any software project, and that's where the practice of feature management comes into play. It is in the end a practice rather than a tool, and this guide will help you understand what it is all about and how Featurevisor can help you with it. ## What is Feature Management? Feature Management is the practice of controlling the visibility and behavior of different features within a software application. It involves a set of techniques and tools that allow you to: - **Toggle features**: Turn features on or off without changing the application code. - **Rollout control**: Gradually release new features to a subset of users. - **A/B & multivariate testing**: [Experiment](/docs/use-cases/experiments) with different variations of a feature to see which performs better. - **Remote configuration**: Change the [behavior](/docs/use-cases/remote-configuration) or appearance of features without deploying new application code. ## Benefits - **Faster time-to-market**: Feature management enables you to deploy code as soon as it's written and tested, even if the feature isn't fully complete. This results in quicker releases and a faster time-to-market. - **Reduced risk**: By using feature flags, you can release new features in a controlled manner, making it easier to roll back in case of errors or issues. This reduces the risk associated with each release. - **Increased flexibility**: Feature management allows for more dynamic and flexible software releases. You can toggle features on or off, perform A/B tests, or roll out features to specific user segments without requiring code changes. - **Improved user experience**: With feature management, you can personalize features for specific user segments, improving user satisfaction and potentially increasing conversion rates. - **Streamlined testing**: Feature flags enable more efficient testing strategies like canary releases and A/B testing, making it easier to gather user feedback and make data-driven decisions. - **Better collaboration**: Feature management tools centralize all your feature configuration in one place to collaborate and manage features more effectively. - **Phased rollouts**: Feature management allows you to gradually release new features, collecting data and feedback at each stage to ensure that the rollout is as smooth as possible. - **Simplified debugging**: When an issue arises, it's easier to pinpoint the cause when you can control the configuration of your features in one place, including seeing history of its recent changes. ## Terms There are various terms that are commonly used in the context of feature management. Let's take a look at some of them: ### Feature flags A feature flag, also known as a feature toggle, is a software development technique that's basically an implementation of "if" condition in your code allowing you to enable or disable functionality in your application, ideally without requiring any further deployments. This is achieved by [decoupling](/docs/use-cases/decouple-releases-from-deployments) your feature releases from application deployments. Feature flags can be referred to by many other different names as well: - feature toggle - feature switch - feature rollout - feature release - feature launch Usually feature flags are evaluated at runtime, meaning that the application will check the value of the flag at the time of execution and behave accordingly. And the evaluated value of the flag can be different for different users, depending on the conditions you set. In Featurevisor, feature flags are expressed as [features](/docs/features), which can be a simple boolean flag, or an A/B test, or a multivariate test with scoped variables depending on your use case. ### A/B tests A/B testing is an [experimentation](/docs/use-cases/experiments) technique used to compare two or more variations of a feature to see which one performs better against your conversion goals. Your conversion goals can be anything from increasing the number of sign-ups to improving the click-through rate of a button. It is a great way to validate your assumptions and make data-driven decisions. In Featurevisor, everything is expressed as a [feature](/docs/features) including basic on/off feature flags and also A/B tests with multiple [variations](/docs/features/#variations). You learn more about how they are both evaluated using the SDKs in application runtime [here](/docs/sdks). ### Canary release Canary release is a technique used to gradually roll out new features to a subset of your users. It is a great way to test new features in production with real users and gather feedback before rolling them out to all your traffic. Learn more about it how it works with Featurevisor in [Progressive Delivery](/docs/use-cases/progressive-delivery) guide. ### Dark launches The practice of releasing new features that are hidden from users, allowing developers to test functionalities in a production environment without exposing it to the general public. You can read further about its use cases in: - [Testing in production](/docs/use-cases/testing-in-production) - [Trunk-based development](/docs/use-cases/trunk-based-development) ### Rollout The process of making a particular feature or change available to users. This can be done [incrementally](/docs/use-cases/progressive-delivery) or all at once, depending on the strategy. You can see how the rollout rules are defined in Featurevisor [here](/docs/features/#rules). ### Targeting rules Targeting rules are the conditions that determine whether a particular feature is exposed or not to a particular user. They can be based on a variety of factors, such as: - User attributes (e.g. age, location, subscription plan, etc.) - User behavior (e.g. number of sessions, number of purchases, etc.) - Device attributes (e.g. browser, screen size, OS etc.) - And more... In Featurevisor, targeting conditions are expressed as [segments](/docs/segments) which contain a set of conditions that must be met for a user to be included in the segment. Those conditions are defined against [attributes](/docs/attributes), and evaluated against provided [context](/docs/sdks/javascript/#context). ### Bucketing The practice of categorizing users into different cohorts, or "buckets", often to test multiple variations of an A/B test or handling a regular feature flag's incremental rollout. Read more about how Featurevisor handles bucketing [here](/docs/bucketing), and how they are expressed in features [here](/docs/features/#bucketing). ## Challenges with traditional feature management Traditionally, feature management has been done by hardcoding conditional statements in the application code directly, which lead to several challenges: - **Code complexity**: Using conditional statements to control features can make the codebase messy. - **Deployment risks**: Rolling out features without a controlled environment can lead to unexpected issues. - **Lack of flexibility**: Once a feature is deployed, it's generally difficult to modify or roll it back. - **Collaboration gaps**: Development, QA, and product management often lack a unified tool to control and monitor features. ## Enter Featurevisor Featurevisor is an open-source software specifically designed to tackle the challenges of Feature Management. Here's how: ### GitOps principles Featurevisor adopts [GitOps](/docs/concepts/gitops) workflow, making it easier to manage, review, and approve feature changes through Pull Requests. This brings in accountability and ensures only vetted changes go live. ### Transparency and auditability Because of using Git, you also get the benefits of version control, allowing you to easily roll back to a previous version if needed and have full history of all changes for auditing purposes. ### Independent configuration deployment Featurevisor allows you to deploy configurations independently of the main application. These configurations, known as "[**datafiles**](/docs/building-datafiles)", contain all the settings related to your feature flags, A/B tests, and variables. This helps [decouple](/docs/use-cases/decouple-releases-from-deployments) releases from application deployments. ### Instant updates Featurevisor [SDKs](/docs/sdks) ensure latest configuration is fetched in applications, meaning you can toggle features on or off instantly without waiting for a new application deployment. ### Cloud Native and unopinionated Whether you are using AWS, Google Cloud, Azure, or any other cloud service, Featurevisor's [cloud native architecture](/docs/concepts/cloud-native-architecture) seamlessly integrates with your existing tech stack. It has no preference for Git hosting, CI/CD tools, or CDN, offering you unparalleled flexibility. ## Conclusion Feature Management is crucial in modern software engineering for deploying in a safer, faster, and in a more controlled manner. Featurevisor takes it a notch higher by incorporating best practices like GitOps and offering instant, flexible configurations. By adopting Featurevisor, you are not just choosing a tool; you are opting for a more efficient and effective approach to managing your software's features and everything to do with managing their releases. --- title: Features nextjs: metadata: title: Features description: Learn how to create feature flags in Featurevisor openGraph: title: Features description: Learn how to create feature flags in Featurevisor images: - url: /img/og/docs-features.png --- Features are the building blocks of creating traditional boolean feature flags and more advanced multivariate experiments, along with variables. {% .lead %} ## Evaluations The goal of creating a feature is to be able to evaluate its values in your application with the provided [SDKs](/docs/sdks). The evaluated values can be either its: - **Flag** (`boolean`): its own on/off status - **Variation** (`string`): a string value if you have a/b tests running - **Variables**: a set of key/value pairs (if any) You can learn more about how the evaluation logic works for each [here](#evaluation-flow). ## Create a Feature Let's say we have built a new sidebar in our application's UI, and we wish to roll it out gradually to our users. We can do that by creating a new feature called `sidebar`: ```yml {% path="features/sidebar.yml" %} description: Sidebar tags: - all bucketBy: userId rules: production: - key: everyone segments: '*' # `*` means everyone percentage: 100 # rolled out 100% ``` This is the smallest possible definition of a feature in Featurevisor. Quite a few are happening there. We will go through each of the properties from the snippet above and more in the following sections. ## Description This is for describing what the feature is about, and meant to be used by the team members who are working on the feature. ```yml {% path="features/sidebar.yml" %} description: Some human readable description of this particular feature # ... ``` ## Tags Tags are used to group features together. This helps your application load only the features that are relevant to the application itself. Very useful when you have multiple applications targeting different platforms (like Web, iOS, Android) in your organization and you are managing all the features from the same Featurevisor project. Array of tags are defined in the `tags` property: ```yml {% path="features/sidebar.yml" %} # ... tags: - all - web - ios ``` Read more about how tags are relevant in [building datafiles](/docs/building-datafiles) per [tag](/docs/tags/). ## Bucketing The `bucketBy` property is used to determine how the feature will be bucketed. Meaning, how the values of a feature are assigned to a user as they get rolled out gradually. ### Single attribute If the user's ID is always known when the particular feature is evaluated, it makes sense to use that as the `bucketBy` value. ```yml {% path="features/sidebar.yml" %} # ... bucketBy: userId ``` Given we used `userId` attribute as the `bucketBy` value, it means no matter which application or device the user is using, as long as the `userId` attribute's value is the same in [context](/docs/sdks/javascript/#context), the same value(s) of the feature will be consistently evaluated for that particular user. ### Anonymous users If the user is anonymous, you can consider using `deviceId` or any other unique identifier that is available in the context instead. ```yml {% path="features/sidebar.yml" %} # ... bucketBy: deviceId ``` ### Combining attributes If you want to bucket users against multiple attributes together, you can do as follows: ```yml {% path="features/sidebar.yml" %} # ... bucketBy: - organizationId - userId ``` ### Alternative attribute If you want to bucket users against first available attribute only, you can do as follows: ```yml {% path="features/sidebar.yml" %} # ... bucketBy: or: - userId - deviceId ``` You can read more about bucketing concept [here](/docs/bucketing). ## Rules A feature can have multiple rollout rules for each environment. ### By environment The environment keys are based on your project configuration. Read more in [Configuration](/docs/configuration) and [Environments](/docs/environments/). ```yml {% path="features/sidebar.yml" highlight="4,9" %} # ... rules: staging: - key: everyone segments: '*' percentage: 100 production: - key: everyone segments: '*' percentage: 100 ``` ### Rule key Each rule must have a unique `key` value among sibling rules within that environment. This is needed to maintain [consistent bucketing](/docs/bucketing/) as we increase our rollout percentage over time, ensuring the same user gets the same feature value every time they are evaluated against the same context. ```yml {% path="features/sidebar.yml" highlight="5,9" %} # ... rules: production: - key: nl segments: netherlands percentage: 50 - key: everyone segments: '*' # everyone percentage: 100 ``` The first rule matched always wins when features are evaluated by the [SDKs](/docs/sdks/) against provided [context](/docs/sdks/javascript/#context). ### Segments Targeting your audience is one of the most important things when rolling out features. You can do that with reusable [segments](/docs/segments/). #### Targeting everyone If we wish to roll out a feature to everyone, we can use the `*` asterisk in `segments` property inside a rule: ```yml {% path="features/sidebar.yml" highlight="6" %} # ... rules: production: - key: everyone segments: '*' percentage: 100 ``` #### Specific segment If we wish to roll out a feature to a specific segment: ```yml {% path="features/sidebar.yml" highlight="6" %} # ... rules: production: - key: de segments: germany # referencing segments/germany.yml percentage: 100 # any value between 0 and 100 (inclusive) ``` #### Complex We can combine `and`, `or`, and `not` operators to create complex segments: ##### With `and` operator: ```yml {% path="features/sidebar.yml" highlight="7-10" %} # ... rules: production: - key: de+iphone # any unique string is fine here # targeting: iphone users in germany segments: and: - germany - iphoneUsers percentage: 100 ``` ##### With `or` operator: ```yml {% path="features/sidebar.yml" highlight="7-10" %} # ... rules: production: - key: nl-or-de # targeting: users from either The Netherlands or Germany segments: or: - netherlands - germany percentage: 100 ``` ##### With `not` operator: ```yml {% path="features/sidebar.yml" highlight="7-9" %} # ... rules: production: - key: not-de # targeting: users from everywhere except Germany segments: not: - germany percentage: 100 ``` ##### Combining multiple operators Combining `and`, `or`, and `not` operators: ```yml {% path="features/sidebar.yml" highlight="10-18" %} # ... rules: production: - key: '1' # targeting: # - adult users with iPhone, and # - from either The Netherlands or Germany, and # - not newsletterSubscribers segments: - and: - iphoneUsers - adultUsers - or: - netherlands - germany - not: - newsletterSubscribers percentage: 100 ``` ##### Nested operators You can also nest `and`, `or`, and `not` operators: ```yml {% path="features/sidebar.yml" highlight="6-12" %} # ... rules: production: - key: '1' segments: - and: - iphoneUsers - adultUsers - or: - netherlands - germany percentage: 100 ``` ### Percentage The `percentage` property is used to determine what percentage of users matching the segments of the particular rule will see this feature as enabled: ```yml {% path="features/sidebar.yml" highlight="7" %} # ... rules: production: - key: everyone segments: '*' percentage: 100 ``` You can choose a number between `0` and `100` (inclusive), with up to 2 decimal places. ### Rule description You can also describe each rule with an optional `description` property. This is useful for documentation purposes: ```yml {% path="features/sidebar.yml" highlight="6" %} # ... rules: production: - key: everyone description: Rollout to everyone in production segments: '*' percentage: 100 ``` ## Variations A feature can have multiple variations if you wish to run A/B test [experiments](/docs/use-cases/experiments/). ### Weights Each variation must have a unique string value with their own weights (out of 100): ```yml {% path="features/sidebar.yml" %} # ... variations: - value: control weight: 50 - value: treatment weight: 50 ``` The sum of all variations' weights must be 100. You can have up to 2 decimal places for each weight. {% callout type="note" title="Control variation" %} In the world of experimentation, the default variation is usually called the `control` variation, and the second variation is called `treatment`. But you are free to name them however you want, and create as many variations as you want. {% /callout %} You can read more about experimentation [here](/docs/use-cases/experiments). ### Disabled variation value If the feature is evaluated as disabled, then its variation will evaluate as `null` by default. You can change this behaviour by setting `disabledVariationValue` property in the feature: ```yml {% path="features/sidebar.yml" highlight="3" %} # ... disabledVariationValue: control variations: - value: control weight: 50 - value: treatment weight: 50 ``` ### Overriding variation value You can also override the variation of a feature for a specific rule: ```yml {% path="features/sidebar.yml" highlight="8" %} # ... rules: production: - key: nl segments: netherlands percentage: 100 variation: control ``` This is useful when you know the desired variation you want to stick to in a specific rule's segments, while continuing testing other variations in other rules. ### Overriding variation weights The weights of variations as defined in `variations` is honoured by all rules by default. There might be cases where you want to override the weights of variations just for a specific rule. You can do that by defining `variationWeights` property in the rule: ```yml {% path="features/sidebar.yml" highlight="8-10" %} # ... rules: production: - key: everyone segments: '*' percentage: 100 variationWeights: control: 70 treatment: 30 ``` ### Variation description You can also describe each variation with an optional `description` property for documentation purposes: ```yml {% path="features/sidebar.yml" highlight="5,9" %} # ... variations: - value: control description: Default experiment for all users weight: 50 - value: treatment description: The new sidebar design that we are testing weight: 50 ``` ## Variables Variables are really powerful, and they allow you to use Featurevisor as your application's runtime configuration management tool. ### Schema Before assigning variable values, we must define the schema of our variables in the feature: ```yml {% path="features/sidebar.yml" %} # ... variablesSchema: bgColor: type: string defaultValue: red ``` Like variations, variables can also have a `description` property for documentation purposes. You can read more about using Featurevisor for remote configuration needs [here](/docs/use-cases/remote-configuration). ### Default when disabled If the feature itself is evaluated as disabled, its variables will evaluate as `null` by default. If you want to serve the variable's default value when the feature is disabled, you can set `useDefaultWhenDisabled: true`: ```yml {% path="features/sidebar.yml" highlight="7" %} # ... variablesSchema: bgColor: type: string defaultValue: red useDefaultWhenDisabled: true ``` ### Disabled variable value If you want a specific variable value to be served instead of the default one when the feature itself is disabled, you can set `disabledValue` property: ```yml {% path="features/sidebar.yml" highlight="7" %} # ... variablesSchema: bgColor: type: string defaultValue: red disabledValue: purple ``` ### Variable description You can also describe each variable with an optional `description` property for documentation purposes: ```yml {% path="features/sidebar.yml" highlight="6" %} # ... variablesSchema: bgColor: type: string description: Background colour of the sidebar defaultValue: red ``` ### Supported types These types of variables are allowed: - `string` - `boolean` - `integer` - `double` - `array` (of strings) - `object` (flat objects only) - `json` (any valid JSON in stringified form) #### `string` ```yml # ... variablesSchema: bgColor: type: string defaultValue: red ``` #### `boolean` ```yml # ... variablesSchema: showSidebar: type: boolean defaultValue: false ``` #### `integer` ```yml # ... variablesSchema: position: type: integer defaultValue: 1 ``` #### `double` ```yml # ... variablesSchema: amount: type: double defaultValue: 9.99 ``` #### `array` ```yml # ... variablesSchema: acceptedCards: type: array defaultValue: - visa - mastercard ``` #### `object` ```yml # ... variablesSchema: hero: type: object defaultValue: title: Welcome subtitle: Welcome to our website ``` #### `json` ```yml # ... variablesSchema: hero: type: json defaultValue: '{"title": "Welcome", "subtitle": "Welcome to our website"}' ``` ### Overriding variables #### From rules You can override variable values for specific rules: ```yml {% path="features/sidebar.yml" highlight="8-9" %} # ... rules: production: - key: nl segments: netherlands percentage: 100 variables: bgColor: orange ``` #### From variations We can assign values to the variables inside variations: ```yml {% path="features/sidebar.yml" highlight="9-10" %} # ... variations: - value: control weight: 50 - value: treatment weight: 50 variables: bgColor: blue ``` If users are bucketed in the `treatment` variation, they will get the `bgColor` variable value of `blue`. Otherwise they will fall back to the default value of `red` as defined in the variables schema. #### Further overriding from variation ```yml {% path="features/sidebar.yml" highlight="9-13" %} # ... variations: # ... - value: treatment weight: 100 variableOverrides: # bgColor should be orange if `netherlands` segment is matched bgColor: - segments: netherlands value: orange # for everyone else in `treatment` variation, it should be blue variables: bgColor: blue ``` If you want to embed overriding conditions directly within variations: ```yml {% path="features/sidebar.yml" highlight="10-13" %} # ... variations: # ... - value: treatment weight: 100 variableOverrides: bgColor: - conditions: - attribute: country operator: equals value: nl value: orange variables: bgColor: blue ``` ## Required A feature can be dependent on one or more other features. This is useful when you want to make sure that a feature is only allowed to continue its evaluation if the other required features are also evaluated as enabled first. For example, let's say we have a new feature under development for redesigning the checkout flow of an e-commerce application. We can call it `checkoutRedesign`. And we have another feature called `checkoutPromo` which is responsible for showing a promo code input field in the new redesigned checkout flow. We can call it `checkoutPromo`. ### Required as enabled Given the `checkoutPromo` feature is dependent on the `checkoutRedesign` feature, we can express that in YAML as follows: ```yml {% path="features/checkoutPromo.yml" highlight="7-8" %} description: Checkout promo tags: - all bucketBy: userId required: - checkoutRedesign # ... ``` This will make sure that `checkoutPromo` feature can continue its evaluation by the SDKs if `checkoutRedesign` feature is enabled against the same context first. ### Required with variation It is possible to have multiple features defined as required for a feature. Furthermore, you can also require the feature(s) to be evaluated as a specific variation: ```yml {% path="features/checkoutPromo.yml" highlight="7-13" %} description: Checkout promo tags: - all bucketBy: userId required: # checking only if checkoutRedesign is enabled - checkoutRedesign # require the feature to be evaluated with a specific variation - key: someOtherFeature variation: treatment # ... ``` If both the required features are evaluated as desired, the dependent feature `checkoutPromo` will then continue with its own evaluation. You can read more about managing feature dependencies [here](/docs/use-cases/dependencies). ## Force You can force a feature to be enabled or disabled against custom conditions. This is very useful when you wish to test something out quickly just for yourself in a specific environment without affecting any other users. ### With conditions ```yml {% path="features/sidebar.yml" highlight="" %} # ... force: production: - conditions: - attribute: userId operator: equals value: '123' # enable or disable it enabled: true # forced variation variation: treatment # variables can also be forced variables: bgColor: purple ``` ### With segments Instead of `conditions` above, you can also use `segments` for forcing variations and variables. ```yml {% path="features/sidebar.yml" highlight="5" %} # ... force: production: - segments: QATeam # enable or disable it enabled: true # forced variation variation: treatment # variables can also be forced variables: bgColor: purple ``` You can see our use case covering this functionality in [testing in production](/docs/use-cases/testing-in-production) guide. Unlike rules, forcing evaluations do not require `key` and `percentage` properties. ## Deprecating You can deprecate a feature by setting `deprecated: true`: ```yml {% path="features/sidebar.yml" %} deprecated: true # ... ``` Deprecating a feature will still include the feature in generated [datafiles](/docs/building-datafiles/) and [SDKs](/docs/sdks/) will still be able to evaluate the feature, but evaluation will lead to showing a warning in the logs. Similarly, variables can also be deprecated: ```yml {% path="features/sidebar.yml" highlight="7" %} # ... variablesSchema: bgColor: type: string defaultValue: red deprecated: true # mark as deprecated ``` This is done to help notify the developers to stop using the affected feature or its variable without breaking the application. ## Archiving You can archive a feature by setting `archived: true`: ```yml archived: true # ... ``` Doing so will exclude the feature from generated datafiles and SDKs will not be able to evaluate the feature. ## Expose In some cases, you may not want to expose a certain feature's configuration only in specific environments when generating the [datafiles](/docs/building-datafiles). Exposure here means the inclusion of the feature's configuration in the generated datafile, irrespective of whether the feature is later evaluated as enabled or disabled. This is different than: - **Archiving**: because you only want to control the exposure of the feature's configuration in a specific environment, not all environments - **Deprecating**: because deprecating a feature will still expose the configuration in all environments - **0% rollout**: because this will evaluate the feature as disabled as intended, but still expose the configuration in the datafiles which we do not want To achieve that, we can use the `expose` property: ```yml {% path="features/sidebar.yml" highlight="8" %} # ... # this optional property tells Featurevisor # to not include this feature config # when generating datafiles # for this specific environment expose: production: false ``` This technique is useful if you wish to test things out in a specific environment (like staging) without affecting rest of the environments (like production). You can take things a bit further if you wish to expose the feature only for certain tags in an environment: ```yml {% path="features/sidebar.yml" highlight="13-17" %} # ... # imagine you already had these tags tags: - web - ios - android # this optional property tells Featurevisor # to include this feature config # when generating datafiles # for only these specific tags expose: production: - web - ios # skipping `android` here ``` Ideally we never wish to keep `expose` property in our definitions, and it is only meant to serve our short term needs especially when we might be migrating from another feature management tool to Featurevisor. ## Evaluation flow Understand how each type of evaluation works in Featurevisor: ### Flag Flag here means the boolean value of the feature itself, meaning whether it is enabled or disabled. These are the sequential steps to evaluate a feature's flag value via its [SDKs](/docs/sdks/): 1. If [sticky](/docs/sdks/javascript/#sticky) feature is available in SDK, use the boolean value from there 1. If feature key does not exist in [datafile](/docs/building-datafiles/), the feature is evaluated as disabled 1. If there are any [`required`](#required) (dependency) features, and they do not satisfy the conditions, the feature is evaluated as disabled 1. If there are any [forced rules](#force), use the first one that matches the [context](/docs/sdks/javascript/#context) and use its `enabled` value 1. Find the first [rule](#rules) that matches against the [context](/docs/sdks/javascript/#context) and use its `enabled` value, otherwise use [bucketing](/docs/bucketing/) logic to determine if the feature is enabled or not 1. If no rules matched, the feature is evaluated as disabled ### Variation 1. If [sticky](/docs/sdks/javascript/#sticky) variation value is available in SDK, use the value from there 1. If the feature's flag value is `false`: 1. The variation is evaluated as `null` by default 1. If [`disabledVariationValue`](#disabled-variation-value) is set, the variation is evaluated as that value 1. If the feature's flag value is `true`: 1. If there are any [forced rules](#force), use the first one that matches the [context](/docs/sdks/javascript/#context), and use its `variation` value 1. If there are any [rules](#rules) that match against the [context](/docs/sdks/javascript/#context), use the rule's `variation` value, otherwise use [bucketing](/docs/bucketing/) logic to determine the variation value ### Variable 1. If [sticky](/docs/sdks/javascript/#sticky) variable value is available in SDK, use the value from there 1. If the feature's flag value is `false`: 1. The variable is evaluated as `null` by default 1. If [`useDefaultWhenDisabled`](#default-when-disabled) is set, the variable is evaluated as its [`defaultValue`](#schema) 1. If [`disabledValue`](#disabled-variable-value) is set, the variable is evaluated as that specific value 1. If the feature's flag value is `true`: 1. If there are any [forced rules](#force), use the first one that matches the [context](/docs/sdks/javascript/#context), and use its `variables` value (if any) 1. If there are any [rules](#rules) that match against the [context](/docs/sdks/javascript/#context), use the rule's `variables` value (if any) 1. If feature has [variations](#variations): 1. Evaluate the variation value first 1. If that specific variation has any variable overrides, use that value 1. Fall back to variable's [`defaultValue`](#schema) --- title: Express.js nextjs: metadata: title: Express.js description: Learn how to integrate Featurevisor in Express.js applications for evaluating feature flags openGraph: title: Express.js description: Learn how to integrate Featurevisor in Express.js applications for evaluating feature flags images: - url: /img/og/docs-frameworks-express.png --- Set up Featurevisor SDK instance in an Express.js application using a custom middleware, including TypeScript integration for evaluating feature flags. {% .lead %} ## Hello World application Before going into Featurevisor integration, let's create a simple Hello World [Express.js](https://expressjs.com/) application. We start by installing the package: ``` $ npm install --save express ``` Then we create a file `index.js` with the following content: ```js // index.js const express = require('express') const PORT = 3000 const app = express() app.get('/', (req, res) => { res.send('Hello World!') }) app.listen(PORT, () => { console.log(`Example app listening on port ${PORT}`) }) ``` We can start the server with this command: ``` $ node index.js Example app listening on port 3000 ``` ## Featurevisor integration We install the Featurevisor SDK now: ``` $ npm install --save @featurevisor/sdk ``` We can now create an instance of the SDK and use it in our application: ```js // index.js const express = require('express') const { createInstance } = require('@featurevisor/sdk') const PORT = 3000 const DATAFILE_URL = 'https://cdn.yoursite.com/datafile.json' const app = express() const f = createInstance({}) app.get('/', (req, res) => { const featureKey = 'myFeature' const context = { userId: 'user-123' } const isEnabled = f.isEnabled(featureKey, context) if (isEnabled) { res.send('Hello World!') } else { res.send('Not enabled yet!') } }) fetch(DATAFILE_URL) .then((response) => response.json()) .then((datafile) => { f.setDatafile(datafile) // we start the server only after the datafile is loaded app.listen(PORT, () => { console.log(`Example app listening on port ${PORT}`) }) }) ``` ## Middleware If is very unlikely that we will have all our routes defined in the same `index.js` file, making it difficult for us to use the same Featurevisor SDK instance in all of them. To solve this problem, we can create a custom middleware that will set the Featurevisor SDK instance to the `req` object, so that we can use the same instance in all our routes throughout the lifecycle of this application. ```js // index.js // ... const f = createInstance({}) app.use((req, res, next) => { req.f = f next() }) // ... ``` Now from anywhere in our application (either in `index.js` or some other module), we can access the Featurevisor SDK instance via `req.f`: ```js app.get('/my-route', (req, res) => { const { f } = req const featureKey = 'myFeature' const context = { userId: 'user-123' } const isEnabled = f.isEnabled(featureKey, context) if (isEnabled) { res.send('Hello World!') } else { res.send('Not enabled yet!') } }) ``` ## TypeScript usage If you are using TypeScript, you can extend the `Request` interface to add the `f` property for Featurevisor SDK's instance. Create a new `custom.d.ts` file and make sure to add it in `tsconfig.json`'s `files` section: ```ts import { FeaturevisorInstance } from '@featurevisor/sdk' declare namespace Express { export interface Request { f: FeaturevisorInstance } } ``` ## Refreshing datafile Because a server instance is meant to run for a long time, we might want to refresh the datafile periodically so that latest datafile is always used without needing to restart the server. See more documentation in [JavaScript SDK page](/docs/sdks/javascript/#interval-based-update). ## Child instances If you are in need of request specific context isolation, you may want to look into spawning child instances from the primary Featurevisor SDK [here](/docs/sdks/javascript/#child-instance). ## Working repository You can find a fully functional example of this integration on GitHub: [https://github.com/featurevisor/featurevisor-example-expressjs](https://github.com/featurevisor/featurevisor-example-expressjs). --- title: Nuxt nextjs: metadata: title: Nuxt description: Learn how to integrate Featurevisor in Nuxt applications for evaluating feature flags openGraph: title: Nuxt description: Learn how to integrate Featurevisor in Nuxt applications for evaluating feature flags images: - url: /img/og/docs-frameworks-nuxt.png --- {% callout type="warning" title="Featurevisor v1" %} This guide is written keeping Featurevisor v1 in mind. It will be updated to be v2-compatible soon. {% /callout %} Set up Featurevisor SDK in Nuxt applications for evaluating feature flags in Vue.js components. {% .lead %} ## Creating a Nuxt application If you don't have a [Nuxt](https://nuxt.com/) application yet, you can create one using the following command: ``` $ npx nuxi@latest init my-app ``` ## Installing SDK Install the Featurevisor SDK using npm: ``` $ npm install --save @featurevisor/sdk ``` It is recommended to be familiar with the SDK API before reading this guide further. You can find full API documentation [here](/docs/sdks). ## Setting up Featurevisor SDK We would like to be able to set up the Featurevisor SDK instance once and reuse the same instance everywhere. To achieve this, we will create new module `featurevisor.ts` in the root of the project: ```ts // ./featurevisor.ts import { createInstance, FeaturevisorInstance } from '@featurevisor/sdk' const DATAFILE_URL = 'https://cdn.yoursite.com/datafile.json' let instance: FeaturevisorInstance export async function getInstance() { if (instance) { return instance } const f = createInstance({ datafileUrl: DATAFILE_URL, }) instance = await f.onReady() return instance } ``` To understand how the datafiles are generated and deployed, please refer to these guides: - [Building datafiles](/docs/building-datafiles) - [Deployment](/docs/deployment) Now that we have the SDK instance in place, we can use it anywhere in our application. ## Accessing SDK in Vue.js components From any Vue.js component file, we can import the `getInstance` function and use it to access the SDK instance: ```html ``` With just a few lines of code, we can now evaluate feature flags in our Vue.js components. ## Regular client-side usage If you are using Vue.js components in a regular client-side rendered application, you can refer to our separate [Vue.js SDK](/docs/vue) for Featurevisor. ## Bucketing guidelines If you are using Featurevisor for gradual rollouts or A/B testing, you should make sure that the [bucketing](/docs/bucketing) is consistent when rendering your components. Usually bucketing is done by passing the User's ID when the user is already known, or a randomly generated UUID for the device if the user has not logged in yet. When evaluating using the SDK instance, we would be passing these values as `context` object: ```ts const context = { userId: '123', deviceId: '', } const isEnabled = f.isEnabled(featureKey, context) ``` Since the evaluation of features are done in the server, you should make sure that the User's ID is passed to the server as well. If that's not an option, you are recommended to use a single value consistently. See documentation about `bucketBy` property in feature definitions for further explanation [here](/docs/features/#bucketing). ## Working repository You can find a fully functional example of this integration on GitHub: [https://github.com/featurevisor/featurevisor-example-nuxt](https://github.com/featurevisor/featurevisor-example-nuxt). --- title: Next.js nextjs: metadata: title: Next.js description: Learn how to integrate Featurevisor in Next.js applications for evaluating feature flags openGraph: title: Next.js description: Learn how to integrate Featurevisor in Next.js applications for evaluating feature flags images: - url: /img/og/docs-frameworks-nextjs.png --- Set up Featurevisor SDK in an existing Next.js application for evaluating feature flags covering both Pages Router and App Router. {% .lead %} ## Flags SDK Vercel has created [Flags SDK](https://flags-sdk.dev/frameworks/next), which works very well with [Next.js](https://nextjs.org/) applications. We don't necessarily need this additional layer to use Featurevisor, but this guide will show you how you can use them together giving you the flexibility to migrate from other feature management tools to Featurevisor and vice versa with ease. ## Installation In your existing Next.js application: ```{% title="Command" %} $ npm install --save flags @featurevisor/sdk ``` ## Naming conventions Because Featurevisor allows evaluating 3 different types of values against individual features, we need to establish a naming convention for the keys used in the Flags SDK: | Evaluation type | Value type | Key Format | Example | | --------------- | ---------- | ---------------------------- | ------------------------ | | Flag | boolean | `` | `my_feature` | | Variation | string | `:variation` | `my_feature:variation` | | Variable | mixed | `:` | `my_feature:my_variable` | ## Set up Featurevisor We will start by creating a new Featurevisor adapter using Flags SDK in the `src/featurevisor.ts` file: ```ts {% path="src/featurevisor.ts" %} import type { Adapter } from 'flags' import { createInstance, FeaturevisorInstance } from '@featurevisor/sdk' export interface FeaturevisorAdapterOptions { datafileUrl: string refreshInterval?: number f?: FeaturevisorInstance } export interface FeaturevisorEntitiesType { userId?: string // ...add more properties (attributes) for your context here } export function createFeaturevisorAdapter(options: FeaturevisorAdapterOptions) { const f = options.f || createInstance({}) let initialFetchCompleted = false // datafile fetcher function fetchAndSetDatafile() { console.log('[Featurevisor] Fetching datafile from:', options.datafileUrl) const result = fetch(options.datafileUrl) .then((response) => response.json()) .then((datafile) => { f.setDatafile(datafile) initialFetchCompleted = true }) .catch((error) => console.error('[Featurevisor] Error fetching datafile:', error) ) return result } // datafile refresher (periodic update) if (options.refreshInterval) { setInterval(async () => { await fetchAndSetDatafile() }, options.refreshInterval) } // adapter return function featurevisorAdapter< ValueType, EntitiesType extends FeaturevisorEntitiesType >(): Adapter { return { async decide({ key, entities, headers, cookies }): Promise { // ensure the datafile is fetched before making decisions if (!initialFetchCompleted) { await fetchAndSetDatafile() } const context = { userId: entities?.userId, } // mapping passed key to Featurevisor SDK methods: // // - '' => f.isEnabled(key, context) // - ':variation' => f.getVariation(key, context) // - ':' => f.getVariable(key, variableKey, context) const [featureKey, variableKey] = key.split(':') if (variableKey) { if (variableKey === 'variation') { // variation return f.getVariation(featureKey, context) as ValueType } else { // variable return f.getVariable(featureKey, variableKey, context) as ValueType } } // flag return f.isEnabled(featureKey, context) as ValueType }, } } } ``` ## Flags SDK integration Now we can create a new `src/flags.ts` file for our individual features and their evaluations: ```ts {% path="src/flags.ts" %} import { flag } from 'flags/next' import { createFeaturevisorAdapter } from './featurevisor' // set up adapter const featurevisorAdapter = createFeaturevisorAdapter({ // replace with your Featurevisor project datafile URL datafileUrl: 'https://cdn.yoursite.com/datafile.json', // if you want to periodically refresh the datafile refreshInterval: 5 * 60 * 1000, // every 5 minutes }) // feature specific flags export const myFeatureFlag = flag({ // '' as the feature key alone to get its flag (boolean) status key: 'my_feature', adapter: featurevisorAdapter(), }) export const myFeatureVariation = flag({ // ':variation' is to get the variation (string) of the feature key: 'my_feature:variation', adapter: featurevisorAdapter(), }) export const myFeatureVariable = flag({ // ':' is to get the variable value of the feature key: 'my_feature:variableKeyHere', adapter: featurevisorAdapter(), }) ``` ## App Router If you're using the App Router, you can call the flag function from a page, component, or middleware to evaluate the flag: ```ts {% path="src/app/page.tsx" %} import { myFeatureFlag } from '../flags' export default async function Page() { const myFeature = await myFeatureFlag() return
{myFeature ? 'Flag is on' : 'Flag is off'}
} ``` ## Pages Router If you're using the Pages Router, you can call the flag function inside getServerSideProps and pass the values to the page as props: ```ts {% path="src/pages/index.tsx" %} import type { InferGetServerSidePropsType, GetServerSideProps } from 'next' import { myFeatureFlag } from '../flags' export const getServerSideProps = (async ({ req }) => { const myFeature = await myFeatureFlag(req) return { props: { myFeature } } }) satisfies GetServerSideProps<{ myFeature: boolean }> export default function Page({ myFeature }: InferGetServerSidePropsType) { return
{myFeature ? 'Flag is on' : 'Flag is off'}
} ``` ## Working repository You can find a fully functional example of this integration on GitHub: [https://github.com/featurevisor/featurevisor-example-nextjs](https://github.com/featurevisor/featurevisor-example-nextjs).
--- title: Astro nextjs: metadata: title: Astro description: Learn how to integrate Featurevisor in Astro applications for evaluating feature flags openGraph: title: Astro description: Learn how to integrate Featurevisor in Astro applications for evaluating feature flags images: - url: /img/og/docs-frameworks-astro.png --- {% callout type="warning" title="Featurevisor v1" %} This guide is written keeping Featurevisor v1 in mind. It will be updated to be v2-compatible soon. {% /callout %} Set up Featurevisor SDK in Astro applications for evaluating feature flags in your pages and components. {% .lead %} ## What is Astro? [Astro](https://astro.build/) is a website build tool with a server-first API design for the modern web. It's UI library agnostic, allowing you to choose React, Preact, Svelte, Vue, or just plain HTML (with JSX flavour) as your rendering layer. ## Creating an Astro project Use `npm` to scaffold a new project: ``` $ npm create astro@latest ``` ## Installing Featurevisor SDK Install the Featurevisor SDK using npm: ``` $ npm install --save @featurevisor/sdk ``` It is recommended to be familiar with the SDK API before reading this guide further. You can find full API documentation [here](/docs/sdks). ## Setting up Featurevisor SDK We would like to be able to set up the Featurevisor SDK instance once and reuse the same instance everywhere. To achieve this, we will create new module: ```js // src/featurevisor.mjs import { createInstance } from '@featurevisor/sdk' const DATAFILE_URL = 'https://cdn.yoursite.com/datafile.json' let instance export async function getInstance() { if (instance) { return instance } const f = createInstance({ datafileUrl: DATAFILE_URL, }) instance = await f.onReady() return instance } ``` Now that we have the SDK instance in place, we can use it anywhere in our application. {% callout type="note" title="Featurevisor's build & deployment" %} To understand how the datafiles are generated and deployed, please refer to these guides: - [Building datafiles](/docs/building-datafiles) - [Deployment](/docs/deployment) {% /callout %} We created a very simple instance of the SDK, but we can also configure it further for fetching latest datafile without restarting our server: - periodically (see [refreshing datafile](/docs/sdks/javascript/#refreshing-datafile)) - as soon as they happen (see [websockets guide](/docs/integrations/partykit)) ## Accessing SDK in components From any component file, we can import the `getInstance` function and use it to access the SDK instance: ```js // src/pages/index.astro --- import { getInstance } from '../featurevisor.mjs'; const f = await getInstance(); const featureKey = "my_feature"; const context = { userId: "123", country: "nl" }; const isEnabled = f.isEnabled(featureKey, context); ---

Feature {featureKey} is {isEnabled ? 'enabled' : 'disabled' }.

``` With just a few lines of code, we can now evaluate feature flags in our Astro components. ## Regular client-side usage If your use case is not server-side rendering or at build time, you can use the SDK instance directly in your client-side code: - [JavaScript SDK](/docs/sdks) - [React SDK](/docs/react) - [Vue.js SDK](/docs/vue) ## Bucketing guidelines If you are using Featurevisor for gradual rollouts or A/B testing, you should make sure that the [bucketing](/docs/bucketing) is consistent when rendering your components. Usually bucketing is done by passing the User's ID when the user is already known, or a randomly generated UUID for the device if the user has not logged in yet. When evaluating using the SDK instance, we would be passing these values as `context` object: ```ts const context = { userId: '123', deviceId: '', } const isEnabled = f.isEnabled(featureKey, context) ``` If the evaluation of features are done in the server, you should make sure that the User's ID is passed to the server as well. If that's not an option, you are recommended to use a single value consistently. See documentation about `bucketBy` property in feature definitions for further explanation [here](/docs/features/#bucketing). ## Working repository You can find a fully functional example of this integration on GitHub: [https://github.com/featurevisor/featurevisor-example-astro](https://github.com/featurevisor/featurevisor-example-astro).
--- title: Fastify nextjs: metadata: title: Fastify description: Learn how to integrate Featurevisor in Fastify applications for evaluating feature flags openGraph: title: Fastify description: Learn how to integrate Featurevisor in Fastify applications for evaluating feature flags images: - url: /img/og/docs-frameworks-fastify.png --- {% callout type="warning" title="Featurevisor v1" %} This guide is written keeping Featurevisor v1 in mind. It will be updated to be v2-compatible soon. {% /callout %} Set up Featurevisor SDK instance in a Fastify application using a custom decorator, including TypeScript integration for evaluating feature flags. {% .lead %} ## Hello World application Before going into Featurevisor integration, let's create a simple Hello World [Fastify](https://www.fastify.io/) application. We start by installing the package: ``` $ npm install --save fastify ``` ```js // index.js const fastify = require('fastify')({ logger: true }) const PORT = 3000 fastify.get('/', async (request, reply) => { return 'Hello World!' }) fastify.listen(PORT, () => { console.log(`Example app listening on port ${PORT}`) }) ``` We can start the server with this command: ``` $ node index.js Example app listening on port 3000 ``` ## Featurevisor integration We install the Featurevisor SDK first: ``` $ npm install --save @featurevisor/sdk ``` We can now create an instance of the SDK and use it in our application: ```js // Require the fastify framework and instantiate it const fastify = require('fastify')({ logger: true, }) // Featurevisor SDK const { createInstance } = require('@featurevisor/sdk') const DATAFILE_URL = 'https://featurevisor-example-cloudflare.pages.dev/production/datafile-tag-all.json' // replace with yoursite cdn const REFRESH_INTERVAL = 60 * 5 // every 5 minutes const f = createInstance({ datafileUrl: DATAFILE_URL, // optionally refresh the datafile every 5 minutes, // without having to restart the server refreshInterval: REFRESH_INTERVAL, }) // Declare a route fastify.get('/', async (request, reply) => { const featureKey = 'my_feature' const context = { userId: '123', country: 'nl' } const isEnabled = f.isEnabled(featureKey, context) if (isEnabled) { reply.send('Hello World!') } else { reply.send('Not enabled yet!') } }) // Run the server! const start = async () => { fastify.listen({ port: 3000 }, function (err, address) { if (err) { fastify.log.error(err) process.exit(1) } fastify.log.info(`server listening on ${fastify.server.address().port}`) }) } start() ``` ## Decorator It is very unlikely that we will have all our routes defined in the same index.js file, making it difficult for us to use the same Featurevisor SDK instance in all of them. To solve this problem, we can create a custom decorator that will set the Featurevisor SDK instance to the request object, so that we can use the same instance in all our routes throughout the lifecycle of this application. ```js // index.js // ... fastify.decorateRequest('f', f) // ... ``` Now from anywhere in our application (either in index.js or some other module), we can access the Featurevisor SDK instance via request.f: ```js fastify.get('/my-route', async (request, reply) => { const featureKey = 'my_feature' const context = { userId: '123', country: 'nl' } const isEnabled = request.f.isEnabled(featureKey, context) if (isEnabled) { reply.send('Hello World!') } else { reply.send('Not enabled yet!') } }) ``` ## TypeScript usage If you are using TypeScript, you can extend the `Request` interface to add the `f` property for Featurevisor SDK's instance. Create a new `custom.d.ts` file and make sure to add it in `tsconfig.json`'s `files` section: ```ts import { FeaturevisorInstance } from '@featurevisor/sdk' import { FastifyInstance } from 'fastify' declare module 'fastify' { interface FastifyInstance { f: FeaturevisorInstance } } ``` ## Working repository You can find a fully functional example of this integration on GitHub: [https://github.com/featurevisor/featurevisor-example-fastify](https://github.com/featurevisor/featurevisor-example-fastify). --- title: Glossary nextjs: metadata: title: Glossary description: Glossary of terms that you will come across when adopting feature management principles with Featurevisor openGraph: title: Glossary description: Glossary of terms that you will come across when adopting feature management principles with Featurevisor images: - url: /img/og/docs-glossary.png --- Featurevisor is a tool that helps you adopt feature management practices in your organization and applications. This also introduces a lot of new terms, and this glossary is aimed at helping you understand quickly what they are all about. {% .lead %} ## A/B Test An A/B test is a method of comparing two (or more) variations of a feature to determine which one performs better. It involves dividing users into two (or more) groups and showing each group a different variation of the feature. The feature could be the colour or placement of a call to action button in a landing page for example. The performance of each variation is then measured based on user behavior, such as click-through rates or conversion rates, to determine which variation is more effective. Learn more in [Experiments](/docs/use-cases/experiments) guide. ## Activation When a particular user is exposed to an [experiment](#experiment), applications are required to activate that experiment for that user by tracking it accordingly. The payload for tracking that activation event usually includes: - The experiment key - The evaluated variation - The user's ID This then helps measure the performance of the experiment itself later against your [conversion goals](#conversion-goal). See how SDKs help with tracking activations [here](/docs/sdks/javascript/#activation). ## Application An application could either be your: - Web application - iOS app - Android app - Backend service - Command line tool - ...or anything else that you build Featurevisor provides [SDKs](/docs/sdks/) in a few different programming languages to help you evaluate your features. ## Approval Because Featurevisor adopts [GitOps](/docs/concepts/gitops) principles, all changes are made via [Pull Requests](#pull-request). When you send Pull Requests, you often require approvals from your peers before [merging](#merge) them (applying the changes). ## Archive Featurevisor [entities](#entity) can be archived at your convenience: - [Archiving an attribute](/docs/attributes/#archiving) - [Archiving a segment](/docs/segments/#archiving) - [Archiving a feature](/docs/features/#archiving) Archiving here means that we still keep our entity definitions in place, but not serving their configuration to our applications via generated [datafiles](#datafile) any more. ## Array Arrays are a data structure that allowing you to define a list of items. Featurevisor supports [variables](#variable), where `array` is one of the supported types. You can learn more about using arrays as variables [here](/docs/features/#array). ## Assertion Featurevisor allows [testing](#testing) your features and segments, to make sure they are working as per your expectations before applying any changes anywhere. A single test can consist of one or more assertions. Each assertion is basically a test scenario for your feature or segment against [contexts](#context). Learn more about testing [here](/docs/testing). ## Attribute An attribute is one of the core building blocks of Featurevisor. You can consider them to be like field names which are using in conditions inside [segments](#segment). Learn more about attributes [here](/docs/attributes). ## Boolean A boolean type is a data type that represents two possible values: `true` and `false`. You may consider it to be like an on/off switch. In Featurevisor, it can apply to a few different areas: - If Feature itself is [enabled or disabled](/docs/sdks/javascript/#checking-if-enabled) - Boolean type for [variables](/docs/features/#boolean) ## Branch A [Git](#git) branch is a pointer to a specific sequence of [commits](#commit) in a [repository](#repository). It represents an independent line of development in a project, allowing you to isolate changes for specific features or tasks. The default branch in Git is usually called `master` or `main`. You can create, switch to, [merge](#merge), and delete branches using various Git commands. This branching mechanism facilitates collaboration, experimentation, and non-linear development workflows. ## Browser Browsers help you access web applications. If you are using Chrome, Firefox, or Safari, then you are using a browser already. ## Bucketing Bucketing is the process how we make sure a particular user is consistently seeing a feature as enabled/disabled or a specific variation (if an [A/B test](#a-b-test) experiment) across all sessions and devices in your application(s). Learn more about bucketing here: - [Bucketing concept](/docs/bucketing) - [Configuring bucketing in features](/docs/features/#bucketing) ## Build Featurevisor has a build step, where it generates [datafiles](#datafile) (JSON files) for your project against your [environments](#environment) and [tags](#tag). Learn more about building datafiles [here](/docs/building-datafiles). ## CDN A CDN, or Content Delivery Network, is a system of distributed servers that deliver web content to a user based on their geographic location, the origin of the webpage, and the content delivery server itself. It's designed to reduce latency and increase the speed of web content delivery by serving content from the server closest to the user. CDNs are commonly used for serving static resources of a website like images, CSS, and JavaScript files. Featurevisor expects that its generated [datafiles](#datafile) are [served](#deployment) via a CDN. Examples of CDNs include: - AWS Cloudfront - Cloudflare - Fastly ## CI/CD CI stands for Continuous Integration, which is a software development practice that involves frequently merging code changes into a shared repository. It ensures that each code change is tested and integrated with the existing codebase as early as possible. CD stands for Continuous Deployment, which is an extension of Continuous Integration. It automates the process of deploying software changes to production environments after passing the necessary tests and quality checks. It enables teams to deliver software updates quickly and frequently to end-users. ## CLI CLI stands for Command Line Interface. It is a text-based interface that allows users to interact with a computer program or operating system by typing commands into a [terminal](#terminal) or command prompt. Learn more [here](/docs/cli). ## Commit A [Git](#git) commit is a command that saves changes to a local [repository](#repository). It's like a snapshot of your work that you can revert to or compare with other versions later. Each commit has a unique ID (a hash), a message describing the changes, and information about the author. Commits form a linear or branched history, allowing you to track progress and understand the evolution of a project. ## Condition Conditions are the building blocks of your segments. Learn more in [segments](/docs/segments). ## Configuration Featurevisor manages its project configuration via `featurevisor.config.js` file. Learn more about project configuration [here](/docs/configuration). ## Context When Featurevisor SDKs [evaluate](#evaluation) a feature or its variation and variables, it expects to receive some additional information about the user. Based on this information, SDKs then return you the desired value for that specific user. This additional info could be: - the User's ID - their location - browser name - ...etc We call this information `context`. You can learn more about how it's used in SDKs [here](/docs/sdks/javascript/#context). ## Conversion goal A conversion goal is a specific action or event that you want your users to take or achieve when interacting with your application(s). It represents a desired outcome or objective, such as making a purchase, signing up for a newsletter, or completing a form. Conversion goals are used to measure the success and effectiveness of your features or experiments by tracking the percentage of users who successfully complete the desired action. ## Datafile Featurevisor generates datafiles ([JSON](#json) files) which are then consumed by [SDKs](#sdk) in your application(s). Learn more here about: - [Building datafiles](/docs/building-datafiles) - [Consuming datafiles](/docs/sdks/javascript/#initialization) ## Deployment Once your datafiles are generated, the idea is to upload it to a [CDN](#cdn) (or a custom server) so that your applications can access and [fetch](#fetch) these files. You can learn more about deployment strategies [here](/docs/deployment). ## Deprecate It can be very easy to keep creating new features, but not always a nice experience when you wish to get rid of them. The challenge mostly comes from having to figure out which [applications](#application) are already using those features (if any). This is where deprecation comes in handy. Deprecating a feature in Featurevisor means we are choosing to remove any usage of it soon, but we are not deleting it yet for not impacting any existing applications negatively. Any SDK that evaluates deprecated features will emit warnings so the developers are notified about it and are given enough time to remove the usage of those features from their applications. Learn more about it [here](/docs/features/#deprecating). ## Declarative Featurevisor takes a declarative approach to defining all our configuration in the form of [attributes](#attribute), [segments](#segment), and [features](#feature). By being declarative, it means that we are only declaring (telling the system) what outcome we desire, and then Featurevisor takes care of everything else for us. These concepts can help you understand more about it: - [Infrastructure as Code (IaC)](/docs/concepts/infrastructure-as-code) - [GitOps](/docs/concepts/gitops) ## Description All Featurevisor [entities](#entity) expect descriptions, to help document what they are intended to be used for. ## Directory A directory, also known as a folder, is a container used to organize [files](#file) and other directories within a file system. ## Double Featurevisor allows `double` as a data type for [variables](#variable), which allows you to have numbers with decimal places. Learn more about its usage [here](/docs/features/#double). ## Entity All the building blocks of Featurevisor are called entities, which are: - [Attribute](/docs/attributes) - [Segment](/docs/segments) - [Feature](/docs/features) - [Group](/docs/groups) ## Environment In software engineering, an environment refers to the specific configuration and settings in which a software application or system operates. If you are an application developer, it may mean: - **staging environment**: where you test your application out before deploying to production - **production environment**: where your real users access your application ## Evaluation Evaluation is the process how Featurevisor SDKs compute the values of feature's own: - enabled/disabled status - its [variations](#variation), and - [variables](#variable) against a given [context](#context). Learn more in [SDKs](/docs/sdks/) page. ## Event Featurevisor [SDKs](#sdk) emit several different kinds of events for [applications](#application) to hook into depending on their use cases. Learn more here: - [Readiness](/docs/sdks/javascript/#asynchronous) - [Activation](/docs/sdks/javascript/#activation) - [Events](/docs/sdks/javascript/#events) - [Logging](/docs/sdks/javascript/#logging) ## Experiment Experiments can be of two types in Featurevisor: - [A/B test](#a-b-test) - [Multivariate tests](#multivariate-test) Even though they are experiments in traditional terms, in Featurevisor everything is defined as a feature (either it's a simple feature with enabled/disabled status, or one with also [variations](#variation)). Learn more about them in [Experiments](/docs/use-cases/experiments) guide. ## Expose Exposing a feature in a specific [environment](#environment) is an advanced technique which is very handy when you are migrating away from another feature management tool to Featurevisor. There might be cases where you want Featurevisor to be used in [staging](#staging) environment for specific features first, while still using the previous existing feature management tool in [production](#production). Learn more how exposing works on a per environment basis [here](/docs/features/#expose). ## Feature Either you want a: - simple feature flag with enabled/disabled status, or - one with [variations](#variation) to be treated as an [experiment](#experiment), or - one with [variables](#variable) for more complex configuration ...everything is defined as a feature in Featurevisor. A feature is the core building block, and you can learn more about its anatomy [here](/docs/features). ## Fetch Fetching is the process of how Featurevisor [SDKs](#sdk) pull in the latest datafiles over the network which packs all the configuration for evaluating all your features in your application against provided contexts. ## File A file is a container in a computer system for storing information, often in the form of text, images, audio, or software. Featurevisor [entities](#entity) are all stored as files in their own [directories](#directory). ## Flag These terms are often used interchangeably: - [Feature](#feature) - Feature flag - Flag ## Force While you may set up all your [rules](#rule) inside a feature targeting various [segments](#segment), you may want to override them all for certain users, especially when you wish to test things out yourself before affecting your real users in [production](#production). This guide for testing in production will explain the use case well [here](/docs/use-cases/testing-in-production). The `force` API docs can be found [here](/docs/features/#force). ## Git Git is a distributed version control system that allows multiple people to work on a project at the same time without overwriting each other's changes. It tracks changes to files in a [repository](#repository) (think a project) so you can see what was changed, who changed it, and why. Git is widely used in software development for source code management. With Featurevisor, all feature configurations are stored in a Git repository that we call our Featurevisor project. ## GitHub GitHub is a web-based hosting service for version control using Git. It provides a platform for collaboration, allowing developers to contribute to projects, fork repositories, submit [pull requests](#pull-request), and manage versioned files. ## GitHub Actions GitHub Actions is a CI/CD (Continuous Integration/Continuous Deployment) service provided by [GitHub](#github). It allows developers to automate, customize, and execute their software development workflows right in their GitHub repository. Actions can be used to build, test, and [deploy](#deployment) applications, manage issues, publish packages, and much more. Workflows are defined using [YAML](#yaml) syntax and can be triggered by various GitHub events such as push, [pull requests](#pull-request) or issue creation. ## GitOps In its simplest form GitOps means doing operations using [Git](#git). Read our [GitOps](/docs/concepts/gitops) guide for more info. ## Group When you wish to run mutually exclusive [experiments](#experiment), meaning the same user should be exposed to two or more overlapping experiments, groups come in handy. Learn more about groups [here](/docs/groups). ## Hash Featurevisor [SDKs]](#sdk) rely on in-memory hashing algorithm making sure same user is bucketed into same feature and variation based on rollout rules consistently across all devices and sessions. Learn more about it in our guide for [bucketing](/docs/bucketing). ## Instance Refers to the instance of the [SDK](#sdk), once initialized with your desired configuration parameters. Learn more about SDK usage [here](/docs/sdks/javascript). ## Integer If you wish to define [variables](#variable) which are whole numbers (without decimal places), then `integer` type is what you need. Learn more about integer type's usage in variables [here](/docs/features/#integer). ## Interval Featurevisor [SDKs](#sdk) allow refreshing datafiles, and one of the techniques can be interval based, meaning it can keep refreshing the [datafile](#datafile) every X number of seconds. You can learn more about refreshing in SDKs [here](/docs/sdks/javascript/#refreshing). ## JavaScript JavaScript is a high-level, interpreted programming language primarily used for enhancing web pages to provide a more interactive user experience. It's one of the core technologies of the web, alongside HTML and CSS. Featurevisor CLI is written in JavaScript, targeting Node.js runtime. But the JavaScript SDK is universal, meaning it works in both browsers and in Node.js. ## JSON JSON (JavaScript Object Notation) is a lightweight data-interchange format that is easy for humans to read and write and easy for machines to parse and generate. It's often used to transmit data between a server and an application. From Featurevisor's perspective, it builds [datafiles](#datafile) (which are JSON files). These datafiles are then [fetched](#fetch) and consumed by applications using Featurevisor [SDKs](#sdk) in various different programming languages. ## Key Keys may mean the name of your Featurevisor entities like [attribute](#attribute), [segments](#segment), and [features](#feature). The names are based on their file names without their extensions (like `.yml`, or `.json`). When [variables](/docs/features/#variables) are defined inside a feature, they are also given unique keys within their features. As for [rules](/docs/features/#rules) within [environments](#environment), they also have their own keys. ## Level Refers to the different levels of [logging](#logging), like: - `log` - `info` - `warn` - `error` ## Lint Entities are expressed declaratively in Featurevisor, primarily as [YAML](#yaml) or [JSON](#json) files. To make sure they do not contain any human introduced mistakes, linting takes care of finding those issues early (if any) before proceeding with [building datafiles](/docs/building-datafiles). Learn more about linting [here](/docs/linting). ## Logging Featurevisor [SDKs](#sdk) allow logging various different levels of messages which can be used for tracking, analyzing performance, troubleshooting, understanding how [evaluations](#evaluation) are done and more. Learn more about logging [here](/docs/sdks/javascript/#logging). ## Merge When you send a [Pull Request](#pull-request) making changes to the configuration of your desired Featurevisor entities, you then proceed with merging it to finally apply the changes so it impacts your applications. ## Multivariate Test Traditionally with A/B tests, you test a single [variation](#variation) change. With multivariate tests, multiple [variables](#variable) are modified for testing a hypothesis. Learn about it more depth in our [Experiments](/docs/use-cases/experiments) guide. ## Mutually Exclusive When you do not wish to expose a single user to more than one experiment at a time from your list of predefined experiments. Featurevisor allows you to achieve that via [Groups](/docs/groups). ## Name The word **name** may often be used interchangeably with **key**. Like your [feature](#feature) name or key. ## Native apps Usually means applications which are built specifically for a platform, like iOS and Android apps. ## Node.js JavaScript the programming language was originally introduced to work in browsers only. With the introduction of Node.js, which is an open-source, cross-platform, JavaScript runtime environment, allowed executing code written in this language outside of a web browser as well. This includes server-side and [command line applications](#cli). Featurevisor CLI is a Node.js package allowing you to use it via command line in Terminal. While Featurevisor JavaScript SDK works in both browsers and in Node.js services. ## npm npm is a package manager for Node.js. Like millions of other open source packages, Featurevisor is also distributed via npm. As application builders and users of Featurevisor, we download Featurevisor [packages](#package) via npm. ## Object Featurevisor allows defining flat objects (key/value pairs of data) as [variables](#variable). Learn more about its usage [here](/docs/features/#object). ## Operator When we define [segments](#segment), it will contain various different [conditions](#condition). And each of those conditions will use operators like `equals`, `notEquals`, and more against your desired values. See full list of supported operators [here](/docs/segments/#operators). ## Override Variations and variables inside features can be overridden in few different ways allowing more flexibility as our features grow more complex. Learn more about them here: - [Overriding variation](/docs/features/#overriding-variation) - [Overriding variables](/docs/features/#overriding-variables) ## Package Featurevisor as a tool is a set of various different packages, which are distributed via package managers targeting different programming languages. The [CLI](#cli) is written in [JavaScript](#javascript) targeting [Node.js](#nodejs) runtime, which is distributed via [npm](#npm). ## Parser By default, Featurevisor expects all [entities](#entity) will be defined as [YAML](#yaml) files. With native support of choosing [JSON](#json) as well. Next to that, the project configuration API allows you to bring in your own custom parser for different formats like TOML, XML, etc. Learn more about them here: - [Custom parsers](/docs/advanced/custom-parsers) - [Project configuration](/docs/configuration) ## Percentage Not all releases of your features should be a big bang release affecting all your users. You may want to roll it out gradually, starting with 5%, then 10%, then 20%, and all the way up to 100% as you gain more confidence. These percentages are defined in your rollout [rules](/docs/features/#rules) inside features. Learn more about gradual rollout and progressive delivery [here](/docs/use-cases/progressive-delivery). ## Pretty When machines deal with JSON files, they usually are in their most compressed form to save disk space and network bandwidth. When compressed, they also affect readability for humans negatively. For debugging (investigating) purposes, there's an option in Featurevisor's project configuration to allow for [pretty datafiles](/docs/configuration/#pretty-datafile) and [pretty state files](/docs/configuration/#pretty-state) improving readability for humans. ## Production Refers to the production [environment](#environment), where your application(s) are deployed to affecting your real users. ## Project Your Featurevisor project, which is usually a single independent Git [repository](#repository). See quick start guide on how to create a new project [here](/docs/quick-start). ## Pull Request A pull request is a feature in version control systems like [Git](#git), and platforms like [GitHub](#github), that allows developers to propose changes to a codebase. It's a request to "pull" your changes into the main project. Pull requests show content differences, facilitate discussions, code [review](#review), and can be integrated with other testing and [CI/CD](#ci-cd) tools before the changes are [merged](#merge) into the main branch. ## Push In the context of [Git](#git), "push" is a command used to upload local [repository](#repository) content to a remote repository. After [committing](#commit) changes locally, you "push" them to the remote repository to share your changes with others and synchronize your local repository with the remote. It's an essential command for collaborative work in Git. ## QA Many organizations have Quality Assurance (QA) teams, making sure everything is working as expected before the changes of your applications are exposed to your real users, minimizing any potential risks. ## Ready When Featurevisor SDKs are used [asynchronously](/docs/sdks/javascript/#initialization), we are delegating the responsibility of [fetching](#fetch) the [datafile](#datafile) to the SDK itself. The fetching of datafile is an asynchronous task over the network. To know when Featurevisor SDK has successfully fetched the datafile, we rely on its readiness events. Learn more about their usage here: - [onReady option](/docs/sdks/javascript/#asynchronous) - [isReady method](/docs/sdks/javascript/#listening-to-events) - [onReady method](/docs/sdks/javascript/#listening-to-events) ## Refresh Featurevisor SDKs allow refreshing [datafile](#datafile) content without having to restart or reload your whole [application](#application). Learn more about refreshing [here](/docs/sdks/javascript/#refreshing). ## Repository A repository, often abbreviated as "repo", is a storage location for software packages. It contains all files and directories associated with a project. From Featurevisor's perspective, a single project is usually its own independent [Git](#git) repository consisting of files which are defining various entities like [attributes](#attribute), [segments](#segment), and [features](#feature). ## Review When we send [Pull Requests](#pull-request), the idea is to get it reviewed by our peers who may either [approve](#approval), reject, or request for more changes. ## Revision Each successful [build](#build) of Featurevisor [datafiles](#datafile) produce a new revision number. This is in integer format (a whole number with no decimal places), and is incremented by 1 from previous build's revision. This revision number is present in all generated datafiles, so applications will know which revision they are using with provided SDKs. Learn more about it: - [Building datafiles](/docs/building-datafiles) - [State files](/docs/state-files) ## Required Featurevisor supports defining certain features as dependencies for a particular feature. Learn more about it here: - [Managing feature dependencies](/docs/use-cases/dependencies) - [Defining dependencies in features](/docs/features/#required) ## Rule Rollout rules are defined per [environment](#environment) for each individual [features](#feature). Learn more about rules [here](/docs/features/#rules). ## Schema When [variables](#variable) are defined inside a feature, they must also provide their own schema to let Featurevisor know more about it. Learn more about variables schema [here](/docs/features/#variables). ## Schema version Generated [datafiles](#datafile) follow a schema version. Featurevisor started with the schema version of 1, and a new schema version 2 is in the works. See how you can build datafiles against a more optimized schema version [here](/docs/building-datafiles/#schema-v2). ## SDK SDK stands for Software Development Kit. Featurevisor comes with [SDKs](/docs/sdks) in a few different programming languages. Their purpose is to [fetch](#fetch) [datafiles](#datafile) and [evaluate](#evaluation) values in your applications as you need them. ## Segment Segments are groups of [conditions](#condition), which are then used in the rollout [rules](#rule) of your features. Learn more about segments: - [Defining segments](/docs/segments) - [Using segments in feature rules](/docs/features/#segments) ## Site Featurevisor is also capable of generating a static read-only website based on all [entity](#entity) definitions as found in your project [repository](#repository). This generated site, once hosted, serves as a dashboard for your team and organization for understanding what's the latest state of configuration everywhere. Learn more about it [here](/docs/site). ## Slot When defining mutually exclusive [experiments](#experiment) (think [features](#feature) in Featurevisor) in a [group](#group), each feature is put in a slot so they never overlap. Learn more about usage of slots in groups [here](/docs/groups). ## Staging Similar to [production](#production), staging is a common [environment](#environment) in software engineering culture. This environment is meant to serve for testing things out internally within the team, before deploying the changes to production where your real users are. ## State files Because there's no real database involved in Featurevisor, the tool uses [Git](#git) for storing and keeping track of all traffic allocation information against rollout rules in features, and incremental revision numbers. Learn more about them and their usage here: - [State files](/docs/state-files) - [Deployment](/docs/deployment) ## Status A status check for a [commit](#commit) is a process in version control systems like [Git](#git), and platforms like [GitHub](#github), that verifies the commit against certain criteria before it's merged into the main [branch](#branch). This can include running automated tests, checking for code style adherence, verifying the commit message format, and more. Status checks help maintain code quality and prevent introducing errors into the main codebase. They are often integrated into the [pull request](#pull-request) process. ## Sticky Applications may often decide to make certain [evaluations](#evaluation) sticky (not take fetched [datafile](#datafile) into account) for specific users against certain [features](#feature) for the lifecycle of the [SDK](#sdk). You can more about how Featurevisor SDKs allow that [here](/docs/sdks/javascript/#stickiness). ## String String is a data type that's used for texts. It is one among many other data types that's supported in [variables](#variable). Learn more about it [here](/docs/features/#string). ## Tag Your organization may have a single Featurevisor [project](#project), in a single Git repository, containing several hundreds or thousands of [features](#feature). But there might be 10 or more separate consumers (applications), and they each may want to fetch [datafiles](#datafile) containing only a subset of all the features found in your project. This is where tagging comes in handy. Each feature can be tagged accordingly, and datafile generator will take care of building tag specific files for your applications to consume. Each application is then able to fetch only the features they need and nothing more or less. Learn more about it here: - [Building datafiles](/docs/building-datafiles) - [Tags in features](/docs/features/#tags) ## Terminal A Terminal, also known as a command line or console, is a text-based interface used to interact with an operating system. Users can input commands to perform operations, navigate the file system, and run scripts or applications. If you are using macOS, then this is your Terminal app. ## Testing Unlike any other feature management tools, Featurevisor allows your to test your [segments](#segment) and [features](#feature) against your expectations. This is similar to "unit testing" in regular programming. Just like how features and segments are defined declaratively, you can also define your tests for them declaratively, and Featurevisor will test everything for you. If some tests are failing, it means something is wrong somewhere and we can fix it early before applying our changes. Learn more about it [here](/docs/testing). ## Tracking Experiments are of no use if we are not tracking anything. Featurevisor SDKs emit [activation](#activation) events, which can be handled and then tracked accordingly using your favourite analytics service. Here are some guides regarding tracking activation events: - [Activating features](/docs/sdks/javascript/#activation) - [Tracking with Google Analytics / Tag Manager](/docs/tracking/google-analytics) ## Usage Sometimes we create [segments](#segment) and [attributes](#attribute), and we can't always tell quickly where they are actively being used. Featurevisor CLI brings in some goodies to help find their usage, and also entities which are not used anywhere so you can take actions to clean them up. Learn more about it [here](/docs/cli/#find-usage). ## Value A value can be anything. This paragraph that you are reading can itself be a value. In terms of Featurevisor, a value can be the [evaluated](#evaluation) output of a: - feature's own enabled/disabled status - feature's variation - feature's variable See how SDKs evaluate values [here](/docs/sdks/). ## Variable Variables are key/value pairs of data that be defined inside individual features. Learn more about their usage here: - [Defining variables in features](/docs/features/#variables) - [Evaluating variables using SDKs](/docs/sdks/javascript/#getting-variables) - [Remote configuration](/docs/use-cases/remote-configuration) ## Variation Defining variations in features allow you to turn them into [A/B](#a-b-test) or [Multivariate test](#multivariate-test) experiments. Learn more about their usage here: - [Defining variations in features](/docs/features/#variations) - [Evaluating variations using SDKS](/docs/sdks/javascript/#getting-variations) ## Weight When we define [variations](#variation) in our features, we also have to provide weights for each variation for splitting their traffic distribution. If you have only two variations, it could be a 50-50 split or a 20-80 split, depending on your needs. It's totally up to do you define them. It's important to understand that weights of variations are something different than rule [percentages](#percentage), because rules are affecting the entire [feature](#feature). Learn more about variations [here](/docs/features/#variations). ## YAML YAML, which stands for "YAML Ain't Markup Language". It's a human-friendly data serialization standard and is often used for configuration files and in applications where data is being stored or transmitted. YAML is designed to be readable and easily editable by humans, and it allows complex data structures to be expressed in a natural and minimal syntax. --- title: Groups nextjs: metadata: title: Groups description: Learn how to create groups in Featurevisor openGraph: title: Groups description: Learn how to create groups in Featurevisor images: - url: /img/og/docs-groups.png --- Groups enable you to run mutually exclusive experiments. {% .lead %} ## Mutually exclusive experiments Let's say you have two experiments defined as `firstFeature` and `secondFeature`. You want to run them both together in production, but you do not want to expose both of them together to any single user. That's what "mutually exclusive" means here. They will never overlap for the same user. You can take advantage of groups in Featurevisor to achieve this exclusions list. ## Create a group We can create groups by creating a new file in the `groups` directory: ```yml {% path="groups/myGroup.yml" %} description: My exclusion group slots: - feature: firstFeature # referring features/firstFeature.yml percentage: 50 - feature: secondFeature # referring features/secondFeature.yml percentage: 50 ``` The name of the group file is not used anywhere, so you can name it however you want. ## Slots A group consists of multiple slots. Each slot in a group defines a feature (by its key) and a percentage value (from 0 to 100). All the percentage values of slots in a group must add up to 100. In the example above, we have two slots, each with a percentage value of 50. This means that any user once bucketed can only fall in to one of the slots, and not both. The first 50% of the users will be exposed to `firstFeature`, and the other 50% will be exposed to `secondFeature`. ## Impact on affected features The slot's percentage determines what's the maximum percentage value you can use in your Feature's rollout rules. In the example above, the maximum percentage value you can use in `firstFeature` and `secondFeature` is 50. Here's how the `firstFeature` would look like in that case: ```yml {% path="features/firstFeature.yml" %} # ... rules: rules: - key: everyone segments: '*' percentage: 50 # can be any value between 0 and 50 ``` ## Bucketing Whenever a feature is evaluated, it goes through a bucketing process where the user is assigned a number between 0 and 100. If the bucketed number falls into the range of any slot, it means that the user is exposed to that particular feature only in the group. Read more in [Bucketing](/docs/bucketing). ## Limitations - A feature can only belong to a maximum of one group at a time - A feature cannot repeat in the same group's slots (this will be supported in a future version) - Both the required and dependent features cannot coexist in the same group ## Guides To maintain consistent bucketing, you are advised to: - Keep mutual exclusions in mind before creating the feature and the group. - Create them together, and then add the rollout rules to the feature. - Plan the percentage distribution in your slots early when creating the group, and do not change those values afterwards. - Start with a bigger slot for your feature, even if you do not want to use the full percentage value for your feature's rules. You can slowly increase it at feature's rule level later. - You can set `feature: false` if you want to remove a feature from a group's slot without any replacement feature. Rely on [linting](/docs/linting) to catch any mistakes. ## Archiving Groups cannot be archived. If you don't need them any more, you can delete their YAML files. --- title: GitHub Pages nextjs: metadata: title: GitHub Pages description: Learn how to upload Featurevisor datafiles to GitHub Pages openGraph: title: GitHub Pages description: Learn how to upload Featurevisor datafiles to GitHub Pages images: - url: /img/og/docs-integrations-github-pages.png --- Set up continuous integration and deployment of your Feaurevisor project with GitHub Actions and GitHub Pages. {% .lead %} See more about GitHub Actions set up in previous guide [here](/docs/integrations/github-actions). ## Creating a new project This guide assumes you have created a new Featurevisor project using the CLI: ```{% title="Command" %} $ mkdir my-featurevisor-project && cd my-featurevisor-project $ npx @featurevisor/cli init $ npm install ``` ## GitHub Pages We are going to be uploading to and serving our datafiles from [GitHub Pages](https://pages.github.com/). GitHub Pages is a product that allows you to host your static sites and apps on GitHub's global network. Given Featurevisor project generates static datafiles (JSON files), it is a great fit for our use case. ## Workflows We will be covering two workflows for our set up with GitHub Actions. ### Checks This workflow will be triggered on every push to the repository targeting any non-master or non-main branches. This will help identify any issues with your Pull Requests early before you merge them to your main branch. ```yml {% path=".github/workflows/checks.yml" %} name: Checks on: push: branches-ignore: - main - master jobs: checks: name: Checks runs-on: ubuntu-latest timeout-minutes: 10 steps: - uses: actions/checkout@v4 - uses: actions/setup-node@v4 with: node-version: 20 - name: Install dependencies run: npm ci - name: Lint run: npx featurevisor lint - name: Test specs run: npx featurevisor test - name: Build run: npx featurevisor build ``` ### Publish This workflow is intended to be run on every push to your main (or master) branch, and is supposed to handle publishing your generated datafiles to GitHub Pages: ```yml {% path=".github/workflows/publish.yml" %} name: Publish on: push: branches: - main - master # Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages permissions: contents: write pages: write id-token: write # Allow only one concurrent deployment, skipping runs queued between the run in-progress and latest queued. # However, do NOT cancel in-progress runs as we want to allow these production deployments to complete. concurrency: group: 'pages' cancel-in-progress: false jobs: publish: name: Publish environment: name: github-pages url: ${{ steps.deployment.outputs.page_url }} runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v4 - name: Setup Node.js uses: actions/setup-node@v4 with: node-version: 20 - name: Install dependencies run: npm ci - name: Lint run: npx featurevisor lint - name: Test specs run: npx featurevisor test - name: Build run: npx featurevisor build - name: Create index.html run: echo "It works." > datafiles/index.html - name: Setup Pages uses: actions/configure-pages@v4 - name: Upload artifact uses: actions/upload-pages-artifact@v3 with: path: 'datafiles' - name: Deploy to GitHub Pages id: deployment uses: actions/deploy-pages@v4 - name: Git configs run: | git config user.name "${{ github.actor }}" git config user.email "${{ github.actor }}@users.noreply.github.com" - name: Push back to origin run: | git add .featurevisor/* git commit -m "[skip ci] Revision $(cat .featurevisor/REVISION)" git push ``` After generating new [datafiles](/docs/building-datafiles/) and uploading them, the workflow will also take care of pushing the Featurevisor [state files](/docs/state-files) back to the repository, so that future builds will be built on top of latest state. Once uploaded, your datafiles will be accessible as: `https://.github.io///featurevisor-tag-.json`. You may want to take it a step further by [setting up custom domains (or subdomains)](https://docs.github.com/articles/using-a-custom-domain-with-github-pages/) for your GitHub Pages project. Otherwise, you are good to go. Learn how to consume datafiles from URLs directly using [SDKs](/docs/sdks). ## Full example You can find a fully functional repository based on this guide here: [https://github.com/meirroth/featurevisor-example-github](https://github.com/meirroth/featurevisor-example-github). ## Sequential builds In case you are worried about simultaneous builds triggered by multiple Pull Requests merged in quick succession, you can learn about mitigating any unintended issues [here](/docs/integrations/github-actions/#sequential-builds). --- title: Cloudflare Pages nextjs: metadata: title: Cloudflare Pages description: Learn how to upload Featurevisor datafiles to Cloudflare Pages openGraph: title: Cloudflare Pages description: Learn how to upload Featurevisor datafiles to Cloudflare Pages images: - url: /img/og/docs-integrations-cloudflare-pages.png --- Set up continuous integration and deployment of your Feaurevisor project with GitHub Actions and Cloudflare Pages. {% .lead %} See more about GitHub Actions set up in previous guide [here](/docs/integrations/github-actions). ## Creating a new project This guide assumes you have created a new Featurevisor project using the CLI: ```{% title="Command" %} $ mkdir my-featurevisor-project && cd my-featurevisor-project $ npx @featurevisor/cli init $ npm install ``` ## Cloudflare Pages We are going to be uploading to and serving our datafiles from [Cloudflare Pages](https://pages.cloudflare.com/). Cloudflare Pages is a product that allows you to host your static sites and apps on Cloudflare's global network. Given Featurevisor project generates static datafiles (JSON files), it is a great fit for our use case. Make sure you already have a Cloudflare Pages project set up, and then use it in the publish workflow later. {% callout type="note" title="Note about Cloudflare Pages automatic deployments" %} Cloudflare Pages is set to auto-deploy your site on every push. This could interfere with our GitHub publish action. To prevent this, you can turn off auto deployment by following the steps in this [Cloudflare documentation](https://developers.cloudflare.com/pages/configuration/branch-build-controls/). {% /callout %} ## Secrets Follow the guide [here](https://developers.cloudflare.com/pages/how-to/use-direct-upload-with-continuous-integration/), and set up these two secrets in your GitHub repository's `Settings > Secrets and variables > Actions` section: - `CLOUDFLARE_ACCOUNT_ID` - `CLOUDFLARE_API_TOKEN` ## Repository settings Make sure you have `Read and write permissions` enabled in your GitHub repository's `Settings > Actions > General > Workflow permissions` section. ## Workflows We will be covering two workflows for our set up with GitHub Actions. ### Checks This workflow will be triggered on every push to the repository targeting any non-master or non-main branches. This will help identify any issues with your Pull Requests early before you merge them to your main branch. ```yml {% path=".github/workflows/checks.yml" %} name: Checks on: push: branches-ignore: - main - master jobs: checks: name: Checks runs-on: ubuntu-latest timeout-minutes: 10 steps: - uses: actions/checkout@v4 - uses: actions/setup-node@v4 with: node-version: 20 - name: Install dependencies run: npm ci - name: Lint run: npx featurevisor lint - name: Test specs run: npx featurevisor test - name: Build run: npx featurevisor build ``` ### Publish This workflow is intended to be run on every push to your main (or master) branch, and is supposed to handle uploading of your generated datafiles to Cloudflare Pages: ```yml {% path=".github/workflows/publish.yml" %} name: Publish on: push: branches: - main - master jobs: publish: name: Publish runs-on: ubuntu-latest timeout-minutes: 10 steps: - uses: actions/checkout@v4 - uses: actions/setup-node@v4 with: node-version: 20 - name: Install dependencies run: npm ci - name: Lint run: npx featurevisor lint - name: Test specs run: npx featurevisor test - name: Build run: npx featurevisor build - name: Upload to Cloudflare Pages run: | echo "It works." > datafiles/index.html npx wrangler pages deploy datafiles --project-name="YOUR_CLOUDFLARE_PAGES_PROJECT_NAME" env: CLOUDFLARE_ACCOUNT_ID: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }} CLOUDFLARE_API_TOKEN: ${{ secrets.CLOUDFLARE_API_TOKEN }} - name: Git configs run: | git config user.name "${{ github.actor }}" git config user.email "${{ github.actor }}@users.noreply.github.com" - name: Push back to origin run: | git add .featurevisor/* git commit -m "[skip ci] Revision $(cat .featurevisor/REVISION)" git push ``` After generating new [datafiles](/docs/building-datafiles/) and uploading them, the workflow will also take care of pushing the Featurevisor [state files](/docs/state-files) back to the repository, so that future builds will be built on top of latest state. Once uploaded, your datafiles will be accessible as: `https://.pages.dev//featurevisor-tag-.json`. You may want to take it a step further by setting up custom domains (or subdomains) for your Cloudflare Pages project. Otherwise, you are good to go. Learn how to consume datafiles from URLs directly using [SDKs](/docs/sdks). ## Full example You can find a fully functional repository based on this guide here: [https://github.com/featurevisor/featurevisor-example-cloudflare](https://github.com/featurevisor/featurevisor-example-cloudflare). ## Sequential builds In case you are worried about simultaneous builds triggered by multiple Pull Requests merged in quick succession, you can learn about mitigating any unintended issues [here](/docs/integrations/github-actions/#sequential-builds). --- title: WebSocket & PartyKit nextjs: metadata: title: WebSocket & PartyKit description: Learn how to integrate Featurevisor with PartyKit via WebSocket for realtime updates. openGraph: title: WebSocket & PartyKit description: Learn how to integrate Featurevisor with PartyKit via WebSocket for realtime updates. images: - url: /img/og/docs-integrations-partykit.png --- Fetch latest Featurevisor [datafiles](/docs/building-datafiles/) in already running applications as soon as there are latest changes by listening to messages via [WebSocket](https://developer.mozilla.org/en-US/docs/Web/API/WebSocket) powered by [PartyKit](https://partykit.io). {% .lead %} ## Benefits of realtime updates Having realtime datafile updates in our application(s) can be beneficial in many ways: - **Immediate adaptation**: Development and product teams can adjust features and see them impact users immediately without needing the users to restart/reload their apps. - **Optimized performance**: By pushing updates in realtime, we avoid the overhead of periodic checks or polling, leading to faster application responses and reduced server strain. - **Proactive issue mitigation**: If a newly released feature is causing issues, it can be turned off instantly, minimizing the impact on users and potentially saving the organization from negative publicity or user churn. - **Increased confidence**: Knowing that features can be quickly adjusted or rolled back in realtime gives teams more confidence to experiment, test, and release. ## WebSocket [WebSocket](https://developer.mozilla.org/en-US/docs/Web/API/WebSocket) is a communication protocol that provides full-duplex communication channels over a single TCP connection. By full-duplex, it means both the client and server can send messages to each other independently at the same time by keeping the connection alive. Since WebSocket is a great way to keep a connection alive between the client and server, we can use this protocol to listen to events from a server that can tell us to trigger a new refresh in our SDK instance, as soon as there has been any new updates in our Featurevisor project (the Git repository). ## What is PartyKit? [PartyKit]() is an open source deployment platform for AI agents, multiplayer and local-first apps, games, and websites. It can help us create a new realtime service that we can send messages to from our CI/CD pipeline whenever there are new changes in our Featurevisor project, and then listen to those messages in our application(s) using WebSocket API to trigger a new refresh in our SDK instance. ## The whole flow in steps - We have a Featurevisor project in a Git repository - We have a CI/CD pipeline that [builds our datafiles](/docs/building-datafiles/) and [deploys](/docs/deployment/) them to a CDN - The CI/CD pipeline will send a message to our PartyKit server whenever there are new changes in our Featurevisor project - Our PartyKit server will receive the message and broadcast it to all connected apps - Our application(s) will listen to the message and fetch the latest datafile from the CDN Let's start implementing this flow step by step. ## Create a new PartyKit server We can init a new npm project and install PartyKit: ```{% title="Command" %} $ npm install --save partykit@beta ``` Then we create a new `server.js` file and add the following code: ```js {% path="your-service/server.js" %} // replace with your own secret, and inject via environment variable const PARTY_SECRET_VALUE = 'party-secret' const PARTY_SECRET_HEADER = 'x-partykit-secret' // the event type we will be broadcasting to all connected apps const REFRESH_TYPE = 'refresh' export default { // handle incoming request coming from our CI/CD pipeline async onRequest(request, room) { if (request.method === 'POST') { const body = await request.json() const secretInHeader = request.headers.get(PARTY_SECRET_HEADER) // once we identify the request is coming from our own CI/CD pipeline, // we broadcast a message to all connected apps if (secretInHeader === PARTY_SECRET_VALUE) { room.broadcast( JSON.stringify({ type: REFRESH_TYPE, }), ) return new Response( `Message sent to ${room.connections.size} connected participants`, ) } } return new Response(`Nothing to see here.`) }, } ``` To test locally: ```{% title="Command" %} $ npx partykit dev server.js ``` Now that we have our server ready, we can deploy it: ```{% title="Command" %} $ npx partykit deploy server.js --name my-party ``` ## Send a message from CI/CD pipeline We can build on top of one of our existing guides on how to setup a CI/CD pipeline and deploy our generated datafiles using [GitHub Actions](/docs/integrations/github-actions) & [Cloudflare Pages](/docs/integrations/cloudflare-pages). This guide uses GitHub Actions, but you are free to choose any other tool of your preference. We can add a new step in our workflow that will send a message to our PartyKit server whenever there are new changes: ```yml {% path=".github/workflows/publish.yml" %} # ... jobs: publish: name: Publish runs-on: ubuntu-latest timeout-minutes: 10 steps: # ... # add new step here after uploading the generated datafiles name: Send message to PartyKit run: | curl -X POST \ -H "Content-Type: application/json" \ -H "X-PartyKit-Secret: " -d '{"type": "refresh"}' \ https://..partykit.dev/party/featurevisor ``` The `X-PartyKit-Secret` is there so that our server only accepts messages from our own CI/CD pipeline and not from anyone else. You are free to take any other approach to better manage your security. ## Listening to messages in our application Now that we have a PartyKit server that can receive messages from our CI/CD pipeline and also broadcast it to all connected applications, we (as one of those applications) can listen to those messages and trigger a new refresh of our SDK instance: ```js {% path="your-app/index.js" %} import { createInstance } from '@featurevisor/sdk' const DATAFILE_URL = 'https://cdn.yoursite.com/datafile.json' const WEBSOCKET_URL = 'wss://..partykit.dev/party/featurevisor' function fetchDatafile() { return fetch(DATAFILE_URL) .then((response) => response.json()) } const initialDatafileContent = await fetchDatafile() const f = createInstance({ datafile: initialDatafileContent, }) const socket = new WebSocket(WEBSOCKET_URL) socket.onmessage = (event) => { const message = JSON.parse(event.data) if (message.type === 'refresh') { const newDatafileContent = await fetchDatafile() f.setDatafile(newDatafileContent) } } ``` We just did that without even needing any new library, because WebSocket API is natively supported in modern browsers. {% callout type="note" title="WebSocket support in Node.js" %} If you are using Node.js, you can consider using the [ws](https://github.com/websockets/ws) package. {% /callout %} Wish just a few lines of code, we just made our application listen to messages from our PartyKit server and trigger a new refresh of our SDK instance as soon as there are new changes in our Featurevisor project, making every feature update a realtime update. --- title: GitHub Actions (GHA) nextjs: metadata: title: GitHub Actions (GHA) description: Learn how to set up CI/CD workflows with GitHub Actions for Featurevisor openGraph: title: GitHub Actions (GHA) description: Learn how to set up CI/CD workflows with GitHub Actions for Featurevisor images: - url: /img/og/docs-integrations-github-actions.png --- Set up continuous integration and deployment of your Featurevisor project with GitHub Actions. {% .lead %} Find more info about GitHub Actions [here](https://github.com/features/actions). ## Creating a new project This guide assumes you have created a new Featurevisor project using the CLI: ```{% title="Command" %} $ mkdir my-featurevisor-project && cd my-featurevisor-project $ npx @featurevisor/cli init $ npm install ``` ## Repository settings Make sure you have `Read and write permissions` enabled in your GitHub repository's `Settings > Actions > General > Workflow permissions` section. ## Workflows We will be covering two workflows for our set up with GitHub Actions. ### Checks This workflow will be triggered on every push to the repository targeting any non-master or non-main branches. This will help identify any issues with your Pull Requests early before you merge them to your main branch. ```yml {% path=".github/workflows/checks.yml" %} name: Checks on: push: branches-ignore: - main jobs: checks: name: Checks runs-on: ubuntu-latest timeout-minutes: 10 steps: - uses: actions/checkout@v4 - uses: actions/setup-node@v4 with: node-version: 20 - name: Install dependencies run: npm ci - name: Lint run: npx featurevisor lint - name: Test specs run: npx featurevisor test - name: Build run: npx featurevisor build ``` ### Publish This workflow is intended to be run on every push to your main (or master) branch, and is supposed to handle uploading of your generated datafiles as well: ```yml {% path=".github/workflows/publish.yml" %} name: Publish on: push: branches: - main jobs: ci: name: Publish runs-on: ubuntu-latest timeout-minutes: 10 steps: - uses: actions/checkout@v4 - uses: actions/setup-node@v4 with: node-version: 20 - name: Install dependencies run: npm ci - name: Lint run: npx featurevisor lint - name: Test specs run: npx featurevisor test - name: Build run: npx featurevisor build - name: Upload datafiles run: echo "Uploading..." # Update "datafiles" directory content based on your CDN set up - name: Git configs run: | git config user.name "${{ github.actor }}" git config user.email "${{ github.actor }}@users.noreply.github.com" - name: Push back to origin run: | git add .featurevisor/* git commit -m "[skip ci] Revision $(cat .featurevisor/REVISION)" git push ``` After generating new [datafiles](/docs/building-datafiles/) and uploading them, the workflow will also take care of pushing the Featurevisor [state files](/docs/state-files) back to the repository, so that future builds will be built on top of latest state. If you want an example of an actual uploading step, see [Cloudflare Pages](/docs/integrations/cloudflare-pages/) integration guide. ## Sequential builds It is possible you might want to run the publish workflow sequentially for every merged Pull Requests, in case multiple Pull Requests are merged in quick succession. ### Queue You can consider using [softprops/turnstyle](https://github.com/softprops/turnstyle) GitHub Action to run publish workflow of all your merged Pull Requests sequentially. ### Branch protection rules Next to it, you can also make it stricter by requiring all Pull Request authors to have their branches up to date from latest main branch before merging: - [Managing suggestions to update pull request branches](https://docs.github.com/en/repositories/configuring-branches-and-merges-in-your-repository/configuring-pull-request-merges/managing-suggestions-to-update-pull-request-branches) - [Create branch protection rule](https://docs.github.com/en/repositories/configuring-branches-and-merges-in-your-repository/managing-protected-branches/managing-a-branch-protection-rule#creating-a-branch-protection-rule) (see #7) - Require branches to be up to date before merging --- title: Linting nextjs: metadata: title: Linting description: Lint your Featurevisor definition files openGraph: title: Linting description: Lint your Featurevisor definition files images: - url: /img/og/docs.png --- Featurevisor provides a CLI command to lint your definition files, making sure they are all valid and won't cause any issues when you build your datafiles. {% .lead %} ## Usage Run: ```{% title="Command" %} $ npx featurevisor lint ``` And it will show you the errors in your definition files, if any. If any errors are found, it will terminate with a non-zero exit code. ## CLI options ### `keyPattern` You can also filter keys using regex patterns: ```{% title="Command" %} $ npx featurevisor lint --keyPattern="myKeyHere" ``` ### `entityType` If you want to filter it down further by entity type: ```{% title="Command" %} $ npx featurevisor lint --keyPattern="myKeyHere" --entityType="feature" ``` Possible values for `--entityType`: - `attribute` - `segment` - `feature` - `group` - `test` ## NPM scripts If you are using npm scripts for linting your Featurevisor project like this: ```js {% path="package.json" %} } { "scripts": { "lint": "featurevisor lint" } } ``` You can then pass your options in CLI after `--`: ``` $ npm run lint -- --keyPattern="myKeyHere" ``` --- title: Linting YAMLs nextjs: metadata: title: Linting YAMLs description: Lint your Featurevisor YAML files openGraph: title: Linting YAMLs description: Lint your Featurevisor YAML files images: - url: /img/og/docs.png --- This page has moved [here](/docs/linting.md). --- title: llms.txt nextjs: metadata: title: llms.txt description: Access Featurevisor documentation as llms.txt file for your AI tools openGraph: title: llms.txt description: Access Featurevisor documentation as llms.txt file for your AI tools images: - url: /img/og/docs-llms.png --- Access Featurevisor documentation as [`llms.txt`](https://featurevisor.com/llms.txt) file for your AI tools. ## Links - [https://featurevisor.com/llms.txt](https://featurevisor.com/llms.txt) --- title: Migrating from v1 to v2 showInlineTOC: true nextjs: metadata: title: Migrating from v1 to v2 description: Guide for migrating from Featurevisor v1 to v2 openGraph: title: Migrating from v1 to v2 description: Guide for migrating from Featurevisor v1 to v2 images: - url: /img/og/docs-migrations-v2.png --- Detailed guide for migrating existing Featurevisor projects (using Featurevisor CLI) and applications (using Featurevisor SDKs) to latest v2.0. --- ## Defining attributes ### Attribute as an object {% label="New" labelType="success" %} Attribute values in context can now also be flat objects. You can still continue to use other existing attribute types without any changes. This change is only if you wish to define attributes as objects. #### Defining attribute {% row %} {% column %} ```yml {% title="Before" path="attributes/userId.yml" %} description: My userId attribute type: string ``` ```yml {% title="Before" path="attributes/userCountry.yml" %} description: My userCountry attribute type: string ``` {% /column %} {% column %} ```yml {% title="After" path="attributes/user.yml" highlight="3,6,9" %} description: My user attribute description type: object properties: id: type: string description: The user ID country: type: string description: The country of the user ``` {% /column %} {% /row %} #### Passing attribute in context When evaluating values in your application with SDKs, you can pass the value as an object: {% row %} {% column %} ```js {% title="Before" path="your-app/index.js" highlight="4-5" %} const f; // Featurevisor SDK instance const context = { userId: '123', userCountry: 'nl', browser: 'chrome', } const isFeatureEnabled = f.isEnabled( 'myFeature', context ) ``` {% /column %} {% column %} ```js {% title="After" path="your-app/index.js" highlight="4-7" %} const f; // Featurevisor SDK instance const context = { user: { id: '123', country: 'nl', }, browser: 'chrome', } const isFeatureEnabled = f.isEnabled( 'myFeature', context ) ``` {% /column %} {% /row %} #### Dot separated path You can make use of dot-separated paths to specify nested attributes. For example, inside features: {% row %} {% column %} ```yml {% title="Before" path="features/myFeature.yml" highlight="3" %} # ... bucketBy: userId ``` {% /column %} {% column %} ```yml {% title="After" path="features/myFeature.yml" highlight="3" %} # ... bucketBy: user.id ``` {% /column %} {% /row %} And also in conditions: {% row %} {% column %} ```yml {% title="Before" path="segments/netherlands.yml" highlight="4" %} description: Netherlands segment conditions: - attribute: userCountry operator: equals value: nl ``` {% /column %} {% column %} ```yml {% title="After" path="segments/netherlands.yml" highlight="4" %} description: Netherlands segment conditions: - attribute: user.country operator: equals value: nl ``` {% /column %} {% /row %} Learn more in [Attributes](/docs/attributes) page. ## Defining segments ### Conditions targeting everyone {% label="New" labelType="success" %} We can now use asterisks (`*`) in conditions (either directly in segments or in features) to match any condition: ```yml {% path="segments/mySegment.yml" highlight="3" %} description: My segment description conditions: '*' ``` This is very handy when you wish to start with an empty segment, then later add conditions to it. ### Operator: exists {% label="New" labelType="success" %} Checks if the attribute exists in the context: {% row %} {% column %} ```yml {% path="segments/mySegment.yml" highlight="4" %} description: My segment description conditions: - attribute: browser operator: exists ``` {% /column %} {% column %} ```js {% path="your-app/index.js" highlight="5" %} const f; // Featurevisor SDK instance const context = { userId: '123', browser: 'chrome', // exists } const isFeatureEnabled = f.isEnabled( 'myFeature', context ) ``` {% /column %} {% /row %} ### Operator: notExists {% label="New" labelType="success" %} Checks if the attribute does not exist in the context: {% row %} {% column %} ```yml {% path="segments/mySegment.yml" highlight="4" %} description: My segment description conditions: - attribute: browser operator: notExists ``` {% /column %} {% column %} ```js {% path="your-app/index.js" highlight="5" %} const f; // Featurevisor SDK instance const context = { userId: '123', // `browser` does not exist } const isFeatureEnabled = f.isEnabled( 'myFeature', context ) ``` {% /column %} {% /row %} ### Operator: includes {% label="New" labelType="success" %} Checks if a certain value is included in the attribute's array (of strings) value. {% row %} {% column %} ```yml {% path="segments/mySegment.yml" highlight="4" %} description: My segment description conditions: - attribute: permissions operator: includes value: write ``` {% /column %} {% column %} ```js {% path="your-app/index.js" highlight="7" %} const f; // Featurevisor SDK instance const context = { userId: '123', permissions: [ 'read', 'write', // included 'delete', ], } const isFeatureEnabled = f.isEnabled( 'myFeature', context ) ``` {% /column %} {% /row %} ### Operator: notIncludes {% label="New" labelType="success" %} Checks if a certain value is not included in the attribute's array (of strings) value. {% row %} {% column %} ```yml {% path="segments/mySegment.yml" highlight="4" %} description: My segment description conditions: - attribute: permissions operator: notIncludes value: write ``` {% /column %} {% column %} ```js {% path="your-app/index.js" highlight="7" %} const f; // Featurevisor SDK instance const context = { userId: '123', permissions: [ 'read', // 'write' is not included 'delete', ], } const isFeatureEnabled = f.isEnabled( 'myFeature', context ) ``` {% /column %} {% /row %} ### Operator: matches {% label="New" labelType="success" %} Checks if the attribute's value matches a regular expression: {% row %} {% column %} ```yml {% path="segments/mySegment.yml" highlight="4" %} description: My segment description conditions: - attribute: userAgent operator: matches value: '(Chrome|Firefox)\/([6-9]\d|\d{3,})' # optional regex flags regexFlags: i ``` {% /column %} {% column %} ```js {% path="your-app/index.js" highlight="5" %} const f; // Featurevisor SDK instance const context = { userId: '123', userAgent: window.navigator.userAgent, } const isFeatureEnabled = f.isEnabled( 'myFeature', context ) ``` {% /column %} {% /row %} ### Operator: notMatches {% label="New" labelType="success" %} Checks if the attribute's value does not match a regular expression: {% row %} {% column %} ```yml {% path="segments/mySegment.yml" highlight="4" %} description: My segment description conditions: - attribute: userAgent operator: notMatches value: '(Chrome|Firefox)\/([6-9]\d|\d{3,})' # optional regex flags regexFlags: i ``` {% /column %} {% column %} ```js {% path="your-app/index.js" highlight="5" %} const f; // Featurevisor SDK instance const context = { userId: '123', userAgent: window.navigator.userAgent, } const isFeatureEnabled = f.isEnabled( 'myFeature', context ) ``` {% /column %} {% /row %} Learn more in [Segments](/docs/segments) page. ## Defining features ### Defining variable schema {% label="Breaking" labelType="error" %} {% row %} {% column %} ```yml {% title="Before" path="features/myFeature.yml" highlight="4" %} # ... variablesSchema: - key: myVariableKey type: string defaultValue: 'default value' ``` {% /column %} {% column %} ```yml {% title="After" path="features/myFeature.yml" highlight="4" %} # ... variablesSchema: myVariableKey: type: string defaultValue: 'default value' ``` {% /column %} {% /row %} Learn more in [Variables](/docs/features/#variables) section. ### When feature is disabled, use default variable value {% label="New" labelType="success" %} When a feature itself is evaluated as disabled, its variable values by default always get evaluated as empty (`undefined` in v1, and `null` in v2). Now, you can choose on a per variable basis whether to serve the default value if the feature is disabled or default to `null`. {% row %} {% column %} ```yml {% title="Before" path="features/myFeature.yml" highlight="" %} # ... variablesSchema: - key: myVariableKey type: string defaultValue: default value ``` {% /column %} {% column %} ```yml {% title="After" path="features/myFeature.yml" highlight="10" %} # ... variablesSchema: myVariableKey: type: string defaultValue: default value # optionally serve default value # when feature is disabled useDefaultWhenDisabled: true ``` {% /column %} {% /row %} Learn more in [Variables](/docs/features/#variables) section. ### When feature is disabled, serve different variable value {% label="New" labelType="success" %} Instead of serving default value, if you want to a different value to be served for your variable whenthe feature itself is disabled, you can do this: {% row %} {% column %} ```yml {% title="Before" path="features/myFeature.yml" highlight="" %} # ... variablesSchema: - key: myVariableKey type: string defaultValue: default value ``` {% /column %} {% column %} ```yml {% title="After" path="features/myFeature.yml" highlight="10" %} # ... variablesSchema: myVariableKey: type: string defaultValue: default value # optionally serve different value # when feature is disabled disabledValue: different value for disabled feature ``` {% /column %} {% /row %} Learn more in [Variables](/docs/features/#variables) section. ### When feature is disabled, serve a specific variation value {% label="New" labelType="success" %} If the feature itself is evaluated as disabled, then its variation value will be evaluated as `null` by default. If you wish to serve a specific variation value in those cases, you can do this: {% row %} {% column %} ```yml {% title="Before" path="features/myFeature.yml" highlight="" %} # ... variations: - value: control weight: 50 - value: treatment weight: 50 ``` {% /column %} {% column %} ```yml {% title="After" path="features/myFeature.yml" highlight="10" %} # ... variations: - value: control weight: 50 - value: treatment weight: 50 disabledVariationValue: control ``` {% /column %} {% /row %} Learn more in [Variations](/docs/features/#variations) section. ### Variable overrides from variations {% label="Breaking" labelType="error" %} {% row %} {% column %} ```yml {% title="Before" path="features/myFeature.yml" highlight="9-15" %} # ... variations: - value: control weight: 50 - value: treatment weight: 50 # had to be used together variables: - key: bgColor value: blue overrides: - segments: netherlands value: orange ``` {% /column %} {% column %} ```yml {% title="After" path="features/myFeature.yml" highlight="9-15" %} # ... variations: - value: control weight: 50 - value: treatment weight: 50 # can be overridden independently variables: bgColor: blue variableOverrides: bgColor: - segments: netherlands value: orange ``` {% /column %} {% /row %} Learn more in [Variables](/docs/features/#variables) section. ### Defining rules {% label="Breaking" labelType="error" %} Rules have moved to top level of the feature definition, and the `environments` property is no longer used. This has resulted in less nesting and more clarity in defining rules. {% row %} {% column %} ```yml {% title="Before" path="features/myFeature.yml" highlight="3-5" %} # ... environments: production: rules: - key: everyone segments: '*' percentage: 100 ``` {% /column %} {% column %} ```yml {% title="After" path="features/myFeature.yml" highlight="4-5" %} # ... rules: production: - key: everyone segments: '*' percentage: 100 ``` {% /column %} {% /row %} Learn more in [Rules](/docs/features/#rules) section. ### Defining forced overrides {% label="Breaking" labelType="error" %} Similar to rules above, force entries have moved to top level of the feature definition as well. {% row %} {% column %} ```yml {% title="Before" path="features/myFeature.yml" highlight="3-5" %} # ... environments: production: force: - segments: qa enabled: true ``` {% /column %} {% column %} ```yml {% title="After" path="features/myFeature.yml" highlight="4-5" %} # ... force: production: - segments: qa enabled: true ``` {% /column %} {% /row %} Learn more in [Force](/docs/features/#force) section. ### Exposing feature in datafile {% label="Breaking" labelType="error" %} The `expose` property had a very rare use case, that controlled the inclusion of a feature in generated datafiles targeting a specific environment and/or tag. {% row %} {% column %} ```yml {% title="Before" path="features/myFeature.yml" highlight="3-5" %} # ... environments: production: expose: false ``` {% /column %} {% column %} ```yml {% title="After" path="features/myFeature.yml" highlight="4-5" %} # ... expose: production: false ``` {% /column %} {% /row %} Learn more in [Expose](/docs/features/#expose) section. ### Variation weight overrides {% label="New" labelType="success" %} If you are running experiments, you can now override the weights of your variations on a per rule basis: ```yml {% path="features/myFeature.yml" highlight="4-9,17-20" %} # ... variations: # common weights for all rules - value: control weight: 50 - value: treatment weight: 50 rules: production: - key: netherlands segments: netherlands percentage: 100 # override the weights here for this rule alone variationWeights: control: 10 treatment: 90 - key: everyone segments: '*' percentage: 100 ``` Learn more in [Variations](/docs/features/#variations) section. --- ## Project configuration ### outputDirectoryPath {% label="Breaking" labelType="error" %} Default output directory path has been changed from `dist` to `datafiles`. This is to better reflect the contents of the directory. {% row %} {% column %} ```js {% title="Before" path="featurevisor.config.js" highlight="3" %} module.exports = { // defaulted to this directory outputDirectoryPath: 'dist', } ``` {% /column %} {% column %} ```js {% title="After" path="featurevisor.config.js" highlight="3" %} module.exports = { // defaults to this directory datafilesDirectoryPath: 'datafiles', } ``` {% /column %} {% /row %} ### datafileNamePattern {% label="New" labelType="success" %} Previously defaulted to `datafile-%s.json`, it has been changed to `featurevisor-%s.json`. {% row %} {% column %} ```js {% title="Before" path="featurevisor.config.js" highlight="2" %} module.exports = { // no option available to customize it } ``` {% /column %} {% column %} ```js {% title="After" path="featurevisor.config.js" highlight="2" %} module.exports = { datafileNamePattern: 'featurevisor-%s.json', } ``` {% /column %} {% /row %} Learn more in [Configuration](/docs/configuration/) page. --- ## CLI usage ### Upgrade to latest CLI {% label="New" labelType="success" %} In your Featurevisor project repository: ```text {% title="Command" %} $ npm install --save @featurevisor/cli@2 ``` ### Building v1 datafiles {% label="New" labelType="success" %} It is understandable you may have applications that still consume v1 datafiles using v1 compatible SDKs. To keep supporting both v1 and v2 from the same project in a backwards compatible way, you can build new v2 datafiles as usual: ```{% title="Command" %} $ npx featurevisor build ``` and on top of that, also build v1 datafiles: ```{% title="Command" %} $ npx featurevisor build \ --schema-version=1 \ --no-state-files \ --datafiles-dir=datafiles/v1 ``` ### Using hash as datafile revision {% label="New" labelType="success" %} By default, every time you build datafiles, a new revision is generated which is an incremental number. ```{% title="Command" %} $ npx featurevisor build ``` You may often have changes like updating a feature's description, which do not require a new revision number. To avoid that, you can pass `--revisionFromHash` option to the CLI: ```{% title="Command" %} $ npx featurevisor build --revisionFromHash ``` If individual datafile contents do not change since last build, the revision will not change either. This helps implement caching when serving datafiles from CDN with ease. ### Datafile naming convention {% label="Breaking" labelType="error" %} Naming convention of built datafiles has been changed from `datafile-tag-.json` to `featurevisor-tag-.json` to help distinguish between Featurevisor datafiles and other datafiles that may be used in your project: {% row %} {% column %} ```{% title="Before" highlight="1,4,6" %} $ tree dist . ├── production │ └── datafile-tag-all.json └── staging └── datafile-tag-all.json 2 directories, 2 files ``` {% /column %} {% column %} ```js {% title="After" highlight="1,4,6" %} $ tree datafiles . ├── production │ └── featurevisor-tag-all.json └── staging └── featurevisor-tag-all.json 2 directories, 2 files ``` {% /column %} {% /row %} If you wish to maintain the old naming convention, you can update your project configuration: ```js {% path="featurevisor.config.js" highlight="4-5" %} module.exports = { // ... datafilesDirectoryPath: 'dist', datafileNamePattern: 'datafile-%s.json', } ``` --- ## JavaScript SDK usage ### Upgrade to latest SDK {% label="New" labelType="success" %} In your application repository: ```text {% title="Command" %} $ npm install --save @featurevisor/sdk@2 ``` ### Fetching datafile {% label="Breaking" labelType="error" %} This option has been removed from the SDK. You are now required to take care of fetching the datafile yourself and passing it to the SDK: {% row %} {% column %} ```js {% title="Before" path="your-app/index.js" highlight="6-10" %} import { createInstance } from '@featurevisor/sdk' const DATAFILE_URL = '...' const f = createInstance({ datafileUrl: DATAFILE_URL, onReady: () => { console.log('SDK is ready') }, }) ``` {% /column %} {% column %} ```js {% title="After" path="your-app/index.js" highlight="9" %} import { createInstance } from '@featurevisor/sdk' const DATAFILE_URL = '...' const datafileContent = await fetch(DATAFILE_URL) .then((res) => res.json()) const f = createInstance({ datafile: datafileContent, }) ``` `onReady` callback is no longer needed, as the SDK is ready immediately after you pass the datafile. {% /column %} {% /row %} ### Refreshing datafile {% label="Breaking" labelType="error" %} This option has been removed from the SDK. You are now required to take care of fetching the datafile and then set to it existing SDK instance: {% row %} {% column %} ```js {% title="Before" path="your-app/index.js" highlight="6-16,20,23-24" %} import { createInstance } from '@featurevisor/sdk' const DATAFILE_URL = '...' const f = createInstance({ datafileUrl: DATAFILE_URL, refreshInterval: 60, // every 60 seconds onRefresh: () => { console.log('Datafile refreshed') }, onUpdate: () => { console.log('New datafile revision detected') }, }) // manually refresh f.refresh() // stop/start refreshing f.stopRefreshing() f.startRefreshing() ``` {% /column %} {% column %} ```js {% title="After" path="your-app/index.js" highlight="12-19,26" %} import { createInstance } from '@featurevisor/sdk' const DATAFILE_URL = '...' const datafileContent = await fetch(DATAFILE_URL) .then((res) => res.json()) const f = createInstance({ datafile: datafileContent, }) const unsubscribe = f.on("datafile_set", ({ revision, // new revision previousRevision, revisionChanged, // true if revision has changed features, // list of all affected feature keys }) => { console.log('Datafile set') }); // custom interval setInterval(function () { const datafileContent = await fetch(DATAFILE_URL) .then((res) => res.json()) f.setDatafile(datafileContent) }, 60 * 1000); ``` `refreshInterval`, `onRefresh` and `onUpdate` options and `refresh` method are no longer supported. {% /column %} {% /row %} ### Getting variation {% label="Soft breaking" labelType="warning" %} When evaluating the variation of a feature that is disabled, the SDK used to return `undefined` in v1. This was challenging to handle in non-JavaScript SDKs, since there is no concept of `undefined` as a type there. Therefore, it has been changed to return `null` in v2. {% row %} {% column %} ```js {% title="Before" path="your-app/index.js" highlight="5" %} const f; // Featurevisor SDK instance const context = { userId: '123' } // could be either `string` or `undefined` const variation = f.getVariation( 'myFeature', context ) ``` {% /column %} {% column %} ```js {% title="After" path="your-app/index.js" highlight="5" %} const f; // Featurevisor SDK instance const context = { userId: '123' } // now either `string` or `null` const variation = f.getVariation( 'myFeature', context ) ``` {% /column %} {% /row %} ### Getting variable {% label="Soft breaking" labelType="warning" %} Similar to above for getting variation, when evaluating a variable of a feature that is disabled, the SDK will now return `null` instead of `undefined`. {% row %} {% column %} ```js {% title="Before" path="your-app/index.js" highlight="5" %} const f; // Featurevisor SDK instance const context = { userId: '123' } // could be either value or `undefined` const variableValue = f.getVariable( 'myFeature', 'myVariableKey', context ) ``` {% /column %} {% column %} ```js {% title="After" path="your-app/index.js" highlight="5" %} const f; // Featurevisor SDK instance const context = { userId: '123' } // now either value or `null` const variation = f.getVariable( 'myFeature', 'myVariableKey', context ) ``` {% /column %} {% /row %} This is applicable for type specific SDK methods as well for variables: - `getVariableString` - `getVariableBoolean` - `getVariableInteger` - `getVariableDouble` - `getVariableArray` - `getVariableObject` - `getVariableJSON` ### Activation {% label="Breaking" labelType="error" %} Experiment activations are not handled by the SDK any more. {% row %} {% column %} ```js {% title="Before" path="your-app/index.js" highlight="6-18,22" %} import { createInstance } from '@featurevisor/sdk' const f = createInstance({ // ... onActivate: function ({ featureKey, variationValue, fullContext, captureContext, }) { // send to your analytics service here track('activation', { experiment: featureKey, variation: variationValue, userId: fullContext.userId, }) }, }) const context = { userId: '123' } f.activate('featureKey', context) ``` {% /column %} {% column %} ```js {% title="After" path="your-app/index.js" highlight="6-13" %} import { createInstance } from '@featurevisor/sdk' const f; // Featurevisor SDK instance const context = { userId: '123' } const variation = f.getVariation("mFeature", context); // send to your analytics service here track('activation', { experiment: 'myFeature', variation: variation.value, userId: context.userId, }) ``` `activate` method and `onActivate` option are no longer supported. You can also make use of new [Hooks API](#hooks). {% /column %} {% /row %} ### Sticky features {% label="Breaking" labelType="error" %} {% row %} {% column %} ```js {% title="Before" path="your-app/index.js" highlight="15,19" %} import { createInstance } from '@featurevisor/sdk' const stickyFeatures = { myFeatureKey: { enabled: true, variation: 'control', variables: { myVariableKey: 'myVariableValue', }, }, } // when creating instance const f = createInstance({ stickyFeatures: stickyFeatures, }) // replacing sticky features later f.setStickyFeatures(stickyFeatures) ``` {% /column %} {% column %} ```js {% title="After" path="your-app/index.js" highlight="15,19" %} import { createInstance } from '@featurevisor/sdk' const stickyFeatures = { myFeatureKey: { enabled: true, variation: 'control', variables: { myVariableKey: 'myVariableValue', }, }, } // when creating instance const f = createInstance({ sticky: stickyFeatures, }) // replacing sticky features later f.setSticky(stickyFeatures, true) ``` Unless `true` is passed as the second argument, the sticky features will be merged with the existing ones. {% /column %} {% /row %} ### Initial features {% label="Breaking" labelType="error" %} Initial features used to be handy for setting some early values before the SDK fetched datafile and got ready. But since datafile fetching responsibility is now on you, the initial features are no longer needed. {% row %} {% column %} ```js {% title="Before" path="your-app/index.js" highlight="15" %} import { createInstance } from '@featurevisor/sdk' const initialFeatures = { myFeatureKey: { enabled: true, variation: 'control', variables: { myVariableKey: 'myVariableValue', }, }, } // when creating instance const f = createInstance({ initialFeatures: initialFeatures, }) ``` {% /column %} {% column %} ```js {% title="After" path="your-app/index.js" highlight="15,19,22-27" %} import { createInstance } from '@featurevisor/sdk' const initialFeatures = { myFeatureKey: { enabled: true, variation: 'control', variables: { myVariableKey: 'myVariableValue', }, }, } // you can pass them as sticky instead const f = createInstance({ sticky: initialFeatures, }) // fetch and set datafile after f.setDatafile(datafileContent) // remove sticky features after f.setSticky( {}, // replacing with empty object true ) ``` {% /column %} {% /row %} ### Setting context {% label="New" labelType="success" %} {% row %} {% column %} ```js {% title="Before" path="your-app/index.js" highlight="11" %} import { createInstance } from '@featurevisor/sdk' const f = createInstance({ // ... }) const isFeatureEnabled = f.isEnabled( 'myFeature', // pass context directly only { userId: '123' }, ) ``` {% /column %} {% column %} ```js {% title="After" path="your-app/index.js" highlight="7,11-13,16-22,33" %} import { createInstance } from '@featurevisor/sdk' const f = createInstance({ // ... // optional initial context context: { browser: 'chrome' }, }) // set more context later (append) f.setContext({ userId: '123', }) // replace currently set context entirely f.setContext( { userId: '123', browser: 'firefox', }, true, // replace ) // already set context will be used automatically const isFeatureEnabled = f.isEnabled('myFeature') // you can still pass context directly // for overriding specific attributes const isFeatureEnabled = f.isEnabled( 'myFeature', // still allows passing context directly { browser: 'edge' }, ) ``` {% /column %} {% /row %} ### Logging {% label="Breaking" labelType="error" %} Instead of passing all log [levels](/docs/sdks/javascript/#levels) individually, you can now pass a single level to the SDK. The set level will cover all the levels below it, so you can pass `debug` to cover all the levels together. #### Creating logger instance {% label="Breaking" labelType="error" %} {% row %} {% column %} ```js {% title="Before" path="your-app/index.js" highlight="8-13" %} import { createInstance, createLogger } from '@featurevisor/sdk' const f = createInstance({ logger: createLogger({ levels: [ 'debug', 'info', 'warn', 'error', ], }) }) ``` {% /column %} {% column %} ```js {% title="After" path="your-app/index.js" highlight="8" %} import { createInstance, createLogger } from '@featurevisor/sdk' const f = createInstance({ logger: createLogger({ level: 'debug', }) }) ``` Setting `debug` will now cover all the levels together, instead of having to pass them all individually. {% /column %} {% /row %} #### Passing log level when creating SDK instance {% label="New" labelType="success" %} Alternatively, you can also pass the log level directly when creating the SDK instance: ```js {% path="your-app/index.js" highlight="4" %} import { createInstance } from '@featurevisor/sdk' const f = createInstance({ logLevel: 'debug', }) ``` #### Setting log level after creating SDK instance {% label="Breaking" labelType="error" %} You can also change the log level after creating the SDK instance: {% row %} {% column %} ```js {% title="Before" path="your-app/index.js" highlight="" %} f.setLogLevels([ 'error', 'warn', 'info', 'debug', ]) ``` {% /column %} {% column %} ```js {% title="After" path="your-app/index.js" highlight="8" %} f.setLogLevel('debug') ``` {% /column %} {% /row %} Read more in [Logging](/docs/sdks/javascript/#logging) section. ### Hooks {% label="New" labelType="success" %} Hooks are a set of new APIs allowing you to intercept the evaluation process and customize it. A hook can be defined as follows: ```ts {% title="Defining a hook" path="your-app/index.ts" %} import { Hook } from "@featurevisor/sdk" const myCustomHook: Hook = { // only required property name: 'my-custom-hook', // rest of the properties below are all optional per hook // before evaluation before: function (options) { const { type, // `feature` | `variation` | `variable` featureKey, variableKey, // if type is `variable` context } options; // update context before evaluation options.context = { ...options.context, someAdditionalAttribute: 'value', } return options }, // after evaluation after: function (evaluation, options) { const { reason // `error` | `feature_not_found` | `variable_not_found` | ... } = evaluation if (reason === "error") { // log error return } }, // configure bucket key bucketKey: function (options) { const { featureKey, context, bucketBy, bucketKey, // default bucket key } = options; // return custom bucket key return bucketKey }, // configure bucket value (between 0 and 100,000) bucketValue: function (options) { const { featureKey, context, bucketKey, bucketValue, // default bucket value } = options; // return custom bucket value return bucketValue }, } ``` You can register the hook when creating SDK instance: ```js {% title="When creating instance" path="your-app/index.js" highlight="6" %} import { createInstance } from '@featurevisor/sdk' const f = createInstance({ // ... hooks: [myCustomHook], }) ``` You can also register the hook after creating the SDK instance: ```js {% title="After creating instance" path="your-app/index.js" highlight="3,6" %} const f; // Featurevisor SDK instance const removeHook = f.addHook(myCustomHook) // remove the hook later removeHook() ``` ### Intercepting context {% label="Breaking" labelType="error" %} {% row %} {% column %} ```js {% title="Before" path="your-app/index.js" highlight="6-12" %} import { createInstance } from '@featurevisor/sdk' const f = createInstance({ // ... interceptContext: function (context) { // modify context before evaluation return { ...context, someAdditionalAttribute: 'value', } }, }) ``` {% /column %} {% column %} ```js {% title="After" path="your-app/index.js" highlight="6-19" %} import { createInstance } from '@featurevisor/sdk' const f = createInstance({ // ... hooks: [ { name: 'intercept-context', before: function (options) { // modify context before evaluation options.context = { ...options.context, someAdditionalAttribute: 'value', } return options }, }, ], }) ``` {% /column %} {% /row %} ### Events {% label="Breaking" labelType="error" %} All the known events from v1 SDK have been removed in v2 SDK: - Readiness: see [fetching datafile](#fetching-datafile) - `onReady` option and method - `ready` event - Refreshing: see [refreshing datafile](#refreshing-datafile) - `refresh` event and method - `startRefreshing` method - `stopRefreshing` method - `onRefresh` option - `update` event - `onUpdate` option - Activation: see [activation](#activation) - `activate` event and method - `onActivate` option A new set of events has been introduced which are more generic. Because of these changes, reactivity is vastly improved allowing you to listen to the changes of specific features and react to them in a highly efficient way without having to reload or restart your application. #### datafile_set {% label="New" labelType="success" %} Will trigger when a datafile is set to the SDK instance: ```js {% path="your-app/index.js" highlight="3-10" %} const f; // Featurevisor SDK instance const unsubscribe = f.on("datafile_set", ({ revision, // new revision previousRevision, revisionChanged, // true if revision has changed features, // list of all affected feature keys }) => { console.log('Datafile set') }) unsubscribe(); ``` #### context_set {% label="New" labelType="success" %} Will trigger when context is set to the SDK instance: ```js {% path="your-app/index.js" highlight="3-7" %} const f; // Featurevisor SDK instance const unsubscribe = f.on("context_set", ({ replaced, // true if context was replaced context, // the new context }) => { console.log('Context set') }) unsubscribe(); ``` #### sticky_set {% label="New" labelType="success" %} Will trigger when sticky features are set to the SDK instance: ```js {% path="your-app/index.js" highlight="3-7" %} const f; // Featurevisor SDK instance const unsubscribe = f.on("sticky_set", ({ replaced, // true if sticky features got replaced features, // list of all affected feature keys }) => { console.log('Sticky features set') }) unsubscribe(); ``` ### Child instance {% label="New" labelType="success" %} It's one thing to deal with the same SDK instance when you are building a client-side application (think web or mobile app) where only one user is accessing the application. But when you are building a server-side application (think a REST API) serving many different users simultaneously, you may want to have different SDK instances with user or request specific context. Child instances make it very easy to achieve that now: ```js {% title="Primary instance" %} import { createInstance } from '@featurevisor/sdK' const f = createInstance({ datafile: datafileContent, }) // set common context for all f.setContext({ apiVersion: '5.0.0', }) ``` Afterwards, you can spawn child instances from it: ```js {% title="Child instance" highlight="3,9" %} // creating a child instance with its own context // (will get merged with parent context if available before evaluations) const childF = f.spawn({ userId: '234', country: 'nl', }) // evaluate via spawned child instance const isFeatureEnabled = childF.isEnabled('myFeature') ``` Similar to primary instance, you can also set context and sticky features in child instances: ```js {% title="Child instance: setting context" highlight="2,7" %} // override child context later if needed childF.setContext({ country: 'de', }) // when evaluating, you can still pass additional context const isFeatureEnabled = childF.isEnabled('myFeature', { browser: 'firefox', }) ``` Methods similar to primary instance are all available on child instances: - `isEnabled` - `getVariation` - `getVariable` - `getVariableBoolean` - `getVariableString` - `getVariableInteger` - `getVariableDouble` - `getVariableArray` - `getVariableObject` - `getVariableJSON` - `getAllEvaluations` - `setContext` - `setSticky` - `on` ### Get all evaluations {% label="New" labelType="success" %} You can get evaluation results of all your features currently loaded via datafile in the SDK instance: ```js {% path="your-app/index.js" %} const f; // Featurevisor SDK instance const allEvaluations = f.getAllEvaluations(context = {}) console.log(allEvaluations) // { // myFeature: { // enabled: true, // variation: "control", // variables: { // myVariableKey: "myVariableValue", // }, // }, // // anotherFeature: { // enabled: true, // variation: "treatment", // } // } ``` This can be very useful when you want to serialize all evaluations, and hand it off from backend to frontend for example. ### Configuring bucket key {% label="Breaking" labelType="error" %} {% row %} {% column %} ```js {% title="Before" path="your-app/index.js" highlight="4-14" %} import { createInstance } from '@featurevisor/sdk' const f = createInstance({ configureBucketKey: function (options) { const { featureKey, context, // default bucket key bucketKey, } = options return bucketKey }, }) ``` {% /column %} {% column %} ```js {% title="After" path="your-app/index.js" highlight="4-20" %} import { createInstance } from '@featurevisor/sdk' const f = createInstance({ hooks: [ { name: 'my-custom-hook', bucketKey: function (options) { const { featureKey, context, bucketBy, // default bucket key bucketKey, } = options return bucketKey }, }, ], }) ``` {% /column %} {% /row %} ### Configuring bucket value {% label="Breaking" labelType="error" %} {% row %} {% column %} ```js {% title="Before" path="your-app/index.js" highlight="4-14" %} import { createInstance } from '@featurevisor/sdk' const f = createInstance({ configureBucketValue: function (options) { const { featureKey, context, // default bucket value bucketValue, } = options return bucketValue }, }) ``` {% /column %} {% column %} ```js {% title="After" path="your-app/index.js" highlight="4-20" %} import { createInstance } from '@featurevisor/sdk' const f = createInstance({ hooks: [ { name: 'my-custom-hook', bucketValue: function (options) { const { featureKey, context, bucketKey // default bucket value bucketValue, } = options return bucketValue }, }, ], }) ``` {% /column %} {% /row %} Learn more in [JavaScript SDK](/docs/sdks/javascript/) page. ## React SDK usage All the hooks are now reactive. Meaning, your components will automatically re-render when: - a newew datafile is set - context is set or updated - sticky features are set or updated Learn more in [React SDK](/docs/react/) page. --- ## Testing features ### sticky {% label="New" labelType="success" %} Test specs of features can now also include sticky features, similar to SDK's API: ```yml {% path="tests/features/myFeature.spec.yml" highlight="9-14" %} feature: myFeature assertions: - description: My feature is enabled environment: production at: 100 context: country: nl sticky: myFeatureKey: enabled: true variation: control variables: myVariableKey: myVariableValue expectedToBeEnabled: true ``` ### expectedEvaluations {% label="New" labelType="success" %} You can go deep with testing feature evaluations, including their evaluation reasons for example: ```yml {% path="tests/features/myFeature.spec.yml" highlight="10-20" %} feature: myFeature assertions: - description: My feature is enabled environment: production at: 100 context: country: nl expectedToBeEnabled: true expectedEvaluations: flag: enabled: true reason: rule # see available rules in Evaluation type from SDK variation: variationValue: control reason: rule variables: myVariableKey: value: myVariableValue reason: rule ``` ### children {% label="New" labelType="success" %} Based on the new [child instance](#child-instance) API in SDK, you can also imitate testing against them via test specs: ```yml {% path="tests/features/myFeature.spec.yml" highlight="10-19" %} feature: myFeature assertions: - description: My feature is enabled environment: production at: 100 context: apiVersion: 5.0.0 children: - context: userId: '123' country: nl expectedToBeEnabled: true - context: userId: '456' country: de expectedToBeEnabled: false ``` Learn more in [Testing](/docs/testing/) page. --- title: Migration guides nextjs: metadata: title: Migration guides description: Learn how to migrate to latest Featurevisor version openGraph: title: Migration guides description: Learn how to migrate to latest Featurevisor version images: - url: /img/og/docs.png --- Guides for migrating to the latest version of Featurevisor. --- ## Guides - [From v1 to v2](/docs/migrations/v2) --- title: Namespaces nextjs: metadata: title: Namespaces description: Organize your features and segments under namespaces in a hierarchical way. openGraph: title: Namespaces description: Organize your features and segments under namespaces in a hierarchical way. images: - url: /img/og/docs-namespaces.png --- Featurevisor allows namespacing features and segments to tackle the challenges of scaling large projects. {% .lead %} Namespaces help teams organize their features and segments in a structured and hierarchical way, making it easier to manage and maintain them as their project grows larger and more complex over time. ## Features Creating a namespace for a [feature](/docs/features/) is as simple as putting the feature under a directory, and the directory name then becomes the namespace. If there is a team working on the checkout flow for e.g., they can create a namespace called `checkout` and put all their features related to the checkout flow in that namespace: ``` features/ ├── checkout/ │ ├── feature1.yml │ ├── feature2.yml │ └── feature3.yml └── globalFeature.yml ``` ### Evaluating features When evaluating a feature, you can refer to the feature by its namespace and key in the format `namespace/featureKey`: ```js {% path="your-app/index.js" %} const f; // Featurevisor SDK instance f.isEnabled("checkout/feature1"); f.isEnabled("globalFeature"); ``` ### Testing features When [testing features](/docs/testing/#testing-features), you can refer to the feature by its namespace and key in the format `namespace/featureKey`: ```yml {% path="tests/checkout/feature1.spec.yml" %} feature: checkout/feature1 # ... ``` The file name and location for test specs do not matter, as long as they exist inside the `tests` directory. ## Segments Very similar to features, you can namespace [segments](/docs/segments/) by putting them in a directory: ``` segments/ ├── countries/ │ ├── germany.yml │ └── netherlands.yml └── globalSegment.yml ``` ### Referencing segments When defining the [rules inside features](/docs/features/#rules), you can refer to the segment by its namespace and key in the format `namespace/segmentKey`: ```yml {% path="features/myFeature.yml" %} # ... rules: production: - key: '1' segments: 'countries/netherlands' percentage: 100 ``` ### Testing segments When [testing segments](/docs/testing/#testing-segments), you can refer to the segment by its namespace and key in the format `namespace/segmentKey`: ```yml {% path="tests/countries/nl.spec.yml" %} segment: countries/nl # ... ``` ## Comparison Namespaces are no replacement for [tags](/docs/tags/) or [environments](/docs/environments/), but they can be used in conjunction with them to create a more structured and organized project. - **Tags**: for [tagging features](/docs/tags/) resulting in targeted and smaller [datafiles](/docs/building-datafiles/) that your applications consume via [SDKs](/docs/sdks/) - **Environments**: for creating different [environments](/docs/environments/), like `production` and `staging`, and then use them in your feature [rules](/docs/features/#rules) to control the rollout of features - **Namespaces**: for organizing features and segments in a hierarchical way --- title: Documentation nextjs: metadata: title: Documentation description: Documentation for Featurevisor openGraph: title: Documentation description: Documentation for Featurevisor images: - url: /img/og/docs.png --- Please use the sidebar on the left to navigate through the documentation. --- title: Plugins API nextjs: metadata: title: Plugins API description: Extend Featurevisor CLI with additional tooling using the plugins API. openGraph: title: Plugins API description: Extend Featurevisor CLI with additional tooling using the plugins API. images: - url: /img/og/docs-plugins.png --- While Featurevisor [CLI](/docs/cli) is packed with various core functionalities, it also has a plugins API allowing you to extend it with further tooling as per your needs. {% .lead %} ## CLI The entire CLI is built on top of the plugins API. This means that all the core functionalities are implemented as plugins internally. You can create your own plugins either locally at individual project level, or even share them with others in the form of reusable npm packages. ## Installing plugins Additional plugins can be installed from [npm](https://www.npmjs.com/) directly. ```{% title="Command" %} $ cd my-featurevisor-project $ npm install --save featurevisor-plugin-example ``` Plugins can also be created locally without needing any additional npm package or publishing to a central registry. ## Registering plugins You can register plugins via [configuration](/docs/configuration) file found at `featurevisor.config.js`: ```js {% path="featurevisor.config.js" %} module.exports = { environments: ['staging', 'production'], tags: ['web', 'mobile'], // register plugins here plugins: [ require('featurevisor-plugin-example'), // require("./plugins/my-local-plugin"), ], } ``` ## Running a plugin Once registered, you can run the plugin via the CLI: ```{% title="Command" %} $ npx featurevisor example Hello world! ``` ## Creating a plugin A plugin is a simple JavaScript module that exports an object following below structure: ```js {% path="plugins/example.js" %} module.exports = { // this will be made available as "example" command: // // $ npx featurevisor example // command: 'example', // handle the command handler: async function ({ rootDirectoryPath, projectConfig, parsed, datasource, }) { console.log('Hello world!') if (somethingFailed) { return false // this will exit the CLI with an error } }, // self-documenting examples examples: [ { command: 'example', description: 'run the example command', }, { command: 'example --foo=bar', description: 'run the example command with additional options', }, ], } ``` ## Using TypeScript For type-safety, you can make use of the `Plugin` type: ```ts {% path="plugins/example.ts" %} import { Plugin } from '@featurevisor/core' const examplePlugin: Plugin = { command: 'example', handler: async function ({ rootDirectoryPath, projectConfig, parsed, datasource, }) { // handle the command here... }, examples: [ // examples here... ], } export default examplePlugin ``` ## Advice for reusable plugins Above example shows how to create a simple plugin. However, if you are creating a plugin that you wish to share with others, it's recommended to make it configurable when registering them. Instead of exporting the plugin object directly from a module, we can export a function that returns the plugin object: ```js // npm package: featurevisor-plugin-example module.exports = function configureExamplePlugin(options) { // use `options` here as needed // return the plugin object return { command: 'example', handler: async function ({ rootDirectoryPath, projectConfig, parsed, datasource, }) { // ... }, examples: [ // ... ], } } ``` When registering the plugin, the configuration options can be passed based on project specific needs: ```js {% path="featurevisor.config.js" %} module.exports = { environments: ['staging', 'production'], tags: ['web', 'mobile'], plugins: [ require('featurevisor-plugin-example')({ // custom options here... someProperty: 'some value', }), ], } ``` ## Handler options ### rootDirectoryPath This is the root directory path of the Featurevisor project where the CLI was executed from. ### projectConfig This is the fully processed configuration object as found in `featurevisor.config.js` file in the root of your Featurevisor project. For full details of what this object contains, refer to the [configuration](/docs/configuration) documentation. ### parsed This object will contain the parsed command line arguments. For example, if the command was: ``` $ npx featurevisor example --foo=bar ``` Then `parsed` object will be: ```js { foo: 'bar' } ``` It uses [yargs](https://www.npmjs.com/package/yargs) internally for parsing the command line arguments. ### datasource Datasource allows reading/writing data from/to the Featurevisor project, so that you don't have to deal with the file system directly. Read further in [datasource](/docs/datasource) documentation. Here's a quick summary of reading and writing various types of entities using the datasource API: #### Revision See [state files](/docs/state-files) for more details. ```js const revision = await datasource.readRevision() await datasource.writeRevision(revision + 1) ``` #### Features See [features](/docs/features) for more details. ```js const features = await datasource.listFeatures() const fooFeatureExists = await datasource.featureExists('foo') const fooFeature = await datasource.readFeature('foo') await datasource.writeFeature('foo', { ...fooFeature, ...newData }) await datasource.deleteFeature('foo') ``` #### Segments See [segments](/docs/segments) for more details. ```js const segments = await datasource.listSegments() const fooSegmentExists = await datasource.segmentExists('foo') const fooSegment = await datasource.readSegment('foo') await datasource.writeSegment('foo', { ...fooSegment, ...newData }) await datasource.deleteSegment('foo') ``` #### Attributes See [attributes](/docs/attributes) for more details. ```js const attributes = await datasource.listAttributes() const fooAttributeExists = await datasource.attributeExists('foo') const fooAttribute = await datasource.readAttribute('foo') await datasource.writeAttribute('foo', { ...fooAttribute, ...newData }) await datasource.deleteAttribute('foo') ``` See more in [datasource](/docs/datasource) documentation. --- title: Projects nextjs: metadata: title: Projects description: Learn how to create and manage Featurevisor projects openGraph: title: Projects description: Learn how to create and manage Featurevisor projects images: - url: /img/og/docs-projects.png --- A Featurevisor project is intended to be used as a single standalone Git repository, separate from your application codebase. {% .lead %} ## Creating a project The easiest way is to use the Featurevisor CLI using `npx` (Node.js). Create a new project directory first: ```{% title="Command" %} $ mkdir my-project $ cd my-project ``` And inside the newly created directory, initialize a Featurevisor project: ```{% title="Command" %} $ npx @featurevisor/cli init ``` ## Installation Afterwards, install the dependencies: ```{% title="Command" %} $ npm install ``` ## Platform agnostic usage While Featurevisor project itself depends on Node.js, your applications do not need to. The idea is that a Featurevisor project will generate [datafiles](/docs/building-datafiles/) (static JSON files), which will later be consumed by applications using [SDKs](/docs/sdks/) in different programming languages which do not need to have any ties to Node.js in any way. ## Directory structure ```{% title="Command" %} $ tree . . ├── attributes/ │   ├── country.yml │   ├── deviceId.yml │   └── userId.yml ├── datafiles/ (generated later) │   ├── production/ │   │   └── featurevisor-tag-all.json │   └── staging/ │   └── featurevisor-tag-all.json ├── features │   └── showCookieBanner.yml ├── featurevisor.config.js ├── package.json ├── segments │   └── netherlands.yml └── tests ├── features │   └── showCookieBanner.spec.yml └── segments └── netherlands.spec.yml ``` ### Project configuration - `featurevisor.config.js`: contains your project configuration. Learn more in [Configuration](/docs/configuration/) page. ### Building blocks These are the directories where you will be defining all the building blocks for managing your features: - `attributes/`: contains all your [attribute](/docs/attributes/) definitions - `segments/`: contains all your reusable [segments](/docs/segments/), which work as targeting conditions - `features/`: contains all your [feature](/docs/features/) definitions - `tests/`: contains all your [test specs](/docs/testing/) against your features and segments ### Output - `datafiles/`: contains all your generated [datafiles](/docs/building-datafiles/), which are meant to be consumed by [SDKs](/docs/sdks/javascript/) in your applications ## Git repository While it is intended that a Featurevisor project should be hosted in a standalone Git repository, it is not a strict requirement. ```{% title="Command" %} $ git init $ git add . $ git commit -m "Initial commit" ``` You can still use the CLI to manage your project without a Git repository, or as part of your larger application codebase (think a monorepo setup). However, it is highly recommended to use a standalone Git repository to keep track of your changes and collaborate with others. Keeping it separate from your application codebase allows you to [decouple](/docs/use-cases/decouple-releases-from-deployments/) your feature changes from your application code deployments. --- title: Quick start nextjs: metadata: title: Quick start description: Quick start guide for Featurevisor openGraph: title: Quick start description: Quick start guide for Featurevisor images: - url: /img/og/docs-quick-start.png --- ## Prerequisites - [Node.js](https://nodejs.org/en/) >= 20.0.0 ## Initialize your project Run the following command to initialize your project: ```{% title="Command" %} $ mkdir my-featurevisor-project && cd my-featurevisor-project $ npx @featurevisor/cli init ``` This is meant to be a completely separate repository from your application code. Learn more in [Projects](/docs/projects) page. ## Installation Once your project has been initialized, install all the dependencies: ```{% title="Command" %} $ npm install ``` ## Configure your project Featurevisor configuration is stored in `featurevisor.config.js` file, with minimum configuration looking like this: ```js {% path="featurevisor.config.js" %} module.exports = { tags: [ 'all', ], environments: [ 'staging', 'production' ], }; ``` Learn more in [Configuration](/docs/configuration). By default, Featurevisor defines [attributes](/docs/attributes), [segments](/docs/segments), and [features](/docs/features) as YAML files. If you want JSON, TOML, or any other format, see [custom parsers](/docs/advanced/custom-parsers) guide. ## Create an attribute Attributes are the building blocks of creating conditions. We will start by creating an attribute called `userId`: ```yml {% path="attributes/userId.yml" %} type: string description: User ID ``` Learn more in [Attributes](/docs/attributes). ## Create a segment Segments are reusable conditions that can be applied as rules in your features to target specific users or groups of users. Let's create a new attribute called `country` first: ```yml {% path="attributes/country.yml" %} type: string description: Country ``` Now, let's create a segment called `germany`: ```yml {% path="segments/germany.yml" %} description: Users from Germany conditions: - attribute: country operator: equals value: de ``` Learn more in [Segments](/docs/segments). ## Create a feature We have come to the most interesting part now. We can create a new `showBanner` feature, that controls showing a banner on our website: ```yml {% path="features/showBanner.yml" %} description: Show banner tags: - all # this makes sure the same User ID consistently gets the same experience bucketBy: userId rules: staging: # in staging, we want to show the banner to everyone - key: everyone segments: '*' percentage: 100 production: # in production, we want to test the feature in Germany first, and # it will be enabled for 100% of the traffic - key: de segments: germany percentage: 100 - key: everyone segments: '*' # everyone percentage: 0 # disabled for everyone else ``` Learn more in [Features](/docs/features). ## Linting We can lint the content of all our files to make sure they are all valid: ```{% title="Command" %} $ npx featurevisor lint ``` Learn more in [Linting](/docs/linting). ## Build datafiles Datafiles are static JSON files that we expect our client-side applications to consume using the Featurevisor [SDKs](/docs/sdks/). Now that we have all the definitions in place, we can build the project: ```{% title="Command" %} $ npx featurevisor build ``` This will generate datafiles in the `datafiles` directory for each of your [tags](/docs/tags/) against each [environment](/docs/environments/) as defined in your [`featurevisor.config.js`](/docs/configuration/) file. With our example, we will have the following datafiles generated: ``` datafiles/ ├── staging/ │ └── featurevisor-tag-all.json └── production/ └── featurevisor-tag-all.json ``` Learn more in [Building datafiles](/docs/building-datafiles). ## Deploy datafiles This is the part where you deploy the datafiles to your CDN or any other static file hosting service. Once done, the URLs of the datafiles may look like `https://cdn.yoursite.com/production/featurevisor-tag-all.json`. Learn more in [Deployment](/docs/deployment). ## Consume datafiles using the SDK Now that we have the datafiles deployed, we can consume them using the Featurevisor [SDK](/docs/sdks/). ### Install SDK In your application, install the SDK first: ```{% title="Command" %} $ npm install --save @featurevisor/sdk ``` Featurevisor JavaScript SDK is compatible with both Node.js and browser environments. ### Initialize SDK You can initialize the SDK as follows: ```js {% path="your-app/index.js" %} import { createInstance } from '@featurevisor/sdk' const datafileUrl = 'https://cdn.yoursite.com/production/featurevisor-tag-all.json' const datafileContent = await fetch(datafileUrl) .then((res) => res.json()) const f = createInstance({ datafile: datafileContent, }) ``` ### Set context Let the SDK know against what context the values should be evaluated: ```js const context = { userId: '123', country: 'de', } f.setContext(context) ``` ### Evaluate values Once the SDK is initialized, you can evaluate your features: ```js // flag status: true or false const isBannerEnabled = f.isEnabled('showBanner') ``` Featurevisor SDK will take care of evaluating the right value(s) for you synchronously against the provided `userId` and `country` attributes in the context. ### Variables & Variations Above example only makes use of the feature's boolean flag status only, but features may also contain [variables](/docs/features/#variables) and [variations](/docs/features/#variations), which can be evaluated with the SDK instance: ```js // variation: `control`, `treatment`, or more const bannerVariation = f.getVariation('showBanner', context) // variables const variableKey = 'myVariableKey' const myVariable = f.getVariable('showBanner', variableKey, context) ``` Find more examples of SDK usage [here](/docs/sdks/javascript/). --- title: React SDK nextjs: metadata: title: React SDK description: Learn how to use Featurevisor SDK with React for evaluating feature flags openGraph: title: React SDK description: Learn how to use Featurevisor SDK with React for evaluating feature flags images: - url: /img/og/docs-react.png --- Featurevisor comes with an additional package for React.js, for ease of integration in your React.js application for evaluating feature flags. {% .lead %} ## Installation Install with npm: ``` $ npm install --save @featurevisor/react ``` ## Setting up the provider Use `FeaturevisorProvider` component to set up the SDK instance in your React application: ```jsx import React from 'react' import ReactDOM from 'react-dom' import { createInstance } from '@featurevisor/sdk' import { FeaturevisorProvider } from '@featurevisor/react' const DATAFILE_URL = '...' const datafileContent = await fetch(DATAFILE_URL) .then(response => response.json()) const f = createInstance({ datafile: datafileContent, }) f.setContext({ userId: '123', }) ReactDOM.render( , document.getElementById('root'), ) ``` ## Hooks The package comes with several hooks to use in your components: ### useFlag Check if a feature is enabled or not: ```jsx import React from 'react' import { useFlag } from '@featurevisor/react' function MyComponent(props) { const featureKey = 'myFeatureKey' const isEnabled = useFlag(featureKey) if (isEnabled) { return

Feature is enabled

} return

Feature is disabled

} ``` ### useVariation Get a feature's evaluated variation: ```jsx import React from 'react': import { useVariation } from '@featurevisor/react'; function MyComponent(props) { const featureKey = 'myFeatureKey'; const variation = useVariation(featureKey); if (variation === 'b') { return

B variation

; } if (variation === 'c') { return

C variation

; } // default return

Default experience

; }; ``` ### useVariable Get a feature's evaluated variable value: ```jsx import React from 'react': import { useVariable } from '@featurevisor/react'; function MyComponent(props) { const featureKey = 'myFeatureKey'; const variableKey = 'color'; const colorValue = useVariable(featureKey, variableKey); return

Color: {colorValue}

; }; ``` ### useSdk In case you need to access the underlying Featurevisor SDK instance: ```jsx import React from 'react' import { useSdk } from '@featurevisor/react' function MyComponent(props) { const f = useSdk() return

...

} ``` ## Passing additional context All the evaluation hooks accept an optional argument for passing additional component-level context: ```js const context = { // ... additional context here in component } useFlag(featureKey, context) useVariation(featureKey, context) useVariable(featureKey, variableKey, context) ``` ## Reactivity All the evaluation hooks are reactive. This means that your components will automatically re-render when: - a newer [datafile is set](/docs/sdks/javascript/#setting-datafile) - [context is set or updated](/docs/sdks/javascript/#context) - [sticky features are set or updated](/docs/sdks/javascript/#sticky) The re-rendering logic is smart enough to compare previously known value with the new evaluated value, and will only re-render the component if the value has changed. If you do not want any reactivity, you are better off using the Featurevisor SDK instance directly in your component. ## Optimization Given the nature of components in React, they can re-render many times. You are advised to minimize the number of calls to Featurevisor SDK in your components by using memoization techniques. ## Example repository You can find a fully functional example of a React application using Featurevisor SDK here: [https://github.com/featurevisor/featurevisor-example-react](https://github.com/featurevisor/featurevisor-example-react).
--- title: React Native SDK nextjs: metadata: title: React Native SDK description: Learn how to use Featurevisor SDK with React Native for evaluating feature flags when building iOS and Android apps openGraph: title: React Native SDK description: Learn how to use Featurevisor SDK with React Native for evaluating feature flags when building iOS and Android apps images: - url: /img/og/docs-react.png --- Featurevisor SDK can be used with React Native for evaluating feature flags when building iOS and Android applications. {% .lead %} ## Installation You can use the same Featurevisor React SDK in your React Native app. Just install it as a regular dependency: ``` $ npm install --save @featurevisor/react ``` See `@featurevisor/react` API docs [here](/docs/react). ## Example usage ```js // ./MyComponent.js import { Text } from 'react-native' import { useFlag } from '@featurevisor/react' export default function MyComponent() { const featureKey = 'my_feature' const context = { // ...additional context } const isEnabled = useFlag(featureKey, context) return Feature is {isEnabled ? 'enabled' : 'disabled'} } ``` ## Polyfills The only extra polyfill you might need is for the `TextEncoder` API. You can consider using [`fastestsmallesttextencoderdecoder`](https://www.npmjs.com/package/fastestsmallesttextencoderdecoder) package for that. ## Example repository You can find a fully functioning example app built with React Native and Featurevisor SDK here: [https://github.com/featurevisor/featurevisor-example-react-native](https://github.com/featurevisor/featurevisor-example-react-native). --- title: Roadmap nextjs: metadata: title: Roadmap description: Future plans and roadmap for Featurevisor openGraph: title: Roadmap description: Future plans and roadmap for Featurevisor images: - url: /img/og/docs.png showEditPageLink: false --- The project is continuously evolving, but with a very clear vision and focus on stability and reliability. {% .lead %} Feature requests on [GitHub](https://github.com/featurevisor/featurevisor/issues) are always welcome, as long as they align with the project's [vision](/blog/v1-release). ## Future plans - Expand SDK support targeting more backend development oriented languages: - Python - Elixir - Rust - Kotlin Multiplatform Submit any new requests on [GitHub](https://github.com/featurevisor/featurevisor/issues). ## Past milestones For full changelog, visit [GitHub](https://github.com/featurevisor/featurevisor/blob/main/CHANGELOG.md). Notable changes: ### 2025 - August 23rd, 2025: [Ruby SDK released](/docs/sdks/ruby) - August 14th, 2025: [Go SDK released](/docs/sdks/go) - August 5th, 2025: [Java SDK released](/docs/sdks/java) - July 29th, 2025: [PHP SDK released](/docs/sdks/php) - July 14th, 2025: [v2.0 stable released](/blog/v2-release) 🎉 - March 9th, 2025: [Namespaces](/docs/namespaces) - March 8th, 2025: [Allow deprecating variables](/docs/features/#variables) - February 26th, 2025: [Make environments optional](/docs/environments) - January 21st, 2025: [Optional v2 datafile schema](/docs/building-datafiles) ### 2024 - August 7th, 2024: [Plugins API introduced](/docs/plugins) - July 7th, 2024: [Swift SDK released](/docs/sdks/swift) - May 9th, 2024: [Test runner made ~100x faster](/docs/testing) - April 16th, 2024: [Assess traffic distribution via CLI](/docs/cli/#assess-distribution) - April 7th, 2024: [Evaluate features via CLI](/docs/cli/#evaluate) - April 2nd, 2024: [Benchmarking evaluations via CLI](/docs/cli/#benchmarking) - March 10th, 2024: [Revisioning info moved to state files](/docs/state-files) - February 26th, 2024: [Test runner made 100x faster](/docs/testing/#fast) - February 11th, 2024: [Linter overhauled using Zod](/docs/linting) - January 16th, 2024: [Matrix for test specs](/docs/testing/#matrix) ### 2023 - December 13th, 2023: [v1.0 stable released](/blog/v1-release) 🎉 - September 24th, 2023: [Custom parsers API](/docs/advanced/custom-parsers) - July 20th, 2023: [Dependent features](/docs/use-cases/dependencies) - July 8th, 2023: [TypeScript code generation](/docs/code-generation) - July 6th, 2023: [Vue.js SDK released](/docs/vue) - July 2nd, 2023: [Both Features and Segments made testable](/docs/testing) - April 30th, 2023: [Mutually exclusive experiments](/docs/groups) - April 21st, 2023: [React](/docs/react) & [React Native](/docs/react-native) SDKs released - April 9th, 2023: [Status site generator released](/docs/site) - March 5th, 2023: [v0.1 alpha released](/blog/introducing-featurevisor) with [JavaScript SDK](/docs/sdks) --- title: Node.js SDK nextjs: metadata: title: Node.js SDK description: Learn how to use Featurevisor SDK in Node.js openGraph: title: Node.js SDK description: Learn how to use Featurevisor SDK in Node.js images: - url: /img/og/docs-sdks-nodejs.png --- You can use the same Featurevisor [JavaScript SDK](/docs/sdks/javascript) in Node.js as well. {% .lead %} ## Installation Install with npm: ``` $ npm install --save @featurevisor/sdk ``` ## API Please find the full API docs in [JavaScript SDK](/docs/sdks/javascript) page. ## Consuming the SDK ### Require If you are not dealing with ES Modules in Node.js, you can use `require()` as usual: ```js const { createInstance } = require('@featurevisor/sdk') ``` ### Import If you want to take advantage of ES Modules, you can import the SDK directly: ```js import { createInstance } from '@featurevisor/sdk' ``` ## Example repository You can refer to this repository that shows a fully working Node.js application using Featurevisor SDK: [https://github.com/featurevisor/featurevisor-example-nodejs](https://github.com/featurevisor/featurevisor-example-nodejs), --- title: Go SDK nextjs: metadata: title: Go SDK description: Learn how to use Featurevisor Go SDK openGraph: title: Go SDK description: Learn how to use Featurevisor Go SDK images: - url: /img/og/docs-sdks-go.png showEditPageLink: false --- Featurevisor's Go SDK is designed to work seamlessly with your existing Go applications. {% .lead %} ## Installation In your Go application, install the SDK using Go modules: ```bash go get github.com/featurevisor/featurevisor-go ``` ## Initialization The SDK can be initialized by passing [datafile](https://featurevisor.com/docs/building-datafiles/) content directly: ```go package main import ( "io" "net/http" "github.com/featurevisor/featurevisor-go" ) func main() { datafileURL := "https://cdn.yoursite.com/datafile.json" resp, err := http.Get(datafileURL) if err != nil { panic(err) } defer resp.Body.Close() datafileBytes, err := io.ReadAll(resp.Body) if err != nil { panic(err) } var datafileContent featurevisor.DatafileContent if err := datafileContent.FromJSON(string(datafileBytes)); err != nil { panic(err) } f := featurevisor.CreateInstance(featurevisor.Options{ Datafile: datafileContent, }) } ``` ## Evaluation types We can evaluate 3 types of values against a particular [feature](https://featurevisor.com/docs/features/): - [**Flag**](#check-if-enabled) (`bool`): whether the feature is enabled or not - [**Variation**](#getting-variation) (`string`): the variation of the feature (if any) - [**Variables**](#getting-variables): variable values of the feature (if any) These evaluations are run against the provided context. ## Context Contexts are [attribute](https://featurevisor.com/docs/attributes) values that we pass to SDK for evaluating [features](https://featurevisor.com/docs/features) against. Think of the conditions that you define in your [segments](https://featurevisor.com/docs/segments/), which are used in your feature's [rules](https://featurevisor.com/docs/features/#rules). They are plain maps: ```go context := featurevisor.Context{ "userId": "123", "country": "nl", // ...other attributes } ``` Context can be passed to SDK instance in various different ways, depending on your needs: ### Setting initial context You can set context at the time of initialization: ```go import ( "github.com/featurevisor/featurevisor-go" ) f := featurevisor.CreateInstance(featurevisor.Options{ Context: featurevisor.Context{ "deviceId": "123", "country": "nl", }, }) ``` This is useful for values that don't change too frequently and available at the time of application startup. ### Setting after initialization You can also set more context after the SDK has been initialized: ```go f.SetContext(featurevisor.Context{ "userId": "234", }) ``` This will merge the new context with the existing one (if already set). ### Replacing existing context If you wish to fully replace the existing context, you can pass `true` in second argument: ```go f.SetContext(featurevisor.Context{ "deviceId": "123", "userId": "234", "country": "nl", "browser": "chrome", }, true) // replace existing context ``` ### Manually passing context You can optionally pass additional context manually for each and every evaluation separately, without needing to set it to the SDK instance affecting all evaluations: ```go context := featurevisor.Context{ "userId": "123", "country": "nl", } isEnabled := f.IsEnabled("my_feature", context) variation := f.GetVariation("my_feature", context) variableValue := f.GetVariable("my_feature", "my_variable", context) ``` When manually passing context, it will merge with existing context set to the SDK instance before evaluating the specific value. Further details for each evaluation types are described below. ## Check if enabled Once the SDK is initialized, you can check if a feature is enabled or not: ```go featureKey := "my_feature" isEnabled := f.IsEnabled(featureKey) if isEnabled { // do something } ``` You can also pass additional context per evaluation: ```go isEnabled := f.IsEnabled(featureKey, featurevisor.Context{ // ...additional context }) ``` ## Getting variation If your feature has any [variations](https://featurevisor.com/docs/features/#variations) defined, you can evaluate them as follows: ```go featureKey := "my_feature" variation := f.GetVariation(featureKey) if variation != nil && *variation == "treatment" { // do something for treatment variation } else { // handle default/control variation } ``` Additional context per evaluation can also be passed: ```go variation := f.GetVariation(featureKey, featurevisor.Context{ // ...additional context }) ``` ## Getting variables Your features may also include [variables](https://featurevisor.com/docs/features/#variables), which can be evaluated as follows: ```go variableKey := "bgColor" bgColorValue := f.GetVariable("my_feature", variableKey) ``` Additional context per evaluation can also be passed: ```go bgColorValue := f.GetVariable("my_feature", variableKey, featurevisor.Context{ // ...additional context }) ``` ### Type specific methods Next to generic `GetVariable()` methods, there are also type specific methods available for convenience: ```go f.GetVariableBoolean(featureKey, variableKey, context) f.GetVariableString(featureKey, variableKey, context) f.GetVariableInteger(featureKey, variableKey, context) f.GetVariableDouble(featureKey, variableKey, context) f.GetVariableArray(featureKey, variableKey, context) f.GetVariableObject(featureKey, variableKey, context) f.GetVariableJSON(featureKey, variableKey, context) ``` ## Getting all evaluations You can get evaluations of all features available in the SDK instance: ```go allEvaluations := f.GetAllEvaluations(featurevisor.Context{}) fmt.Printf("%+v\n", allEvaluations) // { // myFeature: { // enabled: true, // variation: "control", // variables: { // myVariableKey: "myVariableValue", // }, // }, // // anotherFeature: { // enabled: true, // variation: "treatment", // } // } ``` This is handy especially when you want to pass all evaluations from a backend application to the frontend. ## Sticky For the lifecycle of the SDK instance in your application, you can set some features with sticky values, meaning that they will not be evaluated against the fetched [datafile](https://featurevisor.com/docs/building-datafiles/): ### Initialize with sticky ```go import ( "github.com/featurevisor/featurevisor-go" ) f := featurevisor.CreateInstance(featurevisor.Options{ Sticky: &StickyFeatures{ "myFeatureKey": featurevisor.StickyFeature{ Enabled: true, // optional Variation: &featurevisor.VariationValue{Value: "treatment"}, Variables: map[string]interface{}{ "myVariableKey": "myVariableValue", }, }, "anotherFeatureKey": featurevisor.StickyFeature{ Enabled: false, }, }, }) ``` Once initialized with sticky features, the SDK will look for values there first before evaluating the targeting conditions and going through the bucketing process. ### Set sticky afterwards You can also set sticky features after the SDK is initialized: ```go f.SetSticky(featurevisor.StickyFeatures{ "myFeatureKey": featurevisor.StickyFeature{ Enabled: true, Variation: &featurevisor.VariationValue{Value: "treatment"}, Variables: map[string]interface{}{ "myVariableKey": "myVariableValue", }, }, "anotherFeatureKey": featurevisor.StickyFeature{ Enabled: false, }, }, true) // replace existing sticky features (false by default) ``` ## Setting datafile You may also initialize the SDK without passing `datafile`, and set it later on: ```go f.SetDatafile(datafileContent) ``` ### Updating datafile You can set the datafile as many times as you want in your application, which will result in emitting a [`datafile_set`](#datafile-set) event that you can listen and react to accordingly. The triggers for setting the datafile again can be: - periodic updates based on an interval (like every 5 minutes), or - reacting to: - a specific event in your application (like a user action), or - an event served via websocket or server-sent events (SSE) ### Interval-based update Here's an example of using interval-based update: ```go import ( "time" "io" "net/http" "github.com/featurevisor/featurevisor-go" ) func updateDatafile(f *featurevisor.Featurevisor, datafileURL string) { ticker := time.NewTicker(5 * time.Minute) defer ticker.Stop() for range ticker.C { resp, err := http.Get(datafileURL) if err != nil { continue } defer resp.Body.Close() datafileBytes, err := io.ReadAll(resp.Body) if err != nil { continue } var datafileContent featurevisor.DatafileContent if err := datafileContent.FromJSON(string(datafileBytes)); err != nil { continue } f.SetDatafile(datafileContent) } } // Start the update goroutine go updateDatafile(f, datafileURL) ``` ## Logging By default, Featurevisor SDKs will print out logs to the console for `info` level and above. ### Levels These are all the available log levels: - `error` - `warn` - `info` - `debug` ### Customizing levels If you choose `debug` level to make the logs more verbose, you can set it at the time of SDK initialization. Setting `debug` level will print out all logs, including `info`, `warn`, and `error` levels. ```go import ( "github.com/featurevisor/featurevisor-go" ) logLevel := featurevisor.LogLevelDebug f := featurevisor.CreateInstance(featurevisor.Options{ LogLevel: &logLevel, }) ``` Alternatively, you can also set `logLevel` directly: ```go logLevel := featurevisor.LogLevelDebug f := featurevisor.CreateInstance(featurevisor.Options{ LogLevel: &logLevel, }) ``` You can also set log level from SDK instance afterwards: ```go f.SetLogLevel(featurevisor.LogLevelDebug) ``` ### Handler You can also pass your own log handler, if you do not wish to print the logs to the console: ```go import ( "github.com/featurevisor/featurevisor-go" ) logger := featurevisor.NewLogger(featurevisor.CreateLoggerOptions{ Level: &featurevisor.LogLevelInfo, Handler: func(level featurevisor.LogLevel, message string, details interface{}) { // do something with the log }, }) f := featurevisor.CreateInstance(featurevisor.Options{ Logger: logger, }) ``` Further log levels like `info` and `debug` will help you understand how the feature variations and variables are evaluated in the runtime against given context. ## Events Featurevisor SDK implements a simple event emitter that allows you to listen to events that happen in the runtime. You can listen to these events that can occur at various stages in your application: ### `datafile_set` ```go unsubscribe := f.On(featurevisor.EventNameDatafileSet, func(event featurevisor.Event) { revision := event.Revision // new revision previousRevision := event.PreviousRevision revisionChanged := event.RevisionChanged // true if revision has changed // list of feature keys that have new updates, // and you should re-evaluate them features := event.Features // handle here }) // stop listening to the event unsubscribe() ``` The `features` array will contain keys of features that have either been: - added, or - updated, or - removed compared to the previous datafile content that existed in the SDK instance. ### `context_set` ```go unsubscribe := f.On(featurevisor.EventNameContextSet, func(event featurevisor.Event) { replaced := event.Replaced // true if context was replaced context := event.Context // the new context fmt.Println("Context set") }) ``` ### `sticky_set` ```go unsubscribe := f.On(featurevisor.EventNameStickySet, func(event featurevisor.Event) { replaced := event.Replaced // true if sticky features got replaced features := event.Features // list of all affected feature keys fmt.Println("Sticky features set") }) ``` ## Evaluation details Besides logging with debug level enabled, you can also get more details about how the feature variations and variables are evaluated in the runtime against given context: ```go // flag evaluation := f.EvaluateFlag(featureKey, context) // variation evaluation := f.EvaluateVariation(featureKey, context) // variable evaluation := f.EvaluateVariable(featureKey, variableKey, context) ``` The returned object will always contain the following properties: - `FeatureKey`: the feature key - `Reason`: the reason how the value was evaluated And optionally these properties depending on whether you are evaluating a feature variation or a variable: - `BucketValue`: the bucket value between 0 and 100,000 - `RuleKey`: the rule key - `Error`: the error object - `Enabled`: if feature itself is enabled or not - `Variation`: the variation object - `VariationValue`: the variation value - `VariableKey`: the variable key - `VariableValue`: the variable value - `VariableSchema`: the variable schema ## Hooks Hooks allow you to intercept the evaluation process and customize it further as per your needs. ### Defining a hook A hook is a simple struct with a unique required `Name` and optional functions: ```go import ( "github.com/featurevisor/featurevisor-go" ) myCustomHook := &featurevisor.Hook{ // only required property Name: "my-custom-hook", // rest of the properties below are all optional per hook // before evaluation Before: func(options featurevisor.EvaluateOptions) featurevisor.EvaluateOptions { // update context before evaluation if options.Context == nil { options.Context = featurevisor.Context{} } options.Context["someAdditionalAttribute"] = "value" return options }, // after evaluation After: func(evaluation featurevisor.Evaluation, options featurevisor.EvaluateOptions) { if evaluation.Reason == "error" { // log error return } }, // configure bucket key BucketKey: func(options featurevisor.EvaluateOptions) string { // return custom bucket key return options.BucketKey }, // configure bucket value (between 0 and 100,000) BucketValue: func(options featurevisor.EvaluateOptions) int { // return custom bucket value return options.BucketValue }, } ``` ### Registering hooks You can register hooks at the time of SDK initialization: ```go import ( "github.com/featurevisor/featurevisor-go" ) f := featurevisor.CreateInstance(featurevisor.Options{ Hooks: []*featurevisor.Hook{ myCustomHook, }, }) ``` Or after initialization: ```go f.AddHook(myCustomHook) ``` ## Child instance When dealing with purely client-side applications, it is understandable that there is only one user involved, like in browser or mobile applications. But when using Featurevisor SDK in server-side applications, where a single server instance can handle multiple user requests simultaneously, it is important to isolate the context for each request. That's where child instances come in handy: ```go childF := f.Spawn(featurevisor.Context{ // user or request specific context "userId": "123", }) ``` Now you can pass the child instance where your individual request is being handled, and you can continue to evaluate features targeting that specific user alone: ```go isEnabled := childF.IsEnabled("my_feature") variation := childF.GetVariation("my_feature") variableValue := childF.GetVariable("my_feature", "my_variable") ``` Similar to parent SDK, child instances also support several additional methods: - `SetContext` - `SetSticky` - `IsEnabled` - `GetVariation` - `GetVariable` - `GetVariableBoolean` - `GetVariableString` - `GetVariableInteger` - `GetVariableDouble` - `GetVariableArray` - `GetVariableObject` - `GetVariableJSON` - `GetAllEvaluations` - `On` - `Close` ## Close Both primary and child instances support a `.Close()` method, that removes forgotten event listeners (via `On` method) and cleans up any potential memory leaks. ```go f.Close() ``` ## CLI usage This package also provides a CLI tool for running your Featurevisor [project](https://featurevisor.com/docs/projects/)'s test specs and benchmarking against this Go SDK: ### Test Learn more about testing [here](https://featurevisor.com/docs/testing/). ```bash go run cmd/main.go test --projectDirectoryPath="/absolute/path/to/your/featurevisor/project" ``` Additional options that are available: ```bash go run cmd/main.go test \ --projectDirectoryPath="/absolute/path/to/your/featurevisor/project" \ --quiet|verbose \ --onlyFailures \ --keyPattern="myFeatureKey" \ --assertionPattern="#1" ``` ### Benchmark Learn more about benchmarking [here](https://featurevisor.com/docs/cmd/#benchmarking). ```bash go run cmd/main.go benchmark \ --projectDirectoryPath="/absolute/path/to/your/featurevisor/project" \ --environment="production" \ --feature="myFeatureKey" \ --context='{"country": "nl"}' \ --n=1000 ``` ### Assess distribution Learn more about assessing distribution [here](https://featurevisor.com/docs/cmd/#assess-distribution). ```bash go run cmd/main.go assess-distribution \ --projectDirectoryPath="/absolute/path/to/your/featurevisor/project" \ --environment=production \ --feature=foo \ --variation \ --context='{"country": "nl"}' \ --populateUuid=userId \ --populateUuid=deviceId \ --n=1000 ``` ## GitHub repositories - See SDK repository here: [featurevisor/featurevisor-go](https://github.com/featurevisor/featurevisor-go) - See example application repository here: [featurevisor/featurevisor-example-go](https://github.com/featurevisor/featurevisor-example-go) --- title: Java SDK nextjs: metadata: title: Java SDK description: Learn how to use Featurevisor Java SDK openGraph: title: Java SDK description: Learn how to use Featurevisor Java SDK images: - url: /img/og/docs-sdks-java.png showEditPageLink: false --- Featurevisor's Java SDK is a library for evaluating feature flags, experiment variations, and variables in your Java applications. {% .lead %} ## Installation In your Java application, update `pom.xml` to add the following: ### Repository For finding GitHub Package (public package): ```xml github GitHub Packages https://maven.pkg.github.com/featurevisor/featurevisor-java ``` ### Dependency Add Featurevisor Java SDK as a dependency with your desired version: ```xml com.featurevisor featurevisor-java 0.0.6 ``` Find latest version here: [https://github.com/featurevisor/featurevisor-java/packages](https://github.com/featurevisor/featurevisor-java/packages) ### Authentication To authenticate with GitHub Packages, in your `~/.m2/settings.xml` file, add the following: ```xml github YOUR_GITHUB_USERNAME YOUR_GITHUB_TOKEN ``` You can generate a new GitHub token with `read:packages` scope here: [https://github.com/settings/tokens](https://github.com/settings/tokens) See example application here: [https://github.com/featurevisor/featurevisor-example-java](https://github.com/featurevisor/featurevisor-example-java) ## Initialization The SDK can be initialized by passing [datafile](https://featurevisor.com/docs/building-datafiles/) content directly: ```java import com.featurevisor.sdk.Featurevisor; // Load datafile content String datafileUrl = "https://cdn.yoursite.com/datafile.json"; String datafileContent = "..." // load your datafile content // Create SDK instance Featurevisor f = Featurevisor.createInstance(datafileContent); ``` or by constructing a `Featurevisor.Options` object: ```java Featurevisor f = Featurevisor.createInstance(new Featurevisor.Options() .datafile(datafileContent) ); ``` We will learn about several different options in the next sections. ## Evaluation types We can evaluate 3 types of values against a particular [feature](https://featurevisor.com/docs/features/): - [**Flag**](#check-if-enabled) (`boolean`): whether the feature is enabled or not - [**Variation**](#getting-variation) (`Object`): the variation of the feature (if any) - [**Variables**](#getting-variables): variable values of the feature (if any) These evaluations are run against the provided context. ## Context Contexts are [attribute](https://featurevisor.com/docs/attributes) values that we pass to SDK for evaluating [features](https://featurevisor.com/docs/features) against. Think of the conditions that you define in your [segments](https://featurevisor.com/docs/segments/), which are used in your feature's [rules](https://featurevisor.com/docs/features/#rules). They are plain maps: ```java Map context = new HashMap<>(); context.put("userId", "123"); context.put("country", "nl"); // ...other attributes ``` Context can be passed to SDK instance in various different ways, depending on your needs: ### Setting initial context You can set context at the time of initialization: ```java Map initialContext = new HashMap<>(); initialContext.put("deviceId", "123"); initialContext.put("country", "nl"); Featurevisor f = Featurevisor.createInstance(new Featurevisor.Options() .datafile(datafileContent) .context(initialContext)); ``` This is useful for values that don't change too frequently and available at the time of application startup. ### Setting after initialization You can also set more context after the SDK has been initialized: ```java Map additionalContext = new HashMap<>(); additionalContext.put("userId", "123"); additionalContext.put("country", "nl"); f.setContext(additionalContext); ``` This will merge the new context with the existing one (if already set). ### Replacing existing context If you wish to fully replace the existing context, you can pass `true` in second argument: ```java Map newContext = new HashMap<>(); newContext.put("deviceId", "123"); newContext.put("userId", "234"); newContext.put("country", "nl"); newContext.put("browser", "chrome"); f.setContext(newContext, true); // replace existing context ``` ### Manually passing context You can optionally pass additional context manually for each and every evaluation separately, without needing to set it to the SDK instance affecting all evaluations: ```java Map context = new HashMap<>(); context.put("userId", "123"); context.put("country", "nl"); boolean isEnabled = f.isEnabled("my_feature", context); String variation = f.getVariation("my_feature", context); String variableValue = f.getVariableString("my_feature", "my_variable", context); ``` When manually passing context, it will merge with existing context set to the SDK instance before evaluating the specific value. Further details for each evaluation types are described below. ## Check if enabled Once the SDK is initialized, you can check if a feature is enabled or not: ```java String featureKey = "my_feature"; boolean isEnabled = f.isEnabled(featureKey); if (isEnabled) { // do something } ``` You can also pass additional context per evaluation: ```java Map additionalContext = new HashMap<>(); // ...additional context boolean isEnabled = f.isEnabled(featureKey, additionalContext); ``` ## Getting variation If your feature has any [variations](https://featurevisor.com/docs/features/#variations) defined, you can evaluate them as follows: ```java String featureKey = "my_feature"; String variation = f.getVariation(featureKey); if ("treatment".equals(variation)) { // do something for treatment variation } else { // handle default/control variation } ``` Additional context per evaluation can also be passed: ```java String variation = f.getVariation(featureKey, additionalContext); ``` ## Getting variables Your features may also include [variables](https://featurevisor.com/docs/features/#variables), which can be evaluated as follows: ```java String variableKey = "bgColor"; String bgColorValue = f.getVariableString(featureKey, variableKey); ``` Additional context per evaluation can also be passed: ```java String bgColorValue = f.getVariableString(featureKey, variableKey, additionalContext); ``` ### Type specific methods Next to generic `getVariable()` methods, there are also type specific methods available for convenience: ```java f.getVariableBoolean(featureKey, variableKey, context); f.getVariableString(featureKey, variableKey, context); f.getVariableInteger(featureKey, variableKey, context); f.getVariableDouble(featureKey, variableKey, context); f.getVariableArray(featureKey, variableKey, context); f.>getVariableObject(featureKey, variableKey, context); f.getVariableObject(featureKey, variableKey, context); f.>getVariableJSON(featureKey, variableKey, context); f.getVariableJSON(featureKey, variableKey, context); ``` ## Getting all evaluations You can get evaluations of all features available in the SDK instance: ```java import com.featurevisor.types.EvaluatedFeatures; import com.featurevisor.types.EvaluatedFeature; EvaluatedFeatures allEvaluations = f.getAllEvaluations(context); // Access the evaluations map Map evaluations = allEvaluations.getValue(); System.out.println(evaluations); // { // "myFeature": { // "enabled": true, // "variation": "control", // "variables": { // "myVariableKey": "myVariableValue" // } // }, // // "anotherFeature": { // "enabled": true, // "variation": "treatment" // } // } ``` This is handy especially when you want to pass all evaluations from a backend application to the frontend. ## Sticky For the lifecycle of the SDK instance in your application, you can set some features with sticky values, meaning that they will not be evaluated against the fetched [datafile](https://featurevisor.com/docs/building-datafiles/): ### Initialize with sticky ```java Map stickyFeatures = new HashMap<>(); Map myFeatureSticky = new HashMap<>(); myFeatureSticky.put("enabled", true); myFeatureSticky.put("variation", "treatment"); Map myVariables = new HashMap<>(); myVariables.put("myVariableKey", "myVariableValue"); myFeatureSticky.put("variables", myVariables); stickyFeatures.put("myFeatureKey", myFeatureSticky); Map anotherFeatureSticky = new HashMap<>(); anotherFeatureSticky.put("enabled", false); stickyFeatures.put("anotherFeatureKey", anotherFeatureSticky); Featurevisor f = Featurevisor.createInstance(new Featurevisor.Options() .datafile(datafile) .sticky(stickyFeatures)); ``` Once initialized with sticky features, the SDK will look for values there first before evaluating the targeting conditions and going through the bucketing process. ### Set sticky afterwards You can also set sticky features after the SDK is initialized: ```java Map stickyFeatures = new HashMap<>(); // ... build sticky features map f.setSticky(stickyFeatures, true); // replace existing sticky features ``` ## Setting datafile You may also initialize the SDK without passing `datafile`, and set it later on: ```java f.setDatafile(datafileContent); ``` ### Updating datafile You can set the datafile as many times as you want in your application, which will result in emitting a [`datafile_set`](#datafile_set) event that you can listen and react to accordingly. The triggers for setting the datafile again can be: - periodic updates based on an interval (like every 5 minutes), or - reacting to: - a specific event in your application (like a user action), or - an event served via websocket or server-sent events (SSE) ### Interval-based update Here's an example of using interval-based update: ```java // Using ScheduledExecutorService for periodic updates ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1); scheduler.scheduleAtFixedRate(() -> { // Fetch new datafile content String newDatafileContent = // ... fetch from your CDN DatafileContent newDatafile = DatafileContent.fromJson(newDatafileContent); // Update the SDK f.setDatafile(newDatafile); }, 0, 5, TimeUnit.MINUTES); ``` ## Logging By default, Featurevisor SDKs will print out logs to the console for `info` level and above. ### Levels These are all the available log levels: - `error` - `warn` - `info` - `debug` ### Customizing levels If you choose `debug` level to make the logs more verbose, you can set it at the time of SDK initialization. Setting `debug` level will print out all logs, including `info`, `warn`, and `error` levels. ```java import com.featurevisor.sdk.Logger; Featurevisor f = Featurevisor.createInstance(new Featurevisor.Options() .datafile(datafile) .logLevel(Logger.LogLevel.DEBUG)); ``` You can also set log level from SDK instance afterwards: ```java f.setLogLevel(Logger.LogLevel.DEBUG); ``` ### Handler You can also pass your own log handler, if you do not wish to print the logs to the console: ```java // Create a custom logger with a custom handler Logger customLogger = Logger.createLogger(new Logger.CreateLoggerOptions() .level(Logger.LogLevel.INFO) .handler((level, message, details) -> { // do something with the log System.out.println("[" + level + "] " + message); })); Featurevisor f = Featurevisor.createInstance(new Featurevisor.Options() .datafile(datafile) .logger(customLogger)); ``` Alternatively, you can create a custom logger directly: ```java Logger customLogger = new Logger(Logger.LogLevel.INFO, (level, message, details) -> { // do something with the log System.out.println("[" + level + "] " + message); }); Featurevisor f = Featurevisor.createInstance(new Featurevisor.Options() .datafile(datafile) .logger(customLogger)); ``` Further log levels like `info` and `debug` will help you understand how the feature variations and variables are evaluated in the runtime against given context. ## Events Featurevisor SDK implements a simple event emitter that allows you to listen to events that happen in the runtime. You can listen to these events that can occur at various stages in your application: ### `datafile_set` ```java Runnable unsubscribe = f.on("datafile_set", (event) -> { String revision = (String) event.get("revision"); // new revision String previousRevision = (String) event.get("previousRevision"); Boolean revisionChanged = (Boolean) event.get("revisionChanged"); // true if revision has changed // list of feature keys that have new updates, // and you should re-evaluate them @SuppressWarnings("unchecked") List features = (List) event.get("features"); // handle here }); // stop listening to the event unsubscribe.run(); ``` The `features` array will contain keys of features that have either been: - added, or - updated, or - removed compared to the previous datafile content that existed in the SDK instance. ### `context_set` ```java Runnable unsubscribe = f.on("context_set", (event) -> { Boolean replaced = (Boolean) event.get("replaced"); // true if context was replaced @SuppressWarnings("unchecked") Map context = (Map) event.get("context"); // the new context System.out.println("Context set"); }); ``` ### `sticky_set` ```java Runnable unsubscribe = f.on("sticky_set", (event) -> { Boolean replaced = (Boolean) event.get("replaced"); // true if sticky features got replaced @SuppressWarnings("unchecked") List features = (List) event.get("features"); // list of all affected feature keys System.out.println("Sticky features set"); }); ``` ## Evaluation details Besides logging with debug level enabled, you can also get more details about how the feature variations and variables are evaluated in the runtime against given context: ```java // flag Map evaluation = f.evaluateFlag(featureKey, context); // variation Map evaluation = f.evaluateVariation(featureKey, context); // variable Map evaluation = f.evaluateVariable(featureKey, variableKey, context); ``` The returned object will always contain the following properties: - `featureKey`: the feature key - `reason`: the reason how the value was evaluated And optionally these properties depending on whether you are evaluating a feature variation or a variable: - `bucketValue`: the bucket value between 0 and 100,000 - `ruleKey`: the rule key - `error`: the error object - `enabled`: if feature itself is enabled or not - `variation`: the variation object - `variationValue`: the variation value - `variableKey`: the variable key - `variableValue`: the variable value - `variableSchema`: the variable schema ## Hooks Hooks allow you to intercept the evaluation process and customize it further as per your needs. ### Defining a hook A hook is a simple object with a unique required `name` and optional functions: ```java Map myCustomHook = new HashMap<>(); myCustomHook.put("name", "my-custom-hook"); // before evaluation myCustomHook.put("before", (options) -> { String type = (String) options.get("type"); // "feature" | "variation" | "variable" String featureKey = (String) options.get("featureKey"); String variableKey = (String) options.get("variableKey"); // if type is "variable" @SuppressWarnings("unchecked") Map context = (Map) options.get("context"); // update context before evaluation context.put("someAdditionalAttribute", "value"); options.put("context", context); return options; }); // after evaluation myCustomHook.put("after", (evaluation, options) -> { String reason = (String) evaluation.get("reason"); // "error" | "feature_not_found" | "variable_not_found" | ... if ("error".equals(reason)) { // log error return; } }); // configure bucket key myCustomHook.put("bucketKey", (options) -> { String featureKey = (String) options.get("featureKey"); @SuppressWarnings("unchecked") Map context = (Map) options.get("context"); String bucketBy = (String) options.get("bucketBy"); String bucketKey = (String) options.get("bucketKey"); // default bucket key // return custom bucket key return bucketKey; }); // configure bucket value (between 0 and 100,000) myCustomHook.put("bucketValue", (options) -> { String featureKey = (String) options.get("featureKey"); @SuppressWarnings("unchecked") Map context = (Map) options.get("context"); String bucketKey = (String) options.get("bucketKey"); Integer bucketValue = (Integer) options.get("bucketValue"); // default bucket value // return custom bucket value return bucketValue; }); ``` ### Registering hooks You can register hooks at the time of SDK initialization: ```java List> hooks = new ArrayList<>(); hooks.add(myCustomHook); Featurevisor f = Featurevisor.createInstance(new Featurevisor.Options() .datafile(datafile) .hooks(hooks)); ``` Or after initialization: ```java Runnable removeHook = f.addHook(myCustomHook); // removeHook.run(); ``` ## Child instance When dealing with purely client-side applications, it is understandable that there is only one user involved, like in browser or mobile applications. But when using Featurevisor SDK in server-side applications, where a single server instance can handle multiple user requests simultaneously, it is important to isolate the context for each request. That's where child instances come in handy: ```java Map childContext = new HashMap<>(); childContext.put("userId", "123"); ChildInstance childF = f.spawn(childContext); ``` Now you can pass the child instance where your individual request is being handled, and you can continue to evaluate features targeting that specific user alone: ```java boolean isEnabled = childF.isEnabled("my_feature"); String variation = childF.getVariation("my_feature"); String variableValue = childF.getVariableString("my_feature", "my_variable"); ``` Similar to parent SDK, child instances also support several additional methods: - `setContext` - `setSticky` - `isEnabled` - `getVariation` - `getVariable` - `getVariableBoolean` - `getVariableString` - `getVariableInteger` - `getVariableDouble` - `getVariableArray` - `getVariableObject` - `getVariableJSON` - `getAllEvaluations` - `on` - `close` ## Close Both primary and child instances support a `.close()` method, that removes forgotten event listeners (via `on` method) and cleans up any potential memory leaks. ```java f.close(); ``` ## CLI usage This package also provides a CLI tool for running your Featurevisor project's test specs and benchmarking against this Java SDK: ### Test Learn more about testing [here](https://featurevisor.com/docs/testing/). ```bash $ mvn exec:java -Dexec.mainClass="com.featurevisor.cli.CLI" -Dexec.args="test --projectDirectoryPath=/absolute/path/to/your/featurevisor/project" ``` Additional options that are available: ```bash $ mvn exec:java -Dexec.mainClass="com.featurevisor.cli.CLI" -Dexec.args="test --projectDirectoryPath=/absolute/path/to/your/featurevisor/project --quiet --onlyFailures --keyPattern=myFeatureKey --assertionPattern=#1" ``` ### Benchmark Learn more about benchmarking [here](https://featurevisor.com/docs/cli/#benchmarking). ```bash $ mvn exec:java -Dexec.mainClass="com.featurevisor.cli.CLI" -Dexec.args="benchmark --projectDirectoryPath=/absolute/path/to/your/featurevisor/project --environment=production --feature=myFeatureKey --context='{\"country\": \"nl\"}' --n=1000" ``` ### Assess distribution Learn more about assessing distribution [here](https://featurevisor.com/docs/cli/#assess-distribution). ```bash $ mvn exec:java -Dexec.mainClass="com.featurevisor.cli.CLI" -Dexec.args="assess-distribution --projectDirectoryPath=/absolute/path/to/your/featurevisor/project --environment=production --feature=foo --variation --context='{\"country\": \"nl\"}' --populateUuid=userId --populateUuid=deviceId --n=1000" ``` --- title: PHP SDK nextjs: metadata: title: PHP SDK description: Learn how to use Featurevisor PHP SDK openGraph: title: PHP SDK description: Learn how to use Featurevisor PHP SDK images: - url: /img/og/docs-sdks-php.png showEditPageLink: false --- Featurevisor's PHP SDK is designed to work seamlessly with your existing PHP applications, both without or with established frameworks like Laravel, CakePHP, or Symfony. {% .lead %} ## Installation In your PHP application, install the SDK using [Composer](https://getcomposer.org/): ``` $ composer require featurevisor/featurevisor-php ``` ## Initialization The SDK can be initialized by passing [datafile](https://featurevisor.com/docs/building-datafiles/) content directly: ```php $datafileContent ]); ``` ## Evaluation types We can evaluate 3 types of values against a particular [feature](https://featurevisor.com/docs/features/): - [**Flag**](#check-if-enabled) (`boolean`): whether the feature is enabled or not - [**Variation**](#getting-variation) (`string`): the variation of the feature (if any) - [**Variables**](#getting-variables): variable values of the feature (if any) These evaluations are run against the provided context. ## Context Contexts are [attribute](https://featurevisor.com/docs/attributes) values that we pass to SDK for evaluating [features](https://featurevisor.com/docs/features) against. Think of the conditions that you define in your [segments](https://featurevisor.com/docs/segments/), which are used in your feature's [rules](https://featurevisor.com/docs/features/#rules). They are plain objects: ```php $context = [ "userId" => "123", "country" => "nl", // ...other attributes ]; ``` Context can be passed to SDK instance in various different ways, depending on your needs: ### Setting initial context You can set context at the time of initialization: ```php use Featurevisor\Featurevisor; $f = Featurevisor::createInstance([ "context" => [ "deviceId" => "123", "country" => "nl", ], ]); ``` This is useful for values that don't change too frequently and available at the time of application startup. ### Setting after initialization You can also set more context after the SDK has been initialized: ```php $f->setContext([ "userId" => "123", "country" => "nl", ]); ``` This will merge the new context with the existing one (if already set). ### Replacing existing context If you wish to fully replace the existing context, you can pass `true` in second argument: ```php $f->setContext( [ "deviceId" => "123", "userId" => "234", "country" => "nl", "browser" => "chrome", ], true // replace existing context ); ``` ### Manually passing context You can optionally pass additional context manually for each and every evaluation separately, without needing to set it to the SDK instance affecting all evaluations: ```php $context = [ "userId" => "123", "country" => "nl", ]; $isEnabled = $f->isEnabled('my_feature', $context); $variation = $f->getVariation('my_feature', $context); $variableValue = $f->getVariable('my_feature', 'my_variable', $context); ``` When manually passing context, it will merge with existing context set to the SDK instance before evaluating the specific value. Further details for each evaluation types are described below. ## Check if enabled Once the SDK is initialized, you can check if a feature is enabled or not: ```php $featureKey = 'my_feature'; $isEnabled = $f->isEnabled($featureKey); if ($isEnabled) { // do something } ``` You can also pass additional context per evaluation: ```php $isEnabled = $f->isEnabled($featureKey, [ // ...additional context ]); ``` ## Getting variation If your feature has any [variations](https://featurevisor.com/docs/features/#variations) defined, you can evaluate them as follows: ```php $featureKey = 'my_feature'; $variation = $f->getVariation($featureKey); if ($variation === "treatment") { // do something for treatment variation } else { // handle default/control variation } ``` Additional context per evaluation can also be passed: ```php $variation = $f->getVariation($featureKey, [ // ...additional context ]); ``` ## Getting variables Your features may also include [variables](https://featurevisor.com/docs/features/#variables), which can be evaluated as follows: ```php $variableKey = 'bgColor'; $bgColorValue = $f->getVariable($featureKey, $variableKey); ``` Additional context per evaluation can also be passed: ```php $bgColorValue = $f->getVariable($featureKey, $variableKey, [ // ...additional context ]); ``` ### Type specific methods Next to generic `getVariable()` methods, there are also type specific methods available for convenience: ```php $f->getVariableBoolean($featureKey, $variableKey, $context = []); $f->getVariableString($featureKey, $variableKey, $context = []); $f->getVariableInteger($featureKey, $variableKey, $context = []); $f->getVariableDouble($featureKey, $variableKey, $context = []); $f->getVariableArray($featureKey, $variableKey, $context = []); $f->getVariableObject($featureKey, $variableKey, $context = []); $f->getVariableJSON($featureKey, $variableKey, $context = []); ``` ## Getting all evaluations You can get evaluations of all features available in the SDK instance: ```php $allEvaluations = $f->getAllEvaluations($context = []); print_r($allEvaluations); // [ // myFeature: [ // enabled: true, // variation: "control", // variables: [ // myVariableKey: "myVariableValue", // ], // ], // // anotherFeature: [ // enabled: true, // variation: "treatment", // ] // ] ``` This is handy especially when you want to pass all evaluations from a backend application to the frontend. ## Sticky For the lifecycle of the SDK instance in your application, you can set some features with sticky values, meaning that they will not be evaluated against the fetched [datafile](https://featurevisor.com/docs/building-datafiles/): ### Initialize with sticky ```php use Featurevisor\Featurevisor; $f = Featurevisor::createInstance([ "sticky" => [ "myFeatureKey" => [ "enabled" => true, // optional "variation" => 'treatment', "variables" => [ "myVariableKey" => 'myVariableValue', ], ], "anotherFeatureKey" => [ "enabled" => false, ], ], ]); ``` Once initialized with sticky features, the SDK will look for values there first before evaluating the targeting conditions and going through the bucketing process. ### Set sticky afterwards You can also set sticky features after the SDK is initialized: ```php $f->setSticky( [ "myFeatureKey" => [ "enabled" => true, "variation" => 'treatment', "variables" => [ "myVariableKey" => 'myVariableValue', ], ], "anotherFeatureKey" => [ "enabled" => false, ], ], // replace existing sticky features (false by default) true ]); ``` ## Setting datafile You may also initialize the SDK without passing `datafile`, and set it later on: ```php $f->setDatafile($datafileContent); ``` ### Updating datafile You can set the datafile as many times as you want in your application, which will result in emitting a [`datafile_set`](#datafile_set) event that you can listen and react to accordingly. The triggers for setting the datafile again can be: - periodic updates based on an interval (like every 5 minutes), or - reacting to: - a specific event in your application (like a user action), or - an event served via websocket or server-sent events (SSE) ### Interval-based update Here's an example of using interval-based update: @TODO ## Logging By default, Featurevisor SDKs will print out logs to the console for `info` level and above. Featurevisor PHP-SDK by default uses [PSR-3 standard](https://www.php-fig.org/psr/psr-3/) simple implementation. You can also choose from many mature implementations like e.g. [Monolog](https://github.com/Seldaek/monolog) ### Customizing levels If you choose `debug` level to make the logs more verbose, you can set it at the time of SDK initialization. Setting `debug` level will print out all logs, including `info`, `warning`, and `error` levels. ```php use Featurevisor\Featurevisor; use Featurevisor\Logger; $f = Featurevisor::createInstance([ "logger" => Logger::create([ "level" => "debug", ]), ]); ``` Alternatively, you can also set `logLevel` directly: ```php $f = Featurevisor::createInstance([ "logLevel" => "debug", ]); ``` You can also set log level from SDK instance afterwards: ```php $f->setLogLevel("debug"); ``` ### Handler You can also pass your own log handler, if you do not wish to print the logs to the console: ```php use Featurevisor\Featurevisor; use Featurevisor\Logger; $f = Featurevisor::createInstance([ "logger" => Logger::create([ "level" => "info", "handler" => function ($level, $message, $details) { // do something with the log }, ]), ]); ``` Further log levels like `info` and `debug` will help you understand how the feature variations and variables are evaluated in the runtime against given context. ## Events Featurevisor SDK implements a simple event emitter that allows you to listen to events that happen in the runtime. You can listen to these events that can occur at various stages in your application: @TODO: verify these events ### `datafile_set` ```php $unsubscribe = $f->on('datafile_set', function ($event) { $revision = $event['revision']; // new revision $previousRevision = $event['previousRevision']; $revisionChanged = $event['revisionChanged']; // true if revision has changed // list of feature keys that have new updates, // and you should re-evaluate them $features = $event['features']; // handle here }); // stop listening to the event $unsubscribe(); ``` The `features` array will contain keys of features that have either been: - added, or - updated, or - removed compared to the previous datafile content that existed in the SDK instance. ### `context_set` ```php $unsubscribe = $f->on('context_set', function ($event) { $replaced = $event['replaced']; // true if context was replaced $context = $event['context']; // the new context echo "Context set"; }); ``` ### `sticky_set` ```php $unsubscribe = $f->on('sticky_set', function ($event) { $replaced = $event['replaced']; // true if sticky features got replaced $features = $event['features']; // list of all affected feature keys echo "Sticky features set"; }); ``` ## Evaluation details Besides logging with debug level enabled, you can also get more details about how the feature variations and variables are evaluated in the runtime against given context: ```php // flag $evaluation = $f->evaluateFlag($featureKey, $context = []); // variation $evaluation = $f->evaluateVariation($featureKey, $context = []); // variable $evaluation = $f->evaluateVariable($featureKey, $variableKey, $context = []); ``` The returned object will always contain the following properties: - `featureKey`: the feature key - `reason`: the reason how the value was evaluated And optionally these properties depending on whether you are evaluating a feature variation or a variable: - `bucketValue`: the bucket value between 0 and 100,000 - `ruleKey`: the rule key - `error`: the error object - `enabled`: if feature itself is enabled or not - `variation`: the variation object - `variationValue`: the variation value - `variableKey`: the variable key - `variableValue`: the variable value - `variableSchema`: the variable schema ## Hooks Hooks allow you to intercept the evaluation process and customize it further as per your needs. ### Defining a hook A hook is a simple object with a unique required `name` and optional functions: ```php $myCustomHook = [ // only required property 'name' => 'my-custom-hook', // rest of the properties below are all optional per hook // before evaluation 'before' => function (options) { $type = $options['type']; // `feature` | `variation` | `variable` $featureKey = $options['featureKey']; $variableKey = $options['variableKey']; // if type is `variable` $context = $options['context']; // update context before evaluation $options['context'] = array_merge($options['context'], [ 'someAdditionalAttribute' => 'value', ]); return $options; }, // after evaluation 'after' => function ($evaluation, $options) { $reason = $evaluation['reason']; // `error` | `feature_not_found` | `variable_not_found` | ... if ($reason === "error") { // log error return; } }, // configure bucket key 'bucketKey' => function ($options) { $featureKey = $options['featureKey']; $context = $options['context']; $bucketBy = $options['bucketBy']; $bucketKey = $options['bucketKey']; // default bucket key // return custom bucket key return $bucketKey; }, // configure bucket value (between 0 and 100,000) 'bucketValue' => function ($options) { $featureKey = $options['featureKey']; $context = $options['context']; $bucketKey = $options['bucketKey']; $bucketValue = $options['bucketValue']; // default bucket value // return custom bucket value return $bucketValue; }, ]; ``` ### Registering hooks You can register hooks at the time of SDK initialization: ```php use Featurevisor\Featurevisor; $f = Featurevisor::createInstance([ 'hooks' => [ $myCustomHook ], ]); ``` Or after initialization: ```php $removeHook = $f->addHook($myCustomHook); // $removeHook() ``` ## Child instance When dealing with purely client-side applications, it is understandable that there is only one user involved, like in browser or mobile applications. But when using Featurevisor SDK in server-side applications, where a single server instance can handle multiple user requests simultaneously, it is important to isolate the context for each request. That's where child instances come in handy: ```php $childF = $f->spawn([ // user or request specific context 'userId' => '123', ]); ``` Now you can pass the child instance where your individual request is being handled, and you can continue to evaluate features targeting that specific user alone: ```php $isEnabled = $childF->isEnabled('my_feature'); $variation = $childF->getVariation('my_feature'); $variableValue = $childF->getVariable('my_feature', 'my_variable'); ``` Similar to parent SDK, child instances also support several additional methods: - `setContext` - `setSticky` - `isEnabled` - `getVariation` - `getVariable` - `getVariableBoolean` - `getVariableString` - `getVariableInteger` - `getVariableDouble` - `getVariableArray` - `getVariableObject` - `getVariableJSON` - `getAllEvaluations` - `on` - `close` ## Close Both primary and child instances support a `.close()` method, that removes forgotten event listeners (via `on` method) and cleans up any potential memory leaks. ```php $f->close(); ``` ## CLI usage This package also provides a CLI tool for running your Featurevisor project's test specs and benchmarking against this PHP SDK: ### Test Learn more about testing [here](https://featurevisor.com/docs/testing/). ``` $ vendor/bin/featurevisor test --projectDirectoryPath="/absolute/path/to/your/featurevisor/project" ``` Additional options that are available: ``` $ vendor/bin/featurevisor test \ --projectDirectoryPath="/absolute/path/to/your/featurevisor/project" \ --quiet|verbose \ --onlyFailures \ --keyPattern="myFeatureKey" \ --assertionPattern="#1" ``` ### Benchmark Learn more about benchmarking [here](https://featurevisor.com/docs/cli/#benchmarking). ``` $ vendor/bin/featurevisor benchmark \ --projectDirectoryPath="/absolute/path/to/your/featurevisor/project" \ --environment="production" \ --feature="myFeatureKey" \ --context='{"country": "nl"}' \ --n=1000 ``` ### Assess distribution Learn more about assessing distribution [here](https://featurevisor.com/docs/cli/#assess-distribution). ``` $ vendor/bin/featurevisor assess-distribution \ --projectDirectoryPath="/absolute/path/to/your/featurevisor/project" \ --environment=production \ --feature=foo \ --variation \ --context='{"country": "nl"}' \ --populateUuid=userId \ --populateUuid=deviceId \ --n=1000 ``` ## GitHub repositories - See SDK repository here: [featurevisor/featurevisor-php](https://github.com/featurevisor/featurevisor-php) - See example application repository here: [featurevisor/featurevisor-php-example](https://github.com/featurevisor/featurevisor-php-example) --- title: Browser SDK nextjs: metadata: title: Browser SDK description: Learn how to use Featurevisor SDK in browser environments openGraph: title: Browser SDK description: Learn how to use Featurevisor SDK in browser environments images: - url: /img/og/docs-sdks-browser.png --- You can use the same Featurevisor [JavaScript SDK](/docs/sdks/javascript) in browser environments as well. {% .lead %} ## Installation Install with npm: ```{% title="Command" %} $ npm install --save @featurevisor/sdk ``` ## API Please find the full API docs in [JavaScript SDK](/docs/sdks/javascript) page. ## Polyfills ### TextEncoder Featurevisor SDK uses `TextEncoder` API for encoding strings. if you need to support very old browsers, you can consider using [`fastestsmallesttextencoderdecoder`](https://www.npmjs.com/package/fastestsmallesttextencoderdecoder). You can install it with npm: ```{% title="Command" %} $ npm install --save fastestsmallesttextencoderdecoder ``` And then import or `require()` it in your code: ```js {% path="your-app/index.js" %} import 'fastestsmallesttextencoderdecoder' ``` --- title: Swift SDK nextjs: metadata: title: Swift SDK description: Learn how to use Featurevisor Swift SDK openGraph: title: Swift SDK description: Learn how to use Featurevisor Swift SDK images: - url: /img/og/docs-sdks-swift.png --- {% callout title="v1 datafiles only supported" type="warning" %} This SDK does not support latest Featurevisor v2 datafiles yet. Learn more about building datafiles supporting older SDKS [here](/docs/building-datafiles/#schema-version). {% /callout %} Featurevisor Swift SDK can be used in Apple devices targeting several operating systems including: iOS, iPadOS, macOS, tvOS, and watchOS. {% .lead %} If you don't find what you are looking for or the provided details are insufficient in this page, please check out [Swift SDK](https://github.com/featurevisor/featurevisor-swift) repository on GitHub. If something is still not clear, please raise an [issue](https://github.com/featurevisor/featurevisor-swift/issues). ## Installation Swift Package Manager executable requires compilation before running it. ``` $ cd path/to/featurevisor-swift-sdk $ swift build -c release $ (cd .build/release && cp -f featurevisor /usr/local/bin/featurevisor-swift) ``` ## Initialization The SDK can be initialized in two different ways depending on your needs. ### Synchronous You can fetch the datafile content on your own and just pass it via options. ```swift import FeaturevisorSDK let datafileContent: DatafileContent = ... var options: InstanceOptions = .default options.datafile = datafileContent let f = try createInstance(options: options) ``` ### Asynchronous If you want to delegate the responsibility of fetching the datafile to the SDK. ```swift import FeaturevisorSDK var options: InstanceOptions = .default options.datafileUrl = "https://cdn.yoursite.com/production/datafile-tag-all.json" let f = try createInstance(options: options) ``` If you need to take further control on how the datafile is fetched, you can pass a custom `handleDatafileFetch` function ```swift public typealias DatafileFetchHandler = (_ datafileUrl: String) -> Result import FeaturevisorSDK var options: InstanceOptions = .default options.handleDatafileFetch = { datafileUrl in // you need to return here Result } let f = try createInstance(options: options) ``` ## Context Contexts are [attribute](/docs/attributes) values that we pass to SDK for evaluating [features](/docs/features). They are objects where keys are the attribute keys, and values are the attribute values. ```swift public enum AttributeValue { case string(String) case integer(Int) case double(Double) case boolean(Bool) case date(Date) } ``` ```swift import FeaturevisorSDK let context = [ "myAttributeKey": .string("myStringAttributeValue"), "anotherAttributeKey": .double(0.999), ] ``` ## Checking if enabled Once the SDK is initialized, you can check if a feature is enabled or not: ```swift import FeaturevisorSDK let featureKey = "my_feature"; let context = [ "userId": .string("123"), "country": .string("nl") ] let isEnabled = f.isEnabled(featureKey: featureKey, context: context) ``` ## Getting variations If your feature has any [variations](/docs/features/#variations) defined, you can get evaluate them as follows: ```swift import FeaturevisorSDK let featureKey = "my_feature"; let context = [ "userId": .string("123") ] let variation = f.getVariation(featureKey: featureKey, context: context) ``` ## Getting variables Your features may also include [variables](/docs/features/#variables): ```swift import FeaturevisorSDK let featureKey = "my_feature"; let variableKey = "color" let context = [ "userId": .string("123") ] let variable: VariableValue? = f.getVariable(featureKey: featureKey, variableKey: variableKey, context: context) ``` ## Type specific methods Next `getVariable` methods: ### `boolean` ```swift let booleanVariable: Bool? = f.getVariableBoolean(featureKey: FeatureKey, variableKey: VariableKey, context: Context) ``` ### `string` ```swift let stringVariable: String? = f.getVariableString(featureKey: FeatureKey, variableKey: VariableKey, context: Context) ``` ### `integer` ```swift let integerVariable: Int? = f.getVariableInteger(featureKey: FeatureKey, variableKey: VariableKey, context: Context) ``` ### `double` ```swift let doubleVariable: Double? = f.getVariableDouble(featureKey: FeatureKey, variableKey: VariableKey, context: Context) ``` ### `array` ```swift let arrayVariable: [String]? = f.getVariableArray(featureKey: FeatureKey, variableKey: VariableKey, context: Context) ``` ### `object` ```ts let objectVariable: MyDecodableObject? = f.getVariableObject(featureKey: FeatureKey, variableKey: VariableKey, context: Context) ``` ### `json` ```swift let jsonVariable: MyJSONDecodableObject? = f.getVariableJSON(featureKey: FeatureKey, variableKey: VariableKey, context: Context) ``` ## Activation Activation is useful when you want to track what features and their variations are exposed to your users. It works the same as `f.getVariation()` method, but it will also bubble an event up that you can listen to. ```swift import FeaturevisorSDK var options: InstanceOptions = .default options.datafileUrl = "https://cdn.yoursite.com/production/datafile-tag-all.json" options.onActivation = { ... } let f = try createInstance(options: options) let featureKey = "my_feature"; let context = [ "userId": .string("123"), ] f.activate(featureKey: featureKey, context: context) ``` From the `onActivation` handler, you can send the activation event to your analytics service. ## Initial features You may want to initialize your SDK with a set of features before SDK has successfully fetched the datafile (if using `datafileUrl` option). This helps in cases when you fail to fetch the datafile, but you still wish your SDK instance to continue serving a set of sensible default values. And as soon as the datafile is fetched successfully, the SDK will start serving values from there. ```swift import FeaturevisorSDK var options: InstanceOptions = .default options.datafileUrl = "https://cdn.yoursite.com/production/datafile-tag-all.json" options.initialFeatures = [ "my_feature": .init(enabled: true, variation: "treatment", variables: ["myVariableKey": .string("myVariableValue")]), "another_feature": .init(enabled: true, variation: nil, variables: nil) ] let f = try createInstance(options: options) ``` ## Stickiness Featurevisor relies on consistent bucketing making sure the same user always sees the same variation in a deterministic way. You can learn more about it in [Bucketing](/docs/bucketing) section. But there are times when your targeting conditions (segments) can change and this may lead to some users being re-bucketed into a different variation. This is where stickiness becomes important. If you have already identified your user in your application, and know what features should be exposed to them in what variations, you can initialize the SDK with a set of sticky features: ```swift import FeaturevisorSDK var options: InstanceOptions = .default options.datafileUrl = "https://cdn.yoursite.com/production/datafile-tag-all.json" options.stickyFeatures = [ "my_feature": .init(enabled: true, variation: "treatment", variables: ["myVariableKey": .string("myVariableValue")]), "another_feature": .init(enabled: true, variation: nil, variables: nil) ] let f = try createInstance(options: options) ``` Once initialized with sticky features, the SDK will look for values there first before evaluating the targeting conditions and going through the bucketing process. You can also set sticky features after the SDK is initialized: ```swift f.setStickyFeatures(stickyFeatures: [ "my_feature": .init(enabled: true, variation: "treatment", variables: ["myVariableKey": .string("myVariableValue")]) ]) ``` This will be handy when you want to: - update sticky features in the SDK without re-initializing it (or restarting the app), and - handle evaluation of features for multiple users from the same instance of the SDK (e.g. in a server dealing with incoming requests from multiple users) ## Logging By default, Featurevisor will log logs in console output window for warn and error levels. ### Levels ```swift import FeaturevisorSDK let logger = createLogger(levels: [.error, .warn, .info, .debug]) ``` ### Handler You can also pass your own log handler, if you do not wish to print the logs to the console: ```swift import FeaturevisorSDK let logger = createLogger( levels: [.error, .warn, .info, .debug], handle: { level, message, details in ... }) var options = InstanceOptions.default options.logger = logger let f = try createInstance(options: options) ``` ## Intercepting context You can intercept context before they are used for evaluation: ```swift import FeaturevisorSDK let defaultContext = [ "country": "nl" ] var options: InstanceOptions = .default options.interceptContext = { context in context.merging(defaultContext) { (current, _) in current } } let f = try createInstance(options: options) ``` This is useful when you wish to add a default set of attributes as context for all your evaluations, giving you the convenience of not having to pass them in every time. ## Refreshing datafile Refreshing the datafile is convenient when you want to update the datafile in runtime, for example when you want to update the feature variations and variables config without having to restart your application. It is only possible to refresh datafile in Featurevisor if you are using the `datafileUrl` option when creating your SDK instance. ### Manual refresh ```swift import FeaturevisorSDK var options = InstanceOptions.default options.datafileUrl = "https://cdn.yoursite.com/production/datafile-tag-all.json" let f = try createInstance(options: options) f.refresh() ``` ### Refresh by interval If you want to refresh your datafile every X number of seconds, you can pass the `refreshInterval` option when creating your SDK instance: ```swift import FeaturevisorSDK var options: InstanceOptions = .default options.datafileUrl = "https://cdn.yoursite.com/production/datafile-tag-all.json" options.refreshInterval = 30 // 30 seconds let f = try createInstance(options: options) ``` You can stop the interval by calling: ```swift f.stopRefreshing() ``` If you want to resume refreshing: ```swift f.startRefreshing() ``` ### Listening for updates Every successful refresh will trigger the `onRefresh()` option: ```swift import FeaturevisorSDK var options: InstanceOptions = .default options.datafileUrl = "https://cdn.yoursite.com/production/datafile-tag-all.json" options.onRefresh = { ... } let f = try createInstance(options: options) ``` Not every refresh is going to be of a new datafile version. If you want to know if datafile content has changed in any particular refresh, you can listen to `onUpdate` option: ```swift import FeaturevisorSDK var options: InstanceOptions = .default options.datafileUrl = "https://cdn.yoursite.com/production/datafile-tag-all.json" options.onUpdate = { ... } let f = try createInstance(options: options) ``` ## Events Featurevisor SDK implements a simple event emitter that allows you to listen to events that happen in the runtime. ### Listening to events You can listen to these events that can occur at various stages in your application: #### `ready` When the SDK is ready to be used if used in an asynchronous way involving `datafileUrl` option: ```swift sdk.on?(.ready, { _ in // sdk is ready to be used }) ``` The `ready` event is fired maximum once. You can also synchronously check if the SDK is ready: ```swift if (f.isReady()) { // sdk is ready to be used } ``` #### `activation` When a feature is activated: ```swift sdk.on?(.activation, { _ in }) ``` #### `refresh` When the datafile is refreshed: ```swift sdk.on?(.refresh, { _ in // datafile has been refreshed successfully }) ``` This will only occur if you are using `refreshInterval` option. #### `update` When the datafile is refreshed, and new datafile content is different from the previous one: ```swift sdk.on?(.update, { _ in // datafile has been refreshed, and // new datafile content is different from the previous one }) ``` This will only occur if you are using `refreshInterval` option. ### Stop listening You can stop listening to specific events by assgning nil to `off` or by calling `removeListener()`: ```swift f.off = nil f.removeListener?(.update, { _ in }) ``` ### Remove all listeners If you wish to remove all listeners of any specific event type: ```swift f.removeAllListeners?(.update) f.removeAllListeners?(.ready) ``` ## Evaluation details Besides logging with debug level enabled, you can also get more details about how the feature variations and variables are evaluated in the runtime against given context: ```swift // flag let evaluation = f.evaluateFlag(featureKey: featureKey, context: context) // variation let evaluation = f.evaluateVariation(featureKey: featureKey, context: context) // variable let evaluation = f.evaluateVariable(featureKey: featureKey, variableKey: variableKey, context: context) ``` The returned object will always contain the following properties: - `featureKey`: the feature key - `reason`: the reason how the value was evaluated And optionally these properties depending on whether you are evaluating a feature variation or a variable: - `bucketValue`: the bucket value between 0 and 100,000 - `ruleKey`: the rule key - `error`: the error object - `enabled`: if feature itself is enabled or not - `variation`: the variation object - `variationValue`: the variation value - `variableKey`: the variable key - `variableValue`: the variable value - `variableSchema`: the variable schema --- title: SDKs nextjs: metadata: title: SDKs description: Learn how to use Featurevisor SDKs openGraph: title: SDKs description: Learn how to use Featurevisor SDKs images: - url: /img/og/docs.png --- SDKs are meant to be used in your own applications, where you want to evaluate features in the application runtime. {% .lead %} Supported languages and runtimes: - [JavaScript](/docs/sdks/javascript) - [Node.js](/docs/sdks/nodejs) - [Browser](/docs/sdks/browser) - [PHP](/docs/sdks/php) - [Java](/docs/sdks/java) - [Roku](/docs/sdks/roku) - [Swift](/docs/sdks/swift) With other languages to follow very soon: - Kotlin --- title: JavaScript SDK nextjs: metadata: title: JavaScript SDK description: Learn how to use Featurevisor JavaScript SDK openGraph: title: JavaScript SDK description: Learn how to use Featurevisor JavaScript SDK images: - url: /img/og/docs-sdks-javascript.png --- Featurevisor's JavaScript SDK is universal, meaning it works in both [Node.js](/docs/sdks/nodejs) and [browser](/docs/sdks/browser) environments. {% .lead %} ## Installation Install with npm in your application: ```{% title="Command" %} $ npm install --save @featurevisor/sdk ``` ## Initialization The SDK can be initialized by passing [datafile](/docs/building-datafiles/) content directly: ```js {% path="your-app/index.js" highlight="1,8-10" %} import { createInstance } from '@featurevisor/sdk' const datafileUrl = 'https://cdn.yoursite.com/datafile.json' const datafileContent = await fetch(datafileUrl) .then((res) => res.json()) const f = createInstance({ datafile: datafileContent, }) ``` ## Evaluation types We can evaluate 3 types of values against a particular feature: - [**Flag**](#check-if-enabled) (`boolean`): whether the feature is enabled or not - [**Variation**](#getting-variation) (`string`): the variation of the feature (if any) - [**Variables**](#getting-variables): variable values of the feature (if any) These evaluations are run against the provided context. ## Context Contexts are [attribute](/docs/attributes) values that we pass to SDK for evaluating [features](/docs/features) against. Think of the conditions that you define in your [segments](/docs/segments/), which are used in your feature's [rules](/docs/features/#rules). They are plain objects: ```js const context = { userId: '123', country: 'nl', // ...other attributes } ``` Context can be passed to SDK instance in various different ways, depending on your needs: ### Setting initial context You can set context at the time of initialization: ```js {% path="your-app/index.js" highlight="4-7" %} import { createInstance } from '@featurevisor/sdk' const f = createInstance({ context: { deviceId: '123', country: 'nl', }, }) ``` This is useful for values that don't change too frequently and available at the time of application startup. ### Setting after initialization You can also set more context after the SDK has been initialized: ```js f.setContext({ userId: '234', }) ``` This will merge the new context with the existing one (if already set). ### Replacing existing context If you wish to fully replace the existing context, you can pass `true` in second argument: ```js {% highlight="8" %} f.setContext( { deviceId: '123', userId: '234', country: 'nl', browser: 'chrome', }, true, // replace existing context ) ``` ### Manually passing context You can optionally pass additional context manually for each and every evaluation separately, without needing to set it to the SDK instance affecting all evaluations: ```js const context = { userId: '123', country: 'nl', } const isEnabled = f.isEnabled('my_feature', context) const variation = f.getVariation('my_feature', context) const variableValue = f.getVariable('my_feature', 'my_variable', context) ``` When manually passing context, it will merge with existing context set to the SDK instance before evaluating the specific value. Further details for each evaluation types are described below. ## Check if enabled Once the SDK is initialized, you can check if a feature is enabled or not: ```js const featureKey = 'my_feature' const isEnabled = f.isEnabled(featureKey) if (isEnabled) { // do something } ``` You can also pass additional context per evaluation: ```js const isEnabled = f.isEnabled(featureKey, { // ...additional context }) ``` ## Getting variation If your feature has any [variations](/docs/features/#variations) defined, you can evaluate them as follows: ```js const featureKey = 'my_feature' const variation = f.getVariation(featureKey) if (variation === "treatment") { // do something for treatment variation } else { // handle default/control variation } ``` Additional context per evaluation can also be passed: ```js const variation = f.getVariation(featureKey, { // ...additional context }) ``` ## Getting variables Your features may also include [variables](/docs/features/#variables), which can be evaluated as follows: ```js const variableKey = 'bgColor' const bgColorValue = f.getVariable(featureKey, variableKey) ``` Additional context per evaluation can also be passed: ```js const bgColorValue = f.getVariable(featureKey, variableKey, { // ...additional context }) ``` ### Type specific methods Next to generic `getVariable()` methods, there are also type specific methods available for convenience: ```ts f.getVariableBoolean(featureKey, variableKey, context = {}) f.getVariableString(featureKey, variableKey, context = {}) f.getVariableInteger(featureKey, variableKey, context = {}) f.getVariableDouble(featureKey, variableKey, context = {}) f.getVariableArray(featureKey, variableKey, context = {}) f.getVariableObject(featureKey, variableKey, context = {}) f.getVariableJSON(featureKey, variableKey, context = {}) ``` ## Getting all evaluations You can get evaluations of all features available in the SDK instance: ```js const allEvaluations = f.getAllEvaluations(context = {}) console.log(allEvaluations) // { // myFeature: { // enabled: true, // variation: "control", // variables: { // myVariableKey: "myVariableValue", // }, // }, // // anotherFeature: { // enabled: true, // variation: "treatment", // } // } ``` This is handy especially when you want to pass all evaluations from a backend application to the frontend. ## Sticky For the lifecycle of the SDK instance in your application, you can set some features with sticky values, meaning that they will not be evaluated against the fetched [datafile](/docs/building-datafiles/): ### Initialize with sticky ```js import { createInstance } from '@featurevisor/sdk' const f = createInstance({ sticky: { myFeatureKey: { enabled: true, // optional variation: 'treatment', variables: { myVariableKey: 'myVariableValue', }, }, anotherFeatureKey: { enabled: false, }, }, }) ``` Once initialized with sticky features, the SDK will look for values there first before evaluating the targeting conditions and going through the bucketing process. ### Set sticky afterwards You can also set sticky features after the SDK is initialized: ```js f.setSticky( { myFeatureKey: { enabled: true, variation: 'treatment', variables: { myVariableKey: 'myVariableValue', }, }, anotherFeatureKey: { enabled: false, }, }, // replace existing sticky features (false by default) true ) ``` ## Setting datafile You may also initialize the SDK without passing `datafile`, and set it later on: ```js f.setDatafile(datafileContent) ``` ### Updating datafile You can set the datafile as many times as you want in your application, which will result in emitting a [`datafile_set`](#datafile-set) event that you can listen and react to accordingly. The triggers for setting the datafile again can be: - periodic updates based on an interval (like every 5 minutes), or - reacting to: - a specific event in your application (like a user action), or - an event served via websocket or server-sent events (SSE) ### Interval-based update Here's an example of using interval-based update: ```js {% highlight="7" %} const interval = 5 * 60 * 1000 // 5 minutes setTimeout(function () { fetch(datafileUrl) .then((res) => res.json()) .then((datafileContent) => { f.setDatafile(datafileContent) }) }, interval) ``` ## Logging By default, Featurevisor SDKs will print out logs to the console for `info` level and above. ### Levels These are all the available log levels: - `error` - `warn` - `info` - `debug` ### Customizing levels If you choose `debug` level to make the logs more verbose, you can set it at the time of SDK initialization. Setting `debug` level will print out all logs, including `info`, `warn`, and `error` levels. ```js import { createInstance, createLogger } from '@featurevisor/sdk' const f = createInstance({ logger: createLogger({ level: 'debug', }), }) ``` Alternatively, you can also set `logLevel` directly: ```js const f = createInstance({ logLevel: 'debug', }) ``` You can also set log level from SDK instance afterwards: ```js f.setLogLevel('debug') ``` ### Handler You can also pass your own log handler, if you do not wish to print the logs to the console: ```js const f = createInstance({ logger: createLogger({ level: 'info', handler: function (level, message, details) { // do something with the log }, }), }) ``` Further log levels like `info` and `debug` will help you understand how the feature variations and variables are evaluated in the runtime against given context. ## Events Featurevisor SDK implements a simple event emitter that allows you to listen to events that happen in the runtime. You can listen to these events that can occur at various stages in your application: ### `datafile_set` ```js const unsubscribe = f.on('datafile_set', function ({ revision, // new revision previousRevision, revisionChanged, // true if revision has changed // list of feature keys that have new updates, // and you should re-evaluate them features, }) { // handle here }) // stop listening to the event unsubscribe() ``` The `features` array will contain keys of features that have either been: - added, or - updated, or - removed compared to the previous datafile content that existed in the SDK instance. ### `context_set` ```js const unsubscribe = f.on("context_set", ({ replaced, // true if context was replaced context, // the new context }) => { console.log('Context set') }) ``` ### `sticky_set` ```js const unsubscribe = f.on("sticky_set", ({ replaced, // true if sticky features got replaced features, // list of all affected feature keys }) => { console.log('Sticky features set') }) ``` ## Evaluation details Besides logging with debug level enabled, you can also get more details about how the feature variations and variables are evaluated in the runtime against given context: ```js // flag const evaluation = f.evaluateFlag(featureKey, context = {}) // variation const evaluation = f.evaluateVariation(featureKey, context = {}) // variable const evaluation = f.evaluateVariable(featureKey, variableKey, context = {}) ``` The returned object will always contain the following properties: - `featureKey`: the feature key - `reason`: the reason how the value was evaluated And optionally these properties depending on whether you are evaluating a feature variation or a variable: - `bucketValue`: the bucket value between 0 and 100,000 - `ruleKey`: the rule key - `error`: the error object - `enabled`: if feature itself is enabled or not - `variation`: the variation object - `variationValue`: the variation value - `variableKey`: the variable key - `variableValue`: the variable value - `variableSchema`: the variable schema ## Hooks Hooks allow you to intercept the evaluation process and customize it further as per your needs. ### Defining a hook A hook is a simple object with a unique required `name` and optional functions: ```ts import { Hook } from "@featurevisor/sdk" const myCustomHook: Hook = { // only required property name: 'my-custom-hook', // rest of the properties below are all optional per hook // before evaluation before: function (options) { const { type, // `feature` | `variation` | `variable` featureKey, variableKey, // if type is `variable` context } options; // update context before evaluation options.context = { ...options.context, someAdditionalAttribute: 'value', } return options }, // after evaluation after: function (evaluation, options) { const { reason // `error` | `feature_not_found` | `variable_not_found` | ... } = evaluation if (reason === "error") { // log error return } }, // configure bucket key bucketKey: function (options) { const { featureKey, context, bucketBy, bucketKey, // default bucket key } = options; // return custom bucket key return bucketKey }, // configure bucket value (between 0 and 100,000) bucketValue: function (options) { const { featureKey, context, bucketKey, bucketValue, // default bucket value } = options; // return custom bucket value return bucketValue }, } ``` ### Registering hooks You can register hooks at the time of SDK initialization: ```js import { createInstance } from '@featurevisor/sdk' const f = createInstance({ hooks: [ myCustomHook ], }) ``` Or after initialization: ```js const removeHook = f.addHook(myCustomHook); // removeHook() ``` ## Child instance When dealing with purely client-side applications, it is understandable that there is only one user involved, like in browser or mobile applications. But when using Featurevisor SDK in server-side applications, where a single server instance can handle multiple user requests simultaneously, it is important to isolate the context for each request. That's where child instances come in handy: ```js const childF = f.spawn({ // user or request specific context userId: '123', }) ``` Now you can pass the child instance where your individual request is being handled, and you can continue to evaluate features targeting that specific user alone: ```js const isEnabled = childF.isEnabled('my_feature') const variation = childF.getVariation('my_feature') const variableValue = childF.getVariable('my_feature', 'my_variable') ``` Similar to parent SDK, child instances also support several additional methods: - `setContext` - `setSticky` - `isEnabled` - `getVariation` - `getVariable` - `getVariableBoolean` - `getVariableString` - `getVariableInteger` - `getVariableDouble` - `getVariableArray` - `getVariableObject` - `getVariableJSON` - `getAllEvaluations` - `on` - `close` ## Close Both primary and child instances support a `.close()` method, that removes forgotten event listeners (via `on` method) and cleans up any potential memory leaks. ```js f.close() ``` --- title: Ruby SDK nextjs: metadata: title: Ruby SDK description: Learn how to use Featurevisor Ruby SDK openGraph: title: Ruby SDK description: Learn how to use Featurevisor Ruby SDK images: - url: /img/og/docs-sdks-ruby.png showEditPageLink: false --- Featurevisor's Ruby SDK is designed to work seamlessly with your existing Ruby applications, both without or with established frameworks like Rails, Sinatra, or Padrino. {% .lead %} ## Installation Add this line to your application's Gemfile: ```ruby gem 'featurevisor' ``` And then execute: ```bash $ bundle install ``` Or install it yourself as: ```bash $ gem install featurevisor ``` ## Initialization The SDK can be initialized by passing [datafile](https://featurevisor.com/docs/building-datafiles/) content directly: ```ruby require 'featurevisor' require 'net/http' require 'json' # Fetch datafile from URL datafile_url = 'https://cdn.yoursite.com/datafile.json' response = Net::HTTP.get_response(URI(datafile_url)) # Parse JSON with symbolized keys (required) datafile_content = JSON.parse(response.body, symbolize_names: true) # Create SDK instance f = Featurevisor.create_instance( datafile: datafile_content ) ``` **Important**: When parsing JSON datafiles, you must use `symbolize_names: true` to ensure proper key handling by the SDK. Alternatively, you can pass a JSON string directly and the SDK will parse it automatically: ```ruby # Option 1: Parse JSON yourself (recommended) datafile_content = JSON.parse(json_string, symbolize_names: true) f = Featurevisor.create_instance(datafile: datafile_content) # Option 2: Pass JSON string directly (automatic parsing) f = Featurevisor.create_instance(datafile: json_string) ``` ## Evaluation types We can evaluate 3 types of values against a particular [feature](https://featurevisor.com/docs/features/): - [**Flag**](#check-if-enabled) (`boolean`): whether the feature is enabled or not - [**Variation**](#getting-variation) (`string`): the variation of the feature (if any) - [**Variables**](#getting-variables): variable values of the feature (if any) These evaluations are run against the provided context. ## Context Contexts are [attribute](https://featurevisor.com/docs/attributes/) values that we pass to SDK for evaluating [features](https://featurevisor.com/docs/features/) against. Think of the conditions that you define in your [segments](https://featurevisor.com/docs/segments/), which are used in your feature's [rules](https://featurevisor.com/docs/features/#rules). They are plain hashes: ```ruby context = { userId: '123', country: 'nl', # ...other attributes } ``` Context can be passed to SDK instance in various different ways, depending on your needs: ### Setting initial context You can set context at the time of initialization: ```ruby require 'featurevisor' f = Featurevisor.create_instance( context: { deviceId: '123', country: 'nl' } ) ``` This is useful for values that don't change too frequently and available at the time of application startup. ### Setting after initialization You can also set more context after the SDK has been initialized: ```ruby f.set_context({ userId: '234' }) ``` This will merge the new context with the existing one (if already set). ### Replacing existing context If you wish to fully replace the existing context, you can pass `true` in second argument: ```ruby f.set_context({ deviceId: '123', userId: '234', country: 'nl', browser: 'chrome' }, true) # replace existing context ``` ### Manually passing context You can optionally pass additional context manually for each and every evaluation separately, without needing to set it to the SDK instance affecting all evaluations: ```ruby context = { userId: '123', country: 'nl' } is_enabled = f.is_enabled('my_feature', context) variation = f.get_variation('my_feature', context) variable_value = f.get_variable('my_feature', 'my_variable', context) ``` When manually passing context, it will merge with existing context set to the SDK instance before evaluating the specific value. Further details for each evaluation types are described below. ## Check if enabled Once the SDK is initialized, you can check if a feature is enabled or not: ```ruby feature_key = 'my_feature' is_enabled = f.is_enabled(feature_key) if is_enabled # do something end ``` You can also pass additional context per evaluation: ```ruby is_enabled = f.is_enabled(feature_key, { # ...additional context }) ``` ## Getting variation If your feature has any [variations](https://featurevisor.com/docs/features/#variations) defined, you can evaluate them as follows: ```ruby feature_key = 'my_feature' variation = f.get_variation(feature_key) if variation == 'treatment' # do something for treatment variation else # handle default/control variation end ``` Additional context per evaluation can also be passed: ```ruby variation = f.get_variation(feature_key, { # ...additional context }) ``` ## Getting variables Your features may also include [variables](https://featurevisor.com/docs/features/#variables), which can be evaluated as follows: ```ruby variable_key = 'bgColor' bg_color_value = f.get_variable('my_feature', variable_key) ``` Additional context per evaluation can also be passed: ```ruby bg_color_value = f.get_variable('my_feature', variable_key, { # ...additional context }) ``` ### Type specific methods Next to generic `get_variable()` methods, there are also type specific methods available for convenience: ```ruby f.get_variable_boolean(feature_key, variable_key, context = {}) f.get_variable_string(feature_key, variable_key, context = {}) f.get_variable_integer(feature_key, variable_key, context = {}) f.get_variable_double(feature_key, variable_key, context = {}) f.get_variable_array(feature_key, variable_key, context = {}) f.get_variable_object(feature_key, variable_key, context = {}) f.get_variable_json(feature_key, variable_key, context = {}) ``` ## Getting all evaluations You can get evaluations of all features available in the SDK instance: ```ruby all_evaluations = f.get_all_evaluations({}) puts all_evaluations # { # myFeature: { # enabled: true, # variation: "control", # variables: { # myVariableKey: "myVariableValue", # }, # }, # # anotherFeature: { # enabled: true, # variation: "treatment", # } # } ``` This is handy especially when you want to pass all evaluations from a backend application to the frontend. ## Sticky For the lifecycle of the SDK instance in your application, you can set some features with sticky values, meaning that they will not be evaluated against the fetched [datafile](https://featurevisor.com/docs/building-datafiles/): ### Initialize with sticky ```ruby require 'featurevisor' f = Featurevisor.create_instance( sticky: { myFeatureKey: { enabled: true, # optional variation: 'treatment', variables: { myVariableKey: 'myVariableValue' } }, anotherFeatureKey: { enabled: false } } ) ``` Once initialized with sticky features, the SDK will look for values there first before evaluating the targeting conditions and going through the bucketing process. ### Set sticky afterwards You can also set sticky features after the SDK is initialized: ```ruby f.set_sticky({ myFeatureKey: { enabled: true, variation: 'treatment', variables: { myVariableKey: 'myVariableValue' } }, anotherFeatureKey: { enabled: false } }, true) # replace existing sticky features (false by default) ``` ## Setting datafile You may also initialize the SDK without passing `datafile`, and set it later on: ```ruby # Parse with symbolized keys before setting datafile_content = JSON.parse(json_string, symbolize_names: true) f.set_datafile(datafile_content) # Or pass JSON string directly for automatic parsing f.set_datafile(json_string) ``` **Important**: When calling `set_datafile()`, ensure JSON is parsed with `symbolize_names: true` if you're parsing it yourself. ### Updating datafile You can set the datafile as many times as you want in your application, which will result in emitting a [`datafile_set`](#datafile_set) event that you can listen and react to accordingly. The triggers for setting the datafile again can be: - periodic updates based on an interval (like every 5 minutes), or - reacting to: - a specific event in your application (like a user action), or - an event served via websocket or server-sent events (SSE) ### Interval-based update Here's an example of using interval-based update: ```ruby require 'net/http' require 'json' def update_datafile(f, datafile_url) loop do sleep(5 * 60) # 5 minutes begin response = Net::HTTP.get_response(URI(datafile_url)) datafile_content = JSON.parse(response.body) f.set_datafile(datafile_content) rescue => e # handle error puts "Failed to update datafile: #{e.message}" end end end # Start the update thread Thread.new { update_datafile(f, datafile_url) } ``` ## Logging By default, Featurevisor SDKs will print out logs to the console for `info` level and above. ### Levels These are all the available log levels: - `error` - `warn` - `info` - `debug` ### Customizing levels If you choose `debug` level to make the logs more verbose, you can set it at the time of SDK initialization. Setting `debug` level will print out all logs, including `info`, `warn`, and `error` levels. ```ruby require 'featurevisor' f = Featurevisor.create_instance( logger: Featurevisor.create_logger(level: 'debug') ) ``` Alternatively, you can also set `log_level` directly: ```ruby f = Featurevisor.create_instance( log_level: 'debug' ) ``` You can also set log level from SDK instance afterwards: ```ruby f.set_log_level('debug') ``` ### Handler You can also pass your own log handler, if you do not wish to print the logs to the console: ```ruby require 'featurevisor' f = Featurevisor.create_instance( logger: Featurevisor.create_logger( level: 'info', handler: ->(level, message, details) { # do something with the log } ) ) ``` Further log levels like `info` and `debug` will help you understand how the feature variations and variables are evaluated in the runtime against given context. ## Events Featurevisor SDK implements a simple event emitter that allows you to listen to events that happen in the runtime. You can listen to these events that can occur at various stages in your application: ### `datafile_set` ```ruby unsubscribe = f.on('datafile_set') do |event| revision = event[:revision] # new revision previous_revision = event[:previous_revision] revision_changed = event[:revision_changed] # true if revision has changed # list of feature keys that have new updates, # and you should re-evaluate them features = event[:features] # handle here end # stop listening to the event unsubscribe.call ``` The `features` array will contain keys of features that have either been: - added, or - updated, or - removed compared to the previous datafile content that existed in the SDK instance. ### `context_set` ```ruby unsubscribe = f.on('context_set') do |event| replaced = event[:replaced] # true if context was replaced context = event[:context] # the new context puts 'Context set' end ``` ### `sticky_set` ```ruby unsubscribe = f.on('sticky_set') do |event| replaced = event[:replaced] # true if sticky features got replaced features = event[:features] # list of all affected feature keys puts 'Sticky features set' end ``` ## Evaluation details Besides logging with debug level enabled, you can also get more details about how the feature variations and variables are evaluated in the runtime against given context: ```ruby # flag evaluation = f.evaluate_flag(feature_key, context = {}) # variation evaluation = f.evaluate_variation(feature_key, context = {}) # variable evaluation = f.evaluate_variable(feature_key, variable_key, context = {}) ``` The returned object will always contain the following properties: - `feature_key`: the feature key - `reason`: the reason how the value was evaluated And optionally these properties depending on whether you are evaluating a feature variation or a variable: - `bucket_value`: the bucket value between 0 and 100,000 - `rule_key`: the rule key - `error`: the error object - `enabled`: if feature itself is enabled or not - `variation`: the variation object - `variation_value`: the variation value - `variable_key`: the variable key - `variable_value`: the variable value - `variable_schema`: the variable schema ## Hooks Hooks allow you to intercept the evaluation process and customize it further as per your needs. ### Defining a hook A hook is a simple hash with a unique required `name` and optional functions: ```ruby require 'featurevisor' my_custom_hook = { # only required property name: 'my-custom-hook', # rest of the properties below are all optional per hook # before evaluation before: ->(options) { # update context before evaluation options[:context] = options[:context].merge({ someAdditionalAttribute: 'value' }) options }, # after evaluation after: ->(evaluation, options) { reason = evaluation[:reason] if reason == 'error' # log error return end }, # configure bucket key bucket_key: ->(options) { # return custom bucket key options[:bucket_key] }, # configure bucket value (between 0 and 100,000) bucket_value: ->(options) { # return custom bucket value options[:bucket_value] } } ``` ### Registering hooks You can register hooks at the time of SDK initialization: ```ruby require 'featurevisor' f = Featurevisor.create_instance( hooks: [my_custom_hook] ) ``` Or after initialization: ```ruby f.add_hook(my_custom_hook) ``` ## Child instance When dealing with purely client-side applications, it is understandable that there is only one user involved, like in browser or mobile applications. But when using Featurevisor SDK in server-side applications, where a single server instance can handle multiple user requests simultaneously, it is important to isolate the context for each request. That's where child instances come in handy: ```ruby child_f = f.spawn({ # user or request specific context userId: '123' }) ``` Now you can pass the child instance where your individual request is being handled, and you can continue to evaluate features targeting that specific user alone: ```ruby is_enabled = child_f.is_enabled('my_feature') variation = child_f.get_variation('my_feature') variable_value = child_f.get_variable('my_feature', 'my_variable') ``` Similar to parent SDK, child instances also support several additional methods: - `set_context` - `set_sticky` - `is_enabled` - `get_variation` - `get_variable` - `get_variable_boolean` - `get_variable_string` - `get_variable_integer` - `get_variable_double` - `get_variable_array` - `get_variable_object` - `get_variable_json` - `get_all_evaluations` - `on` - `close` ## Close Both primary and child instances support a `.close()` method, that removes forgotten event listeners (via `on` method) and cleans up any potential memory leaks. ```ruby f.close() ``` ## CLI usage This package also provides a CLI tool for running your Featurevisor [project](https://featurevisor.com/docs/projects/)'s test specs and benchmarking against this Ruby SDK. - Global installation: you can access it as `featurevisor` - Local installation: you can access it as `bundle exec featurevisor` - From this repository: you can access it as `bin/featurevisor` ### Test Learn more about testing [here](https://featurevisor.com/docs/testing/). ```bash $ bundle exec featurevisor test --projectDirectoryPath="/absolute/path/to/your/featurevisor/project" ``` Additional options that are available: ```bash $ bundle exec featurevisor test \ --projectDirectoryPath="/absolute/path/to/your/featurevisor/project" \ --quiet|verbose \ --onlyFailures \ --keyPattern="myFeatureKey" \ --assertionPattern="#1" ``` ### Benchmark Learn more about benchmarking [here](https://featurevisor.com/docs/cmd/#benchmarking). ```bash $ bundle exec featurevisor benchmark \ --projectDirectoryPath="/absolute/path/to/your/featurevisor/project" \ --environment="production" \ --feature="myFeatureKey" \ --context='{"country": "nl"}' \ --n=1000 ``` ### Assess distribution Learn more about assessing distribution [here](https://featurevisor.com/docs/cmd/#assess-distribution). ```bash $ bundle exec featurevisor assess-distribution \ --projectDirectoryPath="/absolute/path/to/your/featurevisor/project" \ --environment=production \ --feature=foo \ --variation \ --context='{"country": "nl"}' \ --populateUuid=userId \ --populateUuid=deviceId \ --n=1000 ``` ## GitHub repositories - See SDK repository here: [featurevisor/featurevisor-ruby](https://github.com/featurevisor/featurevisor-ruby) - See example application repository here: [featurevisor/featurevisor-example-ruby](https://github.com/featurevisor/featurevisor-example-ruby) --- title: Segments nextjs: metadata: title: Segments description: Learn how to create segments in Featurevisor openGraph: title: Segments description: Learn how to create segments in Featurevisor images: - url: /img/og/docs-segments.png --- Segments are made up of conditions against various [attributes](/docs/attributes/). They are the groups of users that you can target in your [features](/docs/features/) via rules. {% .lead %} ## Create a segment Let's assume we already have a `country` attribute. Now we with to create a segment that targets users from The Netherlands. We can do that by creating a segment: ```yml {% path="segments/netherlands.yml" highlight="5" %} description: Users from The Netherlands conditions: - attribute: country operator: equals value: nl ``` The segment will match when the [context](/docs/sdks/javascript/#context) in SDK contains an attribute `country` with value `nl`. ## Conditions ### Targeting everyone You can target everyone via a segment by using asterisk `*` as the value in `conditions`: ```yml {% path="segments/everyone.yml" highlight="2" %} description: Everyone conditions: '*' ``` ### Attribute The `attribute` is the name of the [attribute](/docs/attributes/) you want to check against in the [context](/docs/sdks/javascript/#context). #### Nested path If you are using an attribute that is of type `object`, you can make use of dot separated paths to access nester properties, like `myAttribute.nestedProperty`. ### Operator There are numerous operators that can be used to compare the attribute value against given `value`. Find all supported [operators](#operators) in below section. ### Value The `value` property is the value you want to operator to compare against. The type of the value depends on the attribute being used. ## Operators These operators are supported as conditions: | Operator | Type of attribute | Description | | --------------------------- | ------------------- | ----------------------------- | | `exists` | | Attribute exists in context | | `notExists` | | Attribute does not exist | | `equals` | any | Equals to | | `notEquals` | any | Not equals to | | `greaterThan` | `integer`, `double` | Greater than | | `greaterThanOrEquals` | `integer`, `double` | Greater than or equal to | | `lessThan` | `integer`, `double` | Less than | | `lessThanOrEquals` | `integer`, `double` | Less than or equal to | | `contains` | `string` | Contains string | | `notContains` | `string` | Does not contain string | | `startsWith` | `string` | Starts with string | | `endsWith` | `string` | Ends with string | | `in` | `string` | In array of strings | | `notIn` | `string` | Not in array of strings | | `before` | `string`, `date` | Date comparison | | `after` | `string`, `date` | Date comparison | | `matches` | `string` | Matches regex pattern | | `notMatches` | `string` | Does not match regex pattern | | `semverEquals` | `string` | Semver equals to | | `semverNotEquals` | `string` | Semver not equals to | | `semverGreaterThan` | `string` | Semver greater than | | `semverGreaterThanOrEquals` | `string` | Semver greater than or equals | | `semverLessThan` | `string` | Semver less than | | `semverLessThanOrEquals` | `string` | Semver less than or equals | | `includes` | `array` | Array includes value | | `notIncludes` | `array` | Array does not include value | Examples of each operator below: ### `equals` ```yml {% highlight="4" %} # ... conditions: - attribute: country operator: equals value: us ``` ### `notEquals` ```yml {% highlight="4" %} # ... conditions: - attribute: country operator: notEquals value: us ``` ### `greaterThan` ```yml {% highlight="4" %} # ... conditions: - attribute: age operator: greaterThan value: 21 ``` ### `greaterThanOrEquals` ```yml {% highlight="4" %} # ... conditions: - attribute: age operator: greaterThanOrEquals value: 18 ``` ### `lessThan` ```yml {% highlight="4" %} # ... conditions: - attribute: age operator: lessThan value: 65 ``` ### `lessThanOrEquals` ```yml {% highlight="4" %} # ... conditions: - attribute: age operator: lessThanOrEquals value: 64 ``` ### `contains` ```yml {% highlight="4" %} # ... conditions: - attribute: name operator: contains value: John ``` ### `notContains` ```yml {% highlight="4" %} # ... conditions: - attribute: name operator: notContains value: Smith ``` ### `startsWith` ```yml {% highlight="4" %} # ... conditions: - attribute: name operator: startsWith value: John ``` ### `endsWith` ```yml {% highlight="4" %} # ... conditions: - attribute: name operator: endsWith value: Smith ``` ### `in` ```yml {% highlight="4" %} # ... conditions: - attribute: country operator: in value: - be - nl - lu ``` ### `notIn` ```yml {% highlight="4" %} # ... conditions: - attribute: country operator: notIn value: - fr - gb - de ``` ### `before` ```yml {% highlight="4" %} # ... conditions: - attribute: date operator: before value: 2023-12-25T00:00:00Z ``` ### `after` ```yml {% highlight="4" %} # ... conditions: - attribute: date operator: after value: 2023-12-25T00:00:00Z ``` ### `semverEquals` ```yml {% highlight="4" %} # ... conditions: - attribute: version operator: semverEquals value: 1.0.0 ``` ### `semverNotEquals` ```yml {% highlight="4" %} # ... conditions: - attribute: version operator: semverNotEquals value: 1.0.0 ``` ### `semverGreaterThan` ```yml {% highlight="4" %} # ... conditions: - attribute: version operator: semverGreaterThan value: 1.0.0 ``` ### `semverGreaterThanOrEquals` ```yml {% highlight="4" %} # ... conditions: - attribute: version operator: semverGreaterThanOrEquals value: 1.0.0 ``` ### `semverLessThan` ```yml {% highlight="4" %} # ... conditions: - attribute: version operator: semverLessThan value: 1.0.0 ``` ### `semverLessThanOrEquals` ```yml {% highlight="4" %} # ... conditions: - attribute: version operator: semverLessThanOrEquals value: 1.0.0 ``` ### `exists` ```yml {% highlight="4" %} # ... conditions: - attribute: country operator: exists ``` ### `notExists` ```yml {% highlight="4" %} # ... conditions: - attribute: country operator: notExists ``` ### `includes` ```yml {% highlight="4" %} # ... conditions: - attribute: permissions operator: includes value: write ``` ### `notIncludes` ```yml {% highlight="4" %} # ... conditions: - attribute: permissions operator: notIncludes value: write ``` ### `matches` ```yml {% highlight="4" %} # ... conditions: - attribute: email operator: matches value: ^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$ # optional regex flags regexFlags: i # case-insensitive ``` ### `notMatches` ```yml {% highlight="4" %} # ... conditions: - attribute: email operator: notMatches value: ^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$ # optional regex flags regexFlags: i # case-insensitive ``` ## Advanced conditions Conditions can also be combined using `and`, `or`, and `not` operators. ### `and` When using `and`, it means all the direct children conditions under it must match against provided [context](/docs/sdks/javascript/#context). ```yml {% highlight="3" %} # ... conditions: and: - attribute: country operator: equals value: us - attribute: device operator: equals value: iPhone ``` By default if `and` is not specified directly under `conditions`, it is implied. ### `or` When using `or`, it means at least one of the direct children conditions under it must match against provided [context](/docs/sdks/javascript/#context). ```yml {% highlight="3" %} # ... conditions: or: - attribute: country operator: equals value: us - attribute: country operator: equals value: ca ``` ### `not` ```yml {% highlight="3" %} # ... conditions: not: - attribute: country operator: equals value: us ``` ### Complex `and` and `or` can be combined to create complex conditions: ```yml {% highlight="3,8" %} # ... conditions: - and: - attribute: device operator: equals value: iPhone - or: - attribute: country operator: equals value: us - attribute: country operator: equals value: ca ``` You can also nest `and`, `or`, and `not` operators: ```yml {% highlight="3,4" %} # ... conditions: - not: - or: - attribute: country operator: equals value: us - attribute: country operator: equals value: ca ``` ## Archiving You can archive a segment by setting `archived: true`: ```yml {% path="segments/netherlands.yml" highlight="1" %} archived: true description: Users from The Netherlands conditions: - attribute: country operator: equals value: nl ``` --- title: Status site generator nextjs: metadata: title: Status site generator description: Learn how to generate status website using Featurevisor openGraph: title: Status site generator description: Learn how to generate status website using Featurevisor images: - url: /img/og/docs.png --- To get a quick overview of all the feature flags, segments, and attributes (and their history of changes) in your Featurevisor project, you can generate a static website for your team or organization. {% .lead %} ## Why generate a site? As your project grows, it becomes harder to keep track of all the feature flags, segments, and attributes especially if you want to know the current status in any specific environment quickly. The status site generator helps you to keep track of all the changes in your project via a nice and usable static website, that you can generate every time there's a change in your project repository. This also helps communicate the current state of things to your wider organization, especially to those who aren't developers. ## Pre-requisites It is expected that you already have a Featurevisor project with some feature flags, and you have already initialized your git repository with at least one commit. The git repo also needs to have an `origin` remote set up, in order for the edit links to work in the generated website. ## Generate a status site Use Featurevisor CLI: ```{% title="Command" %} $ npx featurevisor site export ``` The generated static site will be available in the `out` directory. ## Serve the site locally Run: ```{% title="Command" %} $ npx featurevisor site serve ``` ## Screenshots Screenshots here may differ from latest site generator. ### Features list [![Features list](/img/site-screenshot-features.png)](/img/site-screenshot-features.png) ### Feature details [![Feature details](/img/site-screenshot-feature-view.png)](/img/site-screenshot-feature-view.png) ### History [![History](/img/site-screenshot-history.png)](/img/site-screenshot-history.png) ## Advanced search The generated website supports advanced search besides just searching by name of your features, segments, or attributes. Examples: - `my keyword`: plain search - `tag:my-tag`: search features by tag - `in:production`: search features by environment - `archived:true` or `archived:false` - `capture:true` or `capture:false`: for filtering attributes - `with:variations` or `without:variations`: for filtering features with/without variations - `variation:variation-value`: for filtering features by variation value - `with:variables` or `without:variables`: for filtering features with/without variables - `variable:variable-key`: for filtering features by variable key ## Read-only mode It is important to note that the generated site is a static one, and therefore it is read-only. To make any changes to your features, segments, or attributes, you will have to make those changes in your Git repository. --- title: State files nextjs: metadata: title: State files description: Learn about Featurevisor state files openGraph: title: State files description: Learn about Featurevisor state files images: - url: /img/og/docs.png --- When you build your datafiles, Featurevisor generates some additional JSON files that are used to keep track of the most recently deployed build of your project. These files are called state files. {% .lead %} ## Location They are located in the `.featurevisor` directory. You don't have to deal with them directly ever. ## What do they contain? ### Traffic allocation Traffic allocation information is important to keep track of so that the next build can maintain [consistent bucketing](/docs/bucketing) for your users against individual features. ### Revision Next to the JSON files for traffic allocation, it will also create and maintain a `REVISION` file which an integer value incremented by every successful build. This revision number will be present in your [generated datafiles](/docs/building-datafiles). ## Committing state files It is recommended that you keep your state files in the Git repository, but not manually commit them yourself when sending Pull Requests. We will learn more about it in [Deployment](/docs/deployment). --- title: Tagging features nextjs: metadata: title: Tagging features description: Tag your features to load them in your application via smaller datafiles openGraph: title: Tagging features description: Tag your features to load them in your application via smaller datafiles images: - url: /img/og/docs-tags.png --- Tagging your features helps build smaller datafiles, so that your applications get to load only the minimum required features in the runtime. {% .lead %} ## Configuration Every Featurevisor project needs to define a set of tags in the [configuration](/docs/configuration) file: ```js {% path="featurevisor.config.js" highlight="2-7" %} module.exports = { tags: [ 'web', 'mobile', // add more tags here... ], environments: [ 'staging', 'production', ], } ``` ## Defining features Above configuration enables you to define your features against one or more tags as follows: ```yml {% path="features/my_feature.yml" highlight="2-3" %} description: My feature tags: - web # ... ``` Learn more about [defining features](/docs/features). ## Building datafiles When [building datafiles](/docs/building-datafiles), Featurevisor will create separate datafiles for each tag: ``` $ tree datafiles . ├── staging │   ├── featurevisor-tag-web.json │   └── featurevisor-tag-mobile.json └── production ├── featurevisor-tag-web.json └── featurevisor-tag-mobile.json ``` ## Consuming datafile Now from your application, you can choose which datafile to load: ```js {% path="your-app/index.js" highlight="3" %} import { createInstance } from '@featurevisor/sdk' const datafileUrl = 'https://cdn.yoursite.com/production/featurevisor-tag-web.json' const datafileContent = await fetch(datafileUrl).then((res) => res.json()) const f = createInstance({ datafile: datafileContent, }) ``` Learn more about [SDKs](/docs/sdks). --- title: Testing nextjs: metadata: title: Testing description: Learn how to test your features and segments in Featurevisor with declarative specs openGraph: title: Testing description: Learn how to test your features and segments in Featurevisor with declarative specs images: - url: /img/og/docs-testing.png --- Features and segments can grow into complex configuration very fast, and it's important that you have the confidence they are working as expected. {% .lead %} We can write test specs in the same expressive way as we defined our features and segments to test them in great detail. ## Testing features Assuming we already have a `foo` feature in `features/foo.yml`: ```yml {% path="features/foo.yml" %} description: Foo feature tags: - all bucketBy: userId variablesSchema: someKey: type: string defaultValue: someValue variations: - value: control weight: 50 - value: treatment weight: 50 rules: production: - key: everyone segments: '*' percentage: 100 ``` We can create a new test spec for it in `tests` directory: ```yml {% path="tests/features/foo.spec.yml" %} feature: foo # your feature key assertions: # asserting evaluated variation # against bucketed value and context - description: Testing variation at 40% in NL environment: production at: 40 context: country: nl expectedToBeEnabled: true # if testing variations expectedVariation: control # asserting evaluated variables - description: Testing variables at 90% in NL environment: production at: 90 context: country: nl expectedToBeEnabled: true # if testing variables expectedVariables: someKey: someValue ``` The `at` property is the bucketed value (in percentage form ranging from 0 to 100) that assertions will be run against. Read more in [Bucketing](/docs/bucketing). If your project has no [environments](/docs/environments), you can omit the `environment` property in your assertions. File names of test specs are not important, but we recommend using the same name as the feature key. ## Testing segments Similar to features, we can write test specs to test our segments as well. Assuming we already have a `netherlands` segment: ```yml {% path="segments/netherlands.yml" %} description: The Netherlands conditions: - attribute: country operator: equals value: nl ``` We can create a new test spec in `tests` directory: ```yml {% path="tests/segments/netherlands.spec.yml" %} segment: netherlands # your segment key assertions: - description: Testing segment in NL context: country: nl expectedToMatch: true - description: Testing segment in DE context: country: de expectedToMatch: false ``` ## Matrix To make things more convenient when testing against a lof of different combinations of values, you can optionally make use of `matrix` property in your assertions. For example, in a feature test spec: ```yml {% path="tests/features/foo.spec.yml" %} feature: foo assertions: # define a matrix - matrix: at: [40, 60] environment: [production] country: [nl, de, us] plan: [free, premium] # make use of the matrix values everywhere description: At ${{ at }}%, in ${{ country }} against ${{ plan }} environment: ${{ environment }} at: ${{ at }} context: country: ${{ country }} plan: ${{ plan }} # match expectations as usual expectedToBeEnabled: true ``` This will then run the assertion against all combinations of the values in the matrix. {% callout type="note" title="Note about variables" %} The example above uses variables in the format `${{ variableName }}`, and there quite a few of them. Just because a lot of variables are used in above example, it doesn't mean you have to do the same. You can mix static values for some properties and use variables for others as it fits your requirements. {% /callout %} You can do the same for segment test specs as well: ```yml {% path="tests/segments/netherlands.spec.yml" %} segment: netherlands # your segment key assertions: - matrix: country: [nl] city: [amsterdam, rotterdam] description: Testing in ${{ city }}, ${{ country }} context: country: ${{ country }} city: ${{ city }} expectedToMatch: true ``` This helps us cover more scenarios by having to write less code in our specs. ## Running tests Use the Featurevisor CLI to run your tests: ```{% title="Command" %} $ npx featurevisor test ``` If any of your assertions fail in any test specs, it will terminate with a non-zero exit code. ## CLI options ### `entityType` If you want to run tests for a specific type of entity, like `feature` or `segment`: ```{% title="Command" %} $ npx featurevisor test --entityType=feature $ npx featurevisor test --entityType=segment ``` ### `keyPattern` You can also filter tests by feature or segment keys using regex patterns: ```{% title="Command" %} $ npx featurevisor test --keyPattern="myKeyHere" ``` ### `assertionPattern` If you are writing assertion descriptions, then you can filter them further using regex patterns: ```{% title="Command" %} $ npx featurevisor test \ --keyPattern="myKeyHere" \ --assertionPattern="text..." ``` ### `verbose` For debugging purposes, you can enable verbose mode to see more details of your assertion evaluations ```{% title="Command" %} $ npx featurevisor test --verbose ``` ### `quiet` You can disable all log output coming from SDK (including errors and warnings): ```{% title="Command" %} $ npx featurevisor test --quiet ``` ### `showDatafile` For more advanced debugging, you can print the datafile content used by test runner: ```{% title="Command" %} $ npx featurevisor test --showDatafile ``` Printing datafile content for each and every tested feature can be very verbose, so we recommend using this option with `--keyPattern` to filter tests. ### `onlyFailures` If you are interested to see only the test specs that fail: ```{% title="Command" %} $ npx featurevisor test --onlyFailures ``` ## NPM scripts If you are using npm scripts for testing your Featurevisor project like this: ```js {% path="package.json" %} { "scripts": { "test": "featurevisor test" } } ``` You can then pass your options in CLI after `--`: ```{% title="Command" %} $ npm test -- --keyPattern="myKeyHere" ``` --- title: Testing features nextjs: metadata: title: Testing features description: Learn how to test your features in Featurevisor with YAML specs openGraph: title: Testing features description: Learn how to test your features in Featurevisor with YAML specs images: - url: /img/og/docs.png --- This page has moved to [Testing](/docs/testing) page. --- title: Google Analytics (GA) nextjs: metadata: title: Google Analytics (GA) description: Learn how to integrate Featurevisor SDK with Google Tag Manager & Google Analytics openGraph: title: Google Analytics (GA) description: Learn how to integrate Featurevisor SDK with Google Tag Manager & Google Analytics images: - url: /img/og/docs-tracking-google-analytics.png --- This guide details how to track and analyse your experiments data in Google Analytics by tracking Featurevisor SDK's activation events via Google Tag Manager (GTM). {% .lead %} ## What is Google Analytics (GA)? Google Analytics (GA) is a web service that offers analytics solutions which allow businesses to report and analyse the behaviour of their traffic, both on web and native mobile applications. ## What is Google Tag Manager (GTM)? Google Tag Manager (GTM) is a free tag management solution that allows users to add and edit snippets of code (tags) that collect, shape and send data to your Google Analytics instance. GTM offers a user friendly UI, empowering non-technical professionals to deploy and update snippets of code autonomously, while offering them the possibility to benefit version control principles. ## Create a new tag in GTM To create a new tag, you will first need to select one of predefined templates supported in GTM. For our use case, select the GA4 tag template, which sends data directly to your Google Analytics property, where experiment results can be analysed. In the tag, set the Event Name to `featurevisor_activation` and then proceed to add all the parameters and user properties that you want to pass along with the event. Finally, add the trigger that will fire your tag. Select the Custom Event trigger and set it to match the name of the event that is pushed to the dataLayer (see next section for how-to guide). {% callout type="note" title="Naming convention" %} It is considered best practice to set event names in GA4 using the underscored format, while it is preferable to send events to the `dataLayer` using the camel-cased format. {% /callout %} ## Push activation event with metadata The SDK integration snippet below provides a guide on how to push the `featurevisorActivation` event to the dataLayer created by Google Tag Manager. ```js import { createInstance } from '@featurevisor/sdk' const f = createInstance({ hooks: [ { name: 'googleAnalytics', after: function (evaluation) { const { reason, type, featureKey, variationValue } = evaluation; // error found if (reason === 'error') { return } // not a variation evaluation if (type !== "variation") { return } const feature = f.getFeature(featureKey); // feature has no variations if (!feature || !feature.variations) { return; } // track const { userId } = f.getContext(); window.dataLayer.push({ event: 'featurevisorActivation', featureKey: featureKey, variationValue, userId, }) } } ], }) ``` ## Store metadata in custom dimensions Event Parameters and User Properties (GA4) are custom defined dimensions in which the value of a user-defined variable is stored. These assets are generally used to attach contextual information that enriches the events that are sent to your GA4 instance. To be able to filter your data and create cohorts for each of your experiments, make sure to pass the **unique key** of the feature and its variation as a Custom Dimension in your GA4 tags. To achieve this, make sure to first register a new Custom Dimension on your interface and name it `featureKey`. You can decide whether to make that Custom Dimension an Event Parameter or a User Property based on the principles of your custom analytics implementation. Once done, add the parameter to all of your GA4 tags in Google Tag Manager to attach the information to the event. ## Analyse results Now that the activation event is stored into your GA4 instance and is enriched by the feature information, you are ready to run analysis to establish the impact of the different variations of a feature. Filtering of events and users can be achieved both directly on the GA interface as well as directly using the raw data exported to Google BigQuery, which is the recommended method for those who want to achieve a deeper level of analysis and reporting flexibility. --- title: Progressive Delivery nextjs: metadata: title: Progressive Delivery description: Learn how to deliver features progressively to your users using Featurevisor. openGraph: title: Progressive Delivery description: Learn how to deliver features progressively to your users using Featurevisor. images: - url: /img/og/docs-use-cases-progressive-delivery.png --- Progressive Delivery is an advanced software release strategy that extends upon Continuous Delivery to involve techniques like feature flags, A/B testing, canary releases, and dark launches. {% .lead %} It allows you to roll out new features gradually to a subset of your users rather than a big bang release to everyone. This enables safer deployments, better user experience, and a more controlled approach to releasing software. ## Benefits - **Risk mitigation**: By rolling out features to a small group of users initially, you can detect issues early without impacting your entire user base. - **User experience**: You can gather user feedback during the early stages of the release, making necessary adjustments before the full-scale rollout. - **Performance monitoring**: Gives you room for monitoring of how new updates impact system performance and user behavior before you decide for wider rollout. - **Rapid iteration**: Faster feedback loops allow for quick iterations, letting you adapt to user needs swiftly. - **Compliance & security**: Easier to manage features in compliance with regulatory requirements (think of countries with strict data privacy laws like GDPR). {% callout type="note" title="Featurevisor's building blocks" %} It's important to learn the building blocks of Featurevisor to better understand rest of this guide: - [Attributes](/docs/attributes): building block for conditions - [Segments](/docs/segments): reusable conditions for targeting users - [Features](/docs/features): feature flags and experiments with rollout rules {% /callout %} ## Defining gradual rollouts Featurevisor allows you to define your features' rollout [rules](/docs/features/#rules) declaratively including their targeting conditions, rollout percentage, and more. A very simple feature flag can be defined as follows: ```yml {% path="features/myFeature.yml" %} description: My feature's description here tags: - all bucketBy: userId rules: production: - key: everyone segments: '*' # everyone percentage: 10 ``` In above example, we are rolling out the feature to only 10% of all our users in production environment. We could have narrowed it down even further by using [segments](/docs/segments) to target only a specific group of users. ```yml {% path="features/myFeature.yml" %} description: My feature's description here tags: - all bucketBy: userId rules: production: - key: everyone segments: germany percentage: 10 ``` Here we are defining that the feature should be exposed to 10% of users in Germany only, and not targeting 10% of all users worldwide like we did in previous example. ## Defining experiments We could also achieve something similar using [A/B tests](/docs/use-cases/experiments), where we can define multiple variations of a feature and assign a percentage to each variation. A/B testing would be more appropriate if we wish to measure two or more different variations of our feature, rather than a simple on/off toggle. ```yml {% path="features/myOtherFeature.yml" %} description: My other feature's description here tags: - all bucketBy: userId variations: # default behaviour - value: control weight: 50 # new behaviours to test - value: treatment weight: 25 - value: anotherTreatment weight: 25 everyone: production: - key: everyone segments: '*' # everyone percentage: 100 ``` In above example, we are rolling out the feature to 100% of all our users in production environment, but 50% of them will see the default behaviour (`control`), and the other 50% will see two new different behaviour (`treatment` and `anotherTreatment`). The weight distribution of the variations can be tweaked to your liking, and you can add as many variations as you want. ## Conclusion Featurevisor provides the mechanism you need to implement a Progressive Delivery strategy effectively, both with simple feature flags and also with A/B tests. Its blend of [feature management](/docs/feature-management), instant updates, and [GitOps workflow](/docs/concepts/gitops) makes it an approachable tool for engineering teams looking to improve their release processes, reduce risk, and accelerate time-to-market. --- title: Testing in production nextjs: metadata: title: Testing in production description: Learn how to coordinate testing in production with Featurevisor openGraph: title: Testing in production description: Learn how to coordinate testing in production with Featurevisor images: - url: /img/og/docs-use-cases-testing-in-production.png --- As your application grows more complex with critical features, you want to have more confidence that everything works as expected in production environment where your real users are, before any new features are exposed to them. Featurevisor can help coordinating testing in production here. {% .lead %} ## Why test in production? Testing in production is important because it is the only way to know for sure that your application behaves as expected in the real world, where your real users are. As much as we have a staging environment that mimics the production environment, it is still not the real thing. It is also the only way to know for sure that your application can handle the real traffic. For this guide here though, we are focusing primarily on the application behavior. ## Who performs the testing? It depends how your team and/or organization is structured. - **Manual**: For small teams, it can be the same team that develops the features. For larger organizations, there can be a dedicated QA (Quality Assurance) team that takes care of manually testing the flows in production. - **Automated**: It can also be automated, where a suite of integration and regression tests are run against the production environment. You may refer to them as end-to-end (e2e) tests often. ## Your application Imagine you own an e-commerce application, where you offer your users the ability to buy products, and leave reviews on products. One of the teams in your organization is working on a new feature that allows users to add products to their wishlists, and it is controlled by a feature flag called `wishlist`. ## Attributes Before continuing further with feature flags, let's have our Featurevisor attributes defined for our application. ### `userId` This attribute will be used to identify the logged in user in the runtime. ```yml {% path="attributes/userId.yml" %} description: User ID type: string ``` ### `deviceId` This attribute will be used to identify the device in the runtime, and is going to be the only ID that we can use for targeting anonymous users (those who haven't logged in yet). ```yml {% path="attributes/deviceId.yml" %} description: Device ID type: string ``` ## Feature We wish to control the `wishlist` feature with a feature flag, so that we can enable it for a subset of our users in the runtime. ```yml {% path="features/wishlist.yml" %} description: Wishlist feature for products tags: - all # because this is only exposed to logged in users bucketBy: userId rules: production: - key: everyone segments: '*' # everyone percentage: 0 # disabled for everyone now ``` We have only one rule so far, which has the feature flag disabled for everyone for now. We will begin increasing the `percentage` value after we have done some testing in production. ## Letting QA team access the feature Even though the feature itself is disabled in production for everyone, we still wish our QA team to be able to access it so they can perform their testing and let the feature owning team know about the results before they proceed to roll it out for everyone later. Given the `wishlist` feature is bucketed against `userId` attribute, we need to know the User IDs of all QA team members first. Once that's in hand, those User IDs can be embedded directly targeting an environment of the feature flag. ```yml {% path="features/wishlist.yml" %} # ... # we use the force key to force enable the feature # only if certain conditions match force: production: - conditions: - attribute: userId operator: in values: # the User IDs of QA team members - 'user-id-1' - 'user-id-2' - 'user-id-3' - 'user-id-4' - 'user-id-5' enabled: true ``` After you have [built](/docs/building-datafiles) and [deployed](/docs/deployment) your datafiles, the QA team members can access production version of your application with the `wishlist` feature enabled for them, while your regular users still see the feature disabled. {% callout type="note" title="User IDs for automated tests" %} If you are running automated tests in production without involving any QA team, then you can pass the User IDs of the test accounts that you want to run your tests against. {% /callout %} ## Making things more maintainable with segments The above approach works, but it can be a bit cumbersome to maintain as the number of QA team members (or your predefined test user accounts) grow and also when their responsibilities for testing grow beyond just one feature at a time. We can make things more maintainable by creating a new `qa` [segment](/docs/segments): ```yml {% path="segments/qa.yml" %} description: QA team members conditions: - attribute: userId operator: in values: # the User IDs of QA team members - 'user-id-1' - 'user-id-2' - 'user-id-3' - 'user-id-4' - 'user-id-5' ``` And then use it in the `wishlist` feature flag: ```yml {% path="features/wishlist.yml" %} # ... force: production: - segments: qa enabled: true # enabled for QA team members ``` From now on, every time we wish to test a new feature in production, we can just add the `qa` segment to it, and the QA team members will be able to access it. We could have also named our segment `testAccounts` instead of `qa` and include the User IDs of the test accounts instead of the QA team members. The meaning of this segment still stays the same. ## Evaluating the features with SDKs Now that the QA team members can access the feature, we need to make sure that the feature is evaluated correctly in the runtime. Initialize the SDK first: ```js {% path="your-app/index.js" %} import { createInstance } from '@featurevisor/sdk' CONST DATAFILE_URL = 'https://cdn.yoursite.com/datafile.json' const datafileContent = await fetch(DATAFILE_URL) .then((res) => res.json()) const f = createInstance({ datafile: datafileContent, }) ``` Evaluate with the right set of attributes as context: ```js const featureKey = 'wishlist' const context = { userId: 'user-id-1', deviceId: 'device-id-1', } const isWishlistEnabled = f.isEnabled(featureKey, context) if (isWishlistEnabled) { // render the wishlist feature } ``` ## Anonymous users Not all features are only exposed to logged in users. Some features are exposed to anonymous users as well. Think of a new feature that is only available in the landing page of your application, and you want to expose it to all users, regardless of whether they are logged in or not. For those cases, we can rely on the `deviceId` value, which can be generated (can be an UUID) and persisted in the client-side. For browsers, this value can be generated and stored in localStorage for example. Once we know those values, all we have to do is go and update the `qa` segment only: ```yml {% path="segments/qa.yml" %} description: QA team members conditions: or: # the User IDs of QA team members - attribute: userId operator: in values: - 'user-id-1' - 'user-id-2' - 'user-id-3' - 'user-id-4' - 'user-id-5' # the Device IDs of QA team members - attribute: deviceId operator: in values: - 'device-id-1' - 'device-id-2' - 'device-id-3' - 'device-id-4' - 'device-id-5' ``` We turned our original condition in the segment to an `or` condition, and added the `deviceId` condition to it. This way, whenever any of the conditions match, we will consider the user (either logged in or not) as a QA team member. ## Gradual rollout Once the QA team has verified that the feature works as expected: - the `qa` segment can be removed from the feature's rule, and - we can begin rolling it out to a small percentage of our real users in production. ```yml {% path="features/wishlist.yml" %} # ... rules: production: - key: everyone segments: '*' percentage: 5 # 5% of the traffic ``` As we gain more confidence, we can increase the `percentage` value gradually all the way up to `100`. ## Conclusion We have just learned how to coordinate testing in production in our organization with Featurevisor, where we can expose features to a known subset of all users who can provide us early feedback (either manually or in an automated way), and how to evaluate those features with SDKs reliably. All done while maintaining a single source of truth for managing the QA segment, and without having to deploy any code changes of our application. Featurevisor is smart enough to not include any segment (like `qa`) in generated datafiles that is not being used actively in any of the feature flags belonging to any generated datafiles, so we don't have to worry about the datafile size growing unnecessarily. --- title: Establishing feature ownership nextjs: metadata: title: Establishing feature ownership description: Learn how to handle feature ownership with Featurevisor openGraph: title: Establishing feature ownership description: Learn how to handle feature ownership with Featurevisor images: - url: /img/og/docs-use-cases-establishing-ownership.png --- With the adoption of Featurevisor, many development teams can face the challenge of managing individual feature ownership. Since all configuration is managed as files in a Git repository, it's essential to ensure that the right teams or individuals are notified and have the authority to approve changes to specific feature flags. {% .lead %} Without proper management, unintended changes could be introduced, leading to potential issues in the production environment, misaligned business goals, or security vulnerabilities. ## Code owners GitHub has a feature called [CODEOWNERS](https://docs.github.com/en/github/creating-cloning-and-archiving-repositories/about-code-owners) that allows you to define individuals or teams that are responsible for code in a repository. This feature can be used to ensure that the right people are notified and have the authority to approve changes to specific files in the repository. You can also find similar functionality in other Git hosting providers, including: - [GitLab](https://docs.gitlab.com/ee/user/project/codeowners/) - [Bitbucket](https://support.atlassian.com/bitbucket-cloud/docs/use-code-owners-to-define-owners-for-files-and-directories/) This guide assumes we are using GitHub. ## Benefits - **Clear ownership**: By specifying who owns which file, there's clear accountability for each feature. This ensures that only the responsible teams or individuals can approve changes, making it easier to track decisions back to specific entities. - **Automated review process**: GitHub automatically requests reviews from the appropriate code owners when a pull request changes any files they own. This streamlines the review process and ensures that no changes go unnoticed. - **Enhanced security**: If any malicious or unintended changes are introduced to a feature flag, they can't be merged without the consent of the owner. This layer of security is especially crucial for critical features or those that might impact user data or system stability. - **Reduced friction**: Teams don't need to manually tag or notify stakeholders when they make changes to feature flags. GitHub's `CODEOWNERS` file automatically takes care of it, reducing the overhead and potential for error. - **Documentation and transparency**: The `CODEOWNERS` file serves as a transparent documentation of ownership, which can be beneficial for new team members, auditors, or other stakeholders to understand the responsibility matrix of the project. ## Defining rules Create a new `CODEOWNERS` file in `./.github` directory first. ### Single owner If a particular feature is owned by a single team or individual, you can specify it as follows: ```{% path="./.github/CODEOWNERS" %} features/my_feature.yml @my-team ``` You can also use wildcards (`*`) to specify multiple files following a pattern: ```{% path="./.github/CODEOWNERS" %} features/payment_*.yml @payments-team ``` ### Multiple owners If a feature is owned by multiple teams or individuals: ```{% path="./.github/CODEOWNERS" %} features/my_feature.yml @my-team @another-team ``` {% callout type="note" title="Same for segments and attributes" %} Even though the examples above only mention setting up rules for features, you can do the same for [segments](/docs/segments) and [attributes](/docs/attributes) as well since everything is expressed as files in the Git repository. {% /callout %} ## Note about branch protection To ensure that the code owner's review is mandatory, you can set up branch protection rules. - Go to your repository settings - Find the "Branches" section, and - Set up a branch protection rule for your main branch - Ensure that the "Require review from Code Owners" option is checked ## How does it differ from tagging? Featurevisor supports [tagging features](/docs/features/#tags) when defining them. This results into smaller [generated datafiles](/docs/building-datafiles), so that applications can load and consume only the relevant configuration with provided SDKs. While tagging is useful for filtering, it doesn't provide any security or ownership benefits. It's also not possible to tag attributes or segments. That's where `CODEOWNERS` can help. --- title: Experiments nextjs: metadata: title: Experiments description: Learn how to experiment with A/B Tests and Multivariate Tests using Featurevisor. openGraph: title: Experiments description: Learn how to experiment with A/B Tests and Multivariate Tests using Featurevisor. images: - url: /img/og/docs-use-cases-experiments.png --- Running experiments like A/B Testing and Multivariate Testing is a powerful technique in product development for continuous learning and iterating based on feedback. Featurevisor can help manage those experiments with a strong governance model in your organization. {% .lead %} ## What is an A/B Test? An **A/B test**, also known as **split testing** or **bucket testing**, is a controlled experiment used to compare two or more variations of a specific feature to determine which one performs better. It is commonly used in fields like web development, marketing, user experience design, and product management. It is common practice to call the default/existing behaviour as `control` variation, and the new/experimental behaviour as `treatment` variation. ## Why run A/B Tests? The primary goal of an A/B test is to measure the impact of the variations on predefined metrics or key performance indicators (KPIs). These metrics can include conversion rates, click-through rates, engagement metrics, revenue, or any other measurable outcome relevant to the experiment. By comparing the performance of the different variants, statistical analysis is used to determine if one variant outperforms the others with statistical significance. This helps decision-makers understand which variant is likely to have a better impact on the chosen metrics. ## Process of running an A/B Test A/B testing follows a structured process that typically involves the following steps: 1. **Research and identify**: Find a customer or business problem and turn it into a testable hypotheses by determining the specific element, such as a webpage, design element, pricing model, or user interface component, that will be subjected to variation. 2. **Power analysis**: Determine if there's enough traffic or users to run the experiment and achieve statistical significance. 3. **Create variations**: Develop multiple versions of the element, ensuring they are distinct and have measurable differences. 4. **Split traffic or users**: Randomly assign users or traffic into separate groups, with each group exposed to a different variant. 5. **Run the experiment**: Implement the variants and collect data on the predefined metrics for each group over a specified period. 6. **Analyze the results**: Use statistical analysis to compare the performance of the variants and determine if any differences are statistically significant. 7. **Make informed decisions**: Based on the analysis, evaluate which variation performs better and whether it should be implemented as the new default or further optimized. ## What about Multivariate Testing? A multivariate test is an experimentation technique that allows you to simultaneously test multiple variations of multiple elements or factors within a single experiment. Unlike A/B testing, which focuses on comparing two or more variants of a single element, multivariate testing involves testing combinations of elements and their variations to understand their collective impact on user behavior or key metrics. ## Difference between A/B Tests and Multivariate Tests Often times, A/B tests with 3 or more variations are referred to as A/B/n tests. We are considering them both as A/B tests in this guide. | | A/B Tests | Multivariate Tests | | ------------------------ | ------------------------------------------------------------------ | -------------------------------------------------------------------------------------------------- | | Purpose | Compare two or more variants of a single element | Simultaneously test multiple elements and variations | | Variants | Two or more variants (Control and Treatment) | Multiple variants for each element being tested | | Scope | Focuses on one element at a time | Tests combinations of elements and their variations | | Complexity | Relatively simpler to set up and analyze | and analyze More complex to set up and analyze | | Statistical Significance | Typically requires fewer samples to achieve significance | Requires larger sample sizes to achieve significance | | Insights | Provides insights into the impact of individual changes | Provides insights into the interaction between changes | | Test Duration | Generally shorter duration | Often requires longer duration to obtain reliable results | | Examples | Ideal for testing isolated changes like UI tweaks, copy variations | Useful for testing multi-factor changes like page redesigns, interaction between multiple elements | ## Our application For this guide, let's say our application consists of a landing page containing these elements: - **Hero section**: The main section of the landing page, which includes: - headline - subheading, and - call-to-action (CTA) button We now want to run both A/B Tests and Multivariate Tests using Featurevisor. {% callout type="note" title="Understanding the building blocks" %} Before going further in this guide, you are recommended to learn about the building blocks of Featurevisor to understand the concepts used in this guide: - [Attributes](/docs/attributes): building block for conditions - [Segments](/docs/segments): conditions for targeting users - [Features](/docs/features): feature flags and variables with rollout rules - [SDKs](/docs/sdks): how to consume datafiles in your applications The [quick start](/docs/quick-start) can be very handy as a summary. {% /callout %} ## A/B Test on CTA button Let's say we want to run an A/B Test on the CTA button in the Hero section of your landing page. The two variations for a simple A/B test experiment would be: - **control**: The original CTA button with the text "Sign up" - **treatment**: The new CTA button with the text "Get started" We can express that in Featurevisor as follows: ```yml {% path="features/ctaButton.yml" %} description: CTA button tags: - all bucketBy: deviceId variations: - value: control description: Original CTA button weight: 50 - value: treatment description: New CTA button that we want to test weight: 50 rules: production: - key: everyone segments: '*' # everyone percentage: 100 # 100% of the traffic ``` We just set up our first A/B test experiment that is: - rolled out to 100% of our traffic to everyone - with a 50/50 split between the `control` and `treatment` variations - to be bucketed against `deviceId` attribute (since we don't have the user logged in yet) {% callout type="note" title="Importance of bucketing" %} Featurevisor relies on bucketing to make sure the same user or anonymous visitor always sees the same variation no matter how many times they go through the flow in your application. This is important to make sure the user experience is consistent across devices (if user's ID is known) and sessions. You can read further about bucketing in these pages: - [Bucketing concept](/docs/bucketing/) - [Defining bucketing in features](/docs/features/#bucketing) {% /callout %} The `deviceId` attribute can be an unique UUID generated and persisted at client-side level where SDK evaluates the features. If we wanted to a more targeted rollout, we could have used [segments](/docs/segments/) to target specific users or groups of users: ```yml {% path="features/ctaButton.yml" %} # ... rules: production: - key: nl segments: - netherlands - iphoneUsers percentage: 100 # enabled for iPhone users in NL only - key: everyone segments: '*' percentage: 0 # disabled for everyone else ``` You can read further how segments are defined in a feature's rollout rules [here](/docs/features/#segments). ## Evaluating feature with SDKs Now that we have defined our feature, we can use Featurevisor SDKs to evaluate the CTA button variation in the runtime, assuming we have already [built](/docs/building-datafiles/) and [deployed](/docs/deployment/) the datafiles to our CDN. For Node.js and browser environments, install the JavaScript SDK: ```{% title="Command" %} $ npm install --save @featurevisor/sdk ``` Then, initialize the SDK in your application: ```js {% path="your-app/index.js" %} import { createInstance } from '@featurevisor/sdk' const DATAFILE_URL = 'https://cdn.yoursite.com/datafile.json' const datafileContent = await fetch(DATAFILE_URL) .then((res) => res.json()) const f = createInstance({ datafile: datafileContent, }) ``` Now we can evaluate the `ctaButton` feature wherever we need to render the CTA button: ```js const featureKey = 'ctaButton' const context = { deviceId: 'device-123', country: 'nl', deviceType: 'iphone', } const ctaButtonVariation = f.getVariation(featureKey, context) if (ctaButtonVariation === 'treatment') { // render the new CTA button return 'Get started' } else { // render the original CTA button return 'Sign up' } ``` Here we see only two variation cases, but we could have had more than two variations in our A/B test experiment. ## Multivariate Test on Hero element Let's say we want to run a Multivariate Test on the Hero section of your landing page. Previously we only ran an A/B test on the CTA button's text, but now we want to run a Multivariate Test on the Hero section affecting some or all its elements. We can map our requirements in a table below: | Variation | Headline | CTA button text | | ---------- | ----------- | --------------- | | control | Welcome | Sign up | | treatment1 | Welcome | Get started | | treatment2 | Hello there | Sign up | | treatment2 | Hello there | Get started | Instead of creating a separate feature per element, we can create a single feature for the Hero section and define multiple variables for each element. The relationship can be visualized as: - one **feature** - having multiple **variations** - each variation having its own set of **variable values** ```yml {% path="features/hero.yml" %} description: Hero section tags: - all bucketBy: deviceId # define a schema of all variables # scoped under `hero` feature first variablesSchema: headline: type: string defaultValue: Welcome ctaButtonText: type: string defaultValue: Sign up variations: - value: control weight: 25 - value: treatment1 weight: 25 variables: # we only need to define variables inside variations, # if the values are different than the default values ctaButtonText: Get started - value: treatment2 weight: 25 variables: headline: Hello there - value: treatment3 weight: 25 variables: headline: Hello there ctaButtonText: Get started rules: production: - key: everyone segments: '*' percentage: 100 ``` We just set up our first Multivariate test experiment that is: - rolled out to 100% of our traffic to everyone - with an even 25% split among all its variations - with each variation having different values for the variables ## Evaluating variables In your application, you can access the variables of the `hero` feature as follows: ```js {% path="your-app/index.js" %} const featureKey = 'hero' const context = { deviceId: 'device-123' } const headline = f.getVariable(featureKey, 'headline', context) const ctaButtonText = f.getVariable(featureKey, 'ctaButtonText', context) ``` Use the values inside your hero element (component) when you render it. ## Tracking activations We understood how to create features for defining simple A/B tests and also more complex multivariates using variables in Featurevisor, and then evaluate them in the runtime in our applications when we need those values. But we also need to track the performance of our experiments to understand which variation is doing better than others. This is where [hooks API](/docs/sdks/javascript/#hooks) come in handy. Featurevisor SDK provides a way to register hooks that can be used to intercept the evaluation process and perform custom actions: ```js {% path="your-app/index.js" %} import { createInstance } from '@featurevisor/sdk' const f = createInstance({ datafile: '...', hooks: [ { name: 'trackActivationsHook', // this hook will be called after each variation evaluation after: function (evaluation) { const { reason, type, featureKey, variationValue } = evaluation; // error found if (reason === 'error') { return } // not a variation evaluation if (type !== "variation") { return } const feature = f.getFeature(featureKey); // feature has no variations if (!feature || !feature.variations) { return; } // track const { userId } = f.getContext(); const trackPayload = { event: 'featurevisorActivation', featureKey, variationValue, userId, } // send the trackPayload to your analytics service } } ], }) ``` As an example, you can refer to the guide of [Google Tag Manager](/docs/tracking/google-tag-manager) for tracking purposes. {% callout type="note" title="Featurevisor is not an analytics platform" %} It is important to understand that Featurevisor is not an analytics platform. It is a feature management tool that helps you manage your features and experiments with a Git-based workflow, and helps evaluate your features in your application with its SDKs. {% /callout %} ## Mutually exclusive experiments Often times when we are running multiple experiments together, we want to make sure that they are mutually exclusive. This means that a user should not be bucketed into multiple experiments at the same time. In more plain words, the same user should not be exposed to multiple experiments together, and only one experiment at a time avoiding any overlap. One example: if User X is exposed to feature `hero` which is running our multivariate test, then the same User X should not be exposed to feature `wishlist` which is running some other A/B test in the checkout flow of the application. For those cases, you are recommended to see the [Groups](/docs/groups/) functionality of Featurevisor, which will help you achieve exactly that without your applications needing to do any extra code changes at all. ## Further reading You are highly recommended to read and understand the building blocks of Featurevisor which will help you make the most out of this tool: - [Attributes](/docs/attributes/): building block for conditions - [Segments](/docs/segments/): conditions for targeting users - [Features](/docs/features/): feature flags and variables with rollout rules - [Groups](/docs/groups/): mutually exclusive features - [SDKs](/docs/sdks/): how to consume datafiles in your applications ## Conclusion We learned how to use Featurevisor for: - Creating both simple A/B tests and more complex Multivariate tests - evaluate them in the runtime in our applications - track the performance of our experiments - activate the features when we are sure that the user has been exposed to them - make multiple experiments mutually exclusive if we need to Featurevisor can be a powerful tool in your experimentation toolkit, and can help you run experiments with a strong governance model in your organization given every change goes through a Pull Request in your Git repository and nothing gets merged without reviews and approvals. --- title: Managing Feature Dependencies nextjs: metadata: title: Managing Feature Dependencies description: Learn how to manage feature dependencies using Featurevisor openGraph: title: Managing Feature Dependencies description: Learn how to manage feature dependencies using Featurevisor images: - url: /img/og/docs-use-cases-dependencies.png --- Imagine you're setting up a chain of dominoes. Each domino is set to fall only if the one before it does. In much the same way, Featurevisor introduces the concept of dependent feature flags, where one feature's availability can depend on another. {% .lead %} This guide will walk you through how this powerful functionality can be a game-changer for any type of applications. ## When to use dependent feature flags? - **Sequential rollouts**: When we want to roll out features in a specific order. - **Feature combinations**: When one feature enhances or relies on another. - **Complex A/B tests**: When we want to test multiple interrelated features together. - **Conditional access**: When certain features should only be accessible under specific conditions. ## Example scenario Let's say we have an e-commerce website and we want to introduce two new features: 1. **One-Click checkout**: Allows users to complete their purchase in a single click. 1. **Express shipping**: Offers faster delivery options for an extra fee. ## The dependency For some business reason, we decide that the "**Express shipping**" feature should only be available if the "**One-Click checkout**" feature is enabled in the first place for the user. In other words, "**Express shipping**" requires "**One-Click checkout**". {% callout type="note" title="Learn the building blocks" %} Before proceeding further, you are advised to learn the building blocks of Featurevisor to understand how features are defined declaratively: - [Attributes](/docs/attributes) - [Segments](/docs/segments) - [Features](/docs/features) {% /callout %} ## Defining our features Assuming we have already set up our "**One-Click checkout**" feature as below: ```yml {% path="features/oneClickCheckout.yml" %} description: One click checkout bucketBy: userId tags: - checkout rules: production: - key: everyone segments: '*' # Everyone percentage: 100 ``` We can then require the `oneClickCheckout` feature when we defined our "**Express shipping**" feature: ```yml {% path="features/expressShipping.yml" %} description: Express shipping bucketBy: userId tags: - checkout # define dependencies here required: - oneClickCheckout rules: production: - key: everyone segments: '*' # Everyone percentage: 100 ``` With just two lines above, we declared our dependency between the two features without writing any complex code. {% callout type="note" title="Learn more about requiring features" %} Further guides on how to define dependencies between features: - [Defining required features](/docs/features/#required) {% /callout %} ## Using the SDK When evaluating the `expressShipping` feature, Featurevisor SDK will automatically check if the `oneClickCheckout` feature is enabled for the user first internally. If not, the `expressShipping` feature will be disabled. ```js const featureKey = 'expressShipping' const context = { userId: 'user-123' } const isExpressShippingEnabled = f.isEnabled(featureKey, context) ``` ## Adding a twist: A/B testing Above example was very simple since we were only checking if features were enabled or not. But what if we require a feature to be enabled only if a specific variation of another feature is enabled? Let's say we want to test two variations in our `oneClickCheckout` feature: ```yml {% path="features/oneClickCheckout.yml" %} description: One click checkout bucketBy: userId tags: - checkout # new variations introduced here variations: - value: control weight: 50 - value: treatment weight: 50 rules: production: - key: everyone segments: '*' # Everyone percentage: 100 ``` We have a 50-50 split between two variations `control` and `treatment`. Now, we want to enable the `expressShipping` feature only if the `treatment` variation of `oneClickCheckout` is evaluated for the user. {% callout type="note" title="Mutually exclusive experiments" %} It's important to understand that we are not talking about mutually exclusive experiments here. To learn about that, please refer to the following guides: - [Groups](/docs/groups) - [Experiments](/docs/use-cases/experiments) {% /callout %} We can express that as a requirement in our `expressShipping` feature: ```yml {% path="features/expressShipping.yml" %} description: Express shipping bucketBy: userId tags: - checkout # define dependencies here required: - key: oneClickCheckout variation: treatment rules: production: - key: everyone segments: '*' # Everyone percentage: 100 ``` Now whenever we evaluate the `expressShipping` feature, Featurevisor SDK will automatically check if the `oneClickCheckout` feature is enabled for the user and if the user has been bucketed into its `treatment` variation first. {% callout type="note" title="Learn more about experimentation" %} We have several guides helping you understand how experiments work in Featurevisor using A/B tests and multivariate tests: - [Experiments](/docs/use-cases/experiments) - [Defining variations](/docs/features/#variations) - [Defining bucketing rule](/docs/features/#bucketing) - [Bucketing concept](/docs/bucketing/) {% /callout %} ## Benefits of using Featurevisor - **Simplicity**: No need for complicated hardcoded logic to manage feature dependencies. Declare it once and you're done. - **Flexibility**: Easily run complex A/B tests involving multiple dependent features. - **Control**: Decide not just whether a feature is on or off, but also under what conditions it should be available. - **Reduced risk**: By controlling feature dependencies, you can ensure that users experience features in a coherent and logical manner, reducing the risk of errors or confusion. - **Collaboration**: Featurevisor's [GitOps](/docs/concepts/gitops) workflow makes it easy for cross-functional teams to collaborate on feature changes in one single place avoiding any issues arising from lack of awareness and visibility. ## Circular dependencies Featurevisor's [linting](/docs/linting) step makes sure that you don't introduce any circular dependencies between your features. This will happen if you try to require a feature that requires the current feature. In our case that would be `expressShipping` requiring `oneClickCheckout` which requires `expressShipping`. Even if you try to do that, Featurevisor will throw an error and prevent you from building the [datafiles](/docs/building-datafiles) saving you from a lot of headache. ## Conclusion Managing feature flags in any type of application becomes incredibly straightforward with Featurevisor. Its ability to handle dependent feature flags through a simple, declarative approach offers a powerful tool for businesses to roll out and test new features in a controlled, logical manner. So next time you're setting up those dominoes, remember: Featurevisor ensures they fall exactly the way you want them to. --- title: User entitlements nextjs: metadata: title: User entitlements description: Learn how to manage user entitlements using Featurevisor openGraph: title: User entitlements description: Learn how to manage user entitlements using Featurevisor images: - url: /img/og/docs-use-cases-entitlements.png --- As your application grows in number of features, it can lead you to offer your services via different plans to your users. Where each plan can come with its own set of entitlements (activities that users are allowed to perform, aka permissions). {% .lead %} ## Your application Imagine you own a social media application, where you offer your users the ability to create posts, like posts, and comment on posts. Users can sign up for free and start liking and commenting on others' posts. But to be able to create new posts themselves, they have to buy a premium plan. ## Mapping entitlements against plans We can map out the entitlements of your users (what they can do) against the different plans you intend to offer as follows: | Entitlement | Free Plan | Premium Plan (paid) | | ---------------- | --------- | ------------------- | | Like Posts | ✅ | ✅ | | Comment on Posts | ✅ | ✅ | | Create Posts | ❌ | ✅ | ## User Profile service For the sake of this guide, let's assume you already have a User Profile service, that allows your application to know in the runtime what plan the currently logged in user is on. Response of the said User Profile service can be like this: ```js // GET /profile { "id": "", "name": "Erlich Bachman", "plan": "premium", // or `free` "country": "us" } ``` ## Attributes Let's start defining our Featurevisor attributes for your application. We will use them throughout this guide at various stages. ### `userId` This attribute will be used to identify the user in the runtime. The `id` field from the response of the User Profile service will be used for this purpose. ```yml {% path="attributes/userId.yml" %} description: User ID type: string ``` ### `country` This attribute will be used to identify the country of the user in the runtime. The `country` field from the response of the User Profile service will be used for this purpose. ```yml {% path="attributes/country.yml" %} description: Country codes in lowercase like us, nl, de, etc. type: string ``` ## Feature We will be creating a new feature called `plan` that will be used to control the entitlements of your users against various different plans. ```yml {% path="features/plan.yml" %} description: Plans and their entitlements against known User tags: - all bucketBy: userId # we define a variable called `entitlements`, # that will be an array of strings variablesSchema: entitlements: type: array defaultValue: - likePosts - commentOnPosts # we aren't running an experiment here, # and will rely on sticky features for users, # therefore weight distribution of variations are not relevant variations: - value: free weight: 100 - value: premium weight: 0 variables: entitlements: - likePosts - commentOnPosts - createPosts # extra entitlement for premium users only # this is a core application config, # and is recommended to be rolled out to 100% of the traffic rules: production: - key: everyone segments: '*' percentage: 100 ``` ## Evaluating entitlements with SDKs Now that we have defined our feature, we can use Featurevisor SDKs to evaluate the entitlements of your users in the runtime. First, initialize the SDK: ```js {% path="your-app/index.js" %} import { createInstance } from '@featurevisor/sdk' const DATAFILE_URL = 'https://cdn.yoursite.com/datafile.json' const datafileContent = await fetch(DATAFILE_URL) .then((res) => res.json()) const f = createInstance({ datafile: datafileContent, }) ``` Fetch your User's ID and Plan info from your User Profile service and make it available: ```js const userProfile = await fetch('https://api.yoursite.com/profile') .then((res) => res.json()) ``` Set sticky features in the SDK for known user: ```js // we want our known user to be always bucketed // into the same plan (variation) as User Profile service suggests f.setSticky({ plan: { enabled: true, variation: userProfile.plan, }, }) ``` Get available entitlements for the known user: ```js const featureKey = 'plan' const variableKey = 'entitlements' const context = { userId: userProfile.id, country: userProfile.country, } const entitlements = f.getVariable(featureKey, variableKey, context) ``` The `entitlements` variable will contain an array of all entitlements the user should have against their current plan. ```js const canCreatePosts = entitlements.includes('createPosts') const canLikePosts = entitlements.includes('likePosts') const canCommentOnPosts = entitlements.includes('commentOnPosts') ``` ## Managing entitlements in one place As your entitlements and number of plans grow, you can use Featurevisor to manage them all declaratively in one place. Your custom User Profile service only needs to be aware of the plan of the user, and nothing more unless you have any custom user specific overrides. Since Featurevisor JavaScript SDK is universal and works in both Node.js and browser environments, you can use it to evaluate your users' entitlements in your backend as well as in frontend. {% callout type="note" title="Always verify in backend" %} Please note that entitlements check in frontend is never a substitute for backend checks. You should always check entitlements in your backend before performing any action. {% /callout %} ## User overrides It is possible in specific circumstances, that you may want to override the entitlements of a user irrespective of what plan they are on. For example, if a user is on a free plan, but you want to give them access to create posts for free for a limited time. We can expect our User Profile service to optionally provide the override information given this is about a specific individual user: ```js // GET /profile { "id": "", "plan": "free", "country": "us", // optional field for overrides "overrideEntitlements": [ "likePosts", "commentOnPosts", "createPosts" ] } ``` We can then use the `overrideEntitlements` field from User Profile and set it as a sticky feature in Featurevisor SDK: ```js f.setStickyFeatures({ plan: { enabled: true, variation: userProfile.plan, variables: userProfile.overrideEntitlements ? // user overrides { entitlements: userProfile.overrideEntitlements } : // otherwise leave empty {}, }, }) ``` You can now continue evaluating entitlements as before using the SDK, and the user will have the overridden entitlements. ## Conditional entitlements It is possible that you may want to offer a specific entitlement to your users based on their location. We aren't talking about running experiments here targeting one specific country, but more like an entitlement that can only ever be available in one single country only. For the sake of this guide, let's assume your social media app can legally allow your users to upload videos in the US only in `premium` plan, and nowhere else. We can declare that config in our feature's definition as follows: ```yml {% path="features/plan.yml" %} # ... variations: - value: free weight: 100 - value: premium weight: 0 variables: entitlements: - likePosts - commentOnPosts - createPosts variableOverrides: entitlements: - conditions: - attribute: country operator: equals value: us value: - likePosts - commentOnPosts - createPosts - uploadVideos # for US users only # ... ``` The entitlements array may look repetitive here, but you can also take an approach of breaking down your entitlements into multiple variables instead of one as you see fit. ## Separate variables per entitlement If you do not wish to have a single variable for all entitlements, you can break them down into multiple variables as follows: ```yml {% path="features/plan.yml" %} description: Plans and their entitlements against known User tags: - all bucketBy: userId variablesSchema: canLikePosts: type: boolean defaultValue: true canCommentOnPosts: type: boolean defaultValue: true canCreatePosts: type: boolean defaultValue: false canUploadVideos: type: boolean defaultValue: false variations: - value: free weight: 100 - value: premium weight: 0 variables: canCreatePosts: true canUploadVideos: false variableOverrides: canUploadVideos: - conditions: - attribute: country operator: equals value: us value: true rules: production: - key: everyone segments: '*' percentage: 100 ``` This will then require you to evaluate each entitlement separately in your application code using Featurevisor SDKs: ```js const canCreatePosts = f.getVariable('plan', 'canCreatePosts', context) if (canCreatePosts) { // show create post button } ``` ## Conclusion When your application and its architecture grows big, and you have multiple teams working and shipping in a distributed fashion, it can become hard to manage entitlements in one place. Having them declared in one place as a single source of truth can help you manage them better, and also help you avoid any accidental entitlements leaks. --- title: Decouple feature releases from application deployments nextjs: metadata: title: Decouple feature releases from application deployments description: Learn how to decouple your feature releases from your application deployments using Featurevisor. openGraph: title: Decouple feature releases from application deployments description: Learn how to decouple your feature releases from your application deployments using Featurevisor. images: - url: /img/og/docs-use-cases-decouple-releases-from-deployments.png --- In today's fast-paced software development world, agility is key. Traditional deployment methods, where every new feature or change requires an application update, can slow down a development team and introduce potential risks. This is where the importance of decoupling comes in. {% .lead %} Decoupling your application deployments from your feature flags, A/B tests, and remote configurations can bring significant advantages to your development workflow, and Featurevisor is designed to help you achieve just that. ## Benefits - **Faster time-to-market**: Decoupling allows you to release features as soon as they are developed and tested, without having to wait for the next scheduled application deployment. - **Reduced risk**: By separating feature releases from application deployments, you can enable or disable features without impacting the entire application, reducing the risk of introducing bugs or other issues. - **Increased flexibility**: Decoupling enables you to target features to specific user segments or roll them out progressively, giving you greater control over how new functionalities are introduced. - **Simplified rollbacks**: If a feature introduces unexpected issues, you can easily roll it back without having to revert the entire application deployment. - **Resource optimization**: Your development and operations teams can focus on their respective tasks more efficiently, as they are not tied up coordinating large-scale deployments for every new feature. - **Better user experience**: You can introduce new features or changes progressively, gathering user feedback and making adjustments in real-time, without forcing users to update/refresh the apps, thus improving the overall user experience. ## How Featurevisor helps achieve decoupling ### Independent Git repository With Featurevisor, your feature configurations are stored in a separate Git repository. This repository acts as your Featurevisor project, completely independent of your application's codebase. ### GitOps workflow Featurevisor adopts a [GitOps workflow](/docs/concepts/gitops), ensuring that all changes to your feature flags or configurations go through Pull Requests. This makes the process of adding or modifying features transparent and auditable in one single place for your entire organization, irrespective of how many different applications or services you have. ### Datafile and CDN When Pull Requests are merged into your Featurevisor project, it automatically [builds datafiles](/docs/building-datafiles) (static JSON configuration files) and [deploys](/docs/deployment) them to your preferred CDN. Your applications can then consume these datafiles at runtime using provided [SDKs](/docs/sdks). ### Real-time changes Because your applications consume datafiles from the CDN, any changes to your feature configurations are propagated in real-time. This means you can toggle features on or off instantly without having to redeploy your applications. ### Cloud-Native and language agnostic Featurevisor is [cloud-native](/docs/concepts/cloud-native-architecture) and can be used with any programming language (assuming their SDK exists already), giving you the flexibility to integrate it into any part of your stack, be it frontend or backend. ## Conclusion Decoupling application deployments from feature releases offers numerous advantages in terms of risk reduction, speed, and efficiency. Featurevisor provides a robust set of tools to help you achieve this decoupling, making it easier to manage your features in a more agile, responsive manner. By adopting Featurevisor, you are not just adding another tool to your stack; you are adopting a smarter way to manage features and deliver value to your users. --- title: Trunk-based Development nextjs: metadata: title: Trunk-based Development description: Learn how to achieve trunk-based development using Featurevisor openGraph: title: Trunk-based Development description: Learn how to achieve trunk-based development using Featurevisor images: - url: /img/og/docs-use-cases-trunk-based-development.png --- Trunk Based Development is a software development approach where all developers work in short-lived branches or directly in the trunk, which is the main codebase. {% .lead %} The key principle is to integrate code changes frequently, ideally several times a day, to enable quicker detection and resolution of conflicts or bugs. ## What do we mean by trunk? By "**trunk**", we mean the main branch of your repository. If you are using Git, it is usually `main` or `master` branch. ## Benefits - **Faster feedback loop**: Frequent merges mean quicker identification of code issues, allowing for immediate action. - **Reduced merge conflicts**: Short-lived branches minimize the divergence from the trunk, reducing the likelihood of complicated merge conflicts. - **Simplified debugging**: With smaller, more frequent merges, isolating the changes that introduced a bug becomes easier. - **Accelerated release cycles**: The approach aligns well with Continuous Integration/Continuous Deployment (CI/CD) practices, facilitating quicker releases. - **Improved collaboration**: Frequent integrations mean that developers are naturally more aligned with each other's changes, promoting a more collaborative environment. ## Potential disadvantages - **Code instability**: Frequent merges can introduce instability into the main codebase if not adequately tested. - **Overhead**: The need for frequent integrations and testing can be taxing on development and operations teams. - **Learning curve**: Developers accustomed to long-lived branches may find it challenging to adapt to the quick pace and frequent merging resulting from adopting Trunk-Based Development. ## How Featurevisor helps ### Incremental changes Featurevisor's feature management capabilities in the form of feature flags, a/b tests, and rollout rules enable you to merge code into the trunk even if it's not fully complete. The feature can be hidden behind a flag until it's ready for production. ### Collaboration Featurevisor adopts [GitOps workflow](/docs/concepts/gitops) for managing features. This ensures that changes are reviewed and version-controlled, aligning perfectly with the principles of Trunk Based Development. {% callout type="note" title="Separation of repositories" %} It is very important to understand that a Featurevisor project with all its feature configurations is a separate repository from your main application codebase. This enables you to manage and release your features independently of your application code deployments. {% /callout %} ### Quick iterations Featurevisor propagates feature flag changes instantly, allowing you to toggle features on or off without redeploying your application. This way feature releases are decoupled from your actual application code deployments, enabling you to iterate quickly. ### A/B testing for feedback Featurevisor’s built-in support for [A/B testing](/docs/use-cases/experiments) allows you to validate the impact of new features quickly, which aligns with quick feedback loops resulting from Trunk Based Development. ### Targeted releases Using Featurevisor’s audience targeting capabilities, you can roll out features to [segmented audiences](/docs/segments), thus reducing the risk associated with frequent code merges into the trunk. You can also see [Progressive Delivery](/docs/use-cases/progressive-delivery) for more details around gradual rollouts. ## Conclusion Trunk Based Development offers a range of benefits, especially for organizations looking to accelerate their development cycles and improve collaboration. Featurevisor complements this approach by providing the tools needed for safe, incremental changes, quick iterations, and effective feature management. While Trunk Based Development has its challenges, the robust capabilities of Featurevisor can significantly aid in mitigating the risks. --- title: Deprecating feature flags safely nextjs: metadata: title: Deprecating feature flags safely description: Learn how to deprecate features and experiments safely with Featurevisor openGraph: title: Deprecating feature flags safely description: Learn how to deprecate features and experiments safely with Featurevisor images: - url: /img/og/docs-use-cases-deprecation.png --- Deprecating feature flags is an essential part of feature flag lifecycle management. As our application and organization grow, some features become permanent while others may be discarded. {% .lead %} Proper management of these features and a/b test experiments prevents clutter, reduces technical debt, and maintains the efficiency and performance of our application's codebase. ## What is deprecation? In software development, deprecation refers to the practice of marking certain features, functionalities, or elements in a codebase as obsolete, outdated, or no longer recommended for use. This is often done when a feature or functionality is: - replaced by a newer or more efficient version, or - when it's decided that the feature is no longer necessary ## Deprecating feature flags Similar to any regular functionalities in software, feature flags can also be deprecated. When a feature flag is deprecated, it means that the feature flag is no longer needed and should be removed from the codebase. ## Grace period When deprecating a feature flag, it's a good practice to provide a grace period for developers to remove the usage of that feature flag from the codebase. This makes sure we are not unintentionally breaking the application for users who are still using the deprecated feature. The grace period can be a few days, weeks, or months, depending on the complexity of the feature and the number of places it's used in the codebase. Ultimately it's an organization level decision. ## How does it work in Featurevisor? Given Featurevisor allows us to manage both feature flags and [experiments](/docs/use-cases/experiments) as [features](/docs/features) declaratively, it makes it very convenient for us to mark some of them as [deprecated](/docs/features/#deprecating). We can do that right from the same file where we define our feature: ```yml {% path="features/my_feature.yml" %} description: My feature description... tags: - web - ios - android # we set `deprecated` to `true` deprecated: true # ... ``` Notice the usage of `deprecated: true` above. That's all we need to do to mark a feature as deprecated, and Featurevisor will take care of the rest. {% callout type="note" title="Deprecating attributes and segments" %} Unlike features, [attributes](/docs/attributes) and [segments](/docs/segments) cannot be deprecated. Featurevisor's [datafile builder](/docs/building-datafiles) is smart enough to not include attributes and segments that are not used in your desired features. That's why we don't need to worry about deprecating them, and can archive or delete them directly if needed. {% /callout %} ## Warnings in applications When a feature is marked as deprecated, Featurevisor [SDKs](/docs/sdks/) will automatically show a warning in the applications that are evaluating the deprecated feature. This lets developers know immediately that the feature is deprecated and should be removed from the codebase. Optionally, we can take over the warning messages via the [logger API](/docs/sdks/javascript/#logging) of SDKs if we wish to customize the warning messages further or track them via our preferred logging and monitoring system. ## Deleting deprecated features Once the development teams have removed the deprecated feature flags from the codebase, and no warnings are visible anywhere in the applications, we can safely [archive](/docs/features/#archiving) or delete the deprecated feature from Featurevisor. ## Conclusion Creating new functionalities and having them managed via several feature flags and experiments is a common practice in modern software development. However, it's equally important to manage these features' lifecycle properly. We do not wish to find ourselves in a situation where we only keep creating new features and never take the time to clean them up. This affects both the performance and maintainability of our applications. With Featurevisor, we can embrace its highly declarative way of managing features and experiments effectively. Including deprecating them when they are no longer needed, providing a grace period for developers to remove them, and finally deleting them from the system safely. --- title: Microfrontends Architecture nextjs: metadata: title: Microfrontends Architecture description: Learn how to manage your features in a microfrontends architecture. openGraph: title: Microfrontends Architecture description: Learn how to manage your features in a microfrontends architecture. images: - url: /img/og/docs-use-cases-microfrontends.png --- Microfrontends are a way of architecting your frontend application as a composition of loosely coupled features. Each feature is owned by a separate team, and can be developed and deployed independently. {% .lead %} In this guide, we will learn how Featurevisor and microfrontends architecture can complement each other, enabling your organization to ship faster with more confidence. ## Benefits Going deep into microfrontends architecture is not the goal of this guide. But we will briefly mention some of its key benefits: - **Development**: Each team can develop their features independently allowing work to be done in parallel - **Tech stack**: Each microfrontend can be developed using the technology that best suits the team - **Deployment**: Each microfrontend can be deployed independently at their own pace - **Ownership**: Each team can own their own feature and be solely responsible for it - **Rolling back**: If a particular microfrontend breaks, it can be rolled back without affecting the rest of the application See this talk by [Luca Mezzalira](https://twitter.com/lucamezzalira) for further explanation [here](https://www.youtube.com/watch?v=BuRB3djraeM). ## Challenges With all the freedom and autonomy, microfrontends architecture also comes with its own set of challenges, especially when it comes to [feature management](/docs/feature-management): - **Consistency**: It can be hard to maintain consistency of features across all microfrontends - **Overlaps**: Some features may overlap across multiple microfrontends - **Anonymous vs authenticated users**: It is possible that some microfrontends are only accessible to either anonymous or authenticated users, or even both, making [consistent bucketing](/docs/bucketing) tricky - **Reviews and approvals**: It can be hard to keep track of and coordinate all the features and their changes across all microfrontends Rest of this guide will help us understand how Featurevisor can help mitigate these concerns in your team and organization. ## Frameworks There are several frameworks and tools that can help achieve this architecture, such as: - [Module federation](https://webpack.js.org/concepts/module-federation/) (webpack) - [single-spa](https://single-spa.js.org/) - [Piral](https://piral.io/) ## Your application Imagine you have an e-commerce application, where you offer your users the ability to: - browse products - sign up and in - add to cart and buy products, and - manage their account We can approach it as a microfrontends architecture next. ## Mapping activities against microfrontends We can map all these activities to their own microfrontend as follows: | Microfrontend | Activities | Path | Access | | ------------- | --------------- | ----------- | ------------------- | | `products` | Browse products | `/products` | Everyone | | `signup` | Sign up | `/signup` | Anonymous users | | `signin` | Sign in | `/signin` | Anonymous users | | `checkout` | Buy products | `/checkout` | Everyone | | `account` | Manage account | `/account` | Authenticated users | Each microfrontend will be accessible via its own URL path, like `yoursite.com/products`, `yoursite.com/signup`, etc. {% callout type="note" title="Advanced microfrontends" %} We are using a very simple example here. But in a real world application, each microfrontend can be much more advanced, where a microfrontend can be: - taking over a single route, or - taking over a group of routes, or - rendering one or more components on a page that is already owned by another microfrontend {% /callout %} ## Configuring tags Your entire application can contain several feature flags. But given it is a microfrontends architecture, we want to make sure that each microfrontend only loads the features it needs. Instead of creating multiple Featurevisor projects for each of your microfrontends, we can having a **single project** that contains all the features and their configurations, and then build **separate datafiles** for each microfrontend. To achieve that, we need to let our Featurevisor [configuration](/docs/configuration) know which tags we want to build our datafiles for: ```js {% path="./featurevisor.config.js" %} module.exports = { environments: [ 'staging', 'production', ], tags: [ 'products', 'signup', 'signin', 'checkout', 'account', ], } ``` Once this configuration is in place, we can [build](/docs/building-datafiles) our datafiles: ```{% title="Command" %} $ npx featurevisor build ``` And it will output the following datafiles in the `datafiles` directory: ``` $ tree datafiles datafiles ├── production │   ├── featurevisor-tag-products.json │   ├── featurevisor-tag-signup.json │   ├── featurevisor-tag-signin.json │   ├── featurevisor-tag-checkout.json │   └── featurevisor-tag-account.json └── staging │   ├── featurevisor-tag-products.json │   ├── featurevisor-tag-signup.json │   ├── featurevisor-tag-signin.json │   ├── featurevisor-tag-checkout.json │   └── featurevisor-tag-account.json 2 directories, 10 files ``` ## Attributes We will set up some Featurevisor [attributes](/docs/attributes) first, that we will use in various stages of this guide later. ### `userId` ```yml {% path="attributes/userId.yml" %} description: Unique identifier of the logged in User type: string ``` ### `deviceId` This ID can be generated at the client-side level when user first visits your app, and stored in a cookie or localStorage for e.g. if in a browser environment. ```yml {% path="attributes/deviceId.yml" %} description: Unique identifier of the device type: string ``` ## Feature For this example, we will create a new [feature](/docs/features) flag that's responsible for toggling a marketing banner that's shown at the bottom of a page. We can call it `showMarketingBanner`. ```yml {% path="features/showMarketingBanner.yml" %} description: Shows marketing banner at the bottom of the page tags: - # we will discuss this in next section below bucketBy: deviceId rules: staging: - key: everyone segments: '*' percentage: 100 production: - key: everyone segments: '*' percentage: 100 ``` ## Tagging features When we create or update any feature in your Featurevisor project, we can tag it with the microfrontend it belongs to: ```yml {% path="features/showMarketingBanner.yml" %} description: Shows marketing banner at the bottom of the page tags: - products # tagging with `products` microfrontend # ... ``` It is possible some features may overlap across microfrontends. For example, we may want to show a marketing banner to users in both the `products` and `checkout` microfrontends. In that case, we can tag the feature with both of them together: ```yml {% path="features/showMarketingBanner.yml" %} description: Shows marketing banner at the bottom of the page tags: - products - checkout # ... ``` ## Anonymous vs Authenticated Featurevisor relies on the feature's `bucketBy` property to determine how to [bucket](/docs/bucketing) the user. It's an easy decision to make when choosing the `bucketBy` value when the microfrontend is only accessible to either anonymous or authenticated users alone, and not both. | Microfrontend | Access | `bucketBy` | | ------------- | --------------- | ---------- | | `products` | Everyone | ? | | `signup` | Anonymous users | `deviceId` | | `signin` | Anonymous users | `deviceId` | | `checkout` | Everyone | ? | | `account` | Authenticated | `userId` | But what shall we do for microfrontends that are accessible to both anonymous and authenticated users? Like `products` and `checkout` in our example. {% callout type="note" title="Applies to both microfrontends and monoliths" %} This challenge applies to any application that deals with both anonymous and authenticated users, whether it's a microfrontends architecture or a monolithic application. {% /callout %} ### Design decision It's a design decision that we need to take here when defining our features. We can either choose to: - always use `deviceId` attribute for `bucketBy`, irrespective of whether the user is anonymous or authenticated, or - we can choose to use `deviceId` for anonymous users and `userId` for authenticated users. The drawback of using `deviceId` at all times for both anonymous and authenticated users is that it will result in inconsistent variations for logged in users across multiple sessions or devices. If we wish to get the maximum benefit of Featurevisor's consistent bucketing making sure the same logged in user sees the same variation across all devices and sessions, we can use: - `userId` for authenticated users, and - `deviceId` for anonymous users ### Bucket by first available attribute We can express that in our feature as follows: ```yml {% path="features/showMarketingBanner.yml" %} description: Shows marketing banner at the bottom of the page tags: - products - checkout bucketBy: # if `userId` is available then it will be used, # otherwise it will fall back to `deviceId` or: - userId - deviceId # ... ``` This will make sure when `userId` attribute is passed for evaluation to the SDK, it will be used for bucketing. Otherwise, it will fall back to `deviceId`. ## Review and approval workflow Given Featurevisor is a centralized feature management solution, it can help you manage your features in a microfrontends architecture in a single place conveniently. In our case, it is a single Git repository that contains all the features and their configurations, which will go through a review and approval workflow by all relevant teams before they can be deployed in the form of generated [datafiles](/docs/building-datafiles). ## Consuming datafiles in your microfrontend Once you have [built](/docs/building-datafiles) and [deployed](/docs/deployment) your datafiles, you can consume them using Featurevisor SDKs in your microfrontends: ```js {% path="products-microfrontend/index.js" %} // in `products` microfrontend import { createInstance } from '@featurevisor/sdk' const DATAFILE_URL = 'https://cdn.yoursite.com/production/featurevisor-tag-products.json' const datafileContent = await fetch(DATAFILE_URL) .then(res => res.json()) const f = createInstance({ datafile: datafileContent, }) ``` Evaluate your features: ```js const featureKey = 'showMarketingBanner' const context = { deviceId: '...', // if available userId: '...', } const showMarketingBanner = f.isEnabled(featureKey, context) if (showMarketingBanner) { // render marketing banner } ``` While the snippets above suggest the usage of Featurevisor SDK in a single `products` microfrontend, it does not differ in any way if you were to use it in a monolithic application. ## Conclusion We have seen how we can use Featurevisor to manage all our feature configurations in a microfrontends architecture in a single place declaratively, even if those features overlap and are used in multiple microfrontends together. We have also seen how to handle tricky situations like anonymous vs authenticated users, and how to make sure logged in users are bucketed in a way so they see the same variation across all devices and sessions consistently maintaining a solid user experience. The freedom and flexibility that microfrontends architecture brings in is great, but it also comes with its own set of challenges. Featurevisor can help you manage your features in a microfrontends architecture bringing all parties together with a strong reviews and approval workflow, and make sure your features are consistent across all your microfrontends for your users. --- title: Remote configuration nextjs: metadata: title: Remote configuration description: Learn how to manage remote configuration using Featurevisor openGraph: title: Remote configuration description: Learn how to manage remote configuration using Featurevisor images: - url: /img/og/docs-use-cases-remote-configuration.png --- Remote configuration refers to the ability of modifying the behaviour or settings of an application while it is running, without the need for restarting, redeploying, or making any code changes. {% .lead %} This approach allows developers and system administrators to adjust various aspects of an application's behaviour, such as feature availability or other configuration parameters, without requiring downtime or disrupting the application's ongoing operation. ## Benefits Separating runtime configuration from your application using Featurevisor brings in a lot of benefits: - **Flexibility & agility**: Allows for greater flexibility in adapting the application's behavior to changing business requirements, user preferences, or market conditions. - **Reduced time & effort**: Configuration changes can be made independently without the need for application deployments, reducing time and effort. - **Personalized experience**: Allows for targeted configuration of features or settings to specific users or groups of users, such as beta testers, internal users, or paying customers. - **Reduced risk & downtime**: It eliminates the need for deploying new application code or taking the application offline, reducing the potential for introducing bugs or causing downtime during configuration updates. - **Auditing & versioning**: Facilitates tracking and auditing changes made to configuration settings. It enables versioning of configuration parameters, ensuring that previous configurations can be reverted if needed and providing a clear history of configuration changes for troubleshooting or compliance purposes. - **Multi-environment**: Allows for different configurations to be applied in different environments, such as development, testing, staging, and production. ## Our application Let's assume we have an e-commerce web application, where we wish to parameterize several aspects of it as configuration. The application allows its users to: - browse products - sign up and in - add products to cart - pay for the products (checkout) - manage their account ## Configuration parameters We can identify the following configuration parameters that we wish to manage during the checkout flow: - list of payment methods (like credit card, PayPal, etc.) - list of credit cards (like Visa, Mastercard, AMEX, etc.) - list of shipping methods (like standard, express, etc.) - allow discount code (or gift card) {% callout type="note" title="Understanding the building blocks" %} Before going further in this guide, you are recommended to learn about the building blocks of Featurevisor to understand the concepts used in this guide: - [Attributes](/docs/attributes): building block for conditions - [Segments](/docs/segments): conditions for targeting users - [Features](/docs/features): feature flags and variables with rollout rules - [SDKs](/docs/sdks): how to consume datafiles in your applications The [quick start](/docs/quick-start) can be very handy as a summary. {% /callout %} ## Defining our feature We can start by creating a new feature called `checkout`: ```yml {% path="features/checkout.yml" %} description: Checkout flow configuration tags: - checkout bucketBy: userId # rolled out as `true` to 100% of our traffic rules: production: - key: everyone segments: '*' # everyone percentage: 100 ``` ## Variables Featurevisor supports [variables](/docs/features/#variables) that can be used to define configuration parameters. We can extend our feature to include variable schema starting with `paymentMethods`: ```yml {% path="features/checkout.yml" %} description: Checkout flow configuration tags: - checkout bucketBy: userId # we add variable schema for all our parameters here, # starting with `paymentMethods` for now variablesSchema: paymentMethods: type: array defaultValue: - creditCard - paypal - applePay - googlePay rules: production: - key: everyone segments: '*' # everyone percentage: 100 ``` By default, we are saying that all our users will be able to use all the payment methods. ## Evaluating variables using SDK Once we have [built](/docs/building-datafiles) and [deployed](/docs/deployment) our datafiles, we can start consuming them in our application using Featurevisor [SDKs](/docs/sdks). We initialize the SDK first: ```js {% path="your-app/index.js" %} import { createInstance } from '@featurevisor/sdk' const DATAFILE_URL = 'https://cdn.yoursite.com/datafile.json' const datafileContent = await fetch(DATAFILE_URL) .then((res) => res.json()) const f = createInstance({ datafile: datafileContent, }) ``` Now we can evaluate the variable in our application: ```js const featureKey = 'checkout' const variableKey = 'paymentMethods' const context = { userId: 'user-123', country: 'nl' } const paymentMethods = f.getVariable(featureKey, variableKey, context) console.log(paymentMethods) // [ // "creditCard", // "paypal", // "applePay", // "googlePay" // ] ``` With this evaluated ordered array of payment methods in the runtime, we can now render the list of payment methods in our checkout flow of the application without having to hardcode this list anywhere. ## Overriding variables by rules From above example, we can see that all our users will be able to use all the payment methods. But what if we want to restrict some payment methods to some users based in certain countries? We can do that by overriding the variables via our rollout [rules](/docs/features/#rules). Assuming we already have a [segment](/docs/segments) created for targeting users in the Netherlands: ```yml {% path="segments/netherlands.yml" %} description: Target users in the Netherlands conditions: - attribute: country operator: equals value: nl ``` We can now use this segment in our rollout rules and override the `paymentMethods` variable: ```yml {% path="features/checkout.yml" %} # ... rules: production: - key: nl segments: netherlands percentage: 100 variables: paymentMethods: - paypal - ideal - key: everyone segments: '*' percentage: 100 ``` Now when we evaluate our features, we will get different results for users in the Netherlands: ```js // Users in the Netherlands const context = { userId: 'user-234', country: 'nl' } const paymentMethods = f.getVariable(featureKey, variableKey, context) console.log(paymentMethods) // [ // "paypal", // "ideal" // ] ``` While rest of the world will still get the same result as before (that is the default value as defined in the variable schema): ```js // Users in the US const context = { userId: 'user-123', country: 'us' } const paymentMethods = f.getVariable(featureKey, variableKey, context) console.log(paymentMethods) // [ // "creditCard", // "paypal", // "applePay", // "googlePay" // ] ``` ## Other ways of overriding variables Depending on your needs, it is also possible to override variables: - at each [variation level](/docs/features/#overriding-variables) acting as an experiment, and also - at environment level by [forcing it](/docs/features/#force) You can see other use cases here detailing these approaches: - [A/B & Multi-variate testing](/docs/use-cases/experiments) - [Managing user entitlements](/docs/use-cases/entitlements) - [Testing in production](/docs/use-cases/testing-in-production) ## How do applications get latest configuration? There are two ways this can happen: - You can configure your SDK to keep refreshing the datafile at a certain **interval** (like every 30 seconds), or - When deploying your Featurevisor datafiles, you can broadcast a notification to all your applications to refresh their datafiles **manually** enabling over the air updates You can refer to the SDK guide here for [refreshing datafile](/docs/sdks/javascript/#refreshing-datafile). ## Full feature example Based on our original requirements, we can express the `checkout` feature with all its variables as follows: ```yml {% path="features/checkout.yml" %} description: Checkout flow configuration tags: - checkout bucketBy: userId variablesSchema: paymentMethods: type: array defaultValue: - creditCard - paypal - applePay - googlePay creditCards: type: array defaultValue: - visa - mastercard - amex shippingMethods: type: array defaultValue: - standard - express allowDiscountCode: type: boolean defaultValue: false rules: production: - key: nl segments: netherlands percentage: 100 variables: paymentMethods: - paypal - ideal allowDiscountCode: true - key: everyone segments: '*' percentage: 100 ``` Based on our requirements, we can keep overriding these variables against different rules as needed. Learn more about [variables](/docs/features/#variables), its supported [types](/docs/features/#variable-types), and how to [override](/docs/features/#overriding-variables) them in [features](/docs/features) documentation. ## Conclusion We learned about: - various benefits of separating runtime configuration from our application - how to break down different configuration parameters of our application into variables - having them defined in a feature declaratively using Featurevisor - overriding them using rollout rules based on our needs Overall, Featurevisor can help manage your application's runtime configuration in a highly readable and maintainable way for your team and your organization, with a strong review and approval process in one single place for everyone in the form of a Git repository. --- title: Vue.js SDK nextjs: metadata: title: Vue.js SDK description: Learn how to use Featurevisor SDK with Vue.js for evaluating feature flags openGraph: title: Vue.js SDK description: Learn how to use Featurevisor SDK with Vue.js for evaluating feature flags images: - url: /img/og/docs-vue.png --- {% callout type="warning" title="Featurevisor v1" %} This guide is written keeping Featurevisor v1 in mind. It will be updated to be v2-compatible soon. {% /callout %} Featurevisor comes with an additional package for Vue.js, for ease of integration in your Vue.js application for evaluating feature flags. {% .lead %} ## Installation Install with npm: ``` $ npm install --save @featurevisor/vue ``` ## Setting up the application Use `setupApp` function to set up the SDK instance in your Vue application: ```js import { createApp } from 'vue' import { createInstance } from '@featurevisor/sdk' import { setupApp } from '@featurevisor/vue' const f = createInstance({ datafileUrl: 'https://cdn.yoursite.com/datafile.json', }) const app = createApp({ /* root component options */ }) setupApp(app, f) ``` This will set up the SDK instance in your Vue application, and make it available in all components later. ## Functions The package comes with several functions to use in your components: ### useStatus Know if the SDK is ready to be used: ```html ``` ### useFlag Check if feature is enabled or not: ```html ``` ### useVariation Get a feature's evaluated variation: ```html ``` ### useVariable Get a feature's evaluated variable value: ```html ``` ### activateFeature Same as `useVariation`, but it will also bubble an activation event up to the SDK for tracking purposes. This should ideally be only called once per feature, and only when we know the feature has been exposed to the user. ### useSdk Get the SDK instance: ```html ``` ## Optimization Given the nature of components in Vue.js, they can re-render many times. You are advised to minimize the number of calls to Featurevisor SDK in your components by using memoization techniques. ## Example repository You can find a fully functional example of a Vue.js application using Featurevisor SDK here: [https://github.com/featurevisor/featurevisor-example-vue](https://github.com/featurevisor/featurevisor-example-vue). {% callout type="note" title="Help wanted with tests" %} We are looking for help with writing tests for this package. If you are interested, please take a look [here](https://github.com/featurevisor/featurevisor/tree/main/packages/vue). {% /callout %} --- title: Example project - example-1 --- # example-1 ## attributes ```yml {% path="attributes/age.yml" %} description: User's age type: integer ``` ```yml {% path="attributes/browser.yml" %} description: browser type: object properties: name: type: string version: type: semver ``` ```yml {% path="attributes/continent.yml" %} description: continent name type: string ``` ```yml {% path="attributes/country.yml" %} archived: false description: country code in lower case (two lettered) type: string ``` ```yml {% path="attributes/date.yml" %} archived: false description: current date passed as ISO 8601 string or Date object type: string ``` ```yml {% path="attributes/device.yml" %} archived: false description: device type type: string ``` ```yml {% path="attributes/deviceId.yml" %} description: Device ID type: string ``` ```yml {% path="attributes/loggedIn.yml" %} archived: false description: is the user already logged in? type: boolean ``` ```yml {% path="attributes/phone.yml" %} description: phone number type: string ``` ```yml {% path="attributes/userId.yml" %} archived: false description: User ID type: string ``` ```yml {% path="attributes/version.yml" %} description: Version number of the app type: string # as semver ``` ## segments ```yml {% path="segments/adult.yml" %} description: Adult users who are 18 years or older conditions: - attribute: age operator: greaterThanOrEquals value: 18 ``` ```yml {% path="segments/blackFridayWeekend.yml" %} archived: false description: black friday weekend conditions: and: - attribute: date operator: after value: 2023-11-24T00:00:00Z - attribute: date operator: before value: 2023-11-27T00:00:00Z ``` ```yml {% path="segments/chrome.yml" %} description: Chrome browser conditions: - attribute: browser.name operator: matches value: "chrome|chromium" regexFlags: i ``` ```yml {% path="segments/countries/germany.yml" %} archived: false description: users from Germany conditions: and: - attribute: country operator: equals value: de ``` ```yml {% path="segments/countries/netherlands.yml" %} archived: false description: The Netherlands conditions: - attribute: country operator: equals value: nl ``` ```yml {% path="segments/countries/switzerland.yml" %} archived: false description: users from Switzerland conditions: and: - attribute: country operator: equals value: ch ``` ```yml {% path="segments/desktop.yml" %} description: desktop users conditions: - attribute: device operator: equals value: desktop ``` ```yml {% path="segments/eu.yml" %} description: EU conditions: - attribute: continent operator: equals value: europe - attribute: country operator: notIn value: - gb ``` ```yml {% path="segments/everyone.yml" %} description: Everyone conditions: "*" ``` ```yml {% path="segments/firefox.yml" %} description: Firefox browser conditions: - attribute: browser.name operator: equals value: firefox ``` ```yml {% path="segments/mobile.yml" %} archived: false description: mobile users conditions: and: - attribute: device operator: equals value: mobile - attribute: phone operator: notExists ``` ```yml {% path="segments/notChromeV1.yml" %} description: Not Chrome v1 browser conditions: - not: - attribute: browser.name operator: equals value: chrome - attribute: browser.version operator: equals value: "1.0" ``` ```yml {% path="segments/qa.yml" %} description: for testing force API in features conditions: - attribute: userId operator: in value: - "user-1" - "user-2" ``` ```yml {% path="segments/unknownDevice.yml" %} archived: false description: users with unknown device conditions: - attribute: device operator: equals value: null ``` ```yml {% path="segments/version_5.5.yml" %} description: Version equals to 5.5 conditions: - or: - attribute: version operator: equals value: 5.5 - attribute: version operator: equals value: "5.5" ``` ```yml {% path="segments/version_gt5.yml" %} description: Version greater than 5 conditions: - attribute: version operator: semverGreaterThan value: 5.0.0 ``` ## features ```yml {% path="features/allowSignup.yml" %} description: Allow signup tags: - all bucketBy: deviceId variablesSchema: allowRegularSignUp: type: boolean defaultValue: true allowGoogleSignUp: type: boolean defaultValue: false allowGitHubSignUp: type: boolean defaultValue: false variations: - value: control weight: 50 - value: treatment weight: 50 variables: allowGoogleSignUp: true allowGitHubSignUp: true rules: staging: - key: "1" segments: "*" percentage: 100 production: - key: nl segments: - countries/netherlands percentage: 100 variation: treatment - key: ch segments: - countries/switzerland percentage: 100 variationWeights: control: 10 treatment: 90 - key: everyone segments: everyone percentage: 100 ``` ```yml {% path="features/bar.yml" %} description: Example with object variable type tags: - all bucketBy: userId variablesSchema: color: type: string defaultValue: "red" hero: type: object defaultValue: title: Hero Title subtitle: Hero Subtitle alignment: center variations: - value: control weight: 33 - value: b weight: 33 variables: hero: title: Hero Title for B subtitle: Hero Subtitle for B alignment: center for B variableOverrides: hero: - segments: or: - countries/germany - countries/switzerland value: title: Hero Title for B in DE or CH subtitle: Hero Subtitle for B in DE of CH alignment: center for B in DE or CH - value: c weight: 34 rules: staging: - key: "1" segments: "*" percentage: 50 production: - key: "1" segments: "*" percentage: 50 ``` ```yml {% path="features/baz.yml" %} description: Classic on/off switch tags: - all bucketBy: or: - userId - device rules: staging: - key: "1" segments: "*" percentage: 100 production: - key: "1" segments: "*" percentage: 80 expose: production: true ``` ```yml {% path="features/cache.yml" %} description: for testing child instances tags: - all bucketBy: deviceId rules: staging: - key: "everyone" segments: "*" percentage: 100 production: - key: "netherlands" segments: "countries/netherlands" percentage: 0 - key: "everyone" segments: "*" percentage: 100 ``` ```yml {% path="features/checkout/page.yml" %} description: Testing variables without having any variations tags: - all bucketBy: userId variablesSchema: showPayments: type: boolean defaultValue: false showShipping: type: boolean defaultValue: false paymentMethods: type: array defaultValue: - visa - mastercard rules: staging: - key: "1" segments: "*" percentage: 100 production: - key: "1" segments: countries/netherlands percentage: 100 variables: paymentMethods: - ideal - paypal - key: "2" segments: countries/germany percentage: 100 variables: paymentMethods: - sofort - paypal - key: "ch" segments: countries/switzerland percentage: 0 - key: "3" segments: "*" percentage: 100 variables: showPayments: true showShipping: true paymentMethods: - visa - mastercard - paypal ``` ```yml {% path="features/discount.yml" %} description: Enable discount in checkout flow tags: - all - checkout bucketBy: userId required: - sidebar rules: staging: - key: "1" segments: "*" percentage: 100 production: - key: "2" description: "Black Friday Weekend rule here" segments: - blackFridayWeekend percentage: 100 - key: "1" segments: "*" percentage: 0 ``` ```yml {% path="features/foo.yml" %} archived: false description: blah tags: - all - sign-in - sign-up bucketBy: userId variablesSchema: bar: type: string defaultValue: "" baz: type: string defaultValue: "" qux: type: boolean defaultValue: false description: This is a boolean variable variations: - value: control weight: 50 - value: treatment weight: 50 variables: bar: bar_here baz: baz_here variableOverrides: bar: - segments: or: - countries/germany - countries/switzerland value: bar for DE or CH baz: - segments: countries/netherlands value: baz for NL force: staging: - conditions: - attribute: userId operator: equals value: "test-force-id" variation: treatment production: - conditions: and: - attribute: userId operator: equals value: "123" - attribute: device operator: equals value: "mobile" variation: treatment variables: bar: yoooooo rules: staging: - key: "1" segments: "*" percentage: 100 production: - key: "1" segments: and: - mobile - or: - countries/germany - countries/switzerland percentage: 80 variables: qux: true - key: "2" segments: "*" percentage: 50 ``` ```yml {% path="features/footer.yml" %} description: Checks `not` operator in segments tags: - all bucketBy: userId rules: staging: - key: "1" segments: - not: - version_5.5 percentage: 100 - key: "2" segments: "*" percentage: 0 production: - key: "1" segments: "*" percentage: 80 ``` ```yml {% path="features/hidden.yml" %} description: Classic on/off switch, that's not exposed in production tags: - all bucketBy: userId rules: staging: - key: "1" segments: "*" percentage: 100 production: - key: "1" segments: "*" percentage: 80 expose: production: false ``` ```yml {% path="features/newRedesign.yml" %} description: Test forced with a variation, and variable overrides in it, without any active rolled out rule tags: - all bucketBy: userId variablesSchema: foo: type: string defaultValue: "default foo" bar: type: string defaultValue: "default bar" deprecated: true variations: - value: control weight: 50 - value: treatment weight: 50 variables: foo: foo for treatment bar: bar for treatment variableOverrides: foo: - segments: or: - countries/germany - countries/switzerland value: foo for treatment in DE or CH bar: - segments: countries/netherlands value: bar for treatment in NL force: staging: - conditions: - attribute: userId operator: equals value: "test-force-id" enabled: true variation: treatment rules: staging: - key: "1" segments: qa percentage: 0 # # intentionally commented to prove it fails before fixing SDK # - key: "1" # segments: "*" # percentage: 0 production: - key: "1" segments: "*" percentage: 100 ``` ```yml {% path="features/pricing.yml" %} description: Testing two variations with first one having weight of 0 tags: - checkout bucketBy: userId disabledVariationValue: control variations: - value: control weight: 0 - value: treatment weight: 100 rules: staging: - key: "1" segments: "*" percentage: 100 production: - key: "1" segments: countries/germany percentage: 100 - key: "2" segments: "*" percentage: 0 ``` ```yml {% path="features/qux.yml" %} description: Variations with weights having decimal places tags: - all bucketBy: userId variablesSchema: fooConfig: type: json defaultValue: '{"foo": "bar"}' variations: - value: control weight: 33.34 - value: b weight: 33.33 variables: fooConfig: '{"foo": "bar b"}' - value: c weight: 33.33 rules: staging: - key: "1" segments: "*" percentage: 50 production: - key: "1" segments: - countries/germany percentage: 50 variation: b - key: "2" segments: "*" percentage: 50 ``` ```yml {% path="features/redesign.yml" %} description: Enable new design tags: - all bucketBy: userId rules: staging: - key: "1" segments: "*" percentage: 100 production: - key: "1" segments: countries/netherlands percentage: 100 - key: "2" segments: "*" percentage: 0 ``` ```yml {% path="features/showBanner.yml" %} description: for testing expose property with tags tags: - all - checkout bucketBy: userId force: staging: - segments: qa enabled: true - conditions: - attribute: userId operator: equals value: user-3 enabled: true rules: staging: - key: "1" segments: "*" percentage: 0 production: - key: "1" segments: "*" percentage: 0 expose: staging: # this means, this feature: # - will exist in datafiles/staging/featurevisor-tag-checkout.json only # - not exist in datafiles/staging/featurevisor-tag-all.json - checkout # See README for additional test scripts for verifying the datafile generation ``` ```yml {% path="features/showHeader.yml" %} description: For testing wrong semver parsing tags: - all bucketBy: - userId rules: staging: - key: "1" segments: "*" percentage: 100 production: - key: "desktop" segments: - version_gt5 - desktop percentage: 100 - key: "mobile" segments: - mobile percentage: 100 - key: "all" segments: "*" percentage: 0 ``` ```yml {% path="features/showNotification.yml" %} description: Classic on/off switch tags: - all bucketBy: userId rules: staging: - key: "1" segments: "*" percentage: 100 production: - key: "1" segments: "*" percentage: 0 ``` ```yml {% path="features/showPopup.yml" %} description: for testing force API in features tags: - all bucketBy: userId force: staging: - segments: qa enabled: true - conditions: - attribute: userId operator: equals value: user-3 enabled: true rules: staging: - key: "1" segments: "*" percentage: 0 production: - key: "1" segments: "*" percentage: 0 expose: production: false ``` ```yml {% path="features/sidebar.yml" %} description: Show sidebar or not tags: - all bucketBy: userId variablesSchema: position: type: string description: position of the sidebar defaultValue: left color: type: string defaultValue: red sections: type: array defaultValue: [] title: type: string defaultValue: "Sidebar Title" title2: type: string defaultValue: "Sidebar Title 2" title3: type: string defaultValue: "Sidebar Title 3" title4: type: string defaultValue: "Sidebar Title 4" title5: type: string defaultValue: "Sidebar Title 5" title6: type: string defaultValue: "Sidebar Title 6" title7: type: string defaultValue: "Sidebar Title 7" variations: - value: control weight: 10 - value: treatment weight: 90 variables: position: right color: red sections: ["home", "about", "contact"] variableOverrides: color: - segments: - countries/germany value: yellow - segments: - countries/switzerland value: white sections: - segments: - countries/germany value: ["home", "about", "contact", "imprint"] - segments: - countries/netherlands value: ["home", "about", "contact", "bitterballen"] rules: staging: - key: "1" segments: "*" percentage: 100 production: - key: "1" segments: "*" percentage: 100 variables: title: Sidebar Title for production expose: staging: - all ``` ```yml {% path="features/testDisabled.yml" %} description: For testing variables when feature itself is disabled tags: - all bucketBy: userId variablesSchema: foo: type: string defaultValue: foo value bar: type: string defaultValue: bar value useDefaultWhenDisabled: true baz: type: string defaultValue: baz value disabledValue: baz value when feature is disabled variations: - value: control weight: 50 - value: treatment weight: 50 rules: staging: - key: "1" segments: "*" percentage: 100 production: - key: "1" segments: "*" percentage: 0 ``` ## tests ```yml {% path="tests/features/allowSignup.spec.yml" %} feature: allowSignup assertions: ## # DE # - matrix: at: [40] description: "DE at ${{ at }}% should have control variation" at: ${{ at }} environment: production context: country: de expectedToBeEnabled: true expectedVariation: control expectedVariables: allowRegularSignUp: true allowGoogleSignUp: false allowGitHubSignUp: false expectedEvaluations: flag: reason: rule ruleKey: everyone bucketValue: 40000 variation: reason: allocated ruleKey: everyone variables: allowRegularSignUp: reason: variable_default allowGoogleSignUp: reason: variable_default - matrix: at: [60] description: "DE at ${{ at }}% should have treatment variation" at: ${{ at }} environment: production context: country: de expectedToBeEnabled: true expectedVariation: treatment expectedVariables: allowRegularSignUp: true allowGoogleSignUp: true allowGitHubSignUp: true ## # NL # - matrix: at: [40, 60] description: "NL at ${{ at }}% should have treatment variation" at: ${{ at }} environment: production context: country: nl expectedToBeEnabled: true expectedVariation: treatment expectedVariables: allowRegularSignUp: true allowGoogleSignUp: true allowGitHubSignUp: true ## # CH # - matrix: at: [5, 8] description: "CH at ${{ at }}% should have control variation" at: ${{ at }} environment: production context: country: ch expectedToBeEnabled: true expectedVariation: control expectedVariables: allowRegularSignUp: true allowGoogleSignUp: false allowGitHubSignUp: false - matrix: at: [15, 20, 40, 50, 60, 80, 100] description: "CH at ${{ at }}% should have treatment variation" at: ${{ at }} environment: production context: country: ch expectedToBeEnabled: true expectedVariation: treatment expectedVariables: allowRegularSignUp: true allowGoogleSignUp: true allowGitHubSignUp: true ``` ```yml {% path="tests/features/bar.spec.yml" %} feature: bar assertions: - at: 15 # 30 * 0.5 environment: staging context: country: us expectedToBeEnabled: true expectedVariation: control expectedVariables: color: red hero: title: Hero Title subtitle: Hero Subtitle alignment: center - at: 20 # 40 * 0.5 environment: staging context: country: us expectedToBeEnabled: true expectedVariation: b expectedVariables: color: red hero: title: Hero Title for B subtitle: Hero Subtitle for B alignment: center for B - at: 20 # 40 * 0.5 environment: staging context: country: de expectedToBeEnabled: true expectedVariation: b expectedVariables: color: red hero: title: Hero Title for B in DE or CH subtitle: Hero Subtitle for B in DE of CH alignment: center for B in DE or CH ``` ```yml {% path="tests/features/baz.spec.yml" %} feature: baz assertions: - at: 10 description: "At 10%, the feature should be enabled" environment: production context: country: nl expectedToBeEnabled: true - at: 70 description: "At 70%, the feature should be enabled" environment: production context: country: nl expectedToBeEnabled: true - at: 90 description: "At 90%, the feature should be disabled" environment: production context: country: nl expectedToBeEnabled: false - matrix: environment: [production] at: [85, 90] country: [nl] city: [amsterdam, utrecht] at: ${{ at }} description: "At ${{ at }} in country ${{ country }} in city ${{ city }}, the feature should be disabled" environment: ${{ environment }} context: country: ${{ country }} city: ${{ city }} expectedToBeEnabled: false ``` ```yml {% path="tests/features/cache.spec.yml" %} feature: cache assertions: - at: 10 environment: production context: {} expectedToBeEnabled: true children: - context: country: "nl" expectedToBeEnabled: false - at: 10 environment: production context: country: de expectedToBeEnabled: true - at: 10 environment: production context: country: nl expectedToBeEnabled: false ``` ```yml {% path="tests/features/checkout/page.spec.yml" %} feature: checkout/page assertions: ## # Everyone # - at: 60 environment: production context: country: us expectedToBeEnabled: true expectedVariables: showPayments: true showShipping: true paymentMethods: - visa - mastercard - paypal expectedEvaluations: flag: ruleKey: "3" variables: showPayments: ruleKey: "3" ## # NL # - at: 60 environment: production context: country: nl expectedToBeEnabled: true expectedVariables: showPayments: false showShipping: false paymentMethods: - ideal - paypal expectedEvaluations: flag: ruleKey: "1" ## # CH # - at: 80 environment: production context: country: ch expectedToBeEnabled: false expectedEvaluations: flag: ruleKey: ch ## # DE # - at: 80 environment: production context: country: de expectedToBeEnabled: true expectedVariables: showPayments: false showShipping: false paymentMethods: - sofort - paypal ``` ```yml {% path="tests/features/discount.spec.yml" %} feature: discount assertions: - at: 10 description: "At 10%, the feature should be disabled on 1st January 2023" environment: production context: date: 2023-01-01T00:00:00Z expectedToBeEnabled: false - at: 70 description: "At 70%, the feature should be disabled on 25th December 2023" environment: production context: date: 2023-12-25T00:00:00Z expectedToBeEnabled: false - at: 90 description: "At 90%, the feature should be enabled on 25th November 2023" environment: production context: date: 2023-11-25T00:00:00Z expectedToBeEnabled: true ``` ```yml {% path="tests/features/foo.spec.yml" %} feature: foo assertions: - at: 40 environment: staging context: country: de expectedToBeEnabled: true expectedVariation: control - at: 60 environment: staging context: country: ch expectedToBeEnabled: true expectedVariation: treatment - at: 60 environment: staging context: country: us expectedToBeEnabled: true expectedVariation: treatment - at: 60 environment: staging context: country: us expectedToBeEnabled: true expectedVariation: treatment expectedVariables: bar: bar_here baz: baz_here - at: 80 environment: staging context: country: de expectedToBeEnabled: true expectedVariation: treatment expectedVariables: bar: bar for DE or CH baz: baz_here - at: 20 environment: staging context: country: de userId: test-force-id expectedToBeEnabled: true expectedVariation: treatment expectedVariables: bar: bar for DE or CH baz: baz_here - at: 70 environment: staging context: country: nl expectedToBeEnabled: true expectedVariation: treatment expectedVariables: bar: bar_here baz: baz for NL ## # Testing `at` with last value of range # - at: 100 environment: staging context: country: nl expectedToBeEnabled: true - at: 49.999 environment: staging context: country: nl expectedToBeEnabled: true expectedVariation: control - at: 50 environment: staging context: country: nl expectedToBeEnabled: true expectedVariation: control - at: 50.001 environment: staging context: country: nl expectedToBeEnabled: true expectedVariation: treatment - at: 99.999 environment: staging context: country: nl expectedToBeEnabled: true expectedVariation: treatment - at: 100 environment: staging context: country: nl expectedToBeEnabled: true expectedVariation: treatment ``` ```yml {% path="tests/features/footer.spec.yml" %} feature: footer assertions: - at: 40 environment: staging context: version: 4 expectedToBeEnabled: true - at: 40 environment: staging context: version: 5.4 expectedToBeEnabled: true - at: 40 environment: staging context: version: "5" expectedToBeEnabled: true - at: 40 environment: staging context: version: 5.5 expectedToBeEnabled: false - at: 40 environment: staging context: version: "5.5" expectedToBeEnabled: false ``` ```yml {% path="tests/features/newRedesign.spec.yml" %} feature: newRedesign assertions: ## # production # - at: 40 environment: production context: country: us expectedToBeEnabled: true expectedVariation: control expectedVariables: foo: default foo bar: default bar - at: 60 environment: production context: country: us expectedToBeEnabled: true expectedVariation: treatment expectedVariables: foo: foo for treatment bar: bar for treatment - at: 60 environment: production context: country: de expectedToBeEnabled: true expectedVariation: treatment expectedVariables: foo: foo for treatment in DE or CH bar: bar for treatment ## # staging # - at: 40 environment: staging context: userId: someone-else country: us expectedToBeEnabled: false expectedVariables: foo: null - at: 40 environment: staging context: userId: test-force-id country: us expectedToBeEnabled: true expectedVariation: treatment expectedVariables: foo: foo for treatment bar: bar for treatment - at: 40 environment: staging context: userId: test-force-id country: de expectedToBeEnabled: true expectedVariation: treatment expectedVariables: foo: foo for treatment in DE or CH bar: bar for treatment - at: 40 environment: staging context: userId: test-force-id country: nl expectedToBeEnabled: true expectedVariation: treatment expectedVariables: foo: foo for treatment bar: bar for treatment in NL ``` ```yml {% path="tests/features/pricing.spec.yml" %} feature: pricing assertions: - at: 5 environment: staging context: country: nl expectedToBeEnabled: true expectedVariation: treatment - at: 10 environment: production context: country: nl expectedToBeEnabled: false expectedVariation: control - at: 20 environment: production context: country: de expectedToBeEnabled: true expectedVariation: treatment ``` ```yml {% path="tests/features/qux.spec.yml" %} feature: qux assertions: - at: 66.5 # (33.33 * 0.5) + 50 environment: production context: country: nl expectedToBeEnabled: true expectedVariation: control expectedVariables: fooConfig: foo: bar - at: 66.665 # (33.33 * 0.5) + 50 environment: production context: country: nl expectedToBeEnabled: true expectedVariation: control - at: 66.67 # (33.34 * 0.5) + 50 environment: production context: country: nl expectedToBeEnabled: true expectedVariation: control - at: 66.675 # (33.35 * 0.5) + 50 environment: production context: country: nl expectedToBeEnabled: true expectedVariation: b expectedVariables: fooConfig: '{ "foo": "bar b" }' # stringified - at: 67 # (42 * 0.5) + 50 environment: production context: country: nl expectedToBeEnabled: true expectedVariation: b - at: 83 # (66 * 0.5) + 50 environment: production context: country: nl expectedToBeEnabled: true expectedVariation: b - at: 83.5 # (67 * 0.5) + 50 environment: production context: country: nl expectedToBeEnabled: true expectedVariation: c - at: 95 # (90 * 0.5) + 50 environment: production context: country: de expectedToBeEnabled: true expectedVariation: b expectedVariables: fooConfig: '{ "foo": "bar b" }' # stringified - at: 55 # (10 * 0.5) + 50 environment: production context: country: de expectedToBeEnabled: true expectedVariation: b expectedVariables: fooConfig: '{ "foo": "bar b" }' # stringified ``` ```yml {% path="tests/features/redesign.spec.yml" %} feature: redesign assertions: - at: 40 environment: production context: country: nl expectedToBeEnabled: true - at: 40 environment: production context: country: de expectedToBeEnabled: false - at: 40 environment: production sticky: redesign: enabled: true context: country: de expectedToBeEnabled: true ``` ```yml {% path="tests/features/showHeader.spec.yml" %} feature: showHeader assertions: - description: "should be disabled for desktop users below v5" at: 80 environment: production context: device: desktop version: 1.2.3 expectedToBeEnabled: false - description: "should be enabled for desktop users above v5" at: 80 environment: production context: device: desktop version: 5.5.0 expectedToBeEnabled: true - description: "should be enabled for mobile users when passing valid semver" at: 80 environment: production context: device: mobile version: 1.2.3 expectedToBeEnabled: true - description: "should be enabled for mobile users when passing invalid semver" at: 80 environment: production context: device: mobile version: 7.0.A101.99gbm.lg expectedToBeEnabled: true - description: "should be disabled for tablets" at: 80 environment: production context: device: tablet expectedToBeEnabled: false ``` ```yml {% path="tests/features/showNotification.spec.yml" %} feature: showNotification assertions: - matrix: at: [0, 50, 100] at: ${{ at }} description: "At ${{ at }}% in staging, the feature should be enabled" environment: staging context: country: nl expectedToBeEnabled: true - matrix: at: [0, 50, 100] at: ${{ at }} description: "At ${{ at }}% in production, the feature should be disabled" environment: production context: country: nl expectedToBeEnabled: false ``` ```yml {% path="tests/features/showPopup.spec.yml" %} feature: showPopup assertions: - at: 40 environment: staging context: userId: "user-1" expectedToBeEnabled: true - at: 40 environment: staging context: userId: "user-2" expectedToBeEnabled: true - at: 40 environment: staging context: userId: "user-3" expectedToBeEnabled: true - at: 40 environment: staging context: userId: "user-6" expectedToBeEnabled: false ``` ```yml {% path="tests/features/sidebar.spec.yml" %} feature: sidebar assertions: - at: 5 environment: production context: country: nl expectedToBeEnabled: true expectedVariation: control - at: 90 environment: production context: country: nl expectedToBeEnabled: true expectedVariation: treatment - at: 90 environment: production context: country: nl expectedToBeEnabled: true expectedVariation: treatment expectedVariables: position: right color: red - at: 90 environment: production context: country: de expectedToBeEnabled: true expectedVariation: treatment expectedVariables: position: right color: yellow title: Sidebar Title for production - at: 90 environment: production context: country: us expectedToBeEnabled: true expectedVariation: treatment expectedVariables: sections: ["home", "about", "contact"] - at: 90 environment: production context: country: de expectedToBeEnabled: true expectedVariation: treatment expectedVariables: sections: ["home", "about", "contact", "imprint"] - at: 70 environment: production context: country: nl userId: "123" expectedToBeEnabled: true expectedVariation: treatment expectedVariables: sections: ["home", "about", "contact", "bitterballen"] ``` ```yml {% path="tests/features/testDisabled.spec.yml" %} feature: testDisabled assertions: ## # Staging enabled # - at: 10 environment: staging expectedToBeEnabled: true expectedVariation: control expectedVariables: foo: foo value bar: bar value baz: baz value ## # Production disabled # - at: 10 environment: production expectedToBeEnabled: false expectedVariation: null expectedVariables: foo: null bar: bar value # useDefaultWhenDisabled baz: baz value when feature is disabled # disabledValue - at: 10 environment: production defaultVariationValue: treatment defaultVariableValues: # this is being tested here foo: "default foo value" expectedToBeEnabled: false expectedVariation: treatment expectedVariables: foo: default foo value # because it was set in defaultVariableValues ``` ```yml {% path="tests/segments/countries/germany.spec.yml" %} segment: countries/germany assertions: - context: country: de expectedToMatch: true - context: country: de someOtherAttribute: someOtherValue expectedToMatch: true - context: country: notDe expectedToMatch: false - matrix: country: [nl] city: [amsterdam, utrecht] description: Testing in country ${{ country }} in city ${{ city }} context: country: ${{ country }} city: ${{ city }} expectedToMatch: false ``` ```yml {% path="tests/segments/eu.spec.yml" %} segment: eu assertions: - description: continent is europe, with no country passed should not match context: continent: europe expectedToMatch: false - description: continent is asia, so not matching early context: continent: "asia" expectedToMatch: false - description: continent is europe, country is nl, should match context: continent: europe country: nl expectedToMatch: true - description: continent is europe, country is gb, which is known to be not in EU, therefore should not match context: continent: europe country: gb expectedToMatch: false # passing unexpected values in `country` - context: continent: europe country: a: a b: b expectedToMatch: false - context: continent: europe country: [a, b, c] expectedToMatch: false - context: continent: europe country: 100 expectedToMatch: true - context: continent: europe country: null expectedToMatch: true ``` ```yml {% path="tests/segments/everyone.spec.yml" %} segment: everyone assertions: - context: country: nl expectedToMatch: true ``` ```yml {% path="tests/segments/firefox.spec.yml" %} segment: firefox assertions: ## # Match # - context: browser: name: firefox version: 100.0 expectedToMatch: true ## # Not match # - context: browser: name: chrome version: 100.0 expectedToMatch: false - context: browser: type: chrome expectedToMatch: false - context: browser: firefox expectedToMatch: false ``` ```yml {% path="tests/segments/notChromeV1.spec.yml" %} segment: notChromeV1 assertions: ## # Match # - context: browser: name: chrome expectedToMatch: true - context: browser: name: chrome version: "2.0" expectedToMatch: true - context: browser: name: firefox version: "1.0" expectedToMatch: true ## # Not match # - context: browser: name: chrome version: "1.0" expectedToMatch: false ``` ```yml {% path="tests/segments/unknownDevice.spec.yml" %} segment: unknownDevice assertions: - context: device: iphone expectedToMatch: false - context: device: "" expectedToMatch: false - context: device: null expectedToMatch: true ``` --- title: Example project - example-json --- # example-json ## attributes ```json {% path="attributes/country.json" %} { "description": "country code in lower case (two lettered)", "type": "string" } ``` ```json {% path="attributes/deviceId.json" %} { "description": "Device ID", "type": "string" } ``` ```json {% path="attributes/userId.json" %} { "description": "User ID", "type": "string" } ``` ## segments ```json {% path="segments/netherlands.json" %} { "description": "The Netherlands", "conditions": [ { "attribute": "country", "operator": "equals", "value": "nl" } ] } ``` ## features ```json {% path="features/showCookieBanner.json" %} { "description": "Show cookie banner to users from the Netherlands", "tags": ["all"], "bucketBy": "userId", "rules": { "staging": [ { "key": "everyone", "segments": "*", "percentage": 100 } ], "production": [ { "key": "nl", "segments": ["netherlands"], "percentage": 100 }, { "key": "everyone", "segments": "*", "percentage": 0 } ] } } ``` ## tests ```json {% path="tests/netherlands.segment.json" %} { "segment": "netherlands", "assertions": [ { "context": { "country": "nl" }, "expectedToMatch": true }, { "context": { "country": "de", "someOtherAttribute": "someOtherValue" }, "expectedToMatch": false }, { "context": { "country": "notNl" }, "expectedToMatch": false } ] } ``` ```json {% path="tests/showCookieBanner.feature.json" %} { "feature": "showCookieBanner", "assertions": [ { "at": 10, "description": "At 10%, the feature should be enabled for NL", "environment": "production", "context": { "country": "nl" }, "expectedToBeEnabled": true }, { "at": 70, "description": "At 70%, the feature should be enabled for NL", "environment": "production", "context": { "country": "nl" }, "expectedToBeEnabled": true }, { "at": 90, "description": "At 90%, the feature should be disabled for US", "environment": "production", "context": { "country": "us" }, "expectedToBeEnabled": false }, { "at": 90, "description": "At 90%, the feature should be disabled for Canada", "environment": "production", "context": { "country": "ca" }, "expectedToBeEnabled": false } ] } ``` --- title: Example project - example-toml --- # example-toml ## attributes ```toml {% path="attributes/country.toml" %} description = "country code in lower case (two lettered)" type = "string" ``` ```toml {% path="attributes/deviceId.toml" %} description = "Device ID" type = "string" ``` ```toml {% path="attributes/userId.toml" %} description = "User ID" type = "string" ``` ## segments ```toml {% path="segments/netherlands.toml" %} description = "The Netherlands" [[conditions]] attribute = "country" operator = "equals" value = "nl" ``` ## features ```toml {% path="features/showCookieBanner.toml" %} description = "Show cookie banner to users from the Netherlands" tags = [ "all" ] bucketBy = "userId" [[rules.staging]] key = "everyone" segments = "*" percentage = 100 [[rules.production]] key = "nl" segments = [ "netherlands" ] percentage = 100 [[rules.production]] key = "everyone" segments = "*" percentage = 0 ``` ## tests ```toml {% path="tests/netherlands.segment.toml" %} segment = "netherlands" [[assertions]] expectedToMatch = true [assertions.context] country = "nl" [[assertions]] expectedToMatch = false [assertions.context] country = "de" someOtherAttribute = "someOtherValue" [[assertions]] expectedToMatch = false [assertions.context] country = "notNl" ``` ```toml {% path="tests/showCookieBanner.feature.toml" %} feature = "showCookieBanner" [[assertions]] at = 10 description = "At 10%, the feature should be enabled for NL" environment = "production" expectedToBeEnabled = true [assertions.context] country = "nl" [[assertions]] at = 70 description = "At 70%, the feature should be enabled for NL" environment = "production" expectedToBeEnabled = true [assertions.context] country = "nl" [[assertions]] at = 90 description = "At 90%, the feature should be disabled for US" environment = "production" expectedToBeEnabled = false [assertions.context] country = "us" [[assertions]] at = 90 description = "At 90%, the feature should be disabled for Canada" environment = "production" expectedToBeEnabled = false [assertions.context] country = "ca" ``` --- title: Example project - example-yml-no-envs --- # example-yml-no-envs ## attributes ```yml {% path="attributes/country.yml" %} description: country code in lower case (two lettered) type: string ``` ```yml {% path="attributes/deviceId.yml" %} description: Device ID type: string ``` ```yml {% path="attributes/userId.yml" %} description: User ID type: string ``` ## segments ```yml {% path="segments/netherlands.yml" %} description: The Netherlands conditions: - attribute: country operator: equals value: nl ``` ## features ```yml {% path="features/showCookieBanner.yml" %} description: Show cookie banner to users from the Netherlands tags: - all bucketBy: userId force: - conditions: - attribute: userId operator: equals value: "123" enabled: true rules: - key: nl segments: - netherlands percentage: 100 - key: everyone segments: "*" # everyone else percentage: 0 ``` ## tests ```yml {% path="tests/netherlands.segment.yml" %} segment: netherlands assertions: - context: country: nl expectedToMatch: true - context: country: de someOtherAttribute: someOtherValue expectedToMatch: false - context: country: notNl expectedToMatch: false ``` ```yml {% path="tests/showCookieBanner.feature.yml" %} feature: showCookieBanner assertions: - at: 10 description: "At 10%, the feature should be enabled for NL" context: country: nl expectedToBeEnabled: true - at: 70 description: "At 70%, the feature should be enabled for NL" context: country: nl expectedToBeEnabled: true - at: 90 description: "At 90%, the feature should be disabled for US" context: country: us expectedToBeEnabled: false - at: 90 description: "At 90%, the feature should be disabled for Canada" context: country: ca expectedToBeEnabled: false ``` --- title: Example project - example-yml --- # example-yml ## attributes ```yml {% path="attributes/country.yml" %} description: country code in lower case (two lettered) type: string ``` ```yml {% path="attributes/deviceId.yml" %} description: Device ID type: string ``` ```yml {% path="attributes/userId.yml" %} description: User ID type: string ``` ## segments ```yml {% path="segments/netherlands.yml" %} description: The Netherlands conditions: - attribute: country operator: equals value: nl ``` ## features ```yml {% path="features/showCookieBanner.yml" %} description: Show cookie banner to users from the Netherlands tags: - all bucketBy: userId rules: staging: - key: everyone segments: "*" # enabled for everyone in staging only percentage: 100 production: - key: nl segments: - netherlands # enabled in prod for NL only percentage: 100 - key: everyone segments: "*" # everyone else percentage: 0 ``` ## tests ```yml {% path="tests/features/showCookieBanner.spec.yml" %} feature: showCookieBanner assertions: - at: 10 description: "At 10%, the feature should be enabled for NL" environment: production context: country: nl expectedToBeEnabled: true - at: 70 description: "At 70%, the feature should be enabled for NL" environment: production context: country: nl expectedToBeEnabled: true - at: 90 description: "At 90%, the feature should be disabled for US" environment: production context: country: us expectedToBeEnabled: false - at: 90 description: "At 90%, the feature should be disabled for Canada" environment: production context: country: ca expectedToBeEnabled: false ``` ```yml {% path="tests/segments/netherlands.spec.yml" %} segment: netherlands assertions: - context: country: nl expectedToMatch: true - context: country: de someOtherAttribute: someOtherValue expectedToMatch: false - context: country: notNl expectedToMatch: false ```