Experience Optimizer
Parameter Name | Parameter Value | Update Policy |
---|---|---|
group | true | default |
group.format | grouped | default |
group.ngroups | true | default |
group.field | style_id_s | default |
Add a Query Pipeline Stage
Parameter Name | Parameter Value |
---|---|
expand | true |
enableElevation | true |
group | false |
Use Experience Optimizer query rewrites
benefits enrollment
could be improved by using the search term benefits enrollment+year:"2021"
(in this case making use of the year field in the data).Most Head/Tail rewrites are typically created automatically via machine learning. However, if desired, custom rewrites can be manually created using the following steps.Parameter | Description | Example Value |
Tail Query | The tail query itself. | benefits enrollment |
Improved Query | The query that will replace the tail query phrase. | benefits enrollment+year:"2021" |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
questionnaire
as questionare
, you can set up a query rewrite to automatically correct it.Parameter | Description | Example Value |
Misspelled Term | The phrase itself. | questionare |
Corrected Term | The term that will replace the misspelled term. | questionnaire |
Action | Action to perform. | expand |
Confidence | Confidence score from the phrase job. A confidence level of 1 represents 100% confidence. For rules created automatically via machine learning, the confidence level reflects the output from the machine learning model. | 1 |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
background check
would show results for both background
and check
. With phrase detection, this search would correctly boost results for "background check"
.Parameter | Description | Example Value |
Surface Form | The phrase itself. | background check |
Word Count | Indicates how many words are included in the phrase. | 2 |
Confidence | Confidence score from the phrase job. A confidence level of 1 represents 100% confidence. For rules created automatically via machine learning, the confidence level will reflect the output from the machine learning model. | 1 |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
Phrase Count | Denotes how many times this phrase was found in the source. This value is automatically set via machine learning. It does not need to be set manually. | 5 |
Boost Factor | The factor to use to boost this phrase in matching queries. | 2.0 |
Slop Factor | Phrase slop, or the distance between the terms of the query while still considering it a phrase match. | 10 |
questionnaire
could have the synonyms application
and survey
.Parameter | Description | Example Value |
Surface Form | The term that has synonyms. | questionnaire |
Direction | With a oneway search, the original search term is replaced by the synonym. In the preceding example, questionnaire would be replaced by the alternative words application and survey . With a symmetric search, the search query is expanded to include the original term and the synonyms, resulting in a greater number of potential hits. In the preceding example, this time the query would include questionnaire , application , and survey . | symmetric |
Synonym Mappings | Synonyms for the surface form. | application, survey` |
Confidence | Confidence score from the phrase job. A confidence level of 1 represents 100% confidence. For rules created automatically via machine learning, the confidence level will reflect the output from the machine learning model. | 1 |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
Count | How many times this term occurred in the signal data when it was discovered. This value is optional when a rewrite is being defined manually. | 5 |
case study examples
to remove examples
and then display results for case study
.Parameter | Description | Example Value |
Phrase to remove | The words to remove from the trigger phrase. | examples |
Trigger phrases | The query that prompts the removal of the phrase. The trigger phrase is not necessarily a complete query. If the query contains the trigger phrase, then Fusion removes the phrase in the Phrase to Remove field. | case study examples |
Use Experience Optimizer query rewrites
benefits enrollment
could be improved by using the search term benefits enrollment+year:"2021"
(in this case making use of the year field in the data).Most Head/Tail rewrites are typically created automatically via machine learning. However, if desired, custom rewrites can be manually created using the following steps.Parameter | Description | Example Value |
Tail Query | The tail query itself. | benefits enrollment |
Improved Query | The query that will replace the tail query phrase. | benefits enrollment+year:"2021" |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
questionnaire
as questionare
, you can set up a query rewrite to automatically correct it.Parameter | Description | Example Value |
Misspelled Term | The phrase itself. | questionare |
Corrected Term | The term that will replace the misspelled term. | questionnaire |
Action | Action to perform. | expand |
Confidence | Confidence score from the phrase job. A confidence level of 1 represents 100% confidence. For rules created automatically via machine learning, the confidence level reflects the output from the machine learning model. | 1 |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
background check
would show results for both background
and check
. With phrase detection, this search would correctly boost results for "background check"
.Parameter | Description | Example Value |
Surface Form | The phrase itself. | background check |
Word Count | Indicates how many words are included in the phrase. | 2 |
Confidence | Confidence score from the phrase job. A confidence level of 1 represents 100% confidence. For rules created automatically via machine learning, the confidence level will reflect the output from the machine learning model. | 1 |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
Phrase Count | Denotes how many times this phrase was found in the source. This value is automatically set via machine learning. It does not need to be set manually. | 5 |
Boost Factor | The factor to use to boost this phrase in matching queries. | 2.0 |
Slop Factor | Phrase slop, or the distance between the terms of the query while still considering it a phrase match. | 10 |
questionnaire
could have the synonyms application
and survey
.Parameter | Description | Example Value |
Surface Form | The term that has synonyms. | questionnaire |
Direction | With a oneway search, the original search term is replaced by the synonym. In the preceding example, questionnaire would be replaced by the alternative words application and survey . With a symmetric search, the search query is expanded to include the original term and the synonyms, resulting in a greater number of potential hits. In the preceding example, this time the query would include questionnaire , application , and survey . | symmetric |
Synonym Mappings | Synonyms for the surface form. | application, survey` |
Confidence | Confidence score from the phrase job. A confidence level of 1 represents 100% confidence. For rules created automatically via machine learning, the confidence level will reflect the output from the machine learning model. | 1 |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
Count | How many times this term occurred in the signal data when it was discovered. This value is optional when a rewrite is being defined manually. | 5 |
case study examples
to remove examples
and then display results for case study
.Parameter | Description | Example Value |
Phrase to remove | The words to remove from the trigger phrase. | examples |
Trigger phrases | The query that prompts the removal of the phrase. The trigger phrase is not necessarily a complete query. If the query contains the trigger phrase, then Fusion removes the phrase in the Phrase to Remove field. | case study examples |
Use Experience Optimizer query rewrites
benefits enrollment
could be improved by using the search term benefits enrollment+year:"2021"
(in this case making use of the year field in the data).Most Head/Tail rewrites are typically created automatically via machine learning. However, if desired, custom rewrites can be manually created using the following steps.Parameter | Description | Example Value |
Tail Query | The tail query itself. | benefits enrollment |
Improved Query | The query that will replace the tail query phrase. | benefits enrollment+year:"2021" |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
questionnaire
as questionare
, you can set up a query rewrite to automatically correct it.Parameter | Description | Example Value |
Misspelled Term | The phrase itself. | questionare |
Corrected Term | The term that will replace the misspelled term. | questionnaire |
Action | Action to perform. | expand |
Confidence | Confidence score from the phrase job. A confidence level of 1 represents 100% confidence. For rules created automatically via machine learning, the confidence level reflects the output from the machine learning model. | 1 |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
background check
would show results for both background
and check
. With phrase detection, this search would correctly boost results for "background check"
.Parameter | Description | Example Value |
Surface Form | The phrase itself. | background check |
Word Count | Indicates how many words are included in the phrase. | 2 |
Confidence | Confidence score from the phrase job. A confidence level of 1 represents 100% confidence. For rules created automatically via machine learning, the confidence level will reflect the output from the machine learning model. | 1 |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
Phrase Count | Denotes how many times this phrase was found in the source. This value is automatically set via machine learning. It does not need to be set manually. | 5 |
Boost Factor | The factor to use to boost this phrase in matching queries. | 2.0 |
Slop Factor | Phrase slop, or the distance between the terms of the query while still considering it a phrase match. | 10 |
questionnaire
could have the synonyms application
and survey
.Parameter | Description | Example Value |
Surface Form | The term that has synonyms. | questionnaire |
Direction | With a oneway search, the original search term is replaced by the synonym. In the preceding example, questionnaire would be replaced by the alternative words application and survey . With a symmetric search, the search query is expanded to include the original term and the synonyms, resulting in a greater number of potential hits. In the preceding example, this time the query would include questionnaire , application , and survey . | symmetric |
Synonym Mappings | Synonyms for the surface form. | application, survey` |
Confidence | Confidence score from the phrase job. A confidence level of 1 represents 100% confidence. For rules created automatically via machine learning, the confidence level will reflect the output from the machine learning model. | 1 |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
Count | How many times this term occurred in the signal data when it was discovered. This value is optional when a rewrite is being defined manually. | 5 |
case study examples
to remove examples
and then display results for case study
.Parameter | Description | Example Value |
Phrase to remove | The words to remove from the trigger phrase. | examples |
Trigger phrases | The query that prompts the removal of the phrase. The trigger phrase is not necessarily a complete query. If the query contains the trigger phrase, then Fusion removes the phrase in the Phrase to Remove field. | case study examples |
Use Experience Optimizer query rewrites
benefits enrollment
could be improved by using the search term benefits enrollment+year:"2021"
(in this case making use of the year field in the data).Most Head/Tail rewrites are typically created automatically via machine learning. However, if desired, custom rewrites can be manually created using the following steps.Parameter | Description | Example Value |
Tail Query | The tail query itself. | benefits enrollment |
Improved Query | The query that will replace the tail query phrase. | benefits enrollment+year:"2021" |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
questionnaire
as questionare
, you can set up a query rewrite to automatically correct it.Parameter | Description | Example Value |
Misspelled Term | The phrase itself. | questionare |
Corrected Term | The term that will replace the misspelled term. | questionnaire |
Action | Action to perform. | expand |
Confidence | Confidence score from the phrase job. A confidence level of 1 represents 100% confidence. For rules created automatically via machine learning, the confidence level reflects the output from the machine learning model. | 1 |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
background check
would show results for both background
and check
. With phrase detection, this search would correctly boost results for "background check"
.Parameter | Description | Example Value |
Surface Form | The phrase itself. | background check |
Word Count | Indicates how many words are included in the phrase. | 2 |
Confidence | Confidence score from the phrase job. A confidence level of 1 represents 100% confidence. For rules created automatically via machine learning, the confidence level will reflect the output from the machine learning model. | 1 |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
Phrase Count | Denotes how many times this phrase was found in the source. This value is automatically set via machine learning. It does not need to be set manually. | 5 |
Boost Factor | The factor to use to boost this phrase in matching queries. | 2.0 |
Slop Factor | Phrase slop, or the distance between the terms of the query while still considering it a phrase match. | 10 |
questionnaire
could have the synonyms application
and survey
.Parameter | Description | Example Value |
Surface Form | The term that has synonyms. | questionnaire |
Direction | With a oneway search, the original search term is replaced by the synonym. In the preceding example, questionnaire would be replaced by the alternative words application and survey . With a symmetric search, the search query is expanded to include the original term and the synonyms, resulting in a greater number of potential hits. In the preceding example, this time the query would include questionnaire , application , and survey . | symmetric |
Synonym Mappings | Synonyms for the surface form. | application, survey` |
Confidence | Confidence score from the phrase job. A confidence level of 1 represents 100% confidence. For rules created automatically via machine learning, the confidence level will reflect the output from the machine learning model. | 1 |
Tags | Optional metadata tags that can be used to identify and organize rewrites. | enrollmentpacket |
Count | How many times this term occurred in the signal data when it was discovered. This value is optional when a rewrite is being defined manually. | 5 |
case study examples
to remove examples
and then display results for case study
.Parameter | Description | Example Value |
Phrase to remove | The words to remove from the trigger phrase. | examples |
Trigger phrases | The query that prompts the removal of the phrase. The trigger phrase is not necessarily a complete query. If the query contains the trigger phrase, then Fusion removes the phrase in the Phrase to Remove field. | case study examples |
Optimizing Zero Results