Skip to content

Python library providing a simple, fully supervised sentence embedding technique for textual adversarial attacks.

License

Notifications You must be signed in to change notification settings

DavidHerel/semantics-preserving-encoder

Repository files navigation

About

Semantics Preserving Encoder is a simple, fully supervised sentence embedding technique for textual adversarial attacks.

Setup

You should be able to run this package with Python 3.6+. To use Semantics Preserving Encoder simply run pip with command:

pip install spe-encoder

Usage

This package is easy to use or integrate into any Python project as follows:

from spe import spe

input_sentences = input_sentences = [
    "The quick brown fox jumps over the lazy dog.",
    "I am a sentence for which I would like to get its embedding"]

output_vectors = spe(input_sentences)

Possible modifications

You can utilise the default classifiers specified in paper Semantics Preserving Encoder or extend/ replace with your own classifiers by placing them in the "spe_classifiers" folder, which is automatically created. The script will auto detect these changes. Note: Currently, only fastText classifiers are supported.

You can also define your own vector dimension for the output vectors through an optional second parameter of 'spe' method. Otherwise, 10 is used as a default value.

my_vector_dimensions = 5
output_vectors = spe(input_sentences, my_vector_dimensions)

Citation

Please cite the ECAI paper if you use SemanticsPreservingEncoder in your work:

@article{herel2022preserving,
  title={Preserving Semantics in Textual Adversarial Attacks},
  author={Herel, David and Cisneros, Hugo and Mikolov, Tomas},
  journal={arXiv preprint arXiv:2211.04205},
  year={2022}
}

License

SemanticsPreservingEncoder is MIT licensed. See the LICENSE file for details.

About

Python library providing a simple, fully supervised sentence embedding technique for textual adversarial attacks.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •