-
Notifications
You must be signed in to change notification settings - Fork 5.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add multi-layer, bidirectional RNN with tanh activation #28804
Open
muzakkirhussain011
wants to merge
6
commits into
ivy-llc:main
Choose a base branch
from
muzakkirhussain011:patch-4
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Commits on Sep 4, 2023
-
func_wrapper.py is a Python module designed to streamline the integration of Hugging Face Transformers into your natural language processing (NLP) projects. It provides a set of input and output conversion wrappers to simplify the process of passing data between your custom functions and Transformers' data structures. Input Conversion Wrappers: inputs_to_transformers_tensors: This wrapper converts input data (text, tensors, etc.) into Transformers-compatible data structures. It is particularly useful when your custom functions expect diverse input types. Output Conversion Wrappers: outputs_to_pytorch_tensors: After your custom function returns data, this wrapper ensures that the output data is converted into PyTorch tensors or other appropriate formats. Usage: Import func_wrapper.py into your project. Initialize a Hugging Face Transformers model and tokenizer. Wrap your custom function with to_transformers_tensors_and_back. This wrapped function can now accept and return Transformers-compatible data. Here's a simple example of how to use func_wrapper.py: import torch from transformers import BertForSequenceClassification, BertTokenizer from ivy.functional.frontends.transformers.func_wrapper import to_transformers_tensors_and_back # Initialize the model and tokenizer model_name = "bert-base-uncased" model = BertForSequenceClassification.from_pretrained(model_name) tokenizer = BertTokenizer.from_pretrained(model_name) # Wrap your custom function using the conversion wrappers wrapped_function = to_transformers_tensors_and_back(your_function, model, tokenizer) # Prepare sample input data sample_input_text = "This is a sample input text." sample_input_tensor = torch.rand((3, 3)) # Call your wrapped function with the sample input data output = wrapped_function(sample_input_text, sample_input_tensor) # The output is automatically converted to PyTorch tensors print(output) Please note that func_wrapper.py is still in development, and further enhancements and refinements are expected. Your feedback and contributions to improve its functionality are welcome.
Configuration menu - View commit details
-
Copy full SHA for 32e73bf - Browse repository at this point
Copy the full SHA 32e73bfView commit details
Commits on Aug 16, 2024
-
Configuration menu - View commit details
-
Copy full SHA for 485ded2 - Browse repository at this point
Copy the full SHA 485ded2View commit details -
Configuration menu - View commit details
-
Copy full SHA for cf1ae37 - Browse repository at this point
Copy the full SHA cf1ae37View commit details -
Configuration menu - View commit details
-
Copy full SHA for 5d130b5 - Browse repository at this point
Copy the full SHA 5d130b5View commit details
Commits on Aug 17, 2024
-
Configuration menu - View commit details
-
Copy full SHA for e9f4c37 - Browse repository at this point
Copy the full SHA e9f4c37View commit details -
Configuration menu - View commit details
-
Copy full SHA for 281d24d - Browse repository at this point
Copy the full SHA 281d24dView commit details
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.