Pioneering the Future of Large Language Models
We invite highly motivated professionals to join our team at European Tech Recruit, where we are pushing the boundaries of quantum AI and machine learning. As a key member of our team, you will contribute to the development of cutting-edge language models and drive innovation in this rapidly evolving field.
Job Responsibilities:
As a core member of our team, you will be responsible for designing and implementing novel techniques for compressing large language models using quantum-inspired technologies. You will work closely with cross-functional teams to integrate these models into our products, ensuring seamless collaboration and effective communication.
Key Responsibilities:
* Develop and implement novel techniques for compressing large language models using quantum-inspired technologies.
* Collaborate with cross-functional teams to integrate these models into our products, ensuring seamless integration and effective communication.
* Conduct thorough evaluations and benchmarks of model performance, identifying areas for improvement and fine-tuning LLMs for enhanced accuracy, robustness, and efficiency.
* Apply your expertise to assess the strengths and weaknesses of models, propose enhancements, and develop novel solutions to improve performance and efficiency.
* Collaborate with the team to maintain comprehensive documentation of LLM development processes, experiments, and results.
* Participate in code reviews and provide constructive feedback to team members.
Requirements:
* Master's or Ph.D. in Artificial Intelligence, Computer Science, Data Science, or related fields.
* 3+ years of hands-on experience with deep learning models and neural networks, preferably working with Large Language Models and Transformer architectures or computer vision models.
* 1+ year of hands-on experience using LLM and Transformer models, with excellent command of libraries such as HuggingFace Transformers, Accelerate, Datasets, etc.
* Solid mathematical foundations and expertise in deep learning algorithms and neural networks, both training and inference.
* Excellent problem-solving, debugging, performance analysis, test design, and documentation skills.
* Strong understanding of GPU architectures.
* Excellent programming skills in Python and experience with relevant libraries (PyTorch, HuggingFace, etc.).
* Experience with cloud platforms (ideally AWS), containerization technologies (Docker) and deploying AI solutions in a cloud environment.
* Excellent written and verbal communication skills, with the ability to work collaboratively in a fast-paced team environment and communicate complex ideas effectively.
* Previous research publications in deep learning are a plus.
Spanish Fluency Required
Key Words: Large Language Models / LLM / Machine Learning / AI / Quantum Computing / GPU Architecture / GPGPU / GPU Farms / Multi-GPU / AWS / Kubernetes Clusters / DeepSpeed / SLURM / RAY / Transformer Models / Fine-tuning / Mistral / Llama