We're excited to announce that our paper, “Dynamic Label Injection for Imbalanced Industrial Defect Segmentation,” has been accepted to the VISION workshop at #ECCV2024 , hosted by @Apple ! 🎉
Check out our preprint here: arxiv.org/abs/2408.10031#Apple#industrial#AI
A graphic designer than a scientist 😵💫.
Check out designs like the girl in the center sporting the famous image by #stylegan3 and the image by #dalle3 on the upper left...can you guess what's wrong?
More designs in the pipeline! If you have any papers in mind, drop a DM!
[...]
Hello deeple4rners! Excited to share T-Scirt with you all - a collection of t-shirts, mugs, bags, and more inspired by the deep learning world. Dive into famous plots from the #papers we read daily. This idea sparked after completing my PhD; feeling more
[...]
w0ah big news!
Glad that our paper "MIND: Multi-Task Incremental Network Distillation" has been accepted in the main conference track at #AAAI24 , one of the main main conferences in the world about artificial intelligence.
arxiv.org/abs/2312.02916
Congrats to the runner-up paper award, winning $500 from the MATRIX project.
"Towards Exemplar-Free Continual Learning in Vision Transformers: an Account of Attention, Functional and Weight Regularization"
I was sleeping, but my co-author Saurav Jha updated me ASAP.
I'm happy to say that we won the runner-up paper award at CLVISION workshop of #CVPR2022
We are among the first to investigate continual learning in transformers!
link to the paper arxiv.org/abs/2203.13167
Uh oh, it seems that these days are great...I published anoher paper!! In #CVPR2022 T4V workshop.
Mmmm and this one is a 𝕊𝕠𝕝𝕠 paper, meaning single authored :D
"Simpler is Better: off-the-shelf Continual Learning through Pretrained Backbones"
arxiv.org/abs/2205.01586
Woah another paper out!
"Smaller Is Better: An Analysis of Instance Quantity/Quality Trade-off in Rehearsal-based Continual Learning"
Accepted at #IJCNN 2022 :)
Here's a preprint :D
arxiv.org/abs/2105.14106
Our paper:
"Towards Exemplar-Free Continual Learning in Vision Transformers: an Account of Attention, Functional and Weight Regularization"
has been accepted to 2022 CVPR Workshop of Continual Learning!
Here's a preprint!
arxiv.org/abs/2203.13167