

This is not to say that iPhone is impenetrable and immune to threats. Reports about malicious and dangerous apps appearing on Google Play Store outnumber those on Apple’s App Store by a considerable distance. Follow the instructions in /app to setup & deploy the mobile app.In the tech circles, it is widely accepted that Apple’s iPhone and its operating system iOS are more secure than Google’s Android OS and other smartphones.Follow the instructions in /server to setup & run the local server.Use this endpoint by launching the local server with -basnet_service_ip 3 - Configure and run the local server This is useful if you don't have your own CUDA GPU or do not want to go through the process of running the servce on your own. Option 2: Use a community provided endpointĪ public endpoint has been provided by members of the community. Make sure to configure a different port if you're running BASNet on the same computer as the local service You will need the deployed service URL to configure the local server If the background is just blank, SIFT will probably not have enough feature to do a correct match.Ģ - Setup the external salience object detection service Option 1: Set up your own model service (requires a CUDA GPU)Īs mentioned above, for the time being, you must deploy theīASNet model (Qin & al, CVPR 2019) as an external HTTP service using this BASNet-HTTP wrapper (requires a CUDA GPU) Also make sure that your document has some sort of background.Make sure that your PS document settings match those in server/src/ps.py, otherwise only an empty layer will be pasted.Go to "Preferences > Plug-ins", enable "Remote Connection" and set a friendly password that you'll need later.But that hasn't been implemented in this repo yet. It would be a lot simpler to use something like DeepLap directly within the mobile app.For now, the salience detection and background removal are delegated to an external service.The object detection / background removal service Check out the /server folder for instructions on configuring the local server.It finds the position pointed on screen by the camera using screenpoint.The interface between the mobile app and Photoshop.Check out the /app folder for instructions on how to deploy the app to your mobile.This prototype runs as 3 independent modules:

Update 2020.05.11: If you're looking for an easy to use app based on this research, head over to Modules ⚠️ This is a research prototype and not a consumer / photoshop user tool. An AR+ML prototype that allows cutting elements from your surroundings and pasting them in an image editing software.Īlthough only Photoshop is being handled currently, it may handle different outputs in the future.
