Quantcast
Channel: Adobe Community : Popular Discussions - Premiere Pro SDK
Viewing all 53010 articles
Browse latest View live

Alpha channel in Exporters

$
0
0

Hello,

I'm having difficulties with alpha channel in an exporter. Symptoms - no alpha channel is given to my exporter by Premiere Pro when using source with alpha channel. I created a sample test video that has rows of (red pixel, green pixel, blue pixel, alpha pixel). So, while examining bits in the memory, I always see 0xff for alpha component (i.e, red pixel looks like 0xff 0x00 0x00 0xff, green pixel looks like 0x00 0xff 0x00 0xff, blue pixel looks like 0x00 0x00 0xff 0xff, and alpha pixel looks like 0x00 0x00 0x00 0xff), which looks to me like no alpha channel given in the frame. I request frames in BGRA_4444_8u pixel format. Correct me if I'm mistaken, but shouldn't that color format give me alpha component if source has alpha? I thought "ok, may be sequence or project settings have setting that i'm not enabling to preserve alpha", but as I found out, Premiere Pro preserves alpha unless explicitly told not to preserve by user. SDK Exporter sample project description mentions support for uncompressed 8 bit RGB with or without alpha. Does that mean that Premiere Pro should give frames with alpha channel if source had it? I looked at how sample exporter requests frames from Premiere Pro and only color format-specific info is custom 'sdk' color format, unless I overlooked something.

 

Thank you,

Petro


Render Quality (w.r.t. Exporter's video src)

$
0
0

Zac,

 

can you explain the backend consequence of the different Render Qualities:

 

/**

** Render qualities

*/

typedef enum

{

kPrRenderQuality_Max = 4,

kPrRenderQuality_High = 3,

kPrRenderQuality_Medium = 2,

kPrRenderQuality_Low = 1,

kPrRenderQuality_Draft = 0,

kPrRenderQuality_ForceEnumSize = kPrForceEnumToIntValue

} PrRenderQuality;

 

And how orthogonal it is to the PixelType requested?

 

For example, one would assume that Draft mode should work as fast as possible by using the YUVA_8u PixelFormat whereas MAX mode would work in YUVA_32f and thus YUVA_32f PixelFormat should be requested in the supported PixelFormat type supported by the Exporter.

Always asking for YUVA_32f is a waste of processing time so that shouldn't be at the top of the supported PixelFormat list unless you really want it at max bit depth.

 

As an example - the shipped "h.264" exporter has a "Render at Max Depth" button. What RenderQuality does it set and what PixelFormat does it put in the list - and in which order - when enabled and disabled?

 

thanks,

 

Rallymax.

Is there a job I can use to tell if a render in encoder has been stopped similar to onEncoderJobComplete for example?

$
0
0

When a user stops or pauses a render in encoder I would like to know, so I can handle it in the panel.

 

Currently it returns type progress and progress at 1.0.

Multi-track audio plugin: VST or Premiere SDK?

$
0
0

Thanks to Bruce and the other posters in the forum, I'm new to Premiere plugins but there is a lot of helpful information here.

 

The Setup

I'm writing a plugin that is basically an adaptive mixer - it takes inputs from two separate audio tracks and performs frequency-domain manipulations on them. I'm trying to figure out if I can write a VST plugin (for more information see here and here) that interacts with two audio tracks in Premiere. The process would be

  1. Select a region of time
  2. Apply the plugin to a target track over that region
  3. In the configuration UI select other track(s) as input to the effect
  4. The effect runs, using the audio samples from the target track and input tracks, to generate an output
  5. This output becomes the target track over that time region

 

Ideally the time window could be shifted, which would maintain all parameters but just shift the region that's being processed.

 

Questions
  1. Is this possible with Premiere's implementation of VST? Can a VST plugin take more than one track as input in Premiere?
  2. Regardless of the answer to #1, would you advise creating this plugin using (a) the VST interface, (2) the Premiere SDK, or (3) the After Effects SDK?

 

Thanks for any insight, between reading the VST standards, the Premiere SDK docs, and the After Effects SDK docs I'm feeling a little lost.

-Bryan

Render Sequence In to Out

$
0
0

I am trying to use the following in a script.

I cant find anything about it, is it not possible?

TypeScript Enabled CEP Development

$
0
0

Modern Tools and ExtendScript

Earlier this week, Bruce Bullis integrated a pull request into the CEP Samples repository that included a new Sample called TypeScript. TypeScript is a language that "transpiles" into JavaScript and JavaScript-based languages (e.g. ExtendScript). It can also be configured in certain environments (e.g. Visual Studio Code) to provide normal JavaScript (and, it turns out, ExtendScript) development with helpful type checking and IntelliSense/autocompletion support. Here's an example:

TSExample.gif

Open the PProPanel-vscode folder in Visual Studio Code (or other TypeScript-aware IDEs?), open up the Premiere.jsx file, and simply begin typing. The environment is already set up to provide rich IntelliSense support.

 

Powered By TypeScript Declaration Files and JSDoc

The environment has to get information about the types from somewhere, right? The type information is identified by the TypeScript system in the following ways:

 

  • In the Panel JavaScript Environment:
  • In the App ExtendScript Environment:
    • Uses the TypeScript-provided ES5 type information (ES3 is coming - when available, it will be switched to use that!).
    • Uses custom Type Declaration files (included within the sample here) to help make the IDE aware of what types are available, as well as documentation!
    • Sources JSDoc comments for type information.

 

Beyond this, TypeScript is capable of type inference (set var x = 5; and the IDE will infer that x is a number). For more on how TypeScript does all of this, see this document.

 

Inline ExtendScript and PremierPro API Documentation

The declaration files included in the sample are currently incomplete, but bbb_999 has indicated interest in helping to fill in the blanks. These files are an improvement over the current documentation in that they can be [more] easily read on the web (don't need to be downloaded and opened in a browser) and also power in-line documentation and suggestions (as in the above gif)!

 

Writing in TypeScript

As configured, the sample does not actually assume that you will be writing in TypeScript and transpiling to JavaScript, but that you are simply writing JavaScript/ExtendScript. If you wish to use this feature, you will have to configure your IDE to do so (Visual Studio Code, Webstorm, Sublime Text, etc.).

 

Writing in NEW JavaScript

It should also be noted that TypeScript can transpile from new JavaScript to old JavaScript. This may not seem all that interesting except that you could use the latest ES6 features to write code for both your panel JavaScriptor your app ExtendScript. These would be transpiled into platform-compatible versions: target: "es5" for panel and target: "es3" for ExtendScript!

 

As with TypeScript, this use-case would require setting up the TypeScript compiler.

 

Debugging In Visual Studio Code

This sample also contains configuration settings to allow debugging of the panel (HTML) environment directly in Visual Studio Code, rather than through a Chrome browser. See:

vscode-debugging.png

 

Questions? Comments? Ask/post away!

[Adobe Premiere Pro] Installed Extensions not Showing in Adobe Premiere

$
0
0

[Adobe Premiere Pro] Installed Extensions not Showing in Adobe Premiere

 

Hi all,

 

Complete Adobe-Development noob here -- so apologies in advanced for any confusion on my part.

 

I have a requirement for work to update an extension that someone else (who is no longer available) created for Adobe Premiere Pro.

 

To the limit of my knowledge, it was hosted on Adobe Exchange and ceased to work due to new released Premiere Pro versions. I am tasked with updating the extension to make it compatible with the newest Premiere Pro release (and hopefully just all versions if possible). The extension is just essentially an iFrame redirect to our web application, housed in an Adobe Premiere Pro panel. I plan to just update the versions and submit a "Patch" on Adobe Exchange with the new ZXP Package.

 

Issue:

  1. I cannot see my "installed" extension in Adobe Premiere Pro when testing my ZXP changes -- the extension menu is grayed out / disabled.

 

Attempted Fix Process (learning to repackage the original ZXP file has been an epic journey):

  • I stripped the original ZXP package of the "mimetype" file and "META-INF" directory after learning those are generated after signing the package.

          ZXP Package Files.png

  • I edited the CSXS --> manifest.xml to provide (what I thought was) the minimum version supported (which I hoped would then just be compatible with all new versions)

               Manifest XML File - Updated.PNG

  • I signed the folder with ZXPSignCmd.exe -- this creates the new, signed ZXP file

 

  • I install the new ZXP package with ExManCmd.exe
    • Command: ExManCmd.exe /install com.MyExtension.zxp

 

  • I receive a successful extension installation result:
    • When installing the extension with ExManCmd.exe, the "com.MyExtension" extension folder is being placed in the following path:

      • C:\Program Files (x86)\Common Files\Adobe\CEP\extensions

          Successful CMD Line Install of Extension.PNG

 

  • Unfortunately, I still do not see the installed extension (or any extensions) in Adobe Premiere Pro version: 11.1.2. The Extensions menu item is grayed out.

          Adobe Premiere Pro - No Extension Showing.png

 

Question:

  1. Can anyone tell me what I'm missing or what I'm doing wrong? Thank you in advance!

Steps to smart render in custom plugins

$
0
0

Hi,

We've  developed custom importer and exporter plugins to support a video file format not directly supported by Premiere CC 2017-2018. The format uses separate AVI and  WAV files for each asset.

 

We want to support smart render to avoid unnecessary re-compression and speed up render.

What are the things we must be aware to support smart render?

 

Also we would like to create an "Editing mode" to allow video preview files be generated with our custom format.

What are the steps to create a new Editing mode?

Best regards,

Daniel Trullén


Render Sequence via Panel is slower?

$
0
0

Hi, I created a panel that renders the active sequence with Adobe Media Encoder.

 

We tested it for a week now and video-editors say that encoding via panel is slower than when they do the same task (Encode sequence in Media Encoder with exactly the same preset and target path) manually.

 

I told them that this is not possible, because both tasks do exactly the same. Is that correct?

Are there any settings / configuration options for  app.encoder that may impact render performance?

 

Thanks

How to export an EDL from a timeline using ExtendScript

Open additional window when opening multiple projects from extendScript.

$
0
0

I'm running app.openDocument(premiere_project_path) to open a project. That is working well. But if I already have project open it will close that and open the new project in it's place. Is there a way to open projects in a new window if another premiere instance is already running?

background is not transparent on transition.

$
0
0

Hi, All.

I am creating a transition plugin from SDK_CrossDissolve.

By the way, I have a strange issue.

I set all pixel of output to {0,0,0,0} on plugin code, but the background is not transparent and layer image.

How can I set transparent background?

//code

...

PF_LayerDef* dest = output;

char* destData = (char*)dest->data;

...

for (int y = 0; y < output->height; y++)

{

     for (int x = 0; x < output->width; x++)

     {

          ((float*)destData)[(y * dest->rowbytes) / 4 + x * 4 + 0] = 0;

          ((float*)destData)[(y * dest->rowbytes) / 4 + x * 4 + 1] = 0;

          ((float*)destData)[(y * dest->rowbytes) / 4 + x * 4 + 2] = 0;

          ((float*)destData)[(y * dest->rowbytes) / 4 + x * 4 + 3] = 0;

     }

}

.....

 

//result

Output buffer is cropped with small image on gpu acceleration.

$
0
0

Hi, everyone!.

I hope you are doing well.

I am developing a gpu accelerated plugin from SDK_ProcAmp.

By the way I see a strange issue now.

On software mode out buffer's size(width and height) is entire render area, but on gpu mode (cuda, opencl, metal) the out buffer's size ( here it is outFrame) is not entire render area and it is layer's size.

How can I set it as entire render area not layer's size? (there is no problem on text layer or big image. there is the problem only on small image.)

Looking forward to hearing back.

Regards,

Igor.

Why is rendering layer different between GPU mode and Software mode during preview?

$
0
0

Hi, everyone.

I have strange problem on GPU mode.

I am creating a gpu accelerated plugin But I can not complete the plugin because of the problem.

 

PROBLEM:

Please take a look to following screenshot in detail. they are layers  when during preview(it is being played) on gpu mode and software mode.

As you can see, there are black pixels on the edge which has opacity on gpu mode. (there is big different with software mode).

 

I am not sure how premiere core works on gpu mode and software mode.

Would you let me know in detail? Can I get layer on gpu mode which is same as software mode?

 

Regards,

Igor.

Is there any event gets invoked in Panel when AME Render is Stopped?

$
0
0

Hi All,


Is there any event available to bind on "app.encoder" object when AME Render is Stopped from AME UI? "onEncoderJobError" doesn't work.

 

Premiere Pro version: 11.0.1

Extension Type: Panel

 

 

Thanks & Regards,
Meet Tank


How to call render when specific event happens?

$
0
0

Hi, All.

I am developing a plugin by Premiere pro and After Effects SDK.

By the way I have a question.

Now render function is called only when parameters are changed or rendering is in process...etc.

..................

Problem:

I added a thread in the plugin and it is running in every 10 seconds.

I want to call render function by emitting certain event in the thread even though rendering is not in process.

Is it possible?

 

Regards,

Anthonie.

Plug-In not showing up in premiere pro

$
0
0

Hello, I'm from accevolution (www.accevolution.net) and we've got the problem that our plugins are not showing up in premiere pro at about 30% of the customers machines. Our workaround at the moment is to tell the customers to install the most recent graphics card driver, launch premiere pro while holding down the shift key and if this doesn't help, to set the ignore flag in the registry here: 3. HKEY_CURRENT_USER\Software\Adobe\Premiere Pro\11.0\PluginCache.64\en_US3. HKEY_CURRENT_USER\Software\Adobe\Premiere Pro\11.0\PluginCache.64\en_US to "0".

 

Any suggestions why they are working perfectly on 70% and not showing up on 30%? From those 30% that have trouble are 99% windows-user, but we also got a few mac users with that problem.

 

It would be great if somebody could give me a hint.

 

Best regards,

 

Wolfram

Javascript API for premiere pro

$
0
0

Where i can find the javascript API link for Premiere pro. I want to change the Lumetri color parameters like Exposure, Contrast, HIghlights, Shadows, Whites, Blacks and Saturation etc programmatically.

 

Thanks,

 

Nehru

Can you build premiere pro plugins using the After Effects SDK?

$
0
0

I come from the background of using Motion and Final Cut Pro.  I am one of the few programmers who uses the FxPlug API which is used to make Motion and Final cut Pro Plugins.  I am expanding into trying to create plugins for premiere pro.  I am confused by After Effects and Premiere Pro having two different SDKs.  After Effects seems to have some structure to it with a parameter setup and methods for rendering while Premiere Pro seems to have very little structure to it.  I noticed that I can open .plugins created with the After Effects SDK both inside After Effects and Premiere Pro while I have had trouble getting any of the Premiere Pro .bundle's to work correctly at all.  What is the difference in making a .bundle vs a .plugin when .plugins seem to work in both?

 

I am on a Mac and using Xcode and anytime I try to debug any of the example projects that came with the Premiere Pro SDK I have this error. "This file path does not exist on disk at this location.  /Applications/Adobe Premiere Pro CC 2017/Adobe Premiere Pro CC 2017.app/Contents/MacOS/Adobe Premiere Pro CC 2017 - NSDocumentRevisonsDebugMode YES".

 

I cant even get one of the example projects off the ground to mess with one for debugging and testing purposes.  What are the benefits of using one SDK over the other?

 

My last question relates to effects and project media files.  Do you need to insert media into the project every time or can I have media built into an effect kind of like a Generator in Final Cut Pro.  In Final Cut Pro you can have media cooked into an effect and media is not needed in the project or the timeline to accomplish this.  It seems like effects only offer presets that are just a difference in parameters but do not have any media built into them.  Is it possible to have media inside of an effect in either Premiere Pro SDK or After Effect SDK?

How to identify the MXF-OP Atom files in a Premiere project

$
0
0

Hi All,

 

Is there any way we can go through a large project and find the MXF-OP Atom files in that project?

Is there any way to get the codec/wrapper details of the files in a Premiere project?

We are trying to achieve this using our HTML5-CEP based panel.

 

Thanks and Regards,

Anoop NR

Viewing all 53010 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>