top of page
Type
Category
110 items found for ""
- Here's a high-level guide to help you understand the steps involved.In Tech Talk·September 19, 2023Here's a high-level guide to help you understand the steps involved. Please note that this is a general overview, and the exact steps might vary based on your specific environment and requirements. 1. Prerequisites: • Ensure you have the required hardware and software prerequisites for AutoSys R11.x.x or R12. • Obtain the installation media or download links for AutoSys, EEM, AE, and WCC. • Ensure you have the necessary licenses for the software. • Backup any existing AutoSys configurations if you're upgrading or migrating. 2. Install EEM (Embedded Entitlements Manager): EEM provides centralized user authentication and authorization. 1. Launch the EEM installer. 2. Follow the on-screen instructions, selecting the appropriate options for your environment. 3. Once installed, access the EEM management console to configure users, groups, and policies. 3. Install AutoSys AE (AutoSys Edition): 1. Launch the AutoSys AE installer. 2. During installation, you'll be prompted to specify the EEM server details. Provide the details of the EEM server you set up in the previous step. 3. Configure the AutoSys database. Depending on your setup, this could be an Oracle or SQL Server database. 4. Complete the installation by following the on-screen prompts. 4. Install WCC (Workload Control Center): 1. Launch the WCC installer. 2. During the installation, provide the details of the AutoSys AE server and the EEM server. 3. Complete the installation process. 5. Post-Installation Configuration: 1. EEM Configuration: Access the EEM console and define policies for AutoSys. This includes user roles, permissions, and other security settings. 2. AutoSys AE Configuration: • Configure job definitions, calendars, and other scheduling parameters. • Set up agents on target machines where jobs will run. • Configure communication between the AutoSys server and agents. 6. WCC Configuration: • Access the WCC console and connect it to the AutoSys AE server. • Configure dashboards, views, and other monitoring settings. 6. Testing: • Once everything is set up, it's crucial to test the entire setup. • Create test jobs in AutoSys and monitor them through WCC. • Test user permissions and roles through EEM to ensure security policies are correctly applied. 7. Documentation: • Document the entire installation and configuration process. This will be helpful for troubleshooting, future upgrades, or migrations. 8. Backup: • Regularly backup your AutoSys configurations, job definitions, and database to prevent data loss.004
- I want add my broker api in power bi for live feed plz helpIn Tech Talk·August 16, 2023Adding Broker API in Power BI Here's a general outline of the steps you can follow: Get API Access: Ensure you have access to your broker's API and the necessary authentication credentials (API key, secret, token, etc.). Open Power BI: Launch Power BI on your computer. Choose Data Source: Click on "Get Data" in the Home tab to choose your data source. Select Web or JSON: Depending on your broker's API, you can select either the "Web" option or the "JSON" option as your data source. Enter API Endpoint: Enter the API endpoint URL provided by your broker in the "URL" field. This URL is where you'll be making API requests to fetch the live feed data. Configure Authentication: Depending on the authentication method your broker uses (e.g., API key, OAuth), you may need to configure authentication settings. If it's API key-based, you can often include the API key in the URL itself or provide it in headers. If it's OAuth, you might need to enter the appropriate credentials. Transform Data (Optional): Power BI might load the data as-is or allow you to transform it. You can use the Power Query Editor to clean, filter, or modify the data before using it in your reports. Load Data: After configuring authentication and optionally transforming the data, click on the "Load" button to load the live feed data into Power BI. Create Visualizations: Once the data is loaded, you can create visualizations like charts, tables, and graphs using the "Visualizations" pane. Set Refresh Schedule: If you want the data to update automatically, you can set a refresh schedule in Power BI. Depending on your version of Power BI, there might be limitations on how often you can refresh data in the free version. Save and Publish: After creating your visualizations and setting up refresh, you can save your Power BI file and publish it to Power BI Service if you want to share it with others.0016
- 1. Artificial Intelligence (AI) Revolution 🤖In Tech Talk·July 19, 2023AI is at the forefront of innovation, transforming industries and revolutionizing the way we interact with machines. From AI-driven virtual assistants to smart homes and autonomous vehicles, the potential of AI is limitless. Let's share our thoughts on how AI impacts our daily lives and its ethical implications.003
- C Macro with a function errors out: expression cannot be used as a functionIn Ask QuestionsSeptember 29, 2023The issue you're encountering is due to the fact that macros are expanded by the preprocessor before the actual compilation takes place. When you use a function inside a macro, the function is not yet defined at the time of macro expansion, leading to the error you're seeing. To fix this, you should ensure that the necessary function prototypes are available before the macro is expanded. In this case, you're using the strrchr function, which is declared in the string.h header. You should include this header before defining the macro. Here's the corrected code: #include #include // Define a helper macro to get the file name from __FILE__ #define FILENAME_ONLY(file) (strrchr(file, '/') ? strrchr(file, '/') + 1 : file) // Use the helper macro to create MYFILENAME #define MYFILENAME FILENAME_ONLY(__FILE__) // Create __MYFILE__ macro #define __MYFILE__ "[" MYFILENAME "]" int main() { printf("%s\n", __MYFILE__); return 0; } By including string.h at the top, you ensure that the strrchr function prototype is available when the macro is expanded. This should resolve the error you're seeing.00
- I want to ask for a project recommendation related to data science/ML/AI/Cloud related for a portfolio made?In Ask QuestionsOctober 19, 2023Data Science project 1. Customer Segmentation: Objective: Use clustering algorithms to segment customers based on their purchasing behavior. Data: Customer purchase history, demographic information, etc. Tools: Python, pandas, scikit-learn, matplotlib, seaborn. 2. Time Series Forecasting for Stock Prices: Objective: Predict future stock prices using time series analysis. Data: Historical stock price data. Tools: Python, pandas, NumPy, scikit-learn, matplotlib, seaborn. Machine Learning project 1. Sentiment Analysis for Product Reviews: Objective: Analyze customer reviews to determine the sentiment towards a product. Data: Customer reviews from websites like Amazon, Yelp, etc. Tools: Python, pandas, scikit-learn, NLTK, matplotlib, seaborn. 2. Image Recognition with Neural Networks: Objective: Build a neural network to recognize images. Data: Image datasets like CIFAR-10, MNIST, etc. Tools: Python, TensorFlow, Keras, OpenCV. Artificial Intelligence project 1. Chatbot Development: Objective: Develop a chatbot that can handle specific tasks or answer questions. Data: Conversational data, domain-specific data. Tools: Python, Rasa, TensorFlow, Dialogflow. 2. Game AI Development: Objective: Develop an AI that can play and possibly master a specific game. Data: Game data, player data. Tools: Python, Unity, TensorFlow, PyTorch. Cloud project 1. Serverless Image Processing Pipeline: Objective: Create a serverless pipeline to process images uploaded to a cloud storage bucket. Data: Images. Tools: AWS Lambda, Google Cloud Functions, Azure Functions, Python, OpenCV. 2. Cloud-based Machine Learning Model Deployment: Objective: Deploy a machine learning model in the cloud and expose it as a REST API. Data: Any data relevant to the machine learning model. Tools: AWS SageMaker, Google AI Platform, Azure Machine Learning, Docker, Python, Flask.00
- One quick question please : Do we need to install any gateway to connect post GRE SQL database to connect to Azure.In Ask QuestionsOctober 28, 2023Thankyou for the support.i will go through the steps that was provided.0
- Can you please tell what is the difference between Paginated report visual in power bi visual pane and Paginated reports build through ..In Ask QuestionsOctober 19, 2023Paginated Report Visual in Power BI Visual Pane • Ease of Use: Easy to integrate within a Power BI report, providing a seamless experience. • Purpose: Primarily used to add paginated report-like visuals to your interactive Power BI reports. • Interactivity: Works within the Power BI ecosystem, allowing you to have interactive dashboards along with paginated visuals. • Data Source: Relies on the same dataset as the Power BI report. • Design Flexibility: Limited compared to Report Builder; you're restricted to the visual's capabilities within the Power BI interface. • Sharing and Deployment: Shares the same sharing and deployment options as Power BI reports. Paginated Reports Built through Report Builder • Ease of Use: Requires a separate tool (Report Builder) for creating reports, and then uploading to the Power BI service. • Purpose: Designed for scenarios where you need traditional, pixel-perfect, paginated reports that can be printed or shared as PDFs. • Interactivity: Limited interactivity compared to Power BI reports; mainly used for static, printable reports. • Data Source: Can connect to a variety of data sources, including those not available in Power BI. • Design Flexibility: Offers more design flexibility, with detailed control over the layout, formatting, and grouping of data. • Sharing and Deployment: Can be shared and deployed through the Power BI service, but also through other means like email subscriptions.00
- Are you getting Microsoft mashup engine1 error, odata version 3 and 4 error, error 404 not foundIn Tech TalkJuly 20, 2023Yes, that's correct. To connect to files stored in SharePoint using the "SharePoint Folder" connector in Power BI, you need to provide the correct URL of the SharePoint document library that contains the files you want to access. Here's how you can do it: 1. Go to the SharePoint Document Library: • In your SharePoint site, navigate to the document library that contains the files you want to connect to in Power BI. 1. Copy the URL: • Copy the URL from the address bar of your web browser. The URL should look something like this: https://your-sharepoint-site.sharepoint.com/sites/YourSiteName/Shared%20Documents/YourDocumentLibrary/ 1. Go to Power BI Desktop: • Open Power BI Desktop. 1. Get Data from SharePoint Folder: • Go to the "Home" tab and click on "Get Data." • Choose "SharePoint Folder" from the list of data sources. 1. Enter the SharePoint URL: • In the "From SharePoint Folder" dialog, paste the URL you copied from your SharePoint document library into the "Folder Path" field. • Click "OK." 1. Connect to Files and Folders: • Power BI will connect to the SharePoint document library and display the files and folders it contains in the Navigator window. 1. Choose Files to Load: • In the Navigator, select the files you want to load into Power BI by checking the corresponding checkboxes. You can also choose to load all files in the document library. 1. Click "Load": • Once you've selected the files you want to load, click on "Load" to load the data into your Power BI report.0
- I am trying to install FullStory to record sessions on SF. The script is uploaded as a static resource into SFIn Ask QuestionsSeptember 29, 2023import { LightningElement } from 'lwc'; import fsScript from "@salesforce/resourceUrl/local-fullstory"; import { loadStyle, loadScript } from "lightning/platformResourceLoader"; export default class FullStoryTest extends LightningElement { connectedCallback() { console.log("in connect " + fsScript); loadScript(this, fsScript) .then(() => { console.log("Loaded"); window.FS = FS; // Attach FS to the window object console.log("FS obj " + FS); }).catch((err) => console.log(err)); } } By attaching FS to the window object, you're making it globally accessible. After loading the component, you should be able to access the FS object from the browser console using window.FS or simply FS.00
- I need some help on a dynamic measure header display based on slicer selections.In Ask QuestionsOctober 19, 2023Sales Header = IF ( HASONEVALUE ( Subsidiary[Name] ), "Sales for " & VALUES ( Subsidiary[Name] ), "Total Sales" ) Cost Header = IF ( HASONEVALUE ( Subsidiary[Name] ), "Cost for " & VALUES ( Subsidiary[Name] ), "Total Cost" ) GP Header = IF ( HASONEVALUE ( Subsidiary[Name] ), "GP for " & VALUES ( Subsidiary[Name] ), "Total GP" )00
- Hi we are working on live connection which is connected through SSAS , ..In Ask QuestionsOctober 26, 2023When you're working with SQL Server Analysis Services (SSAS) and you want to extract metadata such as column names, you can use the following methods: 1. Using SQL Server Management Studio (SSMS): 1. Connect to your SSAS server instance in SSMS. 2. In the Object Explorer, expand the server, and then expand the "Databases" folder. 3. Find and expand the database that contains your cube. 4. Expand the "Cubes" folder and then expand the cube that you're interested in. 5. You should see a "Dimensions" folder. Expand this to see a list of dimensions (tables). 6. Expanding a dimension will show you the attributes (columns) of that dimension. 2. Using XMLA Queries: 1. Connect to your SSAS server instance in SSMS. 2. Open a new XMLA query window. 3. Execute the following XMLA query to get the list of dimensions and their attributes: MDSCHEMA_DIMENSIONS YourDatabaseName YourCubeName YourDatabaseName YourCubeName This will return a list of dimensions and their attributes, which you can then use to extract the column names for each table. The column names will be listed under the "ATTRIBUTE_NAME" column in the results.00
- One quick question please : Do we need to install any gateway to connect post GRE SQL database to connect to Azure.In Ask QuestionsOctober 27, 2023To connect a PostgreSQL database to Microsoft Azure services using a managed identity, you typically don't need to install a gateway. However, you need to configure your Azure service and the PostgreSQL database to allow for the connection. Here's a general approach you might consider: 1. Configure Azure Service: • In the Azure portal, navigate to the service you want to connect to your PostgreSQL database. • Go to the "Identity" panel and turn on system-assigned managed identity. • Assign necessary roles to the managed identity to allow it to interact with the database. 3. Configure PostgreSQL Database: • Ensure that the PostgreSQL database allows connections from Azure. • Add the necessary firewall rules to allow traffic from Azure. • Create the necessary database users and roles. • Grant the necessary permissions to the database users and roles. 4. Connect from Azure Service: • Use the managed identity's credentials to connect from your Azure service to the PostgreSQL database. 5. Test the Connection: • Test the connection to ensure that everything is configured correctly.00
bottom of page