• RELEVANCY SCORE 3.59

    DB:3.59:I Am Trying To Find The Duplicate Files In All My Drives,The Answer Is To Technical, Please Help? kx




    I am using XP,  trying to find the duplicate files in all my drives,the answer is to technical, please help?

    DB:3.59:I Am Trying To Find The Duplicate Files In All My Drives,The Answer Is To Technical, Please Help? kx

    In that case, you should never read any content twice, and never read a file if nothing could match it.  Here is how you do it:
    1, collect a list of all files, and divide them into groups by size
    2, abandon all groups containing one file
    3, in each group, compare files by blocks, one block at a time and remember where you are.  The size of block should match block used in OS.
    4, divide files in a group based on the content of one block you read
    5, go back to step 2 and repeat until all blocks in each file group are compared
    6, report the result.
     
    Note, you should reuse a set of buffer in step 3. Don't waste memory and time to alloc new buffers. After step 1, you know the biggest size of a group. You usually can allow buffer with that (unless it is too big).
     
     

  • RELEVANCY SCORE 3.43

    DB:3.43:Duplicate Records In Apd sd




    Hello Team,I am trying to load data in to a Direct update DSO using an APD.The job fails with the Error " Duplicate data when writing data to the DSO"I have checked the Intermediate results , but there are no Duplicate Records.Can you tell me if there is any other way to find the duplicate records?

    DB:3.43:Duplicate Records In Apd sd

    Go to your query designer and in the query properties you will find row and column suppression so turn it off OR you can also do it from report output by context menu query propertiesThanksAbhishek Shanbhogue

  • RELEVANCY SCORE 3.39

    DB:3.39:Conditional Formatting 9z




    I am trying to find an easier way to highlight duplicate data across two rows without having to create a formula for each row.
    Example Worksheet (top row is the column headings, all following rows are the data, I have numbered each row)

    A B
    1 xa
    2k y
    3 ax

    My goal: highlight thecell in columnA ifdata is not duplicate in column B.
    Thank you.

    DB:3.39:Conditional Formatting 9z

    Revise it to add-on the new condition within the AND,
    viz use this as the CF formula:
    =AND(A1"",COUNTIF($B:$B,A1)=0,C1=9020)

  • RELEVANCY SCORE 3.37

    DB:3.37:How To I Locate And Delete Duplicate Documents And Files 9a


    I am trying to clean up my macbook and want to find and delete duplicate documents and files. How do I do this?

    DB:3.37:How To I Locate And Delete Duplicate Documents And Files 9a

    Finding duplicate files is something a lot of people do. I do not understand why Apple hasn't built it as part of Smart Folders? It would have taken a good coder only 1 hour of work to do that!Pathetic of Apple, really pathetic! A prime example of crippleware!

  • RELEVANCY SCORE 3.32

    DB:3.32:Trying To Delete Duplicate Photos In My Pictures And Every Time You Send One To Recycle Bin It Makes A Duplicate Copy. k3


    Trying to delete duplicate photos in my pictures and every time you send one to recycle bin it makes a duplicate copy. How do I delete without computer making all these copies. Pls. help. Tks.
    original title:deleting duplicate phots

    DB:3.32:Trying To Delete Duplicate Photos In My Pictures And Every Time You Send One To Recycle Bin It Makes A Duplicate Copy. k3

    Do the unwanted copies have filenames with
    the prefix "copy of"?

    Example: Copy of IMG_1455.JPG

    If_yes_maybe the following tutorial I wrote
    for XP will offer some ideas:

    Here's how to reproduce the unwanted
    copying...

    Hold your Ctrl key and select several files...

    Place your pointer on the selected group
    (still holding the ctrl key) and left click / drag
    the group even just the slightest bit...now
    release the mouse button.

    Oooops...more copies...

    Maybe the following info will help:

    It happens occasionally to most anyone who
    is trying to multiple select by holding the Ctrl
    or Shift key while left clicking to select the
    image files.

    The copies are created when you fail to
    completely release the mouse button before
    you move the pointer to another file. If a group
    of selected files is dragged...even the slightest
    bit...releasing the mouse button will produce
    copies of all the previously highlighted files.

    It's somewhat easier if you have your Folder
    Options set to Single Click...this way you can
    select/deselect a file with just a mouse over
    and you don't have to click. For multiples you
    still have to hold...Ctrl or Shift.

    No click...no drag...no copy...maybe that's
    your solution. To adjust for single click...
    Open a folder and go to...Tools / Folder
    Options / General tab...tick..."Single Click
    To Open An Item" / Apply / OK.

    If you wish to delete all the files in a folder
    you could try the following:

    Go to...Edit / Select All...or type...Ctrl / A...

    With all files selected...go to...File / Delete...
    or press your Delete key...or...right click the
    group and choose "Delete".

    How to search for and delete unwanted "copies"
    of your image files.

    Open the folder that the images are saved in and
    left click the "Search" button on your toolbar.

    (If you are viewing the files from within an editing
    program and do not know where the folder is...right
    click one of the image files and from the menu...
    choose...Properties. The path to the folder will be
    on the "General" tab at "Location".)

    In the search pane...select..."All files and folders"

    In the "All or part of the file name" field...enter...

    copy of

    Now...left click the "Search" button.

    When the search is complete...go to...Edit / Select All...

    Then go to...File / Delete.

    You will see a dialog box asking you if you are sure
    you wish to delete the files...click Yes.

    Now...left click the "Back" button on your toolbar.

    Now the remaining files should be your originals.

    If you deleted the wrong files...recover them from
    your recycle bin now.

    Also see the following article:

    Fixing Annoyances: Stop Windows from Copying
    Files Accidentally When Ctrl-Click Selecting
    http://www.howtogeek.com/howto/windows-vista/fixing-annoyances-stop-windows-from-copying-files-accidentally-when-ctrl-click-selecting/

  • RELEVANCY SCORE 3.24

    DB:3.24:How To Find Duplicate Jobs dx



    Hi Everyone,

    I found a lot of Duplicate jobs in Our Production test Systems.Just want to clean up the environment.Can anyone please tell me is there any way to find all DUPLICATE JOBS in Control-m.Any help much appriciated.

    Thanks

    Saddam


    DB:3.24:How To Find Duplicate Jobs dx


    Hi Saddam,

    you can save your currently loaded draft as xml ( via utility you can also do this for your complete repository ).

    - this only works for the repository, not for the active jobfile

    If you prefer flat data, then query the em repository directly.

    If you just need the current version, join the views def_tables, def_job, def_setvar and, if you need it, additionally the tables for the on/do statements, shouts and links ( don't have the names in my memory, but you can find a datamodel of the repository here in the documents section )

    - you can do similiar queries on the ctm server if you need the info for the ajf. This is what Rolf already suggested.

  • RELEVANCY SCORE 3.24

    DB:3.24:How To Avoid Duplicate Entries From A View 79


    Hi All,In one of my created view I am getting some duplicate values.How can I avoid these duplicate entries?There is no selection condtion and join condition is feasible for this criteria!Is any way to prevent the display of these repeated values (Duplicate values)? Regard,Amba.

    DB:3.24:How To Avoid Duplicate Entries From A View 79

    Thanks for the replay.I have used mandt field while joining.Issue is like that for one set of values from table T1 is passing to table T2 and joining condtion is like that and in T2 there is two set of values and one field is changing, But we are not considering that field in view thats why its repeating.I think all of you got a clear idea.

  • RELEVANCY SCORE 3.23

    DB:3.23:How To Find A Duplicate Records In A Particular Column d7


    Hello All,
    I have a query like this
    select accout_id
    ,Taxid
    ,firstname
    ,lastname
    ,creditscore
    from account
    left join borrower
    on account.accountid=borrower.app_id

    I am trying to find if there are any duplicates in Taxid field. How to find this?
    Please assist.
    Thanks

    DB:3.23:How To Find A Duplicate Records In A Particular Column d7

    Please try this.
    ;with cte as
    (
    select accout_id
    ,Taxid
    ,firstname
    ,lastname
    ,creditscore
    ,Row_Number() over (partition by Taxid order by account_id) as rn
    from account
    left join borrower on account.accountid=borrower.app_id
    )
    select
    accout_id
    ,Taxid
    ,firstname
    ,lastname
    ,creditscore from cte where rn1
    ESHANI. Please click Mark As Answer if a post solves your problem or Vote As Helpful if a post has been useful to you

  • RELEVANCY SCORE 3.21

    DB:3.21:Flatfile m3


    Hai, i have falt file. i am trying for remove the duplicate vales. Distnct values send to one destination table and duplicate values send to another destination table.
    i was trying with aggregate transformation. i didn't send duplicate values to another destination.
    plz let me know how can we achieve this

    DB:3.21:Flatfile m3

    with the idea of Todd
    http://toddmcdermid.blogspot.com/2009/01/eliminating-duplicate-primary-keys-in.html
    and maybe some changes you can get your answer.Sincerely SH --MCTS 2005 & 2008 MCITP 2008 -- Please kindly mark the post(s) that answered your question and/or vote for the post(s)

  • RELEVANCY SCORE 3.21

    DB:3.21:Duplicate Books s7


    I am trying edit duplicate books (created with the File Duplicate Book command). I encounter lots of bizarre behavior in the copy but not in the original. The two most annoying are that I can't delete a photo from the book (using the Images Remove from Album command) and the Show Unplaced Images button doesn't work. (Although, if it is active, strange things happen when I click on images in the browser.)

    The result is that I find "Duplicate Book" to be essentially useless. Can anybody help me?

    DB:3.21:Duplicate Books s7

    I have been able to avoid the inability to use "Images Remove from Album". If I avoid using the "File Duplicate Book", it does not happen. The method I use to duplicate the book (without using the command) is to export the enclosing project and then reimporting. I can then remove images from the book in the reimported project. Along the way, duplicate masters are created, but I can tolerate that. Maybe this could be avoided if I did not choose "Consolidate Masters" - I don't know...

  • RELEVANCY SCORE 3.20

    DB:3.20:Duplicate Shape In Rotation f9



    Hello fellow illustrators. I am pretty new at illustrator and could use some help if anyone has any to offer. What i am trying to do is make a shape duplicate in a circle and duplicate that circle inwards. So with other words i want to do is have the shape a have made on top of all the colored shape i have made so that they are all small piggis.

    --rasta

  • RELEVANCY SCORE 3.19

    DB:3.19:Duplicate Ringtone 9j


    Hi! I am trying to install custom ringtones on my 3g and I find I have the same RT 2x. How do I get the duplicate off. Also I want to know if I can only load four RT. I do not see a limit; however, that seems to be the most I can get.

    Also is there any way to delete the Apple RTs on the phone?
    By the way I use Win xp.

    purptiger

    DB:3.19:Duplicate Ringtone 9j

    Hi! I am trying to install custom ringtones on my 3g and I find I have the same RT 2x. How do I get the duplicate off. Also I want to know if I can only load four RT. I do not see a limit; however, that seems to be the most I can get.

    Also is there any way to delete the Apple RTs on the phone?
    By the way I use Win xp.

    purptiger

  • RELEVANCY SCORE 3.19

    DB:3.19:Compile Error In Vc ++ 2005 8c


    I am trying to compile  a program in VS 2005 which i converted from VS 2003. I am getting the following error.
     
    Error 3 fatal error CVT1100: duplicate resource.  type:MANIFEST, name:1, language:0x0409 CVTRES 
    I believe it is complaining about some duplicate resource, but I couldn't find what resource is duplicate. Any help will be appreciated.
     
    Thanks
    Jijo

  • RELEVANCY SCORE 3.19

    DB:3.19:Duplication Detection Help k3


    Hello,
    I have the newly updated CRM 2013 and am trying to run a duplicate detection report. I'd like to run the report to compare first last name duplicates. I've been all over the internet and forums and can not find the correct way to compete this.

    DB:3.19:Duplication Detection Help k3

    Settings - Data Management - Duplicate Detection Jobs
    Create a new job for the entity you wish to check.
    Let the job job and re-open it and click View DuplicatesJason Lattimer
    My Blog- Follow me on Twitter- LinkedIn

  • RELEVANCY SCORE 3.18

    DB:3.18:Error While Creating Entity Using Root Object? zp



    hi all,

    I am trying to create an entity using root object " CondSet ",

    it doesn't allow me to create.

    It gives an error " Creation failed " while creating?

    and also

    How to find the Component set of a particular root entity?

    Thank's in advance.....

    Regard's

    Srikanth Ponnam

    Moderation: Duplicate Post

    Message was edited by: Leon Limson

    DB:3.18:Error While Creating Entity Using Root Object? zp


    Hello,

    Please check my reply in http://scn.sap.com/thread/3333956.

    Best regards,

    Thomas Wagner

  • RELEVANCY SCORE 3.18

    DB:3.18:Conditional Formating To Find Duplicate Cell Values ka


    I am trying to use Excel 02 version "Conditional Formating" to find duplicate cell values. In Excel 07 it is available but not in Excel 02.
    Is there a simple formula that can look down a column of 20-30 values and find any duplicates? I'm using this for a Prob/ Statistics class assignment on the Birthday Problem. Thanks.

    DB:3.18:Conditional Formating To Find Duplicate Cell Values ka

    Refer to the link below which describes a similar issue

    http://www.microsoft.com/office/community/en-us/default.mspx?lang=cr=guid=sloc=en-usdg=microsoft.public.excel.worksheet.functionsp=1tid=69C4F4D7-4FD7-4C22-8FD6-D81A3BD4FD56mid=69C4F4D7-4FD7-4C22-8FD6-D81A3BD4FD56

    For more information you can also check the link below:

    http://office.microsoft.com/en-us/excel/HA010346261033.aspx

    Hope it helps.

    Sachin Shetty

  • RELEVANCY SCORE 3.17

    DB:3.17:Find Duplicate Members In More Than Two Active Directory Groups 8x


    hi,
    I am trying to find out duplicate members in more than two active directory groups. I am new to powershell and only found one script that gives me only duplicate members in two groups. If any one can help

  • RELEVANCY SCORE 3.16

    DB:3.16:Session Failed While Loading From Flat File To Table k9



    Hi,

    I have flat file with duplicate records, i am loading each product in target 1 remaining all duplicate products in target 2

    But my session gets failed, i m unable to find the answer. i m attaching mapping, session log screen shots, kindly help me

    DB:3.16:Session Failed While Loading From Flat File To Table k9


    Hi venkat,

    As rakesh mentioned , you dont need to assign current to variable port ..rather you can use field1 directly.Just for previous value to hold you should use variable port.

    Coming to error i see the file name flat 1.rtf .. Informatica wont accept a space in the file name ..

    just remove the sapce in file name it will recognize.

    Regards

    Shirish

  • RELEVANCY SCORE 3.16

    DB:3.16:Duplicate Document Objects Are Adding In Uimap zz


    Hi All,
    every time when i am trying to add any object to UIMap, a duplicate document object is being add t the UIMap.
    how to avoid/overcome these kind of behavior.

    Thanks,
    Chandra

    DB:3.16:Duplicate Document Objects Are Adding In Uimap zz

    Hi,
    John Louros usedCodePlex UIMap tool:http://uimaptoolbox.codeplex.com/to resolve the similar issue with you and clean up duplicated objects, please see:
    http://stackoverflow.com/questions/9909022/is-it-possible-to-merge-ui-controls-on-ms-coded-ui-tests
    Maybe it also can help you.
    Thanks,

    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.

    Click
    HERE to participate the survey.

  • RELEVANCY SCORE 3.14

    DB:3.14:Duplicate Messages Being Sent Windows Live On Windows 7 f8


    When I send an e-mail with an attachment, the person receiving it is getting multiple copies -- anywhere from 2 to 100. I have changed virus programs three times trying to find the problem. McAffe, Norton 2011, and Windows Security Essentials all have
    said the computer is clean.

    DB:3.14:Duplicate Messages Being Sent Windows Live On Windows 7 f8

    Thank you for visiting the Microsoft Answers Community site.
    The issue you posted is related to Windows Live Mail and would be better suited in the Windows Live Solution Center.
    Please visit the link below to find a community that will offer the support you request:

    http://windowslivehelp.com/forums.aspx?productid=15

    Cody C
    Microsoft Answers Support Engineer
    Visit our
    Microsoft Answers Feedback Forum and let us know what you think.

  • RELEVANCY SCORE 3.14

    DB:3.14:Script Newbie: Messing With Layers kz


    I am trying to teach myself very basic js scripting of Photoshop. My first try are the following few lines: In comes an image with random layer order - I want the script to find the layer named "Background", move it to the top of stack, make a duplicate, moves this to the top, rename the duplicate and apply usm using an action.

    docRef = activeDocument;
    var NumberOfLayers = docRef.layers.length;

    for (i = 0; i NumberOfLayers ; i++)
    {
    var layerRef = app.activeDocument.layers[i]

    if (layerRef.name == 'Background')
    {
    layerRef.move(docRef, ElementPlacement.PLACEATBEGINNING);
    }
    }
    app.doAction("usm","test");

    The premature script finds the Background layer and moves this to the top of stack, but I can't figure out how to make a duplicate, rename this duplicate and move it to the top - and set the duplicate the active layer (so I can apply the usm via the action). Any ideas?

    Regards,
    Rasmus

    DB:3.14:Script Newbie: Messing With Layers kz

    I am trying to teach myself very basic js scripting of Photoshop. My first try are the following few lines: In comes an image with random layer order - I want the script to find the layer named "Background", move it to the top of stack, make a duplicate, moves this to the top, rename the duplicate and apply usm using an action.

    docRef = activeDocument;
    var NumberOfLayers = docRef.layers.length;

    for (i = 0; i NumberOfLayers ; i++)
    {
    var layerRef = app.activeDocument.layers[i]

    if (layerRef.name == 'Background')
    {
    layerRef.move(docRef, ElementPlacement.PLACEATBEGINNING);
    }
    }
    app.doAction("usm","test");

    The premature script finds the Background layer and moves this to the top of stack, but I can't figure out how to make a duplicate, rename this duplicate and move it to the top - and set the duplicate the active layer (so I can apply the usm via the action). Any ideas?

    Regards,
    Rasmus

  • RELEVANCY SCORE 3.13

    DB:3.13:Remove Duplicate Values mc



    I have duplicate values in my table, and I am trying to remove them, I have used Distinct load on both the table and the Generic Load Table.

  • RELEVANCY SCORE 3.13

    DB:3.13:Find The Duplicate Hour In Daylight Savings m7


    Hi ,

    I'm trying to find out the duplicate hour which falls on nov4,2007 at 1.00 AM to 1.59 AM for daylight saving time, is there any to find which hour is previous and which is next by using sql or by using plsql code. Currently I'm using a function to get missing hour (in Spring ), and duplicate hour. But i want know in duplicate hour which is next or pevious. .

    Please help me. In advance thanks for your help.

    Thanks Regards,
    Ramana.

    DB:3.13:Find The Duplicate Hour In Daylight Savings m7

    Thanks a lot scallian appreciate your help.

    Ramana.

  • RELEVANCY SCORE 3.13

    DB:3.13:Show Duplicate Table Values cj



    Hi.

    I'm trying to show duplicate value's only, from a table in my database. What is the best way to realize this in my QV application?

    I only want to show all the duplicate email addresses from the table and exclude the rest.

    Regards.

    DB:3.13:Show Duplicate Table Values cj


    That's correct Amar. I experimented with that also.

    Tnx.

  • RELEVANCY SCORE 3.13

    DB:3.13:How To Find And Delete Duplicate Photos ks


    I have combined photos from various sources, friends, family, etc. and have found that there are many photos that are the same. I have been trying to find a way to find and delete all the duplicates but am unable to do so.

    Anyone have any ideas please?

    Many thanks

    Roger

    DB:3.13:How To Find And Delete Duplicate Photos ks

    Hi Jim Terence.

    Many thanks for your input. I will take a look at both apps and choose the best.

    Cheers

    Roger

  • RELEVANCY SCORE 3.13

    DB:3.13:Fetching Duplicate Record Entries 79



    Hi all,

    I am having some data in a DB table, and i am trying to upload some records into the DB table using an excel sheet. I must find the duplicate records if any from the excel sheet ,ased on the primary key alone,and send those records as a mail to the user.

    Please,if ny of you people could provide me with a logic to go ahead.

    Thanks in advance,

    Winnie.

    DB:3.13:Fetching Duplicate Record Entries 79


    Hi,

    Check this logic.

    Upload data to itab1.

    SORT: itab1 BY primary key.

    itab2[] = itab1[].

    DELETE ADJACENT DUPLICATES FROm itab2 COMPARING primary key.

    LOOP AT itab1 INTO wa.

    READ TABLE itab2 WITH KEY primary keys BINARY SEARCH TRANSPORTING NO FIELDS .

    If sy-subrc IS INITIAL.

    update data base table.

    ELSE.

    APPEND that record to internal table. After end loop send this internal table as mail.

    ENDIF.

    ENDLOOP.

    Thanks,

    Vinod.

  • RELEVANCY SCORE 3.12

    DB:3.12:How To Find Duplicates In Iphoto? 78


    I know very little about iPhoto, but am trying to help a friend (iBook G4, iPhoto 6) who has a lot of duplicate photos in her iPhoto library. Is there a way to search for all the duplicates, so they can be removed?

    DB:3.12:How To Find Duplicates In Iphoto? 78

    Welcome to the Apple Discussions.

    iPhoto 6 had a problem comparing duplicates when importing from a different volume. This was fixed in later versions. As iPhoto 6 hasnt been developed since 2006 I wouldnt hold out much hope for an update.

    Regards

    TD

  • RELEVANCY SCORE 3.11

    DB:3.11:Duplicate Itemcode Alert On Ar Invoice 73



    Hi All,

    I am looking for an alert that will find duplicate ItemCodes on a single invoice.

    Once the invoice is posted it must alert a user that the specific invoice contains duplicate ItemCodes.

    Any help will be appreciated.

    Regards,

    Quinn

    DB:3.11:Duplicate Itemcode Alert On Ar Invoice 73


    Hi Gordon,

    Thank you for your reply.

    I have tested the query you have given me and it works 100%

    Thanks.

  • RELEVANCY SCORE 3.10

    DB:3.10:Remove Duplicate Datastore Mount From Vc ck



    Hi All,

    I am trying to combine two logic here:

    1. report duplicate mount points which we can do through the following

    get-datastore * | where { $_.Name -like "*(*)" }

    2. report hosts which have the follwoing datastores in a cluster/VC in the format:

    Hostname DSName FreeSpaceMB CapacityMB

    Any pointers?

    Thx,

    A

    DB:3.10:Remove Duplicate Datastore Mount From Vc ck


    Hello Matt,

    Thanks for the reply. The script errors out though:

    Select-Object : A positional parameter cannot be found that accepts argument 'n="VMHostNames"; e={Get-View -Property Name -Id ($_.ExtensionData.Host | %{$_.Key}) | %{$_.Name}}'.At line:1 char:49+ Get-Datastore | ?{$_.Name -like "*(*)"} | Select Name, `@{n="VMHostNames"; e={Get-View -Property Name -Id ($_.ExtensionData.Host | %{$_.Key}) | %{$_.Name}}}`, FreeSpaceMB, CapacityMB + CategoryInfo : InvalidArgument: (:) [Select-Object], ParameterBindingException + FullyQualifiedErrorId : PositionalParameterNotFound,Microsoft.PowerShell.Commands.SelectObjectCommand

    I see there is a opening "`" here but no closing "`".

    Get-Datastore | ?{$_.Name -like "*(*)"} | Select Name, `@{n="VMHostNames"; e={Get-View -Property Name -Id ($_.ExtensionData.Host | %{$_.Key}) | %{$_.Name}}}, FreeSpaceMB, CapacityMB

    Regards,

    A

  • RELEVANCY SCORE 3.10

    DB:3.10:Duplicate Ip Addresses j1



    Hello everyone, I have noticed duplicate ip address showing up in the 'ip address' field of managed clients. I know how this is happening (vpn clients) but am interested in how to build a query that identifies them.

    There is a native query for 'duplicate system name' but i cannot find a query that would show me only the devices with duplicate ip addresses. I can create a query that shows all ip's but to page through them looking for dups is a waste of time.

    Bottom line: does anyone know of a way to query for duplicate ip's?

    thanks

    DB:3.10:Duplicate Ip Addresses j1


    dupe ip's i haven't dealt with yet, have seen dupe objects in the container from vpn stuff. We added in the OUI MAC address of our vpn adapters in to epo to ignore those mac addresses and hopefully keep epo a little cleaner.

    https://kc.mcafee.com/corporate/index?page=contentid=KB60141actp=searchsearch id=1267212908144

    kb article is about epo 3.6, but we added in to our 4.5 and it helped.

  • RELEVANCY SCORE 3.09

    DB:3.09:Query To Find Duplicate Record fs


    I need to find all duplicate documents in a folder. A document is duplicate if there are two documents with same name.RegardsV

    DB:3.09:Query To Find Duplicate Record fs

    Hithe following query will find exact 2 dupliacte records with particular date,if u want to get more than two change r_full_content_size=3 r 4 r 5 like that executeselect r_object_id,object_name,r_full_content_size,r_creation_date from temp_custom_type where r_creation_date=DATE('01/06/2007') and object_name in (select object_name from temp_custom_type where folder('/Temp',descend) and r_creation_date=DATE('01/06/2007') group by object_name,r_full_content_size having count(r_full_content_size)=2 ) and r_full_content_size in (select r_full_content_size from tcs_hr_custom_type where folder('/Temp',descend) and r_creation_date=DATE('01/06/2007') group by object_name,r_full_content_size having count(r_full_content_size)=2 );Thanks RegardsSenthilkumar

  • RELEVANCY SCORE 3.08

    DB:3.08:How To Replace Duplicate Cells Information With A Blank Cell c3


    I am trying to find a way after exporting a file into excel that has duplicate cells ie.
    Item Code MaterialDescription Location
    123456paper Stockton
    123456 paper Fresno
    to make the duplicate cells ( the cells that are in bold ) to show up as a blank cell.

  • RELEVANCY SCORE 3.08

    DB:3.08:Duplicate Photos In Multiple Folders In Elements 11? 91



    How can I find/identify all duplicate photos in multiple folders using Elements 11? The goal of corse is to remove duplicates of lesser quality.

    DB:3.08:Duplicate Photos In Multiple Folders In Elements 11? 91


    I use a separate utility, DupeFile Finder, that works very well for such operations. It has never failed me yet. Cannot recall if it was freeware, or shareware, but if I paid, it was not much, and it has worked fine for me.

    For Still Images, I often use the commercial ThumbsPlus, and sort by Similarity. I have version 8, and it did not do X-folder comparisons (like DupeFile Finder does), so one would need those Stills in the same folder to determine dupes. Do not know if the new version has improved on that?

    Good luck,

    Hunt

  • RELEVANCY SCORE 3.07

    DB:3.07:Duplicate Mac Address af



    Hi,

    I have a switch where I can find the exact same MAC Address on 3 different ports...(don't ask why, it is a long story!). I am trying to prove to the customer that this is an issue. What would the switch do with a packet targeted for that MAC Address ?

    Thanks


    DB:3.07:Duplicate Mac Address af


    the mac address learned on 3 different port must have been learned on 3 different ports belonging to three different vlans or the mac-addresses are multicast. If they are unicast mac-address there is no way, the switch will log a message that same mac address is learned on two different ports.

  • RELEVANCY SCORE 3.06

    DB:3.06:Duplicate Htmldocument? f9


    I am trying to print an HTMLDocument from a JEditorPane.

    I would like to duplicate the document, so that I can change some style sheet attributes to change how the document is printed without affecting the HTMLDocument that is being displayed on the screen.

    However I can't seem to find any constructor like new HTMLDocument( htmlDocument);
    Does anyone know how this can be done?

    Thanks,
    reg

    DB:3.06:Duplicate Htmldocument? f9

    http://www.fawcette.com/javapro/2002_12/online/print_kgauthier_12_10_02/

    This DocumentRenderer class might help you...

  • RELEVANCY SCORE 3.05

    DB:3.05:How To Avoid Duplicate Ci Entry? ck



    Hi

    I am using Footprints version 12

    I am trying to find how i can avoid users from entering duplicate CI?

    I tried to find some validation in the "Field" properties,but was not able to see anything related.

    Any help would be greatly appreciated

    Thanks

    DB:3.05:How To Avoid Duplicate Ci Entry? ck


    You cannot.

    Assuming the way to avoid a duplicate would be via unique asset tag or serial number (or combinations thereof) you will need to vote on this idea:

    https://communities.bmc.com/ideas/6882

    If you want to think outside the box, you might be able to edit the column in the database to make it unique, but I dont know how FP would behave with that.

  • RELEVANCY SCORE 3.04

    DB:3.04:How To Find Duplicate Files pf


    Original Title: duplicate files

    I have external hard drives and zip drives. I know that there is duplicate stuff in some places and in others. I'veprobablyeven added some stuff in one place and not another. So I am trying to reorganize and cut the clutter. I need a software that is somewhat
    like Tune-up for music, but finds and locates duplicate files, folders, and software.

    DB:3.04:How To Find Duplicate Files pf

    Lots of programs do search duplicates
    Here's one I use some
    http://www.piriform.com/ccleaner
    http://i43.tinypic.com/33dkw7p.jpg picture of the settings.
    http://i41.tinypic.com/2luhpx.jpg picture of a scan results

  • RELEVANCY SCORE 3.04

    DB:3.04:Fuzzy Rouping k8


    Hi all, I have been trying for a while now to clean some data that containes duplicate data using fuzzy grouping. I can get as far as identifying the duplicate data using fuzzy grouping but how do I get it out so I can insert non duplicate data a dimension table1?
     
    What I am also stuck with is how do u set the data that isn't duplicate in the table1 as well, or is this done in the same step. Please help, deadlines are creeping in on me
     
    Thanx for your time.
     
     
     

    DB:3.04:Fuzzy Rouping k8

    Thanks jwelch. I looked at it and its starting to look like something. After the fuzzy grouping I did a condition split that has 3 conditions. 1 reference table (with similarity score 1 or should I say key in == key out). 2 Is a duplicate table simliarity tusse 0.9 en 0.6 and 3 is a unique table with similarity lower than 0.59. Next I full joined 1 3 using a merge join. The results seem good but there are a few doubles here and there I think you are right now its just a question of fine tuning the conditions.
     
    I have one more (for now ) small question. What do the leading and trailing attribute of the Numerals column do. Coz changing them gave me totally different results.

  • RELEVANCY SCORE 3.04

    DB:3.04:Duplicate Files 73


    Hi

    I am trying to sort through my i-tunes library and delete all the duplicate files but in the menu option
    "View Show Duplicates. " is missing ???

    How can iget this back????

    Windows XP Pro

    DB:3.04:Duplicate Files 73

    Hi

    I dont have any plug-ins at the moment

  • RELEVANCY SCORE 3.03

    DB:3.03:Flash Plug-In f3


    Hello All,

    I am trying to find out how to duplicate the controls on this
    page (
    http://www.picturesinmotion.com/onlinetributes/demo_main.swf
    ) . Notice that when you cursor over the slide show, it displays
    the controls. How is this done? Is there a plug-in that I must buy?
    If so, does anyone know where to find it?

    Regards,

    Mac

    DB:3.03:Flash Plug-In f3

    hi,
    in flash 8
    professional you can find a built-in component called
    FLVPlayer and it has couple of built-in skins. you can choose from
    them or even create a custom skin for the component.

    adam

  • RELEVANCY SCORE 3.03

    DB:3.03:Ht4528 I Am Trying To Download The Padi App On My Phone I Need A Duplicate Card fx


    I am trying to get the PADI app on my poeni and I can not find it it says no matches

    DB:3.03:Ht4528 I Am Trying To Download The Padi App On My Phone I Need A Duplicate Card fx

    I am trying to get the PADI app on my poeni and I can not find it it says no matches

  • RELEVANCY SCORE 3.03

    DB:3.03:Proc Means Question 9j



    Hello,

    I am trying to use proc means to find the sum for sales amount for each dept. The problem is each dept couuld be listed more than once with different sales amount. I want to cound all of the 'duplicate' dept because that's how the data come in.

    But the report is showing the sum for each 'duplicate' dept. I would like to see one combined sum for all duplicate dept. Can i do this with proc means?

    PROC MEANS MAXDEC=2 SUM NOPRINT; BY DEPT; VAR SALEAMT;

    OUTPUT OUT=C SUM = SALESUM;

    DATA D; SET C;

    FILE OUT;

    PUT @1 DEPT

    @11 SALESUM PD7.2;

    Help Please ... Thank you

    DB:3.03:Proc Means Question 9j


    sbb, thanks for your reply. yes, that was 'count'... sorry

    I meant 'include' all duplicate dept so that each dept would show up only once with one total in the report

  • RELEVANCY SCORE 3.03

    DB:3.03:Avoid Duplicate Strings 3z


    I am trying to avoid duplicate string a set of values. Instead of collecting strings in a string array and then going through the loop to find out whether the string exists already in the list or not, I am looking for another possibility. I am sure that arrayList allows duplicate strings. Is there any other possibility which doesn't allow duplicte strings or entry? Thanks.

    DB:3.03:Avoid Duplicate Strings 3z

    Aakash wrote:
    Hi kaj,

    you are right that contains method compares the values of items stored, and it may be a little slow, but it would be beneficial if the program is too large to modify to use "Set".

    RegardsUp front, maybe, but it would still be detrimental in the long run. maybe (if it's really, really bad) save 2 hours coding time, but over the life of the program add 30 hours of operation time. Never do anything based solely on how long it will take to implement, unless maybe the difference is in weeks and you only have days (for example).

  • RELEVANCY SCORE 3.03

    DB:3.03:Duplicate Record Issue cm



    hi all

    pl review following code

    SORT i_pri BY part_nbr mfgng_duns_nbr.

    LOOP AT i_pri INTO i_pri5.

    IF sy-subrc = 0.

    IF i_pri5-part_nbr = wa_pri_dup-part_nbr AND

    i_pri5-mfgng_duns_nbr = wa_pri_dup-mfgng_duns_nbr.

    i_pri5-status = 'D'.

    i_pri5-error_text = 'Duplicate Record'.

    modify i_pri FROM i_pri5.

    ENDIF.

    wa_pri_dup = i_pri5.

    ENDIF.

    ENDLOOP.

    in the above code i am trying to find duplicate records.

    code is showing all duplicate records as status D including first duplicate record. what i want is if there are 3 duplicate records then first record should not have status D where are other should have status D.

    record1 status ' '

    record2 status 'D'

    record3 status 'D'.

    is my requirement

    is there any solution for this problem?

    please help

    thanx

    rocky

    DB:3.03:Duplicate Record Issue cm


    how you are telling that there are duplicates( based on some fields, you can treat them as keys)

    sort the tabble with those keys and modify the records.

    using at new you can modify only the first record.

    loop at itab.

    at new keyfield.

    itab-status = ''.
    modify itab transporting status.

    endat.

    endloop.

  • RELEVANCY SCORE 3.03

    DB:3.03:Duplicate Product Keys When I Am Trying To Reinstall Money Plus c9


    cant activate Money Plus on my new XP laptop because it says the product key is a duplicate. My old laptop is 10 years old and is going in the dump but not untill I activate money on my new one. How do I resolve the duplicate keys in registration?

    DB:3.03:Duplicate Product Keys When I Am Trying To Reinstall Money Plus c9

    cant activate Money Plus on my new XP laptop because it says the product key is a duplicate. My old laptop is 10 years old and is going in the dump but not untill I activate money on my new one. How do I resolve the duplicate keys in registration?

  • RELEVANCY SCORE 3.02

    DB:3.02:How Can I Find And Remove Duplicate Photos In Iphoto? j1


    What is the best way to find and remove duplicate photos in iPhoto?

    DB:3.02:How Can I Find And Remove Duplicate Photos In Iphoto? j1

    Are you seeing these duplicates in iPhoto or via the Finder? If it's in the iPhoto window then you can use one of these applications to identify and remove duplicate photos from an iPhoto Library:iPhoto Library Manager - $29.95Duplicate Cleaner for iPhoto - free Duplicate Annihilator - $7.95 - only app able to detect duplicate thumbnail files or faces files when one library has been imported into another with iPhoto 8 and earlier. PhotoSweeper - $9.95 - This app can search by comparing the image's bitmaps or histograms thus finding duplicates with different file names and dates.I also prefer iPLM as it is more than just a dup finder. It's a the most versatile iPhoto utility available. OT

  • RELEVANCY SCORE 3.02

    DB:3.02:Auto Deleting Duplicate Files sf



    I am trying to delete duplicate file using the Duplicate Finder feature. I am running Windows 7 x64. The porgram runs and finds thousands of duplicates, but I cannot get it to automatically delete all of the duplicates except the Newest by using the Keep Newest option on the first screen. The one-action-for-all option doesnt seem to work or I have not configured something thast needs to be configured. What do I need to do?


    Attached File(s)


    Kepp_Newest.jpg ( 177.03K )
    Number of downloads: 18

    DB:3.02:Auto Deleting Duplicate Files sf


    hi you can use Duplicate Files Deleter software for your problem.i was using to same problem.i hope it solve your problem (IMG:

  • RELEVANCY SCORE 3.02

    DB:3.02:What About Duplicate Indexes ? x1


    Hi All,

    I got a one ticket regarding Duplicate indexesbut i am new to this concept.

    Could you please find the below doubts on duplicate indexes.

    1).Explain the Duplicate indexes and what use of duplicate indexes?

    2).Why we need create the Duplicate indexes and if it is any cause to decrease performance of sql server?

    Thanks

    RAM

    DB:3.02:What About Duplicate Indexes ? x1

    thanks guys that was really useful stuff..Please mark as helpful and propose as answer if you find this as correct!!! Thanks, Rakesh.

  • RELEVANCY SCORE 3.02

    DB:3.02:Ciscoworks Critical Alert From Dfm zk



    Can Any body help me to know the below error message which I am getting from ciscoworks as well as the email is coming from DFM as critical alert (Please find below email also).

    06$Partition=0]PartitionName=)MODE=2;Alert ID=00000RX}Event CODE=1014;1001;1001;1001;1001;1001;;|Status=Active^Severity=Critical^Managed Object=xxx.xxx.250.253^Managed Object Type=Interfaces and Modules^CUSTID=-^CUSTREV=*^Description=xxx.xxx.250.253: Cisco Configuration Management Trap:InformAlarm; xxx.xxx.150.252 [xxx.xxx.250.253]:Duplicate; xxx.xxx.180.252 [xxx.xxx.250.253]:Duplicate; xxx.xxx.250.253:Duplicate; xxx.xxx.1.252 [xxx.xxx.250.253]:Duplicate; xxx.xxx.111.252:Duplicate;

    ================================

    ALERT ID = 00000RX

    TIME = Wed 07-Jun-2006 06:12:41 AST

    STATUS = Active

    SEVERITY = Critical

    MANAGED OBJECT = xxx.xxx.250.253

    MANAGED OBJECT TYPE = Interfaces and Modules

    EVENT DESCRIPTION = xxx.xxx.250.253: Cisco Configuration Management Trap:InformAlarm; xxx.xxx.150.252 [xxx.xxx.250.253]:Duplicate; xxx.xxx.180.252 [xxx.xxx.250.253]:Duplicate; xxx.xxx.250.253:Duplicate; xxx.xxx.1.252 [xxx.xxx.250.253]:Duplicate; xxx.xxx.111.252:Duplicate;

    ==================================

    DB:3.02:Ciscoworks Critical Alert From Dfm zk


    Personal preference but I disabled the the HSRP interface from being managed in DFM:

    DFM - Device Management - Device Details - {Device} - False (drop down box next to interface).

    HTH

  • RELEVANCY SCORE 3.02

    DB:3.02:Duplicate - Mirror da


    I'm trying to find how to duplicate-mirror objects, is it possible ?

    DB:3.02:Duplicate - Mirror da

    or if you have the 2012 advantage pack...mesh --gt; flip mesh



    ----------------------------------------------------------------------http://www.linkedin.com/pub/christoph-schaedl/6/558/73b

  • RELEVANCY SCORE 3.02

    DB:3.02:The Ipod Cannot Be Synced 9f


    I am having real trouble trying to sync my ipod. It all started suddenly 4 days ago for no reason and I still can't find a cure!When I try to sync iTunes gives me a message "a duplicate file name was specified"My iTunes is located on an external hard drive and it is attached to my imac all the time.Any help GREATLY appreciated.

    DB:3.02:The Ipod Cannot Be Synced 9f

    I am having real trouble trying to sync my ipod. It all started suddenly 4 days ago for no reason and I still can't find a cure!When I try to sync iTunes gives me a message "a duplicate file name was specified"My iTunes is located on an external hard drive and it is attached to my imac all the time.Any help GREATLY appreciated.

  • RELEVANCY SCORE 3.02

    DB:3.02:How Can I Add Duplicate Photos To A Slideshow. I Have Iphoto 11 And It Does Not Let Me Add A Duplicate Photo 8m


    I am doing a slideshow with iPhoto 11 and am unable to add duplicate photos to the slideshow. Was trying to find something in settings to allow it but no luck.

  • RELEVANCY SCORE 3.02

    DB:3.02:How To Load Duplicate Aliases For The Members In Hyperion Planning 11.1.2.3 83


    I am trying to load account dimension metadata from GL to planning. But, most of the records ato are rejected due to duplicate alias. Is it possible add .duplicate alias in planning

    DB:3.02:How To Load Duplicate Aliases For The Members In Hyperion Planning 11.1.2.3 83

    AFAIK, I don't think you can use the same description for multiple members.What i would suggest is to concatenate your member name and alias and use that as Alias. As your account member is the placeholder and would not confuse anyone. Ensure that you maintain the 80 character limitRegardsAmarnathORACLE | Essbase

  • RELEVANCY SCORE 3.01

    DB:3.01:Deleting Duplicate Documents jf


    Hi,I need to find all duplicate documents and delete only the repeating ones. I can see threads showing DQL to find duplicate documents but not to delete the duplicates.Can someone tell me the query for finding duplicate documents and deleting only the repeating ones?Thanks,

    DB:3.01:Deleting Duplicate Documents jf

    Thanks for your response. I was hoping that there it could be done in a single DQL. Will have to do using DFc. Thanks!

  • RELEVANCY SCORE 3.01

    DB:3.01:How To "Make Sure You Have The Necessary Backups And Archived Redo Logs" mk


    I sucessfully duplicated database to a remote host. Now I am trying to resycronize the duplicate from the source based on the latest rman full backup (rman level 0 both datafiles are archivelog backups are tagged as 'FULL_BACKUP');

    The way I find all the files to copy to the remote host is with a query to rc_backup_piece:

    select handle from rman.rc_backup_piece
    where to_char(start_time, 'DD-MON-RR') =
    (select to_char(max(start_time), 'DD-MON-RR')
    from rman.rc_backup_piece
    where tag = 'FULL_BACKUP';

    The rman duplicate command is:
    duplicate target databse to dupe until time 'sysdate -1';

    The rman fullbackup occurs last Friday evening
    The file copy to remote host occurs early Saturday morning
    The rman duplicate script occurs mid-Saturday afternoon.

    Inevitably, duplicate fails with an ORA-19505 error, requesting a file from an earlier RMAN backup.

    Why does duplicate need/want a file from an older backup?
    How else can I query for all the files necessary to perform a sucessful duplicte resyncronization?

    DB:3.01:How To "Make Sure You Have The Necessary Backups And Archived Redo Logs" mk

    To validate RMAN backups you can use VALIDATE BACKUPSET number or (when image copies are used) RESTORE VALIDATE FROM DATAFILECOPY number.

    Werner

  • RELEVANCY SCORE 3.01

    DB:3.01:Find Duplicate Items 8c


    HiI'm trying to find the number of occurences of duplicate items in a specific area in the docbase. We currently have a free text field for object_name on a particular object type. I am using the following DQL statement select distinct object_name, count(object_name) from tl_project_doc where folder ('/Projects/Delivery - Stations/Current Projects',descend) group by object_nameThis of course gives me all items including links. What i'm looking for is duplicate items only and no links.Would be greatful if anyone has got a DQL that i could use.Many ThanksAndy

    DB:3.01:Find Duplicate Items 8c

    Hi Andy,Try thisselect distinct object_name,count(*) from tl_project_doc where folder ('/Projects/Delivery - Stations/Current Projects',descend) and i_is_reference=0 group by object_name;
    Thanks,Vijay

  • RELEVANCY SCORE 3.01

    DB:3.01:Where Are Ps Elements 8 Photos Stored? 18



    Help...I am trying to clean out duplicate photos. My hard drive is full HP has asked me where PS Elements 8 stores my photos. Where does Adobe Photoshop Elements 8 store the photos in my computer so I can clean out all the duplicate photos???

    DB:3.01:Where Are Ps Elements 8 Photos Stored? 18


    Hi,

    I re-read your last reply and have a couple of comments.

    The entry in the properties window tells you where the actual image is not the thumbnail (that's in the database I mentioned earlier).

    I always get worried when people suggests deleting duplicate images that they will start deleting their only copy. It is a good idea to make a full catalog backup first. Then only delete duplicate from within elements.

    Brian

  • RELEVANCY SCORE 3.01

    DB:3.01:#Import Wmp.Dll Generates Duplicate Getcurrentstate Functions aj


    Hi,I am trying to import wmp.dll in a simple application to generate the .tlh and .tli files.  Unfortunately, every time I try it, it generates duplicate GetCurrentState() functions, causing the compile to fail.I am using WMP.dll version 11.0.5721.5145.Any ideas?  Also, if anybody knows where to find the API definitions for the older msdxm.ocx, I would be much obliged.Thanks,Lee Mulcahy

    DB:3.01:#Import Wmp.Dll Generates Duplicate Getcurrentstate Functions aj

    Somehow Becky's reply got marked as the 'Answer', but of course telling me to look in a different newsgroup is NOT an answer to my question.The real answer is that there are duplicate names on high- and low-level functions, so you must use a parameter to tell the compiler to add a prefix to the high-level routines:#import wmp.dll  high_method_prefix( SiI )This adds the prefix SiI to all the high-level functions and avoids the duplicates.Lee

  • RELEVANCY SCORE 3.01

    DB:3.01:Duplicate Values In Search Result Document Summary Of Form Library kc


    Hi All,I have deployed one infopath form to one form library. After crawling I am trying to search the records present in the form library. But in the result page, the document summary section is showing duplicate values.Any clue ?Thankx in Adv.RegardsRajdeep.

    rajdeep

    DB:3.01:Duplicate Values In Search Result Document Summary Of Form Library kc

    Try checking your Local Office SharePoint Sites Content Source in Search Settings on your Shared Services Provider.  Make sure that there is not more than one start URL pointing to your SharePoint site.Thanks,Corey

  • RELEVANCY SCORE 3.01

    DB:3.01:Duplicate Entries While Taking Reports In Change Mgmt 9z



    HI All,

    I am trying to take the reports for CHG:Infrastructure Change form.

    At that time,trying to filter out the records by selecting tier 1 ,tier 2 etc but i can able to see the tier 1,tier 2 is displayed twice like below screenshot.

    Kindly help me how to resolve this duplicate fields.

    This issue is happening in Report creator as well.

    ITSM--7.6.04

    Thanks Regards

    Ramanathan

    DB:3.01:Duplicate Entries While Taking Reports In Change Mgmt 9z


    Hi Saroj,

    I am planning to customize the change form by separating the labels.

    Thanks Regards

    Ramanathan

  • RELEVANCY SCORE 3.01

    DB:3.01:Regarding Duplicate Records In Sap 3z


    Hi All,I want to find all the duplicate records in Materials and want to delete them from SAP.How should I achieve this?Is there any other functionality apart from SAP MDM to find and delete duplicate records of Materials?RegardsRahul

    DB:3.01:Regarding Duplicate Records In Sap 3z

    Yes you can set for deletion the Materials either through the BDC program or if you have Mercury tool you can use that to process the transaction MMAMThanks

  • RELEVANCY SCORE 3.01

    DB:3.01:Tfs Integration Creating Duplicate Workitems After Movement 8k


    Hi All,

    I have used TFS Integration tool to migrate workitems of a project from one collection to another collection in the same TFS server.
    The workitems are getting moved but I find duplicate values of the workitem being created. I am trying to move 6312 workitems but after migration its 32410 in the new collection.
    Please help me resolve this issue.

    Thanks in advance

    Thanks
    prashanth

    DB:3.01:Tfs Integration Creating Duplicate Workitems After Movement 8k

    Hi Prashanth,

    Thanks for your reply.

    I installed this
    TFS Integration Tools, then performed the One-way migration to migrate My Tasks query(only this one query) work items from current Collectionteam project to the new Collectionteam project, the migration works as expect, there’s no duplicate
    work items in my new Collectionteam project.

    And as far as I know there’s no the Duplicate option in TFS Integration Tools.John Qiao [MSFT]
    MSDN Community Support | Feedback to us
    Develop and promote your apps in Windows Store
    Please remember to mark the replies as answers if they help and unmark them if they provide no help.

  • RELEVANCY SCORE 3.01

    DB:3.01:Combine Duplicate Rows In Excel 19


    I am trying to find an easy way to combine duplicate rows. See below example on spreadsheet with 6 or more columns of data. I would like to aggregate all checks under 1 row and delete the extras. I have thousands of rows with this type of data. My data would
    look like this after I sort by name.
    Jim Bloodgood X X
    Jim Bloodgood X X
    Jim Bloodgood X X
    Thanks!

    DB:3.01:Combine Duplicate Rows In Excel 19

    Should do it. Assumes a header row and sorted data
    Option Explicit
    Sub CombineColumnsToCommonRowSAS()
    Application.ScreenUpdating = False
    Dim i As Long
    Dim j As Long
    For i = Cells(Rows.Count, 1).End(xlUp).Row To 2 Step -1
    If Cells(i - 1, 1) = Cells(i, 1) Then
    For j = 2 To Cells(i, Columns.Count).End(xlToLeft).Column
    If Cells(i, j) "" Then Cells(i - 1, j) = Cells(i, j)
    Next j
    Rows(i).Delete
    End If
    Next i
    Application.ScreenUpdating = True
    End Sub

    Don Guillett MVP Excel SalesAid Software *** Email address is removed for privacy ***

  • RELEVANCY SCORE 3.00

    DB:3.00:Rsd Showing Duplicate Entry 3k



    Hi All,

    I am trying to figure out why the Rogue is showing many dupplicate entries , even thought the Match critetira I have set to Hostans/domainpair , Sorry the screenshot is not clear.

    How can we avoid the duplicate entry in RSD

    DB:3.00:Rsd Showing Duplicate Entry 3k


    Hi All,

    I am trying to figure out why the Rogue is showing many dupplicate entries , even thought the Match critetira I have set to Hostans/domainpair , Sorry the screenshot is not clear.

    How can we avoid the duplicate entry in RSD

  • RELEVANCY SCORE 3.00

    DB:3.00:Layers And Psd Files d9



    Using Elements 11 I am trying to duplicate a layer on a psd file, however none of the options are available (they are all greyed out).

    Why is this?

    DB:3.00:Layers And Psd Files d9


    Thanks Barbara, it was the 8 bits per channel (or lack of) that was preventing it.

  • RELEVANCY SCORE 3.00

    DB:3.00:Ipv6 Duplicate Error Msg On Gns3 sf



    Hello,

     

    I am trying IPv6 Lab on GNS3,   but I am again and again getting the following duplicate error message on my GNS3 router,

     

    %IPV6-4-DUPLICATE: Duplicate address FE80::C000:13FF:FE80:0 on FastEthernet0/0

     

    I tried changing my Interface Link-Local address, but no result, Please let me know how to get it resolved ?

    DB:3.00:Ipv6 Duplicate Error Msg On Gns3 sf

    But after deactivating of Deed-peer-detection, the message is disapeared plus the issue is also resolved.

    And yes the "show cdp neighbor" command output shows entries about this router. it is with the work-arround it did and without that aswell, in both situations it shows entries about this Router itself.

    Currently i dont have another windows / pc to test this tephology, but i will try it later

  • RELEVANCY SCORE 3.00

    DB:3.00:Duplicate Lines On Ohr Reports - Identifying The Cause aa


    I am trying to find out the cause of duplicate lines on reports run from Oracle HR using Discoverer.

    When I have encoutered this issue previously, it was the result of entries that were not end dated.

    I have looked at the records in question, and cannot find the cause.

    Any ideas?

    Many thanks

    Anna

    Edited by: 856528 on 04-May-2011 06:28

    DB:3.00:Duplicate Lines On Ohr Reports - Identifying The Cause aa

    I am trying to find out the cause of duplicate lines on reports run from Oracle HR using Discoverer.

    When I have encoutered this issue previously, it was the result of entries that were not end dated.

    I have looked at the records in question, and cannot find the cause.

    Any ideas?

    Many thanks

    Anna

    Edited by: 856528 on 04-May-2011 06:28

  • RELEVANCY SCORE 3.00

    DB:3.00:Duplicate An Object dm


    Hi,

    I am trying to find a function in Illustrator CS3 (windows)which allows me to duplicate an object a given amount of times with a horizontal offset and, or vertical offset. A function like the Quark Xpress "step and repeat" I have checked the illustrator help and not found a way to do what I want without changing the properties (look) of the object.

    Thanks to anybody who can help me.

    Best regards
    DD

    DB:3.00:Duplicate An Object dm

    great! thank you joe it is excactly what I needed.

    Best regards,
    DD

  • RELEVANCY SCORE 3.00

    DB:3.00:Duplicate Values 3a


    Hi all,

    I'm trying to find all the duplicate records in a table, I ran this below query:

    SELECT * FROM plsource a
    WHERE a.rowid (SELECT MIN(b.rowid)
    FROM plsource b
    WHERE b.pl_no= a.pl_no
    and b.pltd_cust=a.pltd_cust
    and b.pltd_inter =a.pltd_inter);

    pl_no pltd_cust pltd_inter plitd_inter_no pltd_amt_cr pltd_batch_no
    1 1000 26 69 899 0
    1 1000 27 84 113 13
    4 1124 126 60 222 6
    9 1618 63 12 111 0
    ...I'm trying to create primary key pl_no, pltd_cust,pltd_inter in plsource table. The above query gives me about 1200 duplicate records in the table, for example there are 2 duplicates with pl_no=1 and pltd_cust=1000 and pltd_inter=26 .

    But inorder for us to delete the duplicate records, we need to find out which record is bad, for that I need to select both records for each occurence. For example, I need to retrieve these records for pl_no=1 and pltd_cust=1000 and pltd_inter=26 :

    pl_no pltd_cust pltd_inter plitd_inter_no pltd_amt_cr pltd_batch_no
    1 1000 26 69 899 0
    1 1000 26 81 666 43So, we can decide which row needs to be deleted to create the primary key. Is there a way to select all duplicate records?

    THanks in advance

    Edited by: user10192995 on Jul 29, 2009 11:19 AM

    Edited by: user10192995 on Jul 29, 2009 11:19 AM

  • RELEVANCY SCORE 3.00

    DB:3.00:Clarification, How Can I Find And Eliminate Duplicate, Only Copied Twice, Pictures ? 3c


    How can I find and eliminate duplicate pictures. These pictures have not been changed only copied more than once into the file. Trying to tag hundreds of pictures that have been copied. Selecting a list of untagged pictures, I proceed to tag them but find that I am getting a lot of duplicates. I have to examine each tagged selection and go through the pictures in that tag, find the duplicates and delete them. I would like to be able to ask the program to list all duplicates and then delete them. Over the months I have copied hundreds of pictures into the program. These pictures have not been edited or changed only copied more than once.

    DB:3.00:Clarification, How Can I Find And Eliminate Duplicate, Only Copied Twice, Pictures ? 3c

    you can select all the images and choose the option of suggest stacks automatically. This will group the similar photos in a stack. Then either you can choose the option to keep the images in stack or delete the photos from stack except the top image.

  • RELEVANCY SCORE 2.99

    DB:2.99:Find Duplicate And Include Duplicate....Confused. m7


    Hello Guru's,
    I am attempting to find duplicate phone numbers in a phone database. I have 2 tables (agency, tphone_number).
    I need to get the duplicate phone numbers and agency names from the 2 tables.
    agency:(agency_id,agency_name,county,state)
    tphone_number:(agency_id,phone_number).

    I need to find all duplicate phone numbers and include the agency_id,agency name,county, and state of each duplicate for review on poossible removal or update.
    I know that to find a duplicate number I can:
    SQL select phone_number from tphone_number a where rowid (select min(rowid) from tphone_number b where b.phone_number = a.phone_number);

    But how do I create a list that will include all duplicate data and present agency_id,agency_name,phone_number,county,state of all the duplicates, not just show the single duplicate, but the duplicate plus both agencies that share that duplicate phone number.

    It's puzzling me, hence the post.
    Cheers in advance.

    DB:2.99:Find Duplicate And Include Duplicate....Confused. m7

    user542952 wrote:
    I added an order by to distribute the phone numbers together to show that they are in fact shared or duplicate.Makes no sense, of course they are according to the HAVING

    Besides, if you can abstract from phone_id/phone_nuber, that is precisely what sybrand has.

    Regards
    Peter

  • RELEVANCY SCORE 2.99

    DB:2.99:Zipexception - Duplicate Entry 7f


    Hi all,

    I am trying to zip a number of files(reside in different folder) into the same zip file. However, i encountered duplicate entry exception due to i try to add two same zip entry into the zip file.

    The way i try to avoid such exception is as followed (I will simply ignored the duplicate exception due to the nature of my program.):

    public void addToZipFile()
    {
    try
    {
    //some code here
    }

    catch(ZipEntry ex)
    {
    String msg = ex.getMessage();
    if(msg.indexOf("duplicate") 0)
    {
    //ignored such error
    }
    else
    {
    throw ex;
    }
    }
    }Is the above method a right implementation to deal with the duplicate entry exception ? Any suggestion ?

    Thanks

    DB:2.99:Zipexception - Duplicate Entry 7f

    Hi, Thanks for your quick reply,the exception i catch should be ZipException.

  • RELEVANCY SCORE 2.99

    DB:2.99:Programs To Find Duplicate Content dm



    Does anyone know any software products/tools available that will assist in finding duplicate content in the Content Server repository? The group I am supporting is looking for tools to help identify duplicate content.

    Any leads to what is available would be helpful.

    Thanks

    DB:2.99:Programs To Find Duplicate Content dm


    You can write your own DFC code and java swings application. You can code it to find the duplicated contects based on some parameters and delete those object.

    AFAIK there is no such solution/tool available.

  • RELEVANCY SCORE 2.99

    DB:2.99:Find The Duplicate Meterial No j3



    hi all

    i have an issue

    we have loaded master data 0material to 2 info objects and 1 cube so user found some duplicate meterial no in data

    so how to find these duplicate meterial no .

    is it posible to create infoset and join both the objects

    DB:2.99:Find The Duplicate Meterial No j3


    Hi Anshulr,

    I saw your link that is very helpful to me thanks for your quick response,

    actually i have to compare the material data of infocube and info object this combination customer exit will work or we have to we create a infoset.

    please suggest me

    Thanks,

    Bhaskar

  • RELEVANCY SCORE 2.99

    DB:2.99:How Do I Find Duplicate Words In A Numbers Spreadsheet? a8


    Hello, I've created my first document on Numbers and am trying to figure out how to find duplicate words. Ideally, a list of words that repeat in the document and how many times. Is this possible? Thank you in advance for the advice

    DB:2.99:How Do I Find Duplicate Words In A Numbers Spreadsheet? a8

    If your word list is laid out differently, a screen shot would be useful toward providing an alternate solution.Regards,Barry

  • RELEVANCY SCORE 2.99

    DB:2.99:Find Duplicate Files dk


    Hello,
     
    Is it possible to use sharepoint for the finding duplicates files? I need to identify all duplicate files at our network shared drives.
    Thanks

    DB:2.99:Find Duplicate Files dk

    I got the same need and wrote that query on MOSS Search Database. Hope that helps :
     
    -- Step1 : get all files with short names, md5 signatures, and size
    select
    md5,
    right(accessurl, charindex('\', reverse(accessurl)) - 1) as ShortFileName,
    accessurl AS Url,
    llVal / 1024 as FileSizeKb
    into
    #listingFilesMd5Size
    from
    MSSCrawlURL y inner join MSSDocProps on ( y.DocID = MSSDocProps.DocID )
    where
    MSSDocProps.pid = 58 -- File size
    and llVal 1024 * 10 -- 10 Kb minimum in size
    and md5 0
    and charindex('\', reverse(accessurl)) 1
    -- Step 2: Filter duplicated items
    select count(*) AS NbDuplicates, md5, ShortFileName, FileSizeKb
    into #duplicates
    from #listingFilesMd5Size
    group by md5, ShortFileName, FileSizeKb
    having count(*) 1
    order by count(*) desc
    drop table #listingFilesMd5Size
    -- Step3 : show the report with search URLs
    select *, NbDuplicates * FileSizeKb AS TotalSpaceKb, 'http://srv-moss/SearchCenter/Pages/results.aspx?k=' + ShortFileName AS SearchUrl
    from #duplicates
    order by NbDuplicates * FileSizeKb desc
    drop table #duplicates
     
    http://www.magesi.com/blog/?p=95 

  • RELEVANCY SCORE 2.99

    DB:2.99:Sftp Adapter And Duplicate Check 7k



    Hi,

    My PI development scenario ( FILE-PI-IDOC) and I am trying to achive duplicate check.Under advance setting I checked Duplicate checking and under modification value ,I gave 100 mSec.Here the duplicate check done but after some minutes the files are deleted automatically from SFTP server location and duplicacy check not happening. Can any one please let me know how to achive this and waht should be the modification value exactly.I have unchecked delete option.

    My req: Duplicacy check along with all files be there in SFTP server location till deleted manually.

    Thanks,

    Sanjay Mohanty

    DB:2.99:Sftp Adapter And Duplicate Check 7k


    Hi,

    My PI development scenario ( FILE-PI-IDOC) and I am trying to achive duplicate check.Under advance setting I checked Duplicate checking and under modification value ,I gave 100 mSec.Here the duplicate check done but after some minutes the files are deleted automatically from SFTP server location and duplicacy check not happening. Can any one please let me know how to achive this and waht should be the modification value exactly.I have unchecked delete option.

    My req: Duplicacy check along with all files be there in SFTP server location till deleted manually.

    Thanks,

    Sanjay Mohanty

  • RELEVANCY SCORE 2.99

    DB:2.99:Iphoto Deleting Duplicates j7


    i think there my be one or possibly even 2 copies/duplicates of multiple photos in my iphoto. is there some way of telling my macbook/iphoto to delete all duplicate (or triplicate) photos in my library and keep only the original (one set of original) photos?

    i think the duplicate photos are eating up a lot of space on my computer and contributing to the "start up disk is almost full/delete files".

    trying to find the duplicate photos one by one and deleting them seems way too cumbersome. there must be an easier/faster way that i am just not aware of.

    thanks for your help,
    anthony

  • RELEVANCY SCORE 2.98

    DB:2.98:Deleting Duplicate Row In Datagridview Error 11


    I am trying to delete a row if the program finds a duplicate entry.

    DB:2.98:Deleting Duplicate Row In Datagridview Error 11

    i try to do this,but i can not detect the duplicate value, it catch only value

  • RELEVANCY SCORE 2.98

    DB:2.98:Dtp Certain Psa Requests To Cube pf



    Dear All,

    I have a duplicate copy cube of inventory cube. Exactly same.

    The original inventory has been filled with data from 2LIS_03_BX, 2LIS_03_BF, 2LIS_03_UM .

    Now I am trying to fill the duplicate cube from scratch. Filling the setup tables and pull data through 2LIS_03_BX, 2LIS_03_BF, 2LIS_03_UM and then DTP to duplicate cube.

    The pull from ECC to PSA is fine. But when I dtp from PSA to duplicate inventory cube, the values double. This could be because at the PSA level, since datasources are same, the init pulls are double. one done for my original cube and one done now for my duplicate cube.

    Is there a way to DTP only certain requests to cube from PSA? This is to separate the PSA pulls for the original cube from the duplicate cube.

    Regards,

    Jack Silverz

    DB:2.98:Dtp Certain Psa Requests To Cube pf


    Dear Jack,

    As Prasad told you have already loaded the data to your original Cube from PSA then the same data will be there in the PSA in the form of requests and you need not to fill the setup tables and no need to do the rest of the things again.

    You can simply create the DTP to the Copy Cube. and load the data into the Copy Cube from the PSA.

    If you want to start from the scratch then you need to delete the data from both the cubes then you need to do start from the scratch.

    Before filling the setup tables make sure that you have deleted the data from the setup tables through LBWQ, LBWG and in RSA7 as well.

    If any queries please get back to us.

    Regards,

    Sai.

  • RELEVANCY SCORE 2.98

    DB:2.98:Duplicate Db Convert Parameters fx


    Hi all,11.2.0.3aix 6I am setting up dataguard using duplicate database. Assuming all my target standby db folders/directories are different from the source primary db folders/directories name.What are the available "convert" parameter I can use? I mean, I want to list all "convert" parameters in rman duplicate.I can find only 3:PARAMETER_VALUE_CONVERTDB_FILE_NAME_CONVERTLOG_FILE_NAME_CONVERTIs ther temp_file_convert? etcThanks all,pK

  • RELEVANCY SCORE 2.98

    DB:2.98:Import Issue 8d


    Hi All,
    I have exported one table that table having duplicate rows.when i am trying to import that exported table in another user.
    it througing ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found

    I need the same dump in new user. could you please privode the solutions.

  • RELEVANCY SCORE 2.98

    DB:2.98:Search For Duplicate Files k3


    Can anyone recommend software that will search for duplicate files on a drive automatically? By that I mean find all duplicate files without entering a file name prior to a search. I have a number of external firewire drives and want to copy all the data to an xRaid and then delete any duplicates.
    Thanks..
    Jon

    DB:2.98:Search For Duplicate Files k3

    Can anyone recommend software that will search for duplicate files on a drive automatically? By that I mean find all duplicate files without entering a file name prior to a search. I have a number of external firewire drives and want to copy all the data to an xRaid and then delete any duplicates.
    Thanks..
    Jon

  • RELEVANCY SCORE 2.98

    DB:2.98:Duplicate Component Name: "Xxxxxx" xc


    I'm trying to re-create an application my predecessor created. He was using Netbeans to develop his application and was using several libraries that are no longer trustworthy. I'm using eclipse. I pulled all the entities across and am trying to clean up the mess. I've got most of my dependencies worked out but I'm getting an error with the components.xml. My predecessor had several different projects that were dependent on one another while I'm trying to get the whole thing in one project. However I am now getting an 'Duplicate component name: xxxxxx' for the entry in the components.xml and the @Name annotation in the java classes. Can anyone suggest a reference where I could find a resolution to my problem?

    DB:2.98:Duplicate Component Name: "Xxxxxx" xc

    I am only as vague as you :-)
    Please provide stacktrace and the components.xml for us so that we can help you better.

  • RELEVANCY SCORE 2.97

    DB:2.97:Duplicate Record Error. jp



    Hi,

    I am trying to load data in an infoobjact and I am getting duplicate records error. I have checked my records in the datasource which doesnt have any duplicate records in it. Please suggect the possible solutions.

    DB:2.97:Duplicate Record Error. jp


    I have encountered the same scenario when loading master data attributes using multiple v.7 datasources to a singel MD InfoObject.

    I have tried using both transactional MD attribute DataSources follow the steps below:

    1. DB connect load from table 1 via DTP of Customer Basic data (creates Cust ID's and populates majority of attributes)

    2. DB connect load from table 2 via DTP of Customers Bank details (populates supplementary attributes.

    b/bThere are no duplicates in either load but customers from 2 will exist in 1.b/b

    I am getting erroneous messages for every customer in the 2nd load that exists in the first saying that this is a duplicate data record.

    The request itself turns green and all records are transferred but 0 added with the consequence that no bank attributes are updated.

    Any ideas???

  • RELEVANCY SCORE 2.97

    DB:2.97:Duplicate Photos That Arent Duplicate ps



    i am trying to import photos from my camera roll but Revel keeps telling me they are Duplicates - which they are not i have checked through all the other photos i have uploaded and they are not there. could they be hidden anywhere ?

  • RELEVANCY SCORE 2.96

    DB:2.96:Applying Archived Redo Logs To Duplicate Db f9


    Hi All,

    Oracle 9.2.0.4 on Solaris 8

    I was just wondering how I go about applying archived redo logs from my primary db to my duplicate after I have already run the duplicate?

    Duplicate command is:
    duplicate target database to db;

    This runs at 1 am, finishes around 4 AM but the user has some data in the primary db that they want put in to the duplicate db and I don't want to duplicate the entire db again as it takes over 3 hours and they can't wait...

    All input is appreciated.

    Thanks

    DB:2.96:Applying Archived Redo Logs To Duplicate Db f9

    Hi Jamie CC,

    In general after the database has been opened with resetlogs you can not roll forward with redo.

    Regards,

    Tycho

  • RELEVANCY SCORE 2.96

    DB:2.96:Duplicate Member Outline - Dimension Build Question dx


    I am building a dimension (customer) in a duplicate member outline. I'm using a parent/child dimension build load rule. I know that I need to have the fully-qualified parent name in this file. My issue is this - some of the fully-qualified parent member names are exceeding 80 characters - and the records are being rejected.

    Question - does the fully-qualified parent member name need to include ALL ancestors, or just enough to make the member uniquely identifiable? I'm trying to find a way to make the member names less than 80 characters.

    Thanks!
    - Jake

    DB:2.96:Duplicate Member Outline - Dimension Build Question dx

    It is not necessary to include all the ancestors for a Fully qualified parent. You Could add the id or name of immediate upper level (ie., its parent name alone) to make it unique.

  • RELEVANCY SCORE 2.96

    DB:2.96:Hashing .... pc



    I am using 2D array structure(ArraryLists) for storing data,

    I have to duplicate check this data row wise.

    Since the columns in which i want to duplicate check data may be 20 or 25, and rows may come up to 50,000.

    So how can i hash the rows so that i can duplicate check the rows,

    Any ideas or hints are cordially appreciated.

    is hashing is a good method for this?, from where can i find good articles about hashing/duplicate checking ?,

    is converting all columns for duplicate checking in to a String and calling hashCode , is that good ?

    regards
    Renjith,

    DB:2.96:Hashing .... pc

    Hashing is a good idea. By duplicate, you mean rows where the values of all columns are identical, right? Then start with overriding the equals() method of your row.

    Create a hashtable.
    Loop through all rows.
    Check if the row is already in the hashtable. (then you have a duplicate)
    Insert the row into the hashtable.

    You might need to override the equals() method of the columns as well, provided they are not strings or any other type that already has an equals method that works as it should.

    When you are finished you could look at the hashCode() method to increase performance.

  • RELEVANCY SCORE 2.96

    DB:2.96:Duplicate Keys 8s



    hai all,

    i have the intarnal table which has duplicate entries .

    after that by using sort function and delete duplicate entries i deleted duplicate entries

    then i am usinh insert to insert the records. now it's going to short dump, its saying that they are duplicate entris..

    i was unable to solve this issue.

    please give me some suggessions to under stand the scenario.

    how to handle duplicate entries . if possiable please send the code

    thanks

    laxmi

    DB:2.96:Duplicate Keys 8s


    Hi,

    If you are inserting the records to the database table..

    Use MODIFY..If the record if there it will modify otherwise it will INSERT..

    Example

    DATA: T_ZTABLE TYPE TABLE OF ZTABLE.

    MODIFY ZTABLE FROM TABLE T_ZTABLE.

    Thanks,

    Naren

  • RELEVANCY SCORE 2.95

    DB:2.95:Duplicate Database - Performance Question 8p


    Is the number of channels used to duplicate a database (using RMAN) dependent on the number of channels used to create the backupset?

    Example: 200Gb database (10g), 2 channels used during hot level zero daily backups, all files to be used for duplication still on disk. To increase the speed of a duplication, should I increase the number of channels used in creating my hot level zero database backup? Or is there some RMAN command I can add to my run block that will increase the speed of the 200Gb duplication effort?

    The scenarion is that I have many databases to duplicate in as short of a time as possible. The larger databases take several hours each and I am trying to find a way to make this happen quicker.

    Thank you.

  • RELEVANCY SCORE 2.95

    DB:2.95:How To Find The Duplicate Cis In A Table Using Reporting cz



    Hi All,I want to know whether any duplicate entries(CIs) are there in my table. I want to run a query which shows me only the duplicate entries. Is there a way to create a report with only the duplicate entries. Please let me know whether i can create a script to fetch the duplicate entries from the tables. If yes, where to deploy this script to generate a report which only contains the duplicate entries.

    DB:2.95:How To Find The Duplicate Cis In A Table Using Reporting cz


    Hello

    You can create an UI PAGE using following script and include it as a widget/guage in your hompage/Dashboard.

    I've created an UI page to display LIST of duplicate CI's using below code
    ?xml version="1.0" encoding="utf-8" ?
    j:jelly trim="false" xmlns:j="jelly:core" xmlns:g="glide" xmlns:j2="null" xmlns:g2="null"
    body
    divbuDuplicate CI List/u/b
    br /
    table id="sTable" align="center" style="width:600px"
    tr bgcolor="#dbe5f1"thbCI Name/b/ththbCI Count/b/ththbEnvironment/b/th/tr
    g2:evaluate jelly="true"
    var ciRec = new GlideAggregate("cmdb_ci");
    ciRec.addAggregate('COUNT', 'name');
    ciRec.groupBy('name');
    ciRec.groupBy('u_environment');
    ciRec.addHaving('COUNT', 'name', '', '1');
    ciRec.query();
    ciRec;
    /g2:evaluate
    j2:while test="$[ciRec.next()]"
    trtd$[ciRec.name]/tdtd$[ciRec.getAggregate('COUNT', 'name')]/tdtd$[ciRec.u_environment]/td/tr
    /j2:while
    /table
    br /
    /div
    /body
    /j:jelly

  • RELEVANCY SCORE 2.95

    DB:2.95:Duplicate Files s3


    i just figured out that itunes has a whole lot of duplicate song files. is there and easy way for me to find all of the duplicates so that i can delete them?

    DB:2.95:Duplicate Files s3

    You might also look at the many useful Applescripts found at Doug's Applescripts for iTunes page. In particular is the one to "corral duplicates"...

    http://tinyurl.com/mwnv3

    Basically it finds all tunes it thinks may be duplicates and sticks them into their own playlist. You can then go in and review them and delete the ones that truly are duplicates and not simply other versions of the same song.

    Patrick

  • RELEVANCY SCORE 2.95

    DB:2.95:How To Identify Duplicate Rows df


    How can I check if there is any duplication in the column?

    for example i am running this query and i am trying to find out if there is a duplicate rows. can someone help

    select permit_no1 from U_permit

  • RELEVANCY SCORE 2.95

    DB:2.95:Finding Duplicates:Minus Set Operator In Dealing With Internal Tables zk



    Dear experts,

    I am newbie to ABAP developement,i have been given an assignment to find the duplicate list of vendors in lfa table.

    Now duplicate list doesnot means that text tokens will be just exact to conclude them as duplicate ,it could also be like

    1111 Vendor ABC

    1222 ABC Vendor

    If anybody has clue ,how to work on such a problem ,plz come forward.

    Right now i just tried initially how to find exact duplicates,i found on change command,it do works.

    Then i am trying a new way which should just do the same thing.

    I did as per this algorithm

    1.Compute wholesome list in one internal table itab1

    2.Used delete adjacent duplicates in itab2.

    3.I feel itab3=itab1-itab2 will contain all duplicates in itab3.

    Can anyone give me a hint.How can i do A-B ?.

    DB:2.95:Finding Duplicates:Minus Set Operator In Dealing With Internal Tables zk


    Dear experts,

    I am newbie to ABAP developement,i have been given an assignment to find the duplicate list of vendors in lfa table.

    Now duplicate list doesnot means that text tokens will be just exact to conclude them as duplicate ,it could also be like

    1111 Vendor ABC

    1222 ABC Vendor

    If anybody has clue ,how to work on such a problem ,plz come forward.

    Right now i just tried initially how to find exact duplicates,i found on change command,it do works.

    Then i am trying a new way which should just do the same thing.

    I did as per this algorithm

    1.Compute wholesome list in one internal table itab1

    2.Used delete adjacent duplicates in itab2.

    3.I feel itab3=itab1-itab2 will contain all duplicates in itab3.

    Can anyone give me a hint.How can i do A-B ?.

  • RELEVANCY SCORE 2.95

    DB:2.95:How To Delete Duplicate Tv Shows On Ipad Air? dd


    i am trying to delete duplicate tv shows from ipad air

    DB:2.95:How To Delete Duplicate Tv Shows On Ipad Air? dd

    i am trying to delete duplicate tv shows from ipad air

  • RELEVANCY SCORE 2.95

    DB:2.95:How Do You Duplicate A Page In Pages? 3x


    I am trying to make address labels with an Avery template in Pages and I can't figure out how to duplicate the page setup for addition pages.Thanks,Scott

    DB:2.95:How Do You Duplicate A Page In Pages? 3x

    I use Apple Maps pretty much every day, and it works jsut fine. Not as reliable as the old one, but pretty dang good for a year old application.Pages 5 isn't nearly as good as Maps. Pages 5 more like if Apple had taken away GMap and replaced it with a high-res, zoomable JPG of the earth's surface... and not included "get Directions."Then Cook would claim it's "the Big Picture in Maps."

  • RELEVANCY SCORE 2.95

    DB:2.95:Duplicate Data Records Indicator / The Handle Duplicate Records ss



    Hi All,

    I am getting double data in two request. How can I delete extra data using "Duplicate Data Records indicator ".

    I am not able to see this option in PSA as well as DTP for "the handle duplicate records".

    can u help me to find the option in PSA/DTP.

    Regards

    Amit Srivastava

    DB:2.95:Duplicate Data Records Indicator / The Handle Duplicate Records ss


    Hi,

    "Handle Duplicate Records" is only for the master data. Handles means it doesnt fail if duplicate records are fetch.

    Is this one time issue or recurring scenario?

  • RELEVANCY SCORE 2.95

    DB:2.95:Finding Duplicate Smtp Addresses Using Ldp.Exe pj


    Hi,I am trying to find a way to find a number of duplicate proxy addresses in AD. I have read many forums that point towards using ldp.exe.My question is, how do I find multiple duplicate proxy addresses using this tool?I hope someone can help me out as I have spent the past two days tearing my hair out, as I try to manipulate an ldifde export that has over 200000 rows in it!!!If this can't be done using this tool, then surely there isone out there that can perform this task?Thanks in advanceRgdsLee

    DB:2.95:Finding Duplicate Smtp Addresses Using Ldp.Exe pj

    That there's no output is understandable since you removed the VBS statement that writes the output. :-)
    I'm afraid I have no way to help you with this since it seems to run okay on the servers I have access to.
    Just to verify, line 234 is:
    Wscript.StdOut.Writeline ln
    And ln is populated on line 233 like this:
    ln - k vbTab i
    I'd be inclined to rewrite the script using Powershell if I were you. You can eliminate all the ADSI/ADO junk that used to take up so much of everyone's time.--- Rich Matheisen MCSE, Exchange MVP

  • RELEVANCY SCORE 2.95

    DB:2.95:Common Controls Vs All Windows Forms 31


     
    I am trying to figure out the difference between All Windows Forms and Common Controls. Both of these are listed in the tooolbox in VS 2005 Express. They have some duplicate listings of controls. I am curious to find out what the difference
    is between them.
     
     
                                                                         -thanks 

    DB:2.95:Common Controls Vs All Windows Forms 31

    Yes, they are the same. Most of the time, you will pick up a control from the catergorized groups, i.e. the Common Controls group, the Data group. If you can not find it, you will search the All Windows Controls group.
     

  • RELEVANCY SCORE 2.95

    DB:2.95:Unique Key Violation Detected By Database (Duplicate Entry 'Test.1' For Key 'Sys_User_U') zs



    ALl,

    while creating an user in domain separation environment of SN,I am getting the below error and I am unable to find that userid in the user field.any suggestions will be appreciated ?

    Unique Key violation detected by database (Duplicate entry 'test.1' for key 'sys_user_u')

    DB:2.95:Unique Key Violation Detected By Database (Duplicate Entry 'Test.1' For Key 'Sys_User_U') zs


    Hi Mishra,

    This error can occurs when you write current.update in before business rule. Try to remove that and check.

    Regards,

    Harish.